Home Sector Federal PC Report on Government Services – Approach to Performance Measurement

PC Report on Government Services – Approach to Performance Measurement

PC Report on Government Services – Approach to Performance Measurement

As the old saying goes, you can’t manage what you can’t measure. Perhaps that is why the ‘Approach to performance measurement’ report is the first part of the Productivity Commission’s 2018 Report on Government Services series.

This first component of the report focusses on the importance of measurement, the scope of the report, and methodology. It says that measuring the performance of government service delivery and public reporting creates incentives for better performance by:

  • helping to clarify government objectives and responsibilities.
  • promoting analysis of the relationships between agencies and between programs, enabling governments to coordinate policy within and across agencies.
  • making performance more transparent through informing the community.
  • providing governments with indicators of policy and program performance over time.
  • encouraging ongoing performance improvements in service delivery and effectiveness, by highlighting improvements and innovation.

A key focus of the Report is on measuring the comparative performance of government services across Australia various jurisdictions. “Reporting on comparative performance can provide incentives for service providers to improve performance where there is no or little competition, and provides a level of accountability to consumers, who have little opportunity to express their preferences by accessing services elsewhere.”

The Report focuses on broadly defined ‘social services’, which aim to enhance the wellbeing of people and communities by improving largely intangible outcomes, such as health, education and community safety. It does this across 17 service areas and seven broad policy areas.

The amounts involved are staggering, and dwarf that of individual private sector companies. Government recurrent expenditure on services in Australia is approximately $224 billion. This is s71 percent of all government recurrent expenditure, and represents around 13 per cent of Australia’s Gross Domestic Product (GDP).

The largest component is health, at $96.7 billion in 2017. It is followed by childcare, education and training ($70.8) billion, and community services ($31.2 billion). All other expenditure is $25 billion.

The report point out that governments use a mix of methods to deliver these services to the community: directly as services provider, by funding external providers through grants or the purchase of services, and subsidising users through vouchers or cash payments to purchase services from external providers.

“As non‑government organisations are often involved in the delivery of services, funding from government may not meet the full cost of delivering a service to the community. Since the purpose of the Report is to provide information to assist governments in making decisions about the effectiveness and efficiency of government purchase or supply of services, it is confined to the cost to government.

Similarly, it does not provide detailed information on general government income support. For example, the Report covers aged care but not the aged pension, and child care but not family payments.

Each of the 17 service areas in the Report has a performance indicator framework. Each framework reflects the process through which inputs are transformed into outputs and outcomes in order to achieve desired objectives. Service providers transform resources (inputs) into services (outputs). The rate at which resources are used to make this transformation is known as ‘technical efficiency’.

Each service area has a set of objectives against which performance is reported. The structure of objectives is consistent across service areas and has three components:

  • The high-level objectives or vision for the service, which describes the desired impact of the service area on individuals and the wider community.
  • The service delivery objectives, which highlight the characteristics of services that will enable them to be effective.
  • The objectives for services to be provided in an equitable and efficient manner.

The report has a number of ‘guiding principles’:

  • Comprehensiveness — performance should be assessed against all important objectives.
  • Streamlined reporting — a concise set of information about performance against the identified objectives of a sector or service should be included.
  • A focus on outcomes — high‑level performance indicators should focus on outcomes, reflecting whether service objectives have been met.
  • Hierarchical — high-level outcome indicators should be underpinned by lower‑level output indicators and additional disaggregated data where a greater level of detail is required.
  • Meaningful — reported data must measure what it claims to measure. Proxy indicators should be clearly identified and the development of more meaningful indicators to replace proxy indicators is encouraged where practicable.
  • Comparability — data should be comparable across jurisdictions and over time. However, comparability may be affected by progressive data availability. Where data are not yet comparable across jurisdictions, time series data within jurisdictions is particularly important.
  • Completeness and progressive data availability — aim to report data for all jurisdictions (where relevant), but where this is not possible report data for those jurisdictions that can report (not waiting until data are available for all).
  • Timeliness — data published are the most recent possible. Incremental reporting when data become available, and then updating all relevant data over recent years, is preferable to waiting until all data are available.
  • Use acceptable performance indicators— relevant performance indicators that are already in use in other national reporting arrangements are used wherever appropriate.
  • Understandable — data must be reported in a way that is meaningful to a broad audience, many of whom will not have technical or statistical expertise.
  • Accurate — data published will be of sufficient accuracy to provide confidence in analysis based on information in the Report.
  • Validation — data can vary in the extent to which they have been reviewed or validated (at a minimum, all data are endorsed by the provider and subjected to peer review by the Working Group for the relevant service area).
  • Full costing of services — efficiency estimates should reflect the full costs to government (where possible).

 

Like this news?

Leave a Reply

Your email address will not be published.