Representing quality private education
providers in Australia

Confusion and governance

Monday, September 26 2016

This week a different angle…… let’s talk data transparency as a key to quality student outcomes

Last Thursday the Productivity Commission issued its Preliminary Findings Report into Introducing Competition and Informed Used Choice into Human Services: Identifying Sectors for Reform. This report builds on the Competition Policy Review (Harper Review) conducted in 2015 that included a section which considered competition, contestability and user choice in the human services sector.

While the Productivity Commission report mostly focuses on identifying the sectors that should be a priority for greater contestability and user choice, it does highlight the failings of the VET FEE-HELP program as an example of how ‘not to’ deliver contestability in the human services sector.

The federal government’s failure to provide appropriate stewardship of the program, along with weak price signals and a lack of accessible information, are identified as key reasons for the VET FEE-HELP mess. 

This element is hardly new. The concept of ‘informed’ user choice though is one that is prominent in the report. As the report says “Increased availability and use of human services data is necessary to realise the potential benefits from greater competition, contestability and user choice.

To make informed choices, users need to understand the range of services that are available to them. Providers require data to analyse and improve their services. Governments need data to identify community needs and expectations, the demand for services and gaps in service provision”. I think this is an area where more should and must  be done to inform higher education and VET students, in particular.

Of course, there is no shortage of data. Only a few weeks ago we saw (belatedly) the release of the 2015 full year higher education student data. It outlined the continuing strong growth of private sector providers with student numbers up 10.9 per cent on 2014, compared to a 2.7 per cent gain across the sector. An ACPET Data Snapshot provides a detailed analysis into the make-up of private higher education provision in 2015.

Some very interesting data, only published by the Department of Education and Training for the second year, was the success rate for each provider. So as well as the data on their student numbers, we can now see how successful each of these (Higher Education Support Act approved) providers were in transitioning their students through courses. While it is not the same thing as the attrition rate data for the universities, at least we now have this important outcomes measure for these providers. Combined with the information available through the Quality Indicators for Learning and Teaching (QILT) we have some information that can help assist students make better informed choices.

In the VET sector, the publication of provider data is generally only available at the aggregate level. So while we know from recent reports the quantum of training by a range of dimensions and that completion rates are improving, we don’t have the details of provider activity and which providers are delivering strong outcomes and those that may not be achieving the best for their students.

The availability of VET data is very much under consideration through the recent Review of the Vocational Education and Training (VET) Data Policy – essentially, the policy that governs the collection and dissemination of VET data.

The current Policy is symptomatic of governance in VET – confusing.

I hope you can follow - providers largely send their government-funded student data to the relevant states and territories (with it being on-forwarded to NCVER on a periodic basis and not immediately) with fee-for-service data being either forwarded direct to NCVER or to the provider’s respective state/territory for on forwarding to NCVER - with a range of reporting requirements and timeframes.

Seriously this is the model.

Of particular interest given the current debate about quality and outcomes is the current restrictions that mean the NCVER is severely hamstrung in its ability to share this information with other government agencies and regulators, students and the broader community.

However, the recent damage to the sector caused by some program failures and the actions of a minority of providers means we need to do more to improve the transparency of the VET sector so that students, governments and the community can have real confidence in its quality and outcomes. That’s why I believe there need to be some fundamental changes.

It is time for the publication of VET data that is meaningful to the users of our system. This means information on program outcomes and student and employer satisfaction (where relevant). The current aggregated data on these measures adds little to informing student choice or the broader market. The information available on various  Australian and state and territory web sites is mostly descriptive.

It is also important to consider how the information is published. In an age of digital disruption, it is not reasonable to expect students to navigate complex government-centric web sites and reports. A student friendly solution involving interactive apps and web sites will be more effective. The approach taken in the higher education sector, including through QILT initiative, provides guidance on a possible direction.

I should add that this is not an industry wide view, thought it does happen to be mine. There are legitimate industry concerns about the publication of simplistic or limited measures like completion rates. There is the risk of the misinterpretation of this data and the manipulation of outcomes.

Of course the solution is leadership. To address these concerns and provide meaningful data for students and others, a broader suite of measures need to be developed and reported at provider level. This could include student enrolments, in training, cancellations, course and module completions and employment and other outcomes, along with student satisfaction.

Noting some data limitations and the concerns that arise in relation to reporting fee-for-service activity, this approach could initially focus on government-funded activity with providers able to ‘opt in’ in regard to their fee-for-service activity.

As I noted above, there are a range of reporting timeframes depending on program funding source and jurisdiction. For Total VET Activity there is annual reporting aligned to a specific reporting ‘window. These arrangements are based on historical developments rather than a contemporary approach to data management. They are no longer fit-for-purpose.  It is not satisfactory that we can’t produce point in time data reports at any time.

To address the complexity of reporting there needs to be a single data reporting repository and the opportunity for real time reporting by providers. That is, subject to meeting minimum reporting timeframes, providers should be able to forward data at a time that best aligns to their other student enrolment and management processes. That would mean a constantly updating data set and subsequent reporting.

Let’s hope that the current VET Data Policy review outcomes recognise the importance of data availability to support informed choices, to assist providers to improve their services and governments to identify community needs and gaps in service provision.

Rod Camm


ACPET | Members Login | Search | Legal