Publication cover

Quality assessments: practice and perspectives

By Josie Misko, Sian Halliday-Wynes, John Stanwick, Sinan Gemici Research report 25 July 2014 ISBN 978 1 922056 90 0

Description

The quality and rigour of assessments in vocational education and training (VET) have been a key concern of VET policy-makers. This potentially raises issues with the credibility of VET qualifications and the competence of the graduates who hold these qualifications. The authors consider the issues surrounding assessment in certificate III qualifications in aged care, electro-technology (electrical) and business. Validation is seen as being more important than moderation, which is still not fully understood or practised in a way that is commonly accepted by assessment experts and commentators. Moreover, the streamlining of recognition of prior learning is still not occurring to any great extent. The authors also explore the knowledge and practical application of the assessment of VET practitioners and identify tensions relating to where practitioners put their efforts in ensuring quality assessments.

Summary

About the research

The quality and rigour of assessments in vocational education and training (VET) have been key concerns for VET policy-makers, industry stakeholders, employers, and teachers and trainers in recent years. The issue of quality in assessments has implications for the credibility of VET qualifications and the competence of the graduates who hold these qualifications. In this study the authors investigate this issue by looking at certificate III qualifications in aged care, electro-technology (electrical) and business. The authors also explore the knowledge and the practical application of assessment of practitioners. They identify some clear tensions relating to where practitioners put their efforts in ensuring quality assessments.

Key messages

  • Regulatory frameworks drive quality assessments in electrical and aged care qualifications. They are less critical for general business qualifications.
  • Practitioners understand the requirements for gathering sufficient evidence of practical skills and underpinning knowledge to establish competency against established performance criteria in training packages. Applying this in practice presents more of a challenge.
  • Practitioners also express low confidence in making fair, reliable, consistent and valid judgments about performance, particularly about ‘non-competent performance’.
  • Practitioners are more concerned with implementing processes for ensuring the relevance, clarity and user-friendliness of their assessment instruments than they are with moderating the results of assessments. The general view is that the use of rigorous up-front validation approaches minimises the need for moderation. Also challenging is ensuring that their assessment tools keep pace with changing legislative frameworks and standards.
  • External assessments conducted by regulatory authorities (or their equivalents) in electrical qualifications or external assessors in aged care or business qualifications standardise skill assessments. This helps to ensure comparability and consistency of performance to industry standards.
  • Employers’ time constraints and inadequate experience or expertise in specific units (especially those dealing with theoretical components) work against their increased involvement in assessment validation or in conducting assessments.
  • The streamlining of recognition of prior learning processes is limited because providers want to guard against non-compliance judgments from regulators and auditors.
  • Other than accelerating the progress of existing workers with considerable experience, there is little support for condensing the length of entry-level courses.

Rod Camm
Managing Director, NCVER

Executive summary

In recent years the quality and rigour of assessments in vocational education and training (VET) have been key concerns for VET policy-makers, industry stakeholders, employers, and teachers and trainers. These concerns have often been fuelled by perceptions that some providers in the sector are taking short cuts to qualifications. This has raised issues about the credibility of VET qualifications and the competence of the graduates who hold these qualifications. Assessment experts and commentators have identified the main issues as the lack of:

  • systematic and regular moderation and validation practices in training systems to ensure the consistency and validity of assessments
  • knowledge among VET practitioners about the processes and techniques of assessment (Clayton 2009).

A recent small-scale study (Halliday-Wynes & Misko 2013) by researchers from the National Centre for Vocational Education Research (NCVER) looked at the assessment approaches being used by private and public providers in three Australian states. The study focused on providers delivering qualifications in the aged care and childcare sectors, sectors where there has been a push to get large groups of workers to attain or upgrade their qualifications or specific units of competence to meet regulatory requirements. The study also investigated issues relating to the delivery of training and assessment qualifications. This research builds on the work already done by NCVER and others in this area and especially the considerable work recently done for the former National Quality Council (NQC) by the Centre for Work-based Education of Victoria University, Precision Consulting, and Bateman and Giles.

Research aims

The aim of this research is to better understand the extent to which quality (effective or rigorous) assessments are implemented in qualifications in the aged care, business services, electro-technology (mainly electrical) industry sectors, and in training and assessment. We investigate these issues in the states of South Australia, Victoria, New South Wales and Queensland. The following research questions guided the collection and analysis of information.

  • What do practitioners understand about the need for quality assessments, including consistency and validity?
  • To what extent do organisations practise external validation through the use of external assessors and industry stakeholders (especially employers)? Are there any issues for involving employers in assessments?
  • How prevalent is the implementation of recognition of prior learning (RPL) assessments? To what extent are streamlined approaches to RPL being applied?
  • What is the impact of course timeframes on qualifications? Is there any justification for condensing course durations?
  • Information was collected via in-person and telephone interviews with education and training practitioners  and students, as well as with government and industry advisors in one of the Independent Validation of Assessment projects  conducted by the Council of Australian Governments (COAG). Desktop analyses of websites were conducted to investigate training provider promotion of training and education courses, tuition fees and course durations. Analyses of national VET statistics were used to report on the uptake of recognition of prior learning, course durations and the level of student satisfaction with assessment experiences in VET courses. A mystery shopping exercise (where an interviewer calls institutions and asks for information about what is required to complete a qualification) is used to delve more deeply into issues of course duration.

Findings

What do practitioners understand about the need for quality assessments, including consistency and validity?

Trainers and assessors generally understand the accepted key criteria for effective and quality assessments. However, there seems to be a variation in the extent to which moderation and validation practices are applied across providers. The terms ‘validation’ and ‘moderation’ still appear to present some confusion: some trainers and assessors identified as validation practices what are clearly approaches to moderation, and vice versa. A measure of uncertainty and confusion surrounds the term moderation, with trainers and assessors citing the regular review of assessment tools and the development and application of marking guides as key but up-front moderation techniques (as opposed to a process that ideally takes place after assessment). These are understood as ways to ensure consistency among assessors and clarity for students undertaking the assessments. Even when moderation is used as a post-assessment process for ensuring comparability between student results, it is rare for teachers and trainers to alter any assessment decisions. The only time this might occur is when trainers are uncertain about the performance of students. When post-assessment moderation does occur, it is normally for the purpose of modifying the assessment tools to ensure no issues arise on future occasions.

Providers are far more likely to claim they understand what is meant by the term validation and to give examples of both internal validation and external validation, the latter being the review of assessment tools and tests by other assessors, generally from other training providers, as well as by employers. The involvement of employers in validation processes is especially used in courses such as aged care, where students are involved in industry work placements. It is also common for employers to be involved in supervising work placements and providing feedback to registered training organisations (RTOs) on students’ practical performance. Regular workplace visits to verify assessments are other ways by which training organisations maintain their linkages with industry practice, thereby validating or confirming the suitability of their assessments and their assessment tools.

Providers also differed according to their access to professional development activities. In the public system, providers often have access to professional development for their trainers and assessors. Trainers in private registered training organisations are more likely to be responsible for their own professional development. The sharing of information across providers is not uncommon, even though there is a view that the competitiveness of the Certificate III in Aged Care qualification tends to constrain full collaboration.

To what extent do organisations practise external validation through the use of external assessors and industry stakeholders (especially employers)? Are there any issues for involving employers in assessments?

The prior or current practical workplace experience and knowledge of trainers and assessors who have worked in the industry, as well as students who are existing workers, help trainers and assessors in business courses to develop assessments that are customised to current industry practice. The Capstone Test provides an external validation of skills acquisition in electrical apprenticeships because it is set by an independent body. Some trainers hold the view that its integrity can be compromised if the test is conducted by the registered training organisation itself. In Victoria the external assessment for A-grade licensing is conducted by an external agency such as EPIC Industry Training Board, acting under the auspices of the regulatory authority that issues A-grade licences for electricians (Energy Safe Victoria). This external approach standardises the assessment of skills to ensure the comparability and consistency of performance to industry standards.

How easy is it to involve employers in the validation of assessments? It is clearly quite straightforward for some courses and difficult for others. The mandatory use of practical work placements in some courses requires registered training organisations to develop strong relationships with employers to ensure access to workplaces for placements for their students.  For these courses, providers report that they have no issues in involving employers in assessment validation practices (if it does not mean intensive involvement). For electrical services programs, where students are in the workplace with workplace supervisors, it appears to be more difficult. In these instances providers generally believe that employers prefer to leave the process of assessment to the experts, claiming they do not have the time to spend on training documentation or validation. However, the need for employers to provide evidence of apprentice performance in e-profiles or hard-copy logbooks offers them an opportunity to be involved in the validation of on-the-job assessments. Time constraints and inadequate experience or lack of expertise in specific units also work against the increased involvement of employers in assessment validation or external assessments.

How prevalent is the implementation of recognition of prior learning assessments? To what extent are streamlined approaches to RPL being applied?

There is little emphasis on the recognition of prior learning for entry-level certificate III programs across the three sectors, although there is evidence of higher usage in business programs and for electrical apprentices transferring from the defence forces (navy) who already possess extensive knowledge of and experience in communications. Apart from the obligatory formal offer of recognition that all providers made prior to enrolment, there was little encouragement for students to undertake RPL. At the same time practitioners reported that students are not always keen to participate in recognition assessments and that many want to start their courses from scratch. This is also confirmed by the students who provided information to this study. An investigation of commencements from 2010 matched to 2010 and 2011 completions in the Certificate III in Aged Care, the Certificate III in Business and the Certificate IV in Training and Assessment finds that almost 7% of students in aged care and business courses will undertake recognition of prior learning processes; four times this number (25%) of students will receive recognition for the Certificate IV in Training and Assessment.

Providers varied in what they expected from the students who did request recognition of prior learning, with some providers implementing a streamlined approach, while others required extensive documentation to support students’ claims. Less intensive approaches often relied on workplace observations, third-party reports and critical questioning techniques, while more complex processes required extensive mapping of units to the evidence required, as well as extensive documentation. Generally, the complex approach was used to meet the perceived prospective requirements of quality auditors.

What is the impact of course timeframes on qualifications? Is there any justification for condensing course durations?

Course durations vary, depending on delivery methods and whether students are undertaking the course in full-time or part-time mode. In general, online and distance courses are allocated longer lead times for completions than are in-class or face-to-face courses. There seems to be little appetite for condensing the course length for entry-level courses, as it is commonly accepted that the acquisition of sufficient practical skills and knowledge to work in a new occupation takes time.

Key challenges for students and teachers

Students were generally satisfied with their assessment experiences and reported some frustrations over having insufficient time to absorb the information and to complete assignments. Some believed the course they were doing should be upgraded to a higher certificate level because of the work and responsibility involved. Others reported minor frustrations in relation to non-graded assessments and group assessments: their benefits or drawbacks. Data from the 2012 Students Outcomes Survey also indicated general satisfaction with assessment processes, with around 90% of all graduates of certificate IIIs in aged care, a range of business courses and electro-technology highly satisfied with the way they had been assessed in their courses. This includes knowing how they would be assessed, the fairness of assessments, the regularity of assessments and the assessment being a good test of what was taught. The level of satisfaction with the amount of feedback received on assessment attracted the lowest ratings (83% for the total group and around 78% for electrical, 86% for business and 87% for aged care).

The key assessment challenges that trainers and assessors perceived for themselves relate to arriving at fair, valid and consistent judgments and having confidence in their judgments about ‘not competent’ performance. They also feel challenged by the need to ensure that their assessment tools keep pace with changing legislative frameworks and standards.

Teachers believe that the personal attributes of students also affect their ability to deliver quality assessments. These include students’ lack of commitment to learning and their inadequate knowledge of content (especially theory in electrical and business courses, and manual handling in aged care courses), poor language and literacy skills, and difficulties in demonstrating in assessments what they can do.

Download

TITLE FORMAT SIZE
Quality-assessments-2709 .pdf 625.6 KB Download
Quality-assessments-2709 .docx 362.5 KB Download