Implementing Performance Based Assessments
Written by Michelle Jackson of Lenovo
Recent advances in computing have made performance-based exams a realistic option for many IT organizations. Adding a performance-based component to exams can be a compelling way to enhance the value proposition for certification programs. This is particularly true for newer technologies with faster life cycles. Furthermore, it allows candidates to demonstrate their skills and prove they can perform a job.
The ITCC August Member Meeting featured a panel of three leaders in the area of performance-based assessments. They shared insights from their experiences and offered valuable considerations and suggestions in this subject area. Graham Livingstone from TrueAbility served as moderator. See “About the Panelists” below.
Members can log in to access the session recording.
Delivery Considerations and Challenges
Integration of the lab with the test delivery engine is critically important. The test delivery engine must know when to spin up the lab, when to start the lab, and when to deliver it and make it available to the candidate. Then the score has to be sent back to the test delivery engine. If the exam contains both traditional item types and performance items, the scores must be combined and sent back to test delivery engine. All of the service providers involved have to be super committed to success so the candidate has a good experience.
When deciding whether to deliver the exam online or in a test center, there are further considerations. If the test center hardware does not meet the requirements for the exam, online delivery will be required. Two other points in favor of online delivery were candidate preference for using their own machine and the challenge of taking a multiple-hour exam in a test center.
Another perspective is that test centers typically have greater control over bandwidth and internet connectivity needed for lab delivery. In some cases, the test center might offer the best candidate experience.
The panelists agreed that performance-based assessments are less prone to security risks. This is especially true if dynamic items are used. In addition, if the session is online and recorded, this can be a deterrent to security violations. Finally, if exams are designed using parameters and with extensibility in mind, then it is easier to make changes if needed.
This is an area of interesting challenges. Classical test theory approaches to psychometrics don’t necessarily make sense in performance-based testing because fundamentally in a performance task, there is usually one outcome that is achieved. This changes the conversation about reliability and validity. This is a complex area to say the least, and there is much to be considered to make sure performance is measured effectively.
Offering sample items and/or practice labs are recommended so candidates get a feel for what to expect. One of the panelists mentioned offering free exam retakes (i.e., if the candidate does not pass on the first attempt, they automatically get a free second attempt). This also offers resolution if the candidate encountered latency issues in the first attempt.
One aspect that cannot be overlooked is the cost of the resources required to deliver the exam. Analysis may be needed to determine if a performance-based exam can be delivered in a cost-effective way. The ROI comes from the value of candidates being able to demonstrate they can perform the task(s) in the assessment. This is the value proposition for your candidates, hiring organizations and the internal leadership team.
Be Prepared for Surprises!
- Candidates might think they are prepared when really, they are not. You will need to over communicate about what is expected in the exam experience. When you think you’ve communicated enough, do it some more!
- Internet limitations can be a showstopper. Discuss this with your delivery provider.
- On a positive note, developing performance-based exams may be easier for some SMEs.
Words of Advice
- Be cautious with setting the cut score. It is very easy to set it too high.
- Realize you won’t get it right straight out of the gate. Be prepared to go back and learn and improve.
- Set expectations with your leadership team that performance-based testing is by and large unique, and difficult to do. There will be some bad candidate experiences. Get agreement on what a good delivery success rate is, and what is realistic to accomplish with your program.
- You will need SMEs with expertise in writing the scoring scripts. Provide a sandbox environment where your item writers can work together with the scoring team.
The panelists provided a good understanding of the potential benefits as well as the operational requirements and challenges of adding performance-based items to the portfolio. These assessments present a valuable way to meet the demand for skills in the context of rapid adoption of new technologies.
About the Panelists
Clyde Seepersad is responsible for the training and certification arms of The Linux Foundation. Over the past decade, Clyde has held senior leadership positions in the education space, most recently as head of operations at 360training.com and before that as a senior executive of Houghton Mifflin Harcourt, a global leader in education. Prior to his involvement in education, Clyde was a Principal at the Boston Consulting Group and started his career in the public sector, working within the Ministry of Finance in Trinidad and Tobago.
Liberty Munson is the Senior Psychometrician for the Microsoft Worldwide Learning organization and is responsible for ensuring that the skills assessments in Microsoft Technical Certification are valid and reliable measures of the content areas that they are intended to measure. Prior to Microsoft, she worked at Boeing in the Employee Selection Group, assisted with the development of their internal certification exams, and acted as a co-project manager to Boeing’s Employee Survey. She received her BS in Psychology from Iowa State University and MA and PhD in Industrial/Organizational Psychology with minors in Quantitative Psychology and Human Resource Management from the University of Illinois at Urbana-Champaign.
Nathaneal Letchford is Senior Certification Engineer at National Instruments.
Graham Livingstone is Head of Sales for TrueAbility and oversees growth strategy efforts, whilst building the company’s channel partner program. With more than 20+ years of experience in executive management and sales roles in the technology sector, he has held director positions at several technology and e-commerce companies in the U.S. and UK. Prior to moving to the U.S., he was the Managing Director for IT consulting company Cheriton Computers and has also held Senior Sales and Channel leadership positions at Unilink Solutions, Tempest Computer Software Limited, Protocol Solutions, and Avnet Technology Solutions.