Skip to main content
search

Seven themes from the Surpass Conference

Text saying 'Seven themes from the Surpass Conference'

The future direction of the assessment industry, and the importance of incremental progress while learning from others were key themes at this year’s Surpass Conference in October.*

Topics included examples of innovation, AI, accessibility issues, and the opportunities and challenges in rolling out on-screen testing more widely. As Surpass Assessment founder Bob Gomersall said, “these are exciting times to be involved in educational assessment.”

Read on for seven trends that were highlighted.

1. Change must be incremental

The move to on-screen assessment was described as “inevitable” by respondents to AQA’s recent survey, and headteachers believe digital assessment will reflect the life children now lead.

However, there were widespread worries about the speed of change, and logistical challenges, despite the benefits.

2. The future direction of the assessment industry is bright

A fantastic plenary panel, including Patrick Craven at City & Guilds, Lynne Frederickson at Michigan Language Assessment and Tanya Hollister at Chartered Professional Accountants Canada discussed how challenges occur across many different industry sectors. Predictions varied for timescales for implementing widespread on-screen assessments, as Vocational, General qualifications, language testing and professional sectors all have their own challenges. These included:

  • Vocational: access to kit, learner readiness and risks – although the pandemic may have produced more tolerance for flaws in technology;
  • General qualifications: political opposition, tight regulation, extremely high-stakes and large-volume tests;
  • Language testing: lack of test center experience in high-stakes digital exams and the scoring of some components. However, there have been huge developments in testing the four modes (speaking, reading, writing and listening) on-screen, and MLA’s innovative MET Go! Digital test saw it become the winner of 2022’s Surpass Innovation Award;
  • Professional qualifications: Tanya Hollister at CPA Canada highlighted not just resistance to change, but also “issues of scalability and device availability” for high-stakes large-scale exams.

The lively panel finally highlighted the lessons that could be learned from other countries, such as Finland and New Zealand, where on-screen testing is much more common (albeit having two very different models).

3. The importance of feedback

Formative and summative assessments and their varied challenges were much discussed. Ben Rockcliffe (Alphaplus) and Graeme Clark (SQA) explained how complex formative feedback can be and showed how digital assessments can help with high-quality instant feedback that is personalized for the individual learner. There is much still to do, but Artificial Intelligence (AI) was highlighted as one way that things may progress…

4. AI could overcome some scoring validity barriers, including the fallibility of humans

Paul Edelblut from Vantage AI threw down the gauntlet by declaring that AI scoring can ‘beat the humans’, by providing fairer, unbiased, customized and personalized instant feedback. Sharing examples of how new technologies are often mistrusted at first, he said:

“Innovation scares people – it’s new, it’s different, change is hard. You don’t have to go all-in. You can go incrementally and build over time.”

Paul Edelbut, Vantage AI

Computer marking can have huge benefits, including producing scores in 1/100th of a second, once an AI engine is trained by humans. AI can also offer reliability, reduced costs, plagiarism checks and language learning difficulty diagnoses in formative testing.

One language testing organization revealed how it allows them to offer quicker feedback – and they are looking to automate feedback for even the writing and speaking parts of their test.

AI could also potentially create real-world assessments showing “the authenticity of experience” and what people can do – a very important aim for vocational qualifications, explained Patrick Craven at City & Guilds.

5. Accessibility is important – and improving

On-screen assessment can help make tests more accessible for those with additional needs, and creating for accessibility can ultimately create a better test experience for everyone.

Gavin Evans from the Digital Accessibility Centre explained how accessibility can also be good for business, with £16billion of online spending power from disabled people in the UK, and 12 million older screen users in the UK alone.

Adam Norris then showcased the many accessibility features in Surpass:

  • Colour preferences and contrasts
  • Skip links, landmarks and semantic markup to help assistive technology users to navigate the page
  • Options to author accessible content, including alt text and html tables
  • Ability for reasonable adjustments within tests to be added by test creators
  • Compatibility with all common tools and assistive technologies

6. New developments in test creation item types can improve the testing experience

Advanced item types and simulations can help to advance on-screen assessment and make exams more accurately represent true-to-life situations. CQTs are Customisable Question Types – a name given to any advanced item or simulation using the Surpass CQT Framework, which allows for rapid development with requirements tailor-made to user needs.  

WJEC and OCR shared their experience of working with Surpass on the new Advanced Essay CQT, and ACT showcased the new Customisable Spreadsheet item type.

7. Can we (and should we) untether onscreen tests from paper?

WJEC hoped for end-to-end digital learning and to be able to unlock testing from paper, but suggested that this requires both system and culture challenges.

Others saw the removal of paper tests as key, but there were also predictions of a hybrid model for the foreseeable future. To echo the Vantage AI session, where it was shown that newly developed forms of media (e.g. radio, tv, internet) did not replace the one before, should we consider there may not ever be an end to all paper-based testing?

Interesting differences in candidate responses on paper and on-screen were noted across sectors. For example, presenters revealed they had seen different length answers on-screen compared with paper. For CPA Canada, they had noticed on-screen assessments led to better quality and longer responses, whereas AQA had seen shorter answers (possibly due to being edited before submission).

Public appetite for mixing paper and on-screen testing seemed to differ too – feedback from AQA was that the general public do not think different formats can be comparable in results and this is something to bear in mind in future plans.


There were lots of great discussions at the conference and we look forward to seeing more people at upcoming events, including ATP, and at the next Surpass Community webinars.

The recordings from the Conference are now available – register to request access.

Don’t miss future Surpass Community events – sign up for Community emails today.


* All opinions and figures are taken from presentations at the conference, and do not necessarily reflect Surpass Assessment views or research.

Close Menu