With telc: Language Tests
In this case study, discover how telc Language Tests have used functionality in Surpass to:
- Expand their exam offering to reach more regions and candidates.
- Increase efficiency and reduce the logistical costs of delivering assessments.
- Standardise assessments across a large and varied geographical area.
Organisation type: Language tester – remotely delivered and internet-based
Number of countries reached: 20
Number of examinations: 15,000
telc Language Tests are the largest foreign language testing organisation in the European Union.
In 2021 telc were awarded the contract from the European Personnel Selection Office to conduct digital exams in 24 EU languages. They deliver Common European Framework of Reference for Languages (CEFR) level exams to provide reliable, highly reputable and effective assessments that demonstrate linguistic capability. Involving all aspects of language including reading, writing, speaking and listening within real-world scenarios, this has led to some interesting approaches to assessments.
These language tests help people to get jobs, enter university and pass immigration requirements. From school leavers to foreign nationals or refugees who depend on the qualifications to remain in their chosen country, these are high-stakes assessments.
telc’s use of technology in Surpass to create standardised assessments across 24 languages is remarkable.
Aims and key challenges
For telc, equivalence across languages is important to maintain the validity of assessments when delivering on the global scale and to adhere to the Common European Framework of Reference for Languages (CEFR). The CEFR Levels measure competencies in learning European languages (including English) in a consistent manner so that test-takers, developers, employers and other stakeholders can measure ability across the four disciplines of listening, reading, writing and speaking.
“We had success with Surpass. Using the systems available to us in Surpass has allowed us to create a core template area which can be used across the board in 24 different languages to create testing and hopefully successful assessment for all of these candidates across Europe.”Sean McDonald, E-Learning and Assessment Manager at telc
CEFR levels are a global scale, that breaks candidate groupings of beginner, intermediate and advanced into more detailed categories. Each level comes with a series of ‘can do’ statements which describe to stakeholders exactly what a learner is able to do in a language. These competencies must be identical for each of the 24 languages in which telc offer exams, regardless of where in the world a candidate is taking the test, to assure the validity and reliability of the qualifications.
The telc Language exams are considered ‘high stakes’, as the qualification is required in many cases for foreign nationals to achieve their career goals in fields such as medicine and healthcare. Candidates include people who need to pass the test for immigration purposes, to get into university or to get a job.
Surpass aids telc in the creation and standardisation process in several ways:
- The features designed for organisation in Surpass, including folder and item banking structures, tags and workflows, allow content creators to easily differentiate between items that do or do not require standardisation, all from a single place.
- Using templates and duplication features, telc are able to enforce a prescriptive format within the Surpass user interface to minimise mistakes, remove inconsistencies and provide a standard appearance for tests taken around the world.
- The efficiency from using this functionality also has huge cost-saving benefits, avoiding expensive translation and outsourcing processes.
Standardisation across such a large and varied geographical area presents a huge challenge to any awarding organisation on this scale, particularly in the field of language testing. Using Surpass technology overcomes this operational headache.
Using Surpass has enabled telc to track items regardless of where in the world they are produced so that their format, content and the competencies they are testing are identical across each region, country and language variant. This is a vital element of language testing and the integrity and reputation both of telc and of the CEFR levels. Candidates, employers, governments (when issuing documentation like visas) need to see each CEFR level as a mark of standardised quality.
An example of a telc language test delivered in the Surpass secure test driver
Trust is key to any organisation that wishes to test onscreen. Moving assessments on-screen can be a big change for any organisation and any change on that scale is likely to bring about misgivings. telc used a number of methodologies.
telc assembled focus groups of hundreds of people to take tests both on-screen and on paper, after which their behaviour and item performance was extensively analysed to identify any inconsistencies between paper and onscreen versions. This research provided quantitative data to support the move to onscreen testing, and also proof of cost savings, environmental benefits and reduction in risk compared to traditional paper-based testing and the physical logistics involved.
In some geographical locations where telc have encountered resistance to the move onscreen, these studies have been used to encourage a shift towards a positive mindset.
A team of Subject Matter Experts (SMEs), in language testing and in the CEFR levels, performed multiple rounds of content review to ensure the items were adequately assessing the relevant CEFR levels for that test. Surpass supports telc to achieve standardisation even in remote locations, and allows teams of SMEs to cooperate across continents.
After initial reviewing, a standardised test template can be created and shared easily from the item bank in the Surpass platform with a wider testing committee of SMEs, including a dedicated linguist and project manager per language. This committee collaborate to lend their specific expertise to the discussion, improving the test form through further review and consultation.
After this review, the exam is shared with SMEs so that the test content is appropriately translated and displayed for that region.
Translated content is again reviewed by each committee within the context of the CEFR levels so that telc know the appropriate competencies are being tested correctly for each achievement level. The final stage is to add the content to live exams, statistically evaluate the questions and flag any inconsistency across regions or languages. This process is also supported in Surpass, as telc are able to use these items in tests as non-scored items, and to collect the vital item performance data without impacting exam results with questions that have not yet been fully validated.
For items that do need to be marked, telc can take advantage of effective functionality within Surpass, to mark items according to specific criteria, and match candidates’ abilities directly to the CEFR level competencies. There are logistical benefits for the project coordinators who need not be fluent in the 24 delivery languages to be able to understand the marks and grades given to a candidate: using codes and a consistent format, they can have an accurate overview of candidates’ performance using CEFR criteria, contributing to reliability.
By using the Surpass platform and the innovations and standardisation this makes possible, telc have been able to offer their exams online, while increasing the efficiency and reducing the logistical costs of delivering assessments.
Through online assessment, telc has found that accessibility has improved. Paper documents no longer need to be sent, and examiners aren’t required to travel.
telc hope to exploit this to grow the number of candidates for language tests online, and to offer less popular languages in future.
In a presentation from the 2021 Surpass Conference, Sean offered the Surpass Community an insight into the developments in telc’s language testing.