When the UK government announced the cancellation of exams to prevent the spread of the coronavirus, there was a collective intake of breath from parents and a giant exhalation from students. These yearly trials usually dominate the beginning of summer for British families with school-age children, who are tested to within an inch of their lives. A-levels — taken at 18 in three or four subjects following a two-year course of study — determine whether and where a pupil can go to university.
Most kids were pretty hopeful when exams were cancelled (shocking, right?). Prime Minister Boris Johnson had promised that this year’s students wouldn’t be worse off. He pledged to “make sure that pupils get the qualifications they need and deserve for their academic career.†An algorithm would input previous exam grades, teacher references, student rankings in each subject and the school’s historic performance to award grades that would be “fair and robust.†What emerged was the opposite. No wonder Johnson has taken the humiliating step of abandoning the technology and falling back on the flawed model of teacher-predicted grades.
The government failed because it didn’t deliver on its promise: It quickly became apparent that pupils — especially poorer ones — weren’t getting the grades they deserved. No algorithm can predict how a student copes in the final stretch of that two-year marathon, those months when endless practice papers and the refining of exam technique can make the difference. Most kids don’t know themselves until they’ve been through it.
The first sign that artificial intelligence wasn’t up to the job of grading exams for school leavers was the debacle in the International Baccalaureate, another pre-university qualification that’s popular with students in many countries. The IB is a rigorous and well-regarded body, so it seemed suited to the experiment of how to grade kids who hadn’t done their final exams because of lockdown.
When teachers predict A-level grades, typically three-quarters of them are overstated. But IB predictions are usually highly accurate, with more than 90% of grades matching teacher predictions. The International Baccalaureate Organization had hundreds of thousands of data points going back decades to help fine-tune its algorithm. It didn’t help. When the IB results came out in early July, there was an immediate outcry from schools around the world: Tens of thousands of grades were much lower than predicted. Protests, petitions and lawsuits followed.
Britain’s A-level results prompted a similar response when they were published last week. While beleaguered education minister Gavin Williamson tried to insist no changes were required, it was a bit like Noah denying it would rain. As results poured in, the government tried to put a smiley face on it: There are more students than ever going to university and more from disadvantaged backgrounds, ministers noted. There was a 2% increase overall in the number of pupils getting the top two grades — A and A*.
—Bloomberg