Roche Limit

Education from the inside. Find me on Mastodon @rochelimit@mastodon.technology. Mirrored from my Gemini capsule at gemini://rochelimit.uk

Historical Developments

The English eductional system, up until the late 1980s, was focussed on the General Certificate for Education (GCE), taken at Ordinary (or O) Level (for 16-year-olds) and Advanced (A) Level (taken at 18). Both were aimed at providing the most academic students a route into sixth form college and on to university degrees, and they would be considered thoroughly inaccessable in today's climate. There were other courses, set at a lower standard, such as the CSE, but they were not well regarded.

Depite the elitist course arrangements, schools could focus largely on their teaching and the pupils' needs, as there was little in the way of oversight. There were no league tables and no exam results focussed Office for Standards in Education (Ofsted). As someone who was educated under this system, my experence was that the teachers were not particularly bothered about the exam results. This meant that much of the teaching was lacklustre, but the best teachers could really focus on the learning experience.

Exams then were really designed for academic selection. A fixed proportion of students on each course were awarded a each grade every year. The primary role for the grades was to sort students into the right level of higher education institution, so a level of stability year on year was desirable.

Grade Inflation

Enter the GCSE (General Certificate of Secondary Education), which switched the assessments from the O Levels' norm-referencing to a criterion referenced system. Exam grade boundaries are primarily calculated against a set of subject specific atatements, best-fit criteria that a student must match to be awarded any particular grade. The exam boards who set the exams review the results every year to ensure consistency from year to year, under the supervision of the national Office for Qualifications (Ofqual). But Ofqual allows a certain tolerance in the grading each year, and it turns out that the variations compared to previous years us uniformly upwards towards slightly higher grade averages, peaking in the 90s with a grade inflation of a grade every two or three years.

Combined with the publication of grade outcomes per subject as fine as individual schools, this produced an annual carnival of celebration as school performance records tumble every year for most schools, and successive governments claim responsibility for the infeasable ongoing improvements in the education system.

Of course, the improvements have always been illusory, as is clear when you compare the local results to the international standardised assessments such as TIMMS and PISA. These projects show English educational performances to be generally static over the years of supposed massive improvements. Indeed if the international assessments matched the improvements in GCSE grades over the last two decades then England would have an education system so advanced that the other nations wouldn't have a hope of catching up.

So now each government congratulates itself each year when the exam results creep a little higher. The schools inspector Ofsted, nudged by ministers, makes grade increases a key part of the regulation system and schools whose 'grade outputs' are steady year on year are punished for 'coasting'. With this unwavering political pressure head teachers make grades the key success measure for departments and individual teachers. It takes a confident teacher secure in their skills to resist the temptation to reach for the short-term quick fix of driving their pupils to adopt an exam performance focus for their studies instead of concentrating on the long term goal of developing usable knowledge and skills. And when this exam grades obsession takes over an entire education system, the collateral damage to the actual education becomes significant.

Manipulating the Metric

It is a truism in statistics that when the statistical measure becomes the aim it stops being a good measure. When GCSEs stopped being solely a measure of how well a pupil has learned a subject and became the political target, the focus naturally moves from improving the quality of education to doing anything to push the numbers up.

Now of course there are plenty if positive interventions that will improve both education and the headline grade averages, but these are difficult and relatively expensive, so a head teacher under pressure to rescue flagging results will often resort to manipulating the metric instead of the more difficult to manage educational culture and skills of their teachers and students.

One popular way was the 'dash for BTECs'. BTEC is a vocational alternative to GCSE courses and some of these qualifications are high quality and useful for developing technical skills, but not all were like this. One in particular, an IT skills course, could be taught in a single term and when competed was treated by government as equivalent to four GCSEs passed at the A* to C grade level. The effect was to massively increase the school's average GCSE 'grade C and above' figure, which could push that school to the top of the local rankings. Parents looking for a school for their child will naturally favour the 'best' school, so those near the top of the table will become over subscribed, while others will suffer from shrinking rolls, poor Ofsted assessments and high staff turnover.

The students suffer too, as focussing on grades forces students to abandon deep learning for whatever they see that the teachers have identified as being more important, which emphasises shallow over deep learning.

At a management level, good is defined as that which improves grades, so the grades are the only metric for success, for teachers and students alike. Teachers who develop strong pedagogical understanding and are able to foster deep learning in their charges will not be supported as much as those who can drive short term behaviours in pupils to nudge up their grade profile, even it it undermines the pupils as they move on to more advanced courses.

Time to Change

GCSEs were a good idea once, and they initially worked well to offer academic instruction to children from a wider range of backgrounds than the older O Levels managed. But the move away from norm-referenced grading produced grade inflation, while political pressures forced schools and teachers to adopt short term strategies. Focusing on the grades significantly weakened their corration with educational quality, to the extent that the grade expectations dominates most discussion in schools to the exclusion of ideas that could radically improve the quality of secondary school leavers.

The exam regulator debacle last spring, when summer exams were summarily cancelled across England and Wales, was an avoidable mess.

From the beginning it was clear that was quite possible to run essential exams even with coronavirus in the ascendancy. A Levels could have been saved by scrapping all GCSE exams, except perhaps English and Maths. GCSES used to be a school leaver qualification, but now that every student has to attend some form of education or training until they are 18 their usefulness has declined until they are mostly just a distraction. Exam boards would have held two sittings for each A Level, making use of the alternate papers they hold onto in case the primary papers are compromised, which would have allow socially distanced exams to take place. Students in exam halls are already spread out for obvious reasons, so only a little extra space would have been needed to keep the whole process covid-safe. But then the government panicked, although it is hard to criticise them given the uniqueness of the situation.

The result of the rushed arrangement was that the national averages were very little different last year, while a very large number of students were awarded grades far from those that their teachers had expected. After a national outcry raw teacher predictions were used unmoderated, leaving final grades inconsistent from school to school.

Was it possible to make a better fist of awarding estimated grades? It was always going to be difficult to award grades that were trusted as much as normal without the evidence of actual exam results, but the government made a big mistake. The nominally independent Ofqual quango which regulates qualifications explained the risks of each option, recommended that exams were scrapped and replaced with a certification of performance, to avoid damaging comparisons with genuine A Levels. This would have allowed students to progress to university on the basis of offers, and it would have allowed for the necessary grade inflation as exam boards sought individual fairness.

But Ofqual came under pressure to issue actual A Level grades from the government which was also keen to avoid politically damaging grade inflation. This was a fine aspiration, but the complexities of the statistical model which was needed to assign grades placed that desire in opposition to the reliability of individual awards. Forcing the grades to fit the same distribution as previous years meant the model risked introducing inconsistencies at the individual level. The government ignored the warnings and went for issuing moderated simulated exam grades, with the instruction to focus on the average results at the expense of individual variations. Students missed their university offers by the thousand. Students who had never received anything other that top grades were awarded several grades lower, and others failed their courses, much to the surprise of teachers who had firmly predicted good passes. Small independent schools had class sizes too small to moderate, so their high predictions went through without reductions, a likelihood also predicted by Ofqual and ignored by a blinkered Education Secretary.

And the rest is history. The government of Scotland cracked and switched to the unreliable, unmoderated Centre Assessed Grades from the teachers, followed inevitably by England two weeks later, introducing an whole new set of irregularities, now brushed under the carpet. One can only hope that if exams are cancelled in 2021 there will be a more informed plan in place.