How the grades were generated

For transparency, we would like to share with you the overall process we went through at The Deepings School when submitting Centre-Assessed Grades (CAGs) and Rankings to each exam board earlier this summer. In addition, we have also provided the particular evidence each subject used when calculating their CAGs (linked on the left for years 13 and 11).

An initial meeting was held with all Heads of Departments, outlining the process and to ensure consistency in application. In summary, students due to sit the exams would be awarded a grade based on an assessment of the grade they would have been most likely to achieve had exams gone ahead based on a range of evidence and data, including performance on mock exams and non-exam assessment.

Exam boards asked exam centres to:

  • generate, for each subject, centre assessment grades for their students, and then to 
  • rank order the summer 2020 grades for GCSE and A level, students within each of those grades where 1 is the most secure/highest attaining student, and so on.

Once this information was submitted to each examination board, a ‘standardisation process’ took place and final results were released to the school. More details about this process can be found here.

Evidence that could be used towards grades and ranking:

  • records of each student’s performance over the course of study, including for example progress review data, classwork, bookwork, and/or participation in performances in subjects such as music, drama and PE.
  • performance on any class or homework assessments and mock exams taken over the course of study.

At The Deepings School, we were fortunate to have completed two sets of Pre-Public Examinations (PPEs) in Year 11 and Year 13 before closure. In addition to this, an additional PPE had taken place the previous year. Most subjects therefore had at least three formal assessment points to evidence their CAGs and ranking. We allowed each subject to decide which PPE(s) would give the best indication for their qualification(s). For students with Exam Access Arrangements (EAA), assessments where support arrangements were not in place could not be used as evidence towards this process.

Additional checks for accuracy

Although most subjects may have relied on the most recent PPEs to base their judgements, all other assessment data was used alongside this to check for consistency of performance. This would especially be the case for students that missed any of their PPE exams, or where clearly there may have been underperformance due to illness.

Where additional work has been completed after schools were closed on 20 March, we exercised caution where that evidence suggests a change in performance. Thererfore, for incomplete Non-Examination Assessments (NEAs), a judgement had to be made for what was completed prior to closure.

Additional checks and moderation of previous data had taken place, especially the February PPE and prediction data. This would be to ensure grades submitted were fair across different teachers in a subject. 

Data shared to consider any ‘unconscious bias’. For each subject, data indicating how previous cohorts of students performed compared to teacher predictions was shared to ensure no groups of students were disadvantaged by this process.

The above data would also indicate the amount of progress usually made by each subject from the final PPE to the summer examinations. Comparisons were made to previous outcomes in each subject and the progress made since then to ensure that the CAGs submitted were a fair reflection of the most likely outcome for each subject.

Each subject's CAGs and ranking were discussed with the department at least twice and signed off by two senior members of staff, usually the Head of Department, the Senior Leadership Team Link and/ or a Key Stage co-ordinator. These checks ensured that data submitted was not individual 'Teacher' Assessed grades but went through a thorough moderation process to be able to submit our ‘Centre’ Assessed Grades.