top of page

Bias, COVID19 and SQA Results

It was announced three weeks ago that, for the first time in Scottish education history, Scottish Qualifications Authority (SQA) exams were cancelled as a result of the COVID19 pandemic. The exam diet was supposed to take place during the months of April and May 2020.


At first, I was swept by a wave of relief. My colleagues and I had been trying to imagine how on earth we were meant to host exams in our school. Days before the announcement, the SQA contingency plans were still being discussed. We imagined being asked to support the invigilation of exams, making sure that pupils were all sitting at desks two metres apart, spread in different rooms across the school, keeping our eyes peeled for anyone who might be stifling a cough.


As the gravity of the pandemic began to sink in, it seemed ludicrous to keep schools open for any longer. If exams were to run, many pupils would have had to sit the same exam at different times, as a result of self-isolation periods caused by any family member with a cough. Getting pupils to sit the same exams at different times and making sure that exam questions and answers were not shared between each group seemed logistically impossible. And, for pupils living with elderly and/or immuno-compromised relatives, it would have resulted in making the unfair choice between getting your Scottish qualifications and saving your relatives’ lives. Now, considering that schools in Scotland may well be closed until the end of the summer, cancelling exams was a necessary solution.

The Complications


However, with this solution came many complications. I specialise in teaching English and the main Scottish qualifications typically offered in secondary schools are:


  • National 4: based entirely on coursework completed in school. As far as I’m aware, COVID19 should not affect this qualification.

  • National 5: 30% of the final mark consists of a portfolio marked externally (scripts were sent before schools’ closure) and 70% is based on exams

  • Higher (what most college and university courses require): 30% of the final mark consists of a portfolio marked externally (scripts were not sent before the closure of schools) and 70% is based on the exam.

  • Advanced Higher: Portfolio work marked externally is handed in on the day of the exam and the final mark is based on the exam, portfolio essays and a dissertation.

While all the coursework in my English department was complete, other subjects had not been able to get all the coursework in and they relied on pupils coming back to school to complete these. Eventually, it was (rightly so) deemed unsafe for pupils to go back to school to complete coursework and the SQA concluded that it was logistically impossible to mark coursework externally as usual. Thus, all the coursework completed for external SQA marking will be used as additional evidence that teachers should consider for estimating grades.

The Tears


With waves of relief came floods of tears. Normally, when a pupil misses an exam due to exceptional circumstances, their qualification results are based on their prelim scores, which are usually completed in December or January. Naturally, this has been an incredibly stressful period for pupils who rely on their exam results to get into the courses they need for their dream careers. And if you didn’t do well in your prelims, you risk getting penalised in your final estimated grades. Many pupils who didn’t do well in their prelims saw their dreams of entering their preferred courses in colleges and universities shattering before them. In these exceptional circumstances, it would be cruel and unfair to make a decision about young people’s futures based on a couple of days of prelims, before they have had the chance to consolidate their learning.

The Rush for Gathering Evidence


And so the rush for gathering evidence began. Prelim scripts are typically safely stored in schools for exceptional circumstances. In a matter of week or so, teachers and pupils had to gather as much evidence of coursework aside from the prelim scripts to demonstrate that pupils had improved since they sat their prelims.


According to the SQA, teachers should use their professional judgement to estimate pupils’ National 5, Higher and Advanced Higher grades based on pupils’ “demonstrated and inferred attainment of the required skills, knowledge and understanding for each National Course" and based on "a holistic review of a learner’s performance in the assessment evidence available.” For my subject, this would mean estimating grades based on:


  • Prelim results

  • Portfolio work: two essays drafted several times by pupils throughout the year with limited teacher input.

  • Timed practices: practices of exam-like assessments in class under timed conditions and without notes (e.g. writing a critical essay about a text studied in 45 minutes without any notes, as would be done for the final exam)


The challenge for many teachers is locating timed practices completed in class. Teachers keep track of pupils’ results, but we don’t necessarily keep their papers for various reasons (e.g. they are useful revision tools for pupils). We have relied on gathering the papers that pupils have kept (and not lost or binned). And with the closure of schools, we have had to resort to asking pupils to send us photos of their previously marked work.


A paperwork nightmare; necessary nonetheless.


First of all, from a teacher’s perspective, it has been difficult distinguishing what will be accepted as coursework evidence. To be on the safe side, teachers have been advised to look for timed practices completed in class which consisted of SQA past papers or P&N papers (prelim papers), as opposed to papers that teachers created themselves. However, I found that disadvantages teachers, such as myself, who decided to teach newer material for which there are not as many past papers. For example, I taught the George Mackay Brown short stories for my Higher Scottish Set Text and, since the stories were changed last year, I have had to make a lot of the practice papers myself. In addition, a lot of teachers will have spent the time after the prelims teaching in more course content and not necessarily completing so many timed practices that early on.


Second of all, this rush for evidence favours pupils who are organised and who have kept their work completed in class. From my teaching experience, pupils who struggle with organisation also tend to have Additional Support Needs (ASN; such as dyslexia for example) or they tend to live in different homes (with divorced parents for instance). For ASN pupils who require Alternative Assessments Arrangements (AAA; such as extra time and IT) for their exam, there is a risk of further disadvantage since they may not have had the necessary arrangements until more recently. What often happens is that prelims are used as a baseline to determine whether ASN pupils need alternative exam arrangements to make the exam more inclusive and accessible. As a result, such pupils tend to perform poorly earlier on until they have found the correct AAA put in place.


Moreover, the ability to gather better evidence which could bring up a pupils’ estimated grade will depend on the number of timed practices the class has undertaken in school. Practice makes perfect. Anybody can improve their performance if they are given ten timed practices of the same assessment instead of two or three timed practices. This disadvantages pupils in larger classes since a teacher needs to spend more time marking a class set and so won’t necessarily be able to give their class as many timed practices in school. As a result, pupils in denser classes may be more at risk, while pupils in smaller classes may gain more of an advantage.

Seeking Consistency or Entrenching Privilege?


Schools have until the end of April to submit their estimated grades. Eventually, the SQA will probably need to sample evidence from schools, somehow, to check the consistency of their estimated grades. As far as I’m aware, this may depend on schools’ consistency with their performances from previous years. According to the SQA website, consistency “may be informed by previous subject and qualification performance at both a national and centre level and prior attainment information where that is available.


In other words, once teachers have determined their estimated grades, schools need to collate the grades and adjust these to match previous years’ performances. While this measure makes sense to ensure consistency across the years, there are some drawbacks. Placing so much importance on a school’s previous performance privileges schools that have had better performances in the past (and this tends to be private schools and schools in more affluent areas) and it condemns schools that have previously had lower performances (which tends to be schools in less affluent areas). As a result, there is a greater risk of disadvantaging pupils from lower socioeconomic backgrounds and from ethnic minority backgrounds.


It is clear that we are in a dangerous, unprecedented situation which could cause structural disadvantage to deepen. And that’s before even exploring the elephant in the room: bias.



The Role of Bias


I have often found it frustrating when racism is simplified, or even justified, with unconscious bias. First things first: the word “unconscious” is problematic because it draws attention away from the victims of the racist action and it leads to excessive attention to the perpetrator’s intentions. Intentions don’t matter as much as the harm caused. For that reason, I prefer referring to “implicit” bias.


The Teacherist blog created a detailed, insightful article with relevant research on bias and education in England. A lot of it is applicable to Scotland. I won’t reiterate everything that was mentioned there; instead, I am providing an overview of the implications in Scotland alongside my personal understanding of implicit bias.


When I was studying implicit bias research for my Masters dissertation a few years ago, I realised that bias is learned unwittingly through a process of socialisation during the course of our lives and this is often out-with our control and that of our families when we’re growing up. I spent a while dabbling with Implicit Association Tests (IATs), trying to make my own tests to research teachers’ racial biases, before finding myself blocked by institutional barriers and endless unsuccessful ethics applications (proving that teachers have racist biases was simply unacceptable!).


IATs are computer-based tests that measure how rapidly people are able to categorise various words and images, and they demonstrate that implicit bias affects us all. You can check out the links at the end of this post to try one out or even to create your own.

In an IAT, you might be looking at a white man’s face and associate them faster with the word “intelligent.” Then you might be looking at a black man’s face and associate them faster with the word “lazy” or “violent.” Unfortunately, this is incredibly common and, so to speak, normal, when we consider the accumulation of negative racial stereotypes we are all exposed to throughout our lives. That is why representation matters. It becomes even more intriguing when IATs revealed that ethnic minorities were biased against members of their own groups; this equates to internalised racism and it proves that nobody is immune to bias, even when it disadvantages your own.

IATs have faced criticism for not accurately portraying the decisions we make when we are given time to reflect and weigh out the potential outcomes of our decisions - as you would for a job interview, for instance. However, IATs do reveal the type of biased decision-making that can take place when we are under pressure and running out of time – very much like the decision-making that is taking place during the COVID19 pandemic.


Implicit bias could influence a teacher’s decision to give a pupil a higher or lower mark on assessments throughout the year and it could influence their final estimated grades. Biases are formed by societal stereotypes based on socioeconomic background, gender, disability, race and ethnicity. It is easy to imagine how teachers’ biases could advantage the estimates of white pupils, pupils of Asian descent and boys in Maths while disadvantaging girls, dyspraxic, working-class, Gypsy-Traveller and Black pupils in that same subject. The good news is that implicit bias can be modified. The bad news is that, for a lot of teachers making decisions about estimates, it will be a bit too late. Too late because their estimates have to be based on previous marked coursework. Too late because counteracting bias requires a certain amount of time and effort. Nevertheless, there is some hope.

Breaking the Prejudice Habit


In their 2016 paper “Breaking the Prejudice Habit,” Forscher et al. outline the most effective ways of reducing racial biases. According to them,

enduring change in unintentional bias can be achieved by treating unintentional bias as an unwanted habit that can be broken through a combination of motivation, awareness, and effort.

This can be done with the following strategies:


  • Increasing opportunities for contact: meeting, getting to know and socialising with people with backgrounds that tend to be disadvantaged.

This is obviously more challenging now that everybody needs to be socially-distancing. However, this measure should technically make it easier for teachers to dispel negative stereotypes when they develop strong bonds with their pupils. Yet, on its own, this strategy can also increase a person’s bias through confirmation bias. For instance, if a black girl happens to be struggle with Maths, this may strengthen her teacher’s bias that black girls don't do well in Maths.

  • Perspective-taking: stepping into the racialised target’s shoes, imagining their lived experiences and feeling more empathy for them.

  • Counter-stereotypic imaging: seeing images that challenge stereotypes.

For example, this can be done by watching the movie Hidden Figures, which celebrates black female NASA geniuses, or by simply visualising a black girl acing her Maths exam.


  • Increasing knowledge about how biases can negatively affect behaviour unintentionally.

While teachers and educational leaders are already under a huge amount of pressure, dismissing the role of bias in such key decisions is dangerous. Instead, educators should be raising their awareness of implicit bias and its negative impacts. For those of us who claim to already know about it, we should be revising our knowledge of implicit bias and the implications it has on young people’s future.

Implications on Attainment


To provide an example of the implications of implicit bias and an example of the strategy of perspective-taking, I draw on Akala’s autobiographical account of implicit bias and its impact on his education. After recounting his experience of being underestimated as a black boy at school in London, being bullied by his teacher and being placed in a “Special Needs” class for no valid reason, Akala draws on English educational research (Gillborn, 2008; Burgess & Greaves, 2009):

“Unsurprisingly, the outcome of Foundation Stage Profile (FSP), teacher-assessed tests, has been to conclude that white children are actually the smartest of all ethnic groups, despite the fact that Indian students have been dramatically outperforming them on average for many years. (…) We know for certain that this trend of under-estimating children’s intelligence continues right throughout schooling, which tallies with my experience and makes sense of Local Education Agency data [mentioned earlier in Gillborn’s 2008 research], where black children fall further behind the longer they stay in school.

It is not complex; if a fair portion of your teachers assume you are less clever than you actually are simply because you are black, and treat you accordingly, you are going to resent them and it will naturally affect your self-esteem and grades.”



While Akala’s experiences and the research he mentions come from the context of England, similar racialised attainment gaps and undervaluing of pupils based on race and ethnicity are undoubtedly happening in Scotland. In Netto et al.’s Review on Poverty and Ethnicity 2011, a figure from the most recent school census data at the time (a sample of 58,445 pupils in 2008/9) demonstrates the racialised attainment gap for pupils affected by poverty:


Source: Review on Poverty and Ethnicity (2011)

Netto et al. highlight the concern of low attainment for Gypsy/Traveller children, but they do not have a separate category for this particular group in their data collection; they are probably included in the “Other” category. The evidence presented is also based on the Scottish Index for Multiple Deprivation (SIMD) which, as Netto et al. acknowledge in their review of statistical datasets, is imperfect due to its reliance on postcodes and its consequent failure to highlight deprivation that is not geographically concentrated. For example, Black and Minority Ethnic (BME) pupils living in Pollokshields in Glasgow will not be considered deprived even though there is a higher concentration of BME pupils from deprived backgrounds mixed with more affluent white British pupils in that area (a possible result of gentrification).


What remains clear in these representations of attainment disparities is that power, privilege and structural disadvantage affects attainment and teachers’ implicit biases may unwittingly magnify these inequalities in Scottish education.

Moving Forward


I have to admit, the past few weeks have been the some of the most challenging and uncertain times in my experience of the teaching profession. Teachers, education leaders and the SQA are under a huge amount of pressure trying to make sense of the changes, adapt to remote teaching, reassure their pupils and secure the best outcomes for young people. Perhaps the SQA should consider offering the option of sitting the exam at a later date, as is the case in England. Maybe there is a stronger argument for getting rid of exams altogether, in the future, with a stronger emphasis on coursework assessments marked by the teacher AND by external markers – marking anonymously to reduce the impact of implicit bias.


I like to believe that many of us, if not all, genuinely want the very best for young people, regardless of their gender, ASN, race, ethnicity or socioeconomic background. However, implicit bias seeps into our decision-making process whether we want it or not, especially under pressure. Therefore, I urge us all to do our best to be as inclusive as possible by paying particular attention to the learners who tend to be disadvantaged:


  • Boys

  • Black pupils

  • Gypsy-Traveller pupils

  • ASN pupils

  • Pupils with English as an Additional Language

  • Working-class pupils

  • Bangladeshi and Pakistani pupils


Bear in mind that this disadvantage can vary depending on subject area. These identities can also intersect and amplify the inequality. These are unprecedented, exceptional times for the nation, educators, pupils and their families. While staying safe, we must continue to question the biases in our decisions at present and in the future.


Now is not the time for complacency.

References:

  • Akala (2018). Natives: Race and Class in the Ruins of Empire.

  • Burgess, S., Greaves, E. (2009). “Test Scores, Subjective Assessment and Stereotyping of Ethnic Minorities”

  • Forscher, P., Mitamura, C., Dix, E., Cox, W. & Devine, P. (2016). “Breaking the Prejudice Habit: Mechanisms, Timecourse and Longevity,” Running Head: Habit-Breaking Mechanisms. Madison: University of Winsconsin.

  • Gillborn, D. 2008. “Racism and Education: Coincidence or Conspiracy? White Success, Black Failure.” London Routledge

  • Melfi, T. (2016). Hidden Figures

  • Netto, G., Sosento, F. & Bramley, G. (2011). Poverty and Ethnicity in Scotland: Review of the Literature and Datasets. Joseph Rowntree Foundation.

  • SQA update on arrangements for quality assurance and the certification of National Courses and Awards – a message to schools and colleges

  • The Teacherist blog post “2020 Exams Results and Bias

Trying out Implicit Association Tests:

8 comments
bottom of page