Skip to content

Search the site

Warnings over remote proctoring data protection, DPIA need, after trainee barristers' exam imbroglio.

'Utterly baffling'...

You are about to take the most important examination of your professional or academic career: some 60 seconds before you are due to start, you are asked to consent to retinal and full-face scans, with little longer to digest a potentially troubling privacy policy. You’re asked to maintain “eye contact” with a faceless invigilator or their software proxy for hours on end – whilst a black box algorithm purports to make inferences about your " psychological trends, preferences, predispositions”. You're in? You may end up peeing in a bottle.

Remote proctoring technology has become the norm amid a sharp rise in distance learning --  a trend accelerated by the pandemic. But deeper concerns over privacy, inclusivity, and data protection have also put a spotlight on the urgent need for organisations using such software to have conducted robust Data Protection Impact Assessments (DPIA) – with one recent clash between trainee barristers and the organisation that regulates their training proving a salutory lesson in how badly things can go awry for those caught napping.

Exam proctoring or the remote supervision of tests is being widely deployed by schools, colleges and professional training courses across the UK and US. The services typically involve the use of third-party software companies such as Pearson VUE, Respondus, ProctorU, Proctorio and Examity.

Examinees are typically required to accept sharing a host of biometric and personal data in order to take tests online. In many cases a human examiner is also watching the student via their camera at all times and controversially may ask them to give "tour" or full pan of their room (something that has triggered outrage and frustration from many external examiners, particularly for doctoral defenses.)

A flavour of the data collected: Pearson VUE's privacy policy notes that it may collect and/or disclose to third parties “inferences about preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes” as well as biometrics, geolocation data and details about the students “interaction with an Internet Web site, application, or advertisement”.

"It's all opaque..."

Meg Foulkes, Director of the Open Knowledge Justice Project at non-profit the Open Knowledge Foundation, has been pushing back against the use of such software, launching a "strategic litigation" project. She told The Stack that her concerns include the opacity of algorithmic decision making (including behavioural analysis), biometric data retention, inadequate privacy policies, and more. As she put it with regard to the algorithms: “It's all opaque and it can stay opaque because this third-party controlled entity is protected by trade secrets legislation.”

Foulkes added the discussion around data collection and safeguards for the data gathering during remote proctoring is thin, particularly “if you take a comparison with the kind of safeguards that exist for the police taking a fingerprint, and what kind of grounds they have to have a suspicion to invade someone's privacy.”

Foulkes has had a personal encounter with these systems in 2020 when the time came for her to sit her final exam of the Bar Professional Training Course (BPTC), an exam required to become a barrister in the UK. That exam is run by the Bar Standards Board (BSB) and its remote test was facilitated by the US proctoring firm Pearson VUE. That particular exam became a widely reported imbroglio, with software glitches meaning nearly 30% of pupils couldn't finish their exams, students forced to pee in bottles, and a raft of inclusion issues.

Foulkes refused to consent to the collection of her biometric data.

“The method of consent was a Terms and Conditions notice, tick a box, probably less than 60 seconds before one of the most difficult professional exams that exists,” Foulkes noted to The Stack. As she claimed in a blog in February 2021, students were "coerced into surrendering their privacy rights. According to the GDPR, consent must be “freely given and not imposed as a condition of operation”.

As well as challenging the "mockery of the whole mechanism of consent. It's not really consent”, Foulkes noted that data collected by Pearson VUE is retained "for only the purpose of making their software better. So for them to profit, if you like, by enhancing their software on the basis of extremely sensitive biometric information.”

Her subsequent challenge to the BSB triggered an independent review of the Bar Standards Board's management of the exams. Published in May 2021, the 62-page report is a critical read for those running remote exams, highlighting both institutional failings on inclusivity and communication around data protection, as well as emphasising the critical importance of an early Data Protection Impact Assessment.

Inclusivity failures

Foulkes also recalls a case with a woman “who wears the hijab and was told to remove it because the facial recognition software wouldn't detect her face properly. She was told to remove it, a hugely significant act for a Muslim woman.” The woman asked if she could get a female examiner to oversee her test via the camera, but was told they could not guarantee she would not be watched by a man.

“She was left feeling that she had the decision about whether to have a professional career or her faith; she said ‘no it's my faith I'm stepping back’”, Foulkes recalls.

The independent report adds: "We received very concerning evidence from two students with visual impairments who reported that it was they who contacted the BSB in May 2020 to enquire about how their needs would be accommodated, to be told throughout May, June and July it was for the Providers to arrange.

"In their particular cases, this should have been regarded as the highest priority because of the limitations of Pearson VUE’s systems. They cannot support screen reading software such as JAWS, a programme which allows blind and visually impaired users to read the screen with a text-to-speech output. It took weeks and weeks of proactive intervention by the students with their Providers and the BSB to reach a solution, with support of the Thomas Pocklington trust."

Independent report slates "impenetrable" data from Pearson VUE and "baffling" response.

The independent report commissioned by the BSB was published in May 2021. It drew barely any attention beyond from those affected. Yet the report details valuable cautionary lessons for those using such software. It details confusion over data protection, noting that “in the original contract between Pearson VUE and the BSB... the clauses of the contract go no further than summarise the duties and obligations of either side if they are acting as data processor or data controller. They do not specify which party is the data processor or data controller in any given situation.

"In the contract variation of 2020, there is an additional clause in respect of candidate identification validation, facial comparison, and automated processing of candidate personal data by Pearson VUE. This seems to indicate implicitly, that Pearson VUE was the data processor for these purposes rather than data controller, although this was not agreed in terms between the parties.”

(The report's authors don't pull their punches, noting that "some of the data supplied to this Review by Pearson VUE were impenetrable" and that even the company's managers admitted other figures and explanations were 'utterly baffling' after the issue was escalated.)

A Data Protection Impact Assessment (DPIA) by the BSB as required under GDPR forced some policy changes vis-à-vis the exams from Pearson Vue but ultimately left students “unclear as to what legal protections applied to them,” the May 2021 independent report noted, adding that “there was a considerable email debate about a term in Pearson VUE’s privacy statement in respect of the use of biometric data for ‘the purpose of further developing, upgrading, and improving our applications and systems’. This would inevitably have made Pearson VUE a data controller.

“The DPO had several email exchanges with Pearson VUE in respect of this and Pearson VUE finally agreed in writing that this would not apply to the BPTC candidates’ data. The DPO was then able to amend the Bar Council’s privacy statement on the BSB website on 22nd May 2020, which confirmed their role as data controller, but not explicitly confirming that the use of data for “the purpose of further developing” et cetera of Pearson VUE’s testing processes would not apply.

“Issues relating to the encryption of data and international data transfer were also not clarified” the independent report adds, emphasising that the imbroglio “would have been anticipated and mitigation measures or alternative methods put in place had an effective DPIA been conducted.”

(This would not, needless to say, have mitigated all concerns about the remote proctoring data gathering and use, but would have provided clarification and mitigation for students. Pearson VUE's latest privacy policy -- as updated in March 2021 -- notes that "by sending information and Personal Data to us electronically, you consent to trans-border and international transmission of any data that you may choose to supply us to any country in the world, including countries without an adequate level of data protection". It contains no mention of how or whether biometric and other data is encrypted at rest.)

In the May report the BSB meanwhile states that it will improve its communication with students and training providers, make the examination process more accessible and inclusive for students and “introduce a critical incidents policy and improve data protection and project management; and clarify the roles and responsibilities of the BSB and training providers in the management of the centralised exams.”

In the final conclusion of the May report Mark Neale BSB Director-General notes: “It would have been better, I think, to have bound Pearson VUE, our partners in delivering the examinations, into a project structure. We relied on the assurances we received from Pearson VUE that the processes of booking in students to testing centres, of delivering the examinations themselves, including the provision of support to students and of providing reliable management information would all work smoothly.

“In the event, they did not.”

Pearson Vue told The Stack: ‘’Pearson VUE fully supports the ‘Independent Review of the Bar Standard Board’s management of the August 2020 sittings of the Centralised Examinations’. The in-depth findings appropriately bring attention to areas where Pearson VUE did not fully meet the expectations of the BSB or its candidate cohort. Pearson VUE is deeply sorry for any stress and frustration experienced by BSB candidates… Complex arrangements and quick decisions were made by both the BSB and Pearson VUE in difficult circumstances. As a result, aspects of our handling and communications fell short.

Last December meanwhile the BSB’s held a competitive tender and is currently finalising a new third party contractor for exam proctoring. The Open Knowledge Foundation meanwhile is contemplating a complaint to the ICO regarding the BSB’s use of exam proctoring software. However Foulkes notes that they are, “really happy that the inquiry took so many of our recommendations on and that's a real win for us. But we have to see in terms of what the next steps are, how this kind of narrative continues.”

Follow The Stack on LinkedIn

Latest