January 24, 2021

AND THE ANSWER IS NONE. NONE MORE RACIST: “Software that monitors students during tests perpetuates inequality and violates their privacy,” and Technology Review is on it!

I’m a university librarian and I’ve seen the impacts of these systems up close. My own employer, the University of Colorado Denver, has a contract with Proctorio.

It’s become clear to me that algorithmic proctoring is a modern surveillance technology that reinforces white supremacy, sexism, ableism, and transphobia. The use of these tools is an invasion of students’ privacy and, often, a civil rights violation.

If you’re a student taking an algorithmically proctored test, here’s how it works: When you begin, the software starts recording your computer’s camera, audio, and the websites you visit. It measures your body and watches you for the duration of the exam, tracking your movements to identify what it considers cheating behaviors. If you do anything that the software deems suspicious, it will alert your professor to view the recording and provide them a color-coded probability of your academic misconduct.

Depending on which company made the software, it will use some combination of machine learning, AI, and biometrics (including facial recognition, facial detection, or eye tracking) to do all of this. The problem is that facial recognition and detection have proven to be racist, sexist, and transphobic over, and over, and over again.

In general, technology has a pattern of reinforcing structural oppression like racism and sexism. Now these same biases are showing up in test proctoring software that disproportionately hurts marginalized students.

A Black woman at my university once told me that whenever she used Proctorio’s test proctoring software, it always prompted her to shine more light on her face. The software couldn’t validate her identity and she was denied access to tests so often that she had to go to her professor to make other arrangements. Her white peers never had this problem.

Similar kinds of discrimination can happen if a student is trans or non-binary. But if you’re a white cis man (like most of the developers who make facial recognition software), you’ll probably be fine.

It’s not only the software that’s racist and homophobic — it’s [Al Pacino voice] the whole damn system itself! [/Pacino  voice] According to this professor, ‘Science, statistics, and technology are all inherently racist:’”

University of Rhode Island and Director of Graduate Studies Erik Loomis recently claimed “science, statistics and technology” are racist.

“Science, statistics, and technology are all inherently racist because they are developed by racists who live in a racist society, whether they identify as racists or not,” Loomis tweeted in reference to a New York Times article.

“This is why I have so much contempt for those, including many liberals, who ‘just want the data.’ The data is racist!” he continued.

“You’re out of order! You’re out of order! The whole trial is out of order! They’re out of order!”

 

InstaPundit is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.