Race also matters in proctoring software built to monitor students during remote exams. Proctoring software does not always accurately assess people who have darker skin. At the University of Wisconsin at Madison and in other cases, students have been barred from or have had to pause from taking tests because of software failing to recognize faces of people with darker skin. The technology itself certainly is not racist. Yet, as scholars such as Ruha Benjamin and Safiya Noble have shown, the algorithms and codes structuring such technologies can perpetuate racial biases and stereotypes.