.
Right: Growing pains: children have proved difficult subjects for facial recognition, as their faces change rapidly as they grow.
Found a word you're not familiar with? Double-click that word to bring up a dictionary reference to it. The dictionary page includes an audio sound file with which to actually hear the word said. |
Arguments against using facial recognition cameras in schools
1. Facial recognition technology is not sufficiently accurate
Opponents of the use of facial recognition technology in schools argue that they are not sufficiently accurate to be relied upon. Inaccuracies make these devices unsuitable both as a means of alerting a school that a potentially dangerous intruder is on campus or as a means of monitoring student attendance.
The American Civil Liberties Union of Northern California tested Amazon's Rekognition facial recognition system by loading it with photos of members of Congress and letting it run comparisons to arrest photos. The test resulted in 28 false matches, of which nearly 40 percent were of people of color, even though they make up only 20 percent of Congress.
Joy Buolamwini, a researcher, at Massachusetts Institute of Technology, has demonstrated that for some commercial facial recognition software the accuracy rate in 99 percent when the subject is a white male; this rate drops dramatically for other racial groups and for women. For darker skinned women the error rate rose to nearly 35 percent.
This error rate reflects the data bias of the information bank used to establish the system. The more instances of a particular race or gender the databank holds the more accurate the system will be. This means that for minority groups or groups otherwise underrepresented in the databank there is a far greater risk of misidentification.
This misidentification should be less likely to occur in a roll marking system where the databank is composed of all the children in a particular school; however, where a system is alerting a school to the presence on its campus of someone on its 'blacklist', the biases in that databank could well led to misidentification.
It has been noted that the accuracy of the technology is reduced when the subject to be identified has been photographed in a different light or in motion. It has also been noted that children pose particular problems for facial recognition technology. This is significant whether the technology is being used to monitor student attendance or to indicate the presence of intruders. Demonstrating the potential recognition problems children pose, Apple has recently warned that its new facial recognition tool, Face ID, should not be used by children under the age of 13. The firm claims that their faces 'may not have fully developed' and are too similar, increasing the chance that their iPhone could be unlocked by someone else.
For children, whose appearances change rapidly as they grow, the accuracy of this technology is questionable. A spokesperson for the American Civil Liberties Union of Northern California has stated, 'False positives for a student entering school or going about their day can result in traumatic interactions with law enforcement, loss of class time, disciplinary action, and potentially a criminal record.'
2. Facial recognition technology does not guarantee student safety
Critics of facial recognition technology as a means of ensuring student safety argue that such systems are unlikely to be effective.
In an opinion piece published on The Intercept on May 31, 2018, Ana Kofman criticised the probable ineffectiveness of the facial recognition-based surveillance system being employed within the Lockport School District, New York. Kofman stated, 'Given the nature of gun violence at schools, Lockport's purchase of surveillance technology appears inefficient and expensive. All of the major school shootings in the last five years in the U.S. have been carried out by current students or alumnae of the school in question. "These are students for whom the school wouldn't have a reason to have their face entered into the face recognition system's blacklist," explained Rachel Levinson-Waldman, a security and policing expert at the Brennan Center for Justice.'
Similar claimed were made by Toni Smith-Thompson in an opinion piece published by the American Civil Liberties Union on August 15, 2018. Smith-Thompson states, 'While the current call for increased safety against school shooters has fuelled a wave of increased surveillance, this technology does not mitigate the risk. The vast majority of school shooters are first-time offenders and would not be included in any database to prevent them from entering a school. Indeed, perpetrators who are themselves students would easily gain access to school facilities.'
Smith-Thompson further noted that facial recognition technology can be easily subverted. She wrote, 'Facial recognition technology is especially prone to sabotage: for 22 cents, you can purchase a pair of cardboard glasses to fool it.'
Researchers from Carnegie Mellon University have shown that specially designed spectacle frames can mislead state-of-the-art facial recognition software. The glasses can make the wearer disappear to such automated systems. It can also cause these systems to misidentify the wearer as someone else.
Andrew Ferguson, a law professor at the University of the District of Columbia, has claimed that surveillance companies are preying on the dread of schools and parents by selling experimental 'security theatre' systems that offer only the appearance of safer schools.
Professor Ferguson has stated, 'These companies are taking advantage of the genuine fear and almost impotence of parents who want to protect their kids and they're selling them surveillance technology at a cost that will do very little to protect them.'
3. Use of facial recognition technology can be overly restrictive
Opponents of the constant use of facial recognition technology in schools argue that this undermines students' privacy in a way that damagingly restricts their freedom, limiting their capacity to explore different sorts of behaviour because they fear being judged or punished.
Martin Chorzempa, a fellow at the Peterson Institute for International Economics, has considered the way in which surveillance can be used to shape behaviour and force compliance. Chorzempa stated, 'The whole point is that people don't know if they're being monitored, and that uncertainty makes people more obedient.' He described the approach as a panopticon (an institutional arrangement through which people are potentially watched at all times), the idea that people will follow the rules precisely because they believe they may be under observation.
It has been claimed that one of the values of privacy is that it is an opportunity to act without fear of judgement. Dr Niels Wouters, of the Microsoft Research Centre for Social Natural User Interfaces at the University of Melbourne, has stated, 'A child trying out who they are under the constant gaze of an intelligent camera could mean they become forever judged for pulling weirdo faces.' Dr Wouters is equally concerned that fear of this potential judgement means that children do not explore different types of behaviour. He has stated, 'This takes away a huge sense of freedom for these children...It's an important thing in childhood to explore boundaries.'
The New York Civil Liberties Union (NYCLU) has similarly warned of the damagingly restrictive quality of facial recognition technology used within schools. Responding to its use within the Lockport City School District, Toni Smith-Thompson of NYCLU stated, 'Schools should be safe environments for students to learn and play. They should be places where students can test out and practice ideas, interactions, and activities and be supported to make their own (safe) choices. Pervasive monitoring and collection of children's most sensitive information - including their biometric data - can turn students into perpetual suspects. It exposes every aspect of a child's life to unfair scrutiny.'
A similar point was made by two other members of the NYCLU in an article published on June 18, 2018. Stefanie Coyle and John A. Curr III warn, 'In the system Lockport purchased, once a person's facial image is captured by the technology and uploaded, the system can go back and track that person's movements around the school over the previous 60 days.
It's easy to imagine that students will feel like they are constantly under suspicion. Lockport is sending the message that it views students as unpredictable, potential criminals who must have their faces scanned wherever they go.'
Privacy advocates are concerned about the atmosphere of suspicion that ongoing facial recognition surveillance could create within schools. In a letter of protest written by the NYCLU on June 18, 2018, its authors noted, 'Lockport plans to utilize this technology in each school in the District, including elementary schools, resulting in facial imaging of four-and five-year-old children. That fact should shock the conscience of any person who cares about education. We are concerned that no one at the District...questioned the wisdom of this purchase from the perspective of school climate, or the message it sends to our young people about their futures, their relationships with adults, or their sense of belonging in their school.
Rather than protecting them, the District is treating every child as a threat; rather than human relationships, the District is relying on machines to do its job.'
4. Facial recognition technology is often installed without adequate community consultation
Critics of the use of facial recognition cameras within schools are concerned that these devices are being introduced without the consent of the whole school community and without the nature of their operation being properly explained.
The New York Civil Liberties Union (NYCLU) has investigated the manner in which facial recognition cameras were introduced into schools in the Lockport School District, New York, and has discovered a lack of consultation with parents and students before the devices were introduced. The NYCLU states, 'The Smart Schools Bond Act, which provides funding for technology in schools, includes specific requirements for engaging community stakeholders, including children, teachers and parents... [The] documents the Lockport School District provided show they held only one public meeting to introduce the community to the idea of using state money for technology in the classroom to purchase surveillance technology. After the meeting, the school moved forward with an application to acquire the facial recognition software. That meeting was in the middle of a weekday afternoon in August, when many parents are at work or out of town. There were no emails or flyers showing engagement of students and parents in the process of deciding to adopt this technology. In fact, the head of the Lockport Education Association told a reporter they were not consulted.'
Lockport resident, Jim Shultz, has created a petition asking the Lockport City School District to postpone the implementation of the facial recognition system within the district's schools until proper community consultation has taken place. Mr Shultz has complained that concerns about the cost and the relative effectiveness of the system have not been addressed and that there has been no adequate opportunity for school communities to give their view.
In a comment published in Times Union on November 6, 2018, Mr Shultz stated, ' State education officials need to halt the state funding for the Lockport plan until it is audited for its financial irregularities and its lack of community consultation, and until serious privacy protections are guaranteed.'
5. There need to be stronger laws and regulations to ensure that facial recognition technology is not misused
Critics of the use of facial recognition technology within schools argue that for systems that are intended to increase control of school environments, their operation is often not properly regulated. There are serious questions often left unanswered about the use and misuse of the data and critics are concerned that transparent protocols for its application and storage have not been set up. Underlying this is the concern that in many jurisdictions there are inadequate legal guarantees to protect citizens' privacy.
The New York Civil Liberties Union (NYCLU) has expressed concern about the lack of regulation to control the use of facial recognition data collected in the Lockport School District. The NYCLU has stated, 'There are...no regulations in place to account for the serious inaccuracy of this technology, which is most likely to misidentify people of color. And there is no policy in place to limit who will have access to the data collected from the cameras that scan the faces of thousands of parents, teachers and children every day.'
A similar complaint was made by a parent, Jim Shultz, whose daughter attends Lockport High School. Mr Shultz's comment was published in Times Union, on November 6, 2018. Mr Shultz stated, 'No rules or policies exist regarding who can put a student's facial image in the system, how long the images can be held, or who has the authority to use the system to track a student's movements. Can the local police demand access to the system? Federal agencies? None of this is spelled out. State education officials approved the funds for this new surveillance system without any consideration as to how it can be used.'
On July 13, 2018, Microsoft President, Brad Smith, made a public appeal for governments to establish some regulation of facial recognition systems. In a public post on the company's blog site, Smith stated, 'It seems especially important to pursue thoughtful government regulation of facial recognition technology, given its broad societal ramifications and potential for abuse.'
In Australia, concern regarding inadequate regulation of the use of facial recognition technology in schools is underpinned by concerns regarding a more general lack of safeguards surrounding citizens' privacy. In an opinion piece published in The Conversation on October 6, 2017, Bruce Baer Arnold, Assistant Professor, School of Law, University of Canberra, stated, '[Australia is] a nation where Commonwealth, state and territory privacy law is inconsistent. That law is weakly enforced, in part because watchdogs such as the Office of the Australian Information Commissioner (OAIC) are under-resourced, threatened with closure or have clashed with senior politicians.
Australia does not have a coherent enforceable right to privacy. Instead we have a threadbare patchwork of law (including an absence of a discrete privacy statute in several jurisdictions).'
There have been similar concerns expressed about a lack of effective overarching privacy legislation in the United States. In an analysis written by Drew Harwell of the Washington Post and published on June 7, 2018, it was noted, 'No federal law restricts the use of facial-recognition technology, and only Illinois and Texas have passed laws requiring companies to get people's consent before collecting what the industry calls "faceprints." That allows local police forces, cities, employers and school boards to largely set their own policies.'
Those who are concerned about the lack of adequate laws to protect privacy argue that China's leading role in developing and using facial recognition technology is concerning. China has a more problematic attitude to the protection of individual liberties than do Western democracies. Tiffany Lee, writing for The World Post, in an opinion piece published on August 7, 2018, warned that China's relative indifference to privacy issues is likely to influence international attitudes. Lee states, 'China's rapidly advancing technology industries and massive consumer market are already influencing norms around the world. China will likely impact the way privacy is understood and protected.' Lee argues that China's increasing influence has created an even greater need for strengthened national and international digital privacy laws.
|