Dozens of civil rights and education groups have sent a letter to the U.S. Department of Education asking it to ban the use of federal funds to purchase school surveillance technologies.

The No Tech Criminalization in Education (NOTICE) Coalition wrote in the March 18 letter that the groups are concerned about “the rapid expansion of artificial intelligence and big data technologies in K-12 public schools,” which they argue have the potential to “violate the civil and human rights of students from historically marginalized communities.”

The groups include GLSEN, a nonprofit advocacy organization focusing on LGBTQ+ students in K-12 schools; research and advocacy nonprofit Education Law Center; NAACP Legal Defense and Educational Fund, Inc.; and Teachers Unite, an independent membership organization of New York city public school educators.

Schools have increased their reliance on high-tech solutions, such as AI-powered facial- and weapons-recognition technologies, to ensure the physical safety of their students and staff. In addition, many schools have also turned to software that monitors students’ online activity as a response to rising gun violence in schools and student mental health challenges that pose a risk to the school community.

“One of the things we have seen is that a lot of those COVID-era funds have been used to procure a lot of these technologies,” said Clarence Okoh, one of the leaders of the coalition and a senior policy counsel for the Center for Law and Social Policy, an anti-poverty advocacy nonprofit. “The private sector companies that sell these technologies actually market the fact that there are these federal grant programs that are available and encourage schools to leverage them.”

These “problematic” technologies have “devastating consequences for young people” and don’t necessarily improve student safety and well-being, the coalition wrote in the letter addressed to Secretary of Education Miguel Cardona; Catherine Lhamon, the assistant secretary for civil rights; and Monique Dixon, the deputy assistant secretary for policy.

Researchers have found that schools that tighten security and surveillance in response to shootings or other acts of violence may worsen long-term discipline disparities and academic progress, particularly for Black students.

Student surveys suggest that surveillance technologies, such as device monitoring, can make students less likely to express themselves openly or less willing to seek support for their mental and behavioral needs, according to a 2022 report from the Center for Democracy and Technology, a nonprofit that advocates online civil liberties.

Many school districts lack the technical expertise they need to fully evaluate surveillance technologies before they use them, the letter pointed out. So far, only a few states have issued guidance around the use of AI for a variety of purposes in schools.

There’s an open question about what is the difference between supervising students and surveilling students?

Amelia Vance, the president of the Public Interest Privacy Center, which advocates for privacy safeguards for children

“Even in [that] guidance, we’re not seeing any kind of significant mention of the implications of these technologies in relation to student civil rights protections, especially as it relates to student discipline and the use of the technologies by law-enforcement officials in schools,” Okoh said. He emphasized that’s why it’s important for the Education Department to get involved.

The New York state education department last year permanently banned the use of facial-recognition technology in schools—the first state to do so. Okoh and the NOTICE coalition said the federal Education Department should follow New York’s lead.

Along with banning the use of federal funds to purchase school surveillance technologies, the coalition also asks that the Education Department study the prevalence of these technologies in public schools; issue and offer technical guidance to help districts evaluate AI-powered technologies; and include the voices of youth and caregivers when developing policies around the use of AI technologies in schools.

Drawing the line between safety and surveillance

Studying the prevalence of AI-powered monitoring systems and offering technical guidance to districts are vital priorities, said Amelia Vance, the president of the Public Interest Privacy Center, which advocates effective, ethical, and equitable privacy safeguards for all children and students.

“We need to know more. We don’t know what has been adopted,” Vance said. “A lot of times, when it is adopted, the actual efficacy rate is not something that is accurately provided to districts.”

However, when it comes to banning “police-surveillance technologies,” Vance said it could be “difficult to define” what falls under that category.

“A longtime responsibility that I think pretty much everybody in society would say that schools have is to supervise their students,” she said. “And there’s an open question about what is the difference between supervising students and surveilling students?”

End