Verkada – ĂŰĚŇÓ°ĘÓ America's Education News Source Wed, 01 Nov 2023 21:39:52 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Verkada – ĂŰĚŇÓ°ĘÓ 32 32 Biden Order on AI Tackles Tech-Enabled Discrimination in Schools /article/biden-order-on-ai-tackles-tech-enabled-discrimination-in-schools/ Tue, 31 Oct 2023 21:01:00 +0000 /?post_type=article&p=717111 Updated Nov. 1

As artificial intelligence rapidly expands its presence in classrooms, President Biden signed an executive order Monday requiring federal education officials to create guardrails that prevent tech-driven discrimination. 

The , which the White House called “the most sweeping actions ever taken to protect Americans from the potential risks of AI systems,” offers several directives that are specific to the education sector. The order dealing with emerging technologies like ChatGPT directs the Justice Department to coordinate with federal civil rights officials on ways to investigate discrimination perpetuated by algorithms. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Within a year, the education secretary must release guidance on the ways schools can use the technology equitably, with a particular focus on the tools’ effects on “vulnerable and underserved communities.” Meanwhile, an Education Department “AI toolkit” released within the next year will offer guidance on how to implement the tools so that they enhance trust and safety while complying with federal student privacy rules. 

For civil rights advocates who have decried AI’s potentially unintended consequences, the order was a major step forward. 

The order’s focus on civil rights investigations “aligns with what we’ve been advocating for over a year now,” said Elizabeth Laird, the director of equity and civic technology at the nonprofit Center for Democracy and Technology. Her group has called on the Education Department’s Office for Civil Rights to open investigations into the ways AI-enabled tools in schools could have a disparate impact on students based on their race, disability, sexual orientation and gender identity. 

“It’s really important that this office, which has been focused on protecting marginalized groups of students for literally decades, is more involved in conversations about AI and can bring that knowledge and skill set to bear on this emerging technology,” Laird told ĂŰĚŇÓ°ĘÓ. 

In to federal agencies on Wednesday, the Office of Management and Budget spelled out the types of AI education technologies that pose civil rights and safety risks. They include tools to detect student cheating, monitor their online activities, project academic outcomes, make discipline recommendations or facilitate surveillance online and in-person.  

An Education Department spokesperson didn’t respond to a request for comment Monday on how the agency plans to respond to Biden’s order. 

Schools nationwide have adopted artificial intelligence in divergent ways, including in to provide students individualized lessons and with the growing use of chatbots like ChatGPT by both students and teachers. It’s also generated heated debates over technology’s role in exacerbating harms to at-risk youth, including educators’ use of early warning systems that mine data about students — including their race and disciplinary records — to predict their odds of dropping out of school. 

“We’ve heard reported cases of using data to predict who might commit a crime, so very Minority Report,” Laird said. “The bar that schools should be meeting is that they should not be targeting students based on protected characteristics unless it meets a very narrowly defined purpose that is within the government’s interests. And if you’re going to make that argument, you certainly need to be able to show that this is not causing harm to the groups that you’re targeting.” 

AI and student monitoring tools

An unprecedented degree of student surveillance has also been facilitated by AI, including online activity monitoring tools, remote proctoring software to detect cheating on tests and campus security cameras with facial recognition capabilities. 

Beyond its implications on schools, the Biden order requires certain technology companies to conduct AI safety testing before their products are released to the public and to provide their results to the government. It also orders new regulations to ensure AI won’t be used to produce nuclear weapons, recommends that AI-generated photos and videos be transparently identified as such with watermarks and calls on Congress to pass federal data privacy rules “to protect all Americans, especially kids.”

In September, The Center for Democracy and Technology released a report that warned that schools’ use of AI-enabled digital monitoring tools, which track students’ behaviors online, could have a disparate impact on students — particularly LGBTQ+ youth and those with disabilities — in violation of federal civil rights laws. As teachers punish students for using ChatGPT to allegedly cheat on classroom assignments, a survey suggested that children in special education were more likely to face discipline than their general education peers. They also reported higher levels of surveillance and subsequent discipline as a result. 

In response to the report, a coalition of Democratic lawmakers penned a letter urging the Education Department’s civil rights office to investigate districts that use digital surveillance and other AI tools in ways that perpetuate discrimination. 

Education technology companies that use artificial intelligence could come under particular federal scrutiny as a result of the order, said consultant Amelia Vance, an expert on student privacy regulations and president of the Public Interest Privacy Center. The order notes that the federal government plans to enforce consumer protection laws and enact safeguards “against fraud, unintended bias, discrimination, infringements on privacy and other harms from AI.” 

“Such protections are especially important in critical fields like healthcare, financial services, education, housing, law and transportation,” the order notes, “where mistakes by or misuse of AI could harm patients, cost consumers or small businesses or jeopardize safety or rights.”

Schools rely heavily on third-party vendors like education technology companies to provide services to students, and those companies are subject to Federal Trade Commission rules against deceptive and unfair business practices, Vance noted. The order’s focus on consumer protections, she said, “was sort of a flag for me that maybe we’re going to see not only continuing interest in regulating ed tech, but more specifically regulating ed tech related to AI.”

While the order was “pretty vague when it came to education,” Vance said it was important that it did acknowledge AI’s potential benefits in education, including for personalized learning and adaptive testing. 

“As much as we keep talking about AI as if it showed up in the past year, it’s been there for a while and we know that there are valuable ways that it can be used,” Vance said. “It can surface particular content, it can facilitate better connections to people when they need certain content.” 

AI and facial recognition cameras

As school districts pour billions of dollars into school safety efforts in the wake of mass school shootings, security vendors have heralded the promises of AI. Yet civil rights groups have warned that facial recognition and other AI-driven technology in schools could perpetuate biases — and could miss serious safety risks. 

Just last month, the gun-detection company Evolv Technology, which pitches its hardware to schools, acknowledged it was the subject of a Federal Trade Commission inquiry into its marketing practices. The agency is reportedly probing whether the company employs artificial intelligence in the ways that it claims. 

In September, New York became the first state to , a move that followed outcry when an upstate school district announced plans to roll out a surveillance camera system that tracked students’ biometric data. 

A new Montana law bans facial recognition statewide with one notable exception — . Citing privacy concerns, the law adopted this year prohibits government agencies from using facial recognition, but with a specific carveout for schools. One rural education system, the 250-student Sun River School District, employs a 30-camera security system from Verkada that uses facial recognition to track the identities of people on its property. As a result, the district has a camera-to-student ratio of 8-to-1. 

In an email on Wednesday, a Verkada spokesperson said the company is in the process of reviewing Biden’s order to understand its implications on the company.

Verkada offers a cautionary tale about the potential security vulnerabilities of campus surveillance systems. In 2021, the company suffered a massive data breach and hackers claimed to expose the live feeds of 150,000 surveillance cameras — including those in place at Sandy Hook Elementary School in Newtown, Connecticut, the site of a mass shooting in 2012. A conducted on behalf of the company found the breach was more limited, affecting some 4,500 cameras.

Hikvision has similarly made inroads in the school security market with its facial recognition surveillance cameras — including during a pandemic-era push to enforce face mask compliance. Yet the company, owned in part by the Chinese government, has also faced significant allegations of civil rights abuses and in 2019 was placed on a U.S. trade blacklist after being implicated in the country’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities. 

Though multiple U.S. school districts continue to use Hikvision cameras, a recent investigation found the company’s software despite claiming for years it had ended the practice.

 In an email, a Hikvision spokesperson didn’t comment on how Biden’s executive order could affect its business, including in schools, but offered a letter it shared to its customers in response to the investigation, saying an outdated reference to ethnic detection appeared on its website erroneously.

“It has been a longstanding Hikvision policy to prohibit the use of minority recognition technology,” the letter states. “As we have previously stated, that functionality was phased out and completely prohibited by the company in 2018.“

Data scientist David Riedman, who built a national database to track school shootings dating back decades, said that artificial intelligence is at “the forefront” of the school safety conversation and emerging security technologies can be built in ways that don’t violate students’ rights. 

Riedman became a figure in the national conversation about school shootings as the creator of the K12 School Shooting Database but has since taken on an additional role as director of industry research and content for ZeroEyes, a surveillance software company that uses security cameras to ferret out guns. Instead of using facial recognition, the ZeroEyes algorithm was trained to identify and notify law enforcement within seconds of spotting a firearm. 

The — as opposed to facial recognition — can “evade privacy and bias concerns that plague other AI models,” and internal research found that “only 0.06546% of false positives were humans detected as guns.” 

“The simplicity” of ZeroEye’s technology, Riedman said, puts the company in good standing as far as the Biden order is concerned.

“ZeroEyes isn’t looking for people at all,” he said. “It’s only looking for objects and the only objects it is trying to find, and it’s been trained to find, are images that look like guns. So you’re not getting student records, you’re not getting student demographics, you’re not getting anything related to people or even a school per se. You just have an algorithm that is constantly searching for images to see if there is something that looks like a firearm in them.”

However, false positives remain a concern. Just last week at a high school in Texas, from ZeroEyes prompted a campus lockdown that set off student and parent fears of an active shooting. The company said the false alarm was triggered by an image of a student outside who the system believed was armed based on shadows and the way his arm was positioned. 

]]>
Schools Bought Security Cameras to Fight COVID. Did it Work? /article/from-face-mask-detection-to-temperature-checks-districts-bought-ai-surveillance-cameras-to-fight-covid-why-critics-call-them-smoke-and-mirrors/ Wed, 30 Mar 2022 11:01:00 +0000 /?post_type=article&p=587174 This story is part of a series produced in partnership with exploring the increasing role of artificial intelligence and surveillance in our everyday lives during the pandemic, including in schools.

When students in suburban Atlanta returned to school for in-person classes amid the pandemic, they were required to cover their faces with cloth masks like in many places across the U.S. Yet in this 95,000-student district, officials took mask compliance a step further than most. 

Through a network of security cameras, officials harnessed artificial intelligence to identify students whose masks drooped below their noses. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“If they say a picture is worth a thousand words, if I send you a piece of video — it’s probably worth a million,” said Paul Hildreth, the district’s emergency operations coordinator. “You really can’t deny, ‘Oh yeah, that’s me, I took my mask off.’”

The school district in Fulton County had installed the surveillance network, by , years before the pandemic shuttered schools nationwide in 2020. Under a constant fear of mass school shootings, districts in recent years have increasingly deployed controversial surveillance networks like cameras with facial recognition and gun detection.

With the pandemic, security vendors switched directions and began marketing their wares as a solution to stop the latest threat. In Fulton County, the district used Avigilon’s “No Face Mask Detection” technology to identify students with their faces exposed. 

During remote learning, the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and in search of threats and mental health warning signs. Back on campus, districts have rolled out tools like badges that track students’ every move

But one of the most significant developments has been in AI-enabled cameras. Twenty years ago, security cameras were present in 19 percent of schools, according to . Today, that . Powering those cameras with artificial intelligence makes automated surveillance possible, enabling things like temperature checks and the collection of other biometric data.

Districts across the country have said they’ve bought AI-powered cameras to fight the pandemic. But  as pandemic-era protocols like mask mandates end, experts said the technology will remain. Some educators have stated plans to leverage pandemic-era surveillance tech for student discipline while others hope AI cameras will help them identify youth carrying guns. 

The cameras have faced sharp resistance from civil rights advocates who questioned their effectiveness and argue they trample students’ privacy rights.

Noa Young, a 16-year-old junior in Fulton County, said she knew that cameras monitored her school but wasn’t aware of their high-tech features like mask detection. She agreed with the district’s now-expired mask mandate but felt that educators should have been more transparent about the technology in place.

“I think it’s helpful for COVID stuff but it seems a little intrusive,” Young said in an interview. “I think it’s strange that we were not aware of that.”

‘Smoke and mirrors’

Outside of Fulton County, educators have used AI cameras to fight COVID on multiple fronts. 

In Rockland Maine’s Regional School Unit 13, officials used federal pandemic relief money to procure a network of cameras for contact tracing. Through advanced surveillance, the cameras by allow the 1,600-student district to identify students who came in close contact with classmates who tested positive for COVID-19. In its , Verkada explains how districts could use federal funds tied to the public health crisis to buy its cameras for contact tracing and crowd control. 

At a district in suburban Houston, officials spent nearly $75,000 on AI-enabled cameras from , a surveillance company owned in part by the Chinese government, and deployed thermal imaging and facial detection to identify students with elevated temperatures and those without masks. 

The cameras can screen as many as 30 people at a time and are therefore “less intrusive” than slower processes, said Ty Morrow, the Brazosport Independent School District’s head of security. The checkpoints have helped the district identify students who later tested positive for COVID-19, Morrow said, although has argued Hikvision’s claim of accurately scanning 30 people at once is not possible. 

“That was just one more tool that we had in the toolbox to show parents that we were doing our due diligence to make sure that we weren’t allowing kids or staff with COVID into the facilities,” he said.  

Yet it’s this mentality that worries consultant Kenneth Trump, the president of Cleveland-based National School Safety and Security Services. Security hardware for the sake of public perception, the industry expert said, is simply “smoke and mirrors.”

“It’s creating a façade,” he said. “Parents think that all the bells and whistles are going to keep their kids safer and that’s not necessarily the case. With cameras, in the vast majority of schools, nobody is monitoring them.”

‘You don’t have to like something’

When the Fulton County district upgraded its surveillance camera network in 2018, officials were wooed by Avigilon’s AI-powered “Appearance Search,” which allows security officials to sift through a mountain of video footage and identify students based on characteristics like their hairstyle or the color of their shirt. When the pandemic hit, the company’s mask detection became an attractive add-on, Hildreth said.

He said the district didn’t actively advertise the technology to students but they likely became aware of it quickly after students got called out for breaking the rules. He doesn’t know students’ opinions about the cameras — and didn’t seem to care. 

“I wasn’t probably as much interested in their reaction as much as their compliance,” Hildreth said. “You don’t have to like something that’s good for you, but you still need to do it.”

A Fulton County district spokesman said they weren’t aware of any instances where students were disciplined because the cameras caught them without masks. 

After the 2018 mass school shooting in Parkland, Florida, pitched its cameras with AI-powered “gun detection” as a promising school safety strategy. Similar to facial recognition, the gun detection system uses artificial intelligence to spot when a weapon enters a camera’s field of view. By identifying people with guns before shots are fired, the service is “like Minority Report but in real life,” a company spokesperson wrote in an email at the time, referring to the that predicts a dystopian future of mass surveillance. During the pandemic, the company rolled out thermal cameras that a company spokesperson wrote in an email could “accurately pre-screen 2,000 people per hour.”

The spokesperson declined an interview request but said in an email that Athena is “not a surveillance company” and did not want to be portrayed as “spying on” students. 

Among the school security industry’s staunchest critics is Sneha Revanur, a 17-year-old high school student from San Jose, California, who founded to highlight the dangers of artificial intelligence on civil liberties. 

Revanur said she’s concerned by districts’ decisions to implement surveillance cameras as a public health strategy and that the technology in schools could result in harsher discipline for students, particularly youth of color. 


Sneha Revanur

Verkada offers a cautionary tale about the potential harms of pervasive school surveillance and student data collection. Last year, when a hack exposed the live feeds of 150,000 surveillance cameras, including those inside Tesla factories, jails and at Sandy Hook Elementary School in Newtown, Connecticut. The Newtown district, which suffered a mass school shooting in 2012, said compromising information about students. The some educators from contracting with the California-based company. 

After a back-and-forth with the Verkada spokesperson, the company would not grant an interview or respond to a list of written questions. 

Revanur called the Verkada hack at Sandy Hook Elementary a “staggering indictment” of educators’ rush for “dragnet surveillance systems that treat everyone as a constant suspect” at the expense of student privacy. Constant monitoring, she argued, “creates this culture of fear and paranoia that truly isn’t the most proactive response to gun violence and safety concerns.” 

In Fayette County, Georgia, the district spent about $500,000 to purchase 70 Hikvision cameras with thermal imaging to detect students with fevers. But it and disabled them over their efficacy and Hikvision’s ties to the Chinese government. In 2019, the U.S. government , alleging the company was implicated in China’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities.

 The school district declined to comment. In a statement, a Hikvision spokesperson said the company “takes all reports regarding human rights very seriously” and has engaged governments globally “to clarify misunderstandings about the company.” The company is “committed to upholding the right to privacy,” the spokesperson said. 

Meanwhile, Regional School Unit 13’s decision to use Verkada security cameras as a contact tracing tool could run afoul of in Maine schools. The district didn’t respond to requests for comment. 

Michael Kebede, the ACLU of Maine’s policy counsel, cited recent studies on facial recognition’s flaws in and and called on the district to reconsider its approach. 

“We fundamentally disagree that using a tool of mass surveillance is a way to promote the health and safety of students,” Kobede said in a statement. “It is a civil liberties nightmare for everyone, and it perpetuates the surveillance of already marginalized communities.”

Security officials at the Brazosport Independent School District in suburban Houston use AI-enabled security cameras to screen educators for elevated temperatures. District leaders mounted the cameras to carts so they could be used in various locations across campus. (Courtesy Ty Morrow)

White faces

In Fulton County, school officials wound up disabling the face mask detection feature in cafeterias because it was triggered by people eating lunch. Other times, it identified students who pulled their masks down briefly to take a drink of water. 

In suburban Houston, Morrow ran into similar hurdles. When white students wore light-colored masks, for example, the face detection sounded alarms. And if students rode bikes to school, the cameras flagged their elevated temperatures. 

“We’ve got some false positives but it was not a failure of the technology,” Hildreth said. “We just had to take a look and adapt what we were looking at to match our needs.”

With those lessons learned, Hildreth said he hopes to soon equip Fulton County campuses with AI-enabled cameras that identify students who bring guns to school. He sees a future where algorithms identify armed students “in the same exact manner” as Avigilon’s mask detection. 

In a post-pandemic world, Albert Fox Cahn, founder of the nonprofit , worries the entire school security industry will take a similar approach. In February, educators in Waterbury, Connecticut, a new network of campus surveillance cameras with weapons detection. 

“With the pandemic hopefully waning, we’ll see a lot of security vendors pivoting back to school shooting rhetoric as justification for the camera systems,” he said. Due to the potential for errors, Cahn called the embrace of AI gun detection “really alarming.” 

Disclosure: This story was produced in partnership with . It is part of a reporting series that is supported by the which works to build vibrant and inclusive democracies whose governments are accountable to their citizens. All content is editorially independent and overseen by Guardian and 74 editors.

]]>