Gaggle – ĂŰĚŇÓ°ĘÓ America's Education News Source Sun, 20 Apr 2025 18:24:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Gaggle – ĂŰĚŇÓ°ĘÓ 32 32 ‘Spy High:’ Amazon Documentary Probes Dangers of Online Student Surveillance /article/spy-high-amazon-documentary-probes-dangers-of-online-student-surveillance/ Mon, 21 Apr 2025 12:30:00 +0000 /?post_type=article&p=1013855 It all began with a pixelated image of a Mike and Ike: the colorful, fruity candy that with a digital blur and authorities’ preconceived notions could perhaps be mistaken for a pill. 

That’s what happened to 15-year-old Blake Robbins, who was accused by officials in Pennsylvania’s affluent Lower Merion School District of dealing drugs in 2009 after they surreptitiously snapped a photo of him at home with the chewy candy in his hand. The moment was captured by the webcam on his school-issued laptop, one of some 66,000 covert student images collected by the district, including one of Robbins asleep in his bed. 

Robbins sued and the subsequent case, dubbed “WebcamGate,” is at the center of now streaming on Amazon Prime, that examines the high-profile student surveillance scandal and the explosion of student privacy threats that followed it.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


The Lower Merion School District, which settled the class-action lawsuit, was an early adopter of one-to-one computer education technology programs that provide school-issued laptops to students. Such programs have since , particularly since the pandemic. So, too, have digital surveillance tools like Gaggle and GoGuardian, which alert educators when students express thoughts of self-harm or discuss topics deemed taboo, like sex, violence or drugs. 

Directed by Jody McVeigh-Schultz and executive produced by Mark Wahlberg, the documentary offers a cautionary tale about what happens when student monitoring initiatives — often intended to promote young people’s safety and well-being — go awry. It also explores how covert student surveillance intersects with far-reaching school equity issues involving race, disability, privilege and discipline. 

After years of reporting on digital student surveillance myself, I caught up last week with McVeigh-Schultz, whose other documentaries include about reality TV’s seemingly wholesome Duggar family and the Emmy-nominated which delves into the brutal 1960 killing of three women in an Illinois state park. We talked about what he wants viewers to take away from the Robbins’ scandal 15 years after it unfolded and the lessons it holds for contemporary student privacy debates and schools’ growing reliance on ed tech. 

The interview was edited for length and clarity. 

What motivated you to take a deep dive into the Robbins case, and why is it important right now?

I grew up just outside of Philly in a suburb called Cheltenham and I had heard about this story. I knew Lower Merion as the high school that Kobe [Bryant] went to. That’s what it was famous for, but I knew about the Robbins story and I was like, “That’s crazy,” when I heard about it back in 2010 and then I kind of never heard anything more about it. It was a really big story and then just kind of went away. 

When we talked to folks from wealthy suburbs outside of Philadelphia, I think it’s very clear to me that one of the key indicators of status is education. It’s more important than anything else to people. 

The public schools in Lower Merion are really highly rated and people care a ton about the quality of the education and the image of the institution. What are the real world implications of that? 

In this case, the way it played out, some of the things that happened were counterintuitive. Many folks from that community didn’t want to see a lawsuit come to bear against their school. It was like, “Oh well, you know, this actually is perhaps going to affect our home values,” if you’re selling your home and the biggest selling point is the quality of the education.

Blake Robbins, then a high school student in Pennsylvania’s affluent Lower Merion School District, speaks to the press about his 2010 lawsuit alleging covert digital surveillance by educators. (Unrealistic Ideas)

That’s something that you wouldn’t expect to be one of the first reactions to finding out that the schools may be surveilling your kids. But it was, and the fact that the Robbins family had lived in the community for a long time but just weren’t considered part of the in-group just because of who they were was very interesting and, I think, led to people being skeptical of them.

The documentary leaves it up to you to decide whether that skepticism is deserved or not.

Absolutely. The documentary certainly highlights how people are complex and have complicated stories. What did you learn about debates over personal privacy, especially when it comes to information about children?

People’s expectations of how much privacy you should be afforded, and how much you should expect without having to ask any questions, those expectations vary a lot. 

Somebody who was interviewed in a news piece that ran in 2010 said, “You know, this is the school district’s laptop, they could tap in at any time and rightfully so.” I’m a parent, I have a 2-year-old and a 7-year-old who’s in first grade. To me, that seems a bit absurd, but the truth is, I think there are certain contexts where a school-issued laptop is going to be surveilled. We know it’s going to be surveilled, but we don’t expect that it will be able to take pictures in our kids’ bedrooms. 

To me it’s a matter of where are [the] spaces where we should reasonably expect privacy? Transparency is the most important aspect of all of this. Not only were there no conversations going on like, “Hey look, these laptops are going to be surveilled in a number of ways. You should not be leaving them open in your bedroom, You should not be going on any website you wouldn’t want your principal to also see.” The IT department specifically thought it would be a bad idea if parents and students were alerted to the existence of the software that could take images. They felt like, “Well then we won’t be able to recover the stolen laptop because people will just put tape over it.”

Well, that is their decision not to have images taken of them in their bedroom, right? One of the journalists we interviewed said it was like trying to kill a fly with a bazooka. This level of surveillance was not required to track inventory. It just wasn’t. 

Hindsight is 20/20 but it’s obvious from what transpired that they spent a lot more money on legal fees and settling these lawsuits than they ever saved by making sure a handful of laptops were not stolen or lost.

What did you learn about the motives of the school district officials, the lawyers and the families involved?

When I’m making a documentary I’m never thinking in terms of quote-unquote good guys and bad guys. Everyone in this story thought they were doing what was best for the students involved. But in the end, I think there was this balance of protecting students ‘ privacy and protecting the image of the school district. When a mistake is made, there is a reluctance to admit and take responsibility and accept blame. Once you do that, you are admitting to what happened and then there’s all these legal ramifications. 

Multiple people are like, you know, these kids need therapists, they need somebody to check on them and to be like, “Hey, your privacy was violated, are you doing OK?” and that did not happen.

I can’t say why that didn’t happen but to me it seems likely that part of not offering people help is that the minute you say this person needs a therapist because of what we did, you’re admitting to a pretty major violation. 

The documentary doesn’t focus just on the Robbins case. It offers a deep dive into education policy debates around racial inequities, school integration, gender equality and LGBTQ+ rights. What did you find were the implications of surveillance for these populations? 

We talked to Elizabeth Laird at the Center for Democracy and Technology and one of the things she said she sees all the time is that when surveillance is ubiquitous and regularly used in education, vulnerable populations end up feeling the brunt of the negative repercussions. 

In this case, back in 2010, people discovered that a disproportionate amount of the students that were surveilled were African American. There was a sense that if this technology was being misused to discipline students or to check up on students then the chances are it was going to be misused for somebody that was a student of color. 

When we started talking to students of color who had their images taken, we started to understand, “Oh, there is this whole context to what they’re experiencing.” Somebody said you can’t understand the laptop issue without understanding all these other battles that were happening at the time. There was a history of an achievement gap there and African-American parents felt like if you wanted to get an equal education for your kids, you had to fight for it. In this context, there was a real lack of trust of the school district by African-American parents. 

Keron Williams and his mother really wanted to tell his story. It was a story of somebody suspecting him of stealing a bracelet and him being brought into the principal’s office. He says his laptop webcam was activated a couple days later after they searched his pockets and found nothing but a Boy Scouts handkerchief. 

There’s racial profiling but also this idea of the misuse of technology meant to keep laptops from being stolen. If something like this is misused, vulnerable populations are going to feel the brunt of it more. 

That brings me to one of the other stories we talked about, which was more recent. 

In 2020, with the pandemic, school-issued devices and remote learning became the norm. We talked to two students who started high school online, went to classes on Zoom, and they were using their school-issued laptops for everything. 

The way they communicated instead of seeing their friends at lunch was through a Google Hangouts chat. What they didn’t realize was their school was using monitoring software that essentially scooped up everything they wrote while logged into their school account, including private chats. They were brought to the principal’s office and were confronted with what they wrote. 

The context of it is that the school decided it was bullying. What we reveal is that they were using the word “gay” because they were. The term they used was “we’re a pretty gay friend group. Gay was a descriptor to us.”

One of these kids had to come out in the principal’s office with his father there. Luckily his parents were pretty great about it, but that’s a really awful position to put a kid in and, you know, again, a vulnerable population bearing the brunt of overzealous surveillance. 

The goal of this surveillance is to protect kids, it’s to make sure kids aren’t hurting themselves, hurting other students. There’s obviously a mental health crisis going on in terms of high school-aged kids, but there really has to be a discussion about whether these tactics are making the mental health crisis better or worse. 

You’re talking about the tools that schools nationally have increasingly used to collect and analyze reams of information about students in the name of keeping them safe. This includes tools like Gaggle and GoGuardian. Given the growth in these tools, do any guardrails need to be put in place? 

First of all, it’s so important that students know what is being used to surveil any device they’re using. The fact that kids hadn’t heard of Gaggle is really a problem. 

But if they know about it, that doesn’t solve all the problems because what you’re asking high schoolers especially to do is to find their own voice, understand how to freely express themselves, to be vulnerable. In some of my best creative writing courses my teachers were saying, “Look, if it scares you to write this, you’re probably going in the right direction.” 

The minute a kid realizes, “Well, everything that I’m writing in a creative writing class — a poem, a personal essay — is going through this software, maybe going to my principal, maybe going to law enforcement,” they’re going to express themselves differently. That’s just a really dangerous road to go down.

Students and parents have to be aware, but also I just think it should be less powerful. I don’t think we should be able to say there are no ways in which you can use our technology, which is kind of unavoidable if you’re a high school student, without being constantly surveilled.

In Minnesota, the story we cover, they . That’s a pretty huge step, and I think that’ll happen more and more as people become more aware of this stuff. 

 There are just places where we should not be allowing this.

]]>
Computer Programs Monitor Students’ Every Word in the Name of Safety /article/computer-programs-monitor-students-every-word-in-the-name-of-safety/ Sat, 26 Oct 2024 12:01:00 +0000 /?post_type=article&p=734595 This article was originally published in

Whether it’s a research project on the Civil War or a science experiment on volcano eruptions, students in the Colonial School District near Wilmington, Delaware, can look up just about anything on their school-provided laptops.

But in one instance, an elementary school student searched “how to die.”

In that case, Meghan Feby, an elementary school counselor in the district, got a phone call through a platform called , whose algorithm flagged the phrase. The system sold by educational software company GoGuardian allows schools to monitor and analyze what students are doing on school-issued devices and flag any activities that signal a risk of self-harm or threats to others.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


The student who had searched “how to die” did not want to die and showed no indicators of distress, Feby said — the student was looking for information but in no danger. Still, she values the program.

“I’ve gotten into some situations with GoGuardian where I’m really happy that they came to us and we were able to intervene,” Feby said.

School districts across the country have widely adopted such computer monitoring platforms. With the youth mental health crisis worsened by the COVID-19 pandemic and school violence affecting more K-12 students nationwide, teachers are desperate for a solution, experts say.

But critics worry about the lack of transparency from companies that have the power to monitor students and choose when to alert school personnel. Constant student surveillance also raises concerns regarding student data, privacy and free speech.

While available for more than a decade, the programs saw a surge in use during the pandemic as students transitioned to online learning from home, said Jennifer Jones, a staff attorney at the Knight First Amendment Institute.

“I think because there are all kinds of issues that school districts have to contend with — like student mental health issues and the dangers of school shootings — I think they [school districts] just view these as cheap, quick ways to address the problem without interrogating the free speech and privacy implications in a more thoughtful way,” Jones said.

According to the most recent youth risk behavior from the federal Centers for Disease Control and Prevention, nearly all indicators of poor mental health, suicidal thoughts and suicidal behaviors increased from 2013 to 2023. During the same period, the percentage of high school students who were threatened or injured at school, missed school because of safety concerns or experienced forced sex increased, according to the CDC .

And the threat of school shootings remains on many educators’ minds. Since the Columbine High School shooting in 1999, more than 383,000 students have experienced gun violence at school, according to .

GoGuardian CEO Rich Preece told Stateline that about half of the K-12 public schools in the United States have installed the company’s platforms.

As her school’s designee, Feby gets an alert when a student uses certain search terms or combinations of words on their school-issued laptops. “It will either come to me as an email, or, if it is very high risk, it comes as a phone call.”

Once she’s notified, Feby will decide whether to meet with the student or call the child’s home. If the system flags troubling activity outside of school hours, GoGuardian Beacon contacts another person in the county — including law enforcement, in some school districts.

Feby said she’s had some false alarms. One student was flagged because of the song lyrics she had looked up. Another one had searched for something related to anime.

About a third of the students in Feby’s school come from a home where English isn’t their first language, so students often use worrisome English terms inadvertently. Kids can also be curious, she said.

Still, having GoGuardian in the classroom is important, Feby said. Before she became a counselor 10 years ago, she was a school teacher. And after the 2012 Sandy Hook Elementary School mass shooting, she realized school safety was more important than ever.

Data and privacy

Teddy Hartman, GoGuardian’s head of privacy, taught high school English literature in East Los Angeles and was a school administrator before joining the technology company about four years ago.

Hartman was brought to GoGuardian to help with creating a robust privacy program, he said, including guardrails on its use of artificial intelligence.

“We thought, ‘How can we co-create with educators, the best of the data scientists, the best of the technologists, while also remembering that students and our educators are first and foremost?’” Hartman said.

GoGuardian isn’t using any student data outside of the agreements that school districts have allowed, and that data isn’t used to train the company’s AI, Hartman said. Companies that regulate what children can do online are also required to adhere to regarding the safety and privacy of minors, including the Family Educational Rights and Privacy Act and the Children’s Online Privacy Protection Rule.

But privacy experts are still concerned about just how much access these types of companies should have to student data.

School districts across the country are spending hundreds of thousands of dollars on contracts with some of the leading computer monitoring vendors — including GoGuardian, Gaggle and others — without fully assessing the privacy and civil rights implications, said Clarence Okoh, a senior attorney at the Center on Privacy and Technology at the Georgetown University Law Center.

In 2021, while many schools were just beginning to see the effects of online learning, ĂŰĚŇÓ°ĘÓ, a nonprofit news outlet covering education, published an investigation into how Gaggle was operating in Minneapolis schools. Hundreds of documents revealed how students at one school system were subject to constant digital surveillance long after the school day was over, including at home, the outlet reported.

That level of pervasive surveillance can have far-reaching implications, Okoh said. For one, in jurisdictions where legislators have expanded censorship of “divisive concepts” in schools, including critical race theory and LGBTQ+ themes, the ability for schools to monitor conversations including those terms is concerning, he said.

A by the Electronic Frontier Foundation, a nonprofit digital rights group based in San Francisco, illustrates what kinds of keyword triggers are blocked or flagged for administrators. In one example, GoGuardian had flagged a student for visiting the text of a Bible verse including the word “naked,” the report said. In another instance, a Texas House of Representatives site with information regarding “cannabis” bills was flagged.

GoGuardian and Gaggle both also dropped LGBTQ+ terms from their keyword lists after the foundation’s initial records request, the group said.

But getting a full understanding of the way these companies monitor students is challenging because of a lack of transparency, Jones said. It’s difficult to get information from private tech companies, and the majority of their data isn’t made public, she said.

Do they work?

Years before the 2022 shooting at Robb Elementary School in Uvalde, Texas, the school district purchased a technology service to monitor what students were doing on social media, according to . The district sent two payments to the Social Sentinel company totaling more than $9,900, according to the paper.

While the cost varies, some school districts are spending hundreds of thousands of dollars on online monitoring programs. Muscogee County School District in Georgia paid $137,829 in initial costs to install GoGuardian on the district’s Chromebooks, . In Maryland, Montgomery County Public Schools for the 2024-2025 school year after spending $230,000 annually on it, later , according to the Wootton Common Sense.

Despite the spending, there’s no way to prove that these technologies work, said Chad Marlow, a senior policy counsel at the American Civil Liberties Union who authored a on education surveillance programs.

In 2019, Bark, a content monitoring platform, claimed to have helped prevent 16 school shootings in a describing their Bark for Schools program. The Gaggle company website says it 5,790 lives between 2018 and 2023.

These data points are measured by the number of alerts the systems generate that indicate a student may be very close to harming themselves or others. But there is little evidence that this kind of school safety technology is effective, according to the ACLU report.

“You cannot use data to say that, if there wasn’t an intervention, something would have happened,” Marlow said.

Computer monitoring programs are just one example of an overall increase in school surveillance nationwide, including cameras, facial recognition technology and more. And increased surveillance does not necessarily deter harmful conduct, Marlow said.

“A lot of schools are saying, ‘You know what, we’ve $50,000 to spend, I’m going to spend it on a student surveillance product that doesn’t work, instead of a door that locks or a mental health counselor,’” Marlow said.

Some experts are advocating for more mental health resources, including hiring more guidance counselors, and school policies that support mental health, which could prevent violence or suicide, Jones said. programs, including volunteer work or community events, also can contribute to emotional and mental well-being.

But that’s in an ideal world, GoGuardian’s Hartman said. Computer monitoring platforms aren’t the only solution for solving the youth mental health and violence epidemic, but they aim to help, he said.

“We were founded by engineers,” Hartman said. “So, in our slice of this world, is there something we can do, from a school technology perspective that can help by being a tool in the toolbox? It’s not an end-all, be-all.”

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Stateline maintains editorial independence. Contact Editor Scott S. Greenberger for questions: info@stateline.org. Follow Stateline on and .

]]>
Biden Order on AI Tackles Tech-Enabled Discrimination in Schools /article/biden-order-on-ai-tackles-tech-enabled-discrimination-in-schools/ Tue, 31 Oct 2023 21:01:00 +0000 /?post_type=article&p=717111 Updated Nov. 1

As artificial intelligence rapidly expands its presence in classrooms, President Biden signed an executive order Monday requiring federal education officials to create guardrails that prevent tech-driven discrimination. 

The , which the White House called “the most sweeping actions ever taken to protect Americans from the potential risks of AI systems,” offers several directives that are specific to the education sector. The order dealing with emerging technologies like ChatGPT directs the Justice Department to coordinate with federal civil rights officials on ways to investigate discrimination perpetuated by algorithms. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Within a year, the education secretary must release guidance on the ways schools can use the technology equitably, with a particular focus on the tools’ effects on “vulnerable and underserved communities.” Meanwhile, an Education Department “AI toolkit” released within the next year will offer guidance on how to implement the tools so that they enhance trust and safety while complying with federal student privacy rules. 

For civil rights advocates who have decried AI’s potentially unintended consequences, the order was a major step forward. 

The order’s focus on civil rights investigations “aligns with what we’ve been advocating for over a year now,” said Elizabeth Laird, the director of equity and civic technology at the nonprofit Center for Democracy and Technology. Her group has called on the Education Department’s Office for Civil Rights to open investigations into the ways AI-enabled tools in schools could have a disparate impact on students based on their race, disability, sexual orientation and gender identity. 

“It’s really important that this office, which has been focused on protecting marginalized groups of students for literally decades, is more involved in conversations about AI and can bring that knowledge and skill set to bear on this emerging technology,” Laird told ĂŰĚŇÓ°ĘÓ. 

In to federal agencies on Wednesday, the Office of Management and Budget spelled out the types of AI education technologies that pose civil rights and safety risks. They include tools to detect student cheating, monitor their online activities, project academic outcomes, make discipline recommendations or facilitate surveillance online and in-person.  

An Education Department spokesperson didn’t respond to a request for comment Monday on how the agency plans to respond to Biden’s order. 

Schools nationwide have adopted artificial intelligence in divergent ways, including in to provide students individualized lessons and with the growing use of chatbots like ChatGPT by both students and teachers. It’s also generated heated debates over technology’s role in exacerbating harms to at-risk youth, including educators’ use of early warning systems that mine data about students — including their race and disciplinary records — to predict their odds of dropping out of school. 

“We’ve heard reported cases of using data to predict who might commit a crime, so very Minority Report,” Laird said. “The bar that schools should be meeting is that they should not be targeting students based on protected characteristics unless it meets a very narrowly defined purpose that is within the government’s interests. And if you’re going to make that argument, you certainly need to be able to show that this is not causing harm to the groups that you’re targeting.” 

AI and student monitoring tools

An unprecedented degree of student surveillance has also been facilitated by AI, including online activity monitoring tools, remote proctoring software to detect cheating on tests and campus security cameras with facial recognition capabilities. 

Beyond its implications on schools, the Biden order requires certain technology companies to conduct AI safety testing before their products are released to the public and to provide their results to the government. It also orders new regulations to ensure AI won’t be used to produce nuclear weapons, recommends that AI-generated photos and videos be transparently identified as such with watermarks and calls on Congress to pass federal data privacy rules “to protect all Americans, especially kids.”

In September, The Center for Democracy and Technology released a report that warned that schools’ use of AI-enabled digital monitoring tools, which track students’ behaviors online, could have a disparate impact on students — particularly LGBTQ+ youth and those with disabilities — in violation of federal civil rights laws. As teachers punish students for using ChatGPT to allegedly cheat on classroom assignments, a survey suggested that children in special education were more likely to face discipline than their general education peers. They also reported higher levels of surveillance and subsequent discipline as a result. 

In response to the report, a coalition of Democratic lawmakers penned a letter urging the Education Department’s civil rights office to investigate districts that use digital surveillance and other AI tools in ways that perpetuate discrimination. 

Education technology companies that use artificial intelligence could come under particular federal scrutiny as a result of the order, said consultant Amelia Vance, an expert on student privacy regulations and president of the Public Interest Privacy Center. The order notes that the federal government plans to enforce consumer protection laws and enact safeguards “against fraud, unintended bias, discrimination, infringements on privacy and other harms from AI.” 

“Such protections are especially important in critical fields like healthcare, financial services, education, housing, law and transportation,” the order notes, “where mistakes by or misuse of AI could harm patients, cost consumers or small businesses or jeopardize safety or rights.”

Schools rely heavily on third-party vendors like education technology companies to provide services to students, and those companies are subject to Federal Trade Commission rules against deceptive and unfair business practices, Vance noted. The order’s focus on consumer protections, she said, “was sort of a flag for me that maybe we’re going to see not only continuing interest in regulating ed tech, but more specifically regulating ed tech related to AI.”

While the order was “pretty vague when it came to education,” Vance said it was important that it did acknowledge AI’s potential benefits in education, including for personalized learning and adaptive testing. 

“As much as we keep talking about AI as if it showed up in the past year, it’s been there for a while and we know that there are valuable ways that it can be used,” Vance said. “It can surface particular content, it can facilitate better connections to people when they need certain content.” 

AI and facial recognition cameras

As school districts pour billions of dollars into school safety efforts in the wake of mass school shootings, security vendors have heralded the promises of AI. Yet civil rights groups have warned that facial recognition and other AI-driven technology in schools could perpetuate biases — and could miss serious safety risks. 

Just last month, the gun-detection company Evolv Technology, which pitches its hardware to schools, acknowledged it was the subject of a Federal Trade Commission inquiry into its marketing practices. The agency is reportedly probing whether the company employs artificial intelligence in the ways that it claims. 

In September, New York became the first state to , a move that followed outcry when an upstate school district announced plans to roll out a surveillance camera system that tracked students’ biometric data. 

A new Montana law bans facial recognition statewide with one notable exception — . Citing privacy concerns, the law adopted this year prohibits government agencies from using facial recognition, but with a specific carveout for schools. One rural education system, the 250-student Sun River School District, employs a 30-camera security system from Verkada that uses facial recognition to track the identities of people on its property. As a result, the district has a camera-to-student ratio of 8-to-1. 

In an email on Wednesday, a Verkada spokesperson said the company is in the process of reviewing Biden’s order to understand its implications on the company.

Verkada offers a cautionary tale about the potential security vulnerabilities of campus surveillance systems. In 2021, the company suffered a massive data breach and hackers claimed to expose the live feeds of 150,000 surveillance cameras — including those in place at Sandy Hook Elementary School in Newtown, Connecticut, the site of a mass shooting in 2012. A conducted on behalf of the company found the breach was more limited, affecting some 4,500 cameras.

Hikvision has similarly made inroads in the school security market with its facial recognition surveillance cameras — including during a pandemic-era push to enforce face mask compliance. Yet the company, owned in part by the Chinese government, has also faced significant allegations of civil rights abuses and in 2019 was placed on a U.S. trade blacklist after being implicated in the country’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities. 

Though multiple U.S. school districts continue to use Hikvision cameras, a recent investigation found the company’s software despite claiming for years it had ended the practice.

 In an email, a Hikvision spokesperson didn’t comment on how Biden’s executive order could affect its business, including in schools, but offered a letter it shared to its customers in response to the investigation, saying an outdated reference to ethnic detection appeared on its website erroneously.

“It has been a longstanding Hikvision policy to prohibit the use of minority recognition technology,” the letter states. “As we have previously stated, that functionality was phased out and completely prohibited by the company in 2018.“

Data scientist David Riedman, who built a national database to track school shootings dating back decades, said that artificial intelligence is at “the forefront” of the school safety conversation and emerging security technologies can be built in ways that don’t violate students’ rights. 

Riedman became a figure in the national conversation about school shootings as the creator of the K12 School Shooting Database but has since taken on an additional role as director of industry research and content for ZeroEyes, a surveillance software company that uses security cameras to ferret out guns. Instead of using facial recognition, the ZeroEyes algorithm was trained to identify and notify law enforcement within seconds of spotting a firearm. 

The — as opposed to facial recognition — can “evade privacy and bias concerns that plague other AI models,” and internal research found that “only 0.06546% of false positives were humans detected as guns.” 

“The simplicity” of ZeroEye’s technology, Riedman said, puts the company in good standing as far as the Biden order is concerned.

“ZeroEyes isn’t looking for people at all,” he said. “It’s only looking for objects and the only objects it is trying to find, and it’s been trained to find, are images that look like guns. So you’re not getting student records, you’re not getting student demographics, you’re not getting anything related to people or even a school per se. You just have an algorithm that is constantly searching for images to see if there is something that looks like a firearm in them.”

However, false positives remain a concern. Just last week at a high school in Texas, from ZeroEyes prompted a campus lockdown that set off student and parent fears of an active shooting. The company said the false alarm was triggered by an image of a student outside who the system believed was armed based on shadows and the way his arm was positioned. 

]]>
Exclusive: Dems Urge Federal Action on Student Surveillance Citing Bias Fears /article/exclusive-dems-urge-federal-action-on-student-surveillance-citing-discrimination-fears/ Thu, 19 Oct 2023 18:01:00 +0000 /?post_type=article&p=716619 A coalition of Democratic lawmakers on Thursday called on the U.S. Education Department to investigate school districts that use digital surveillance and other artificial intelligence tools in ways that trample students’ civil rights. 

, the coalition expressed concerns that AI-enabled student monitoring tools could foster discrimination against marginalized groups, including LGBTQ+ youth and students with disabilities. The Education Department’s Office for Civil Rights should issue guidance on the appropriate uses of emerging classroom technologies, the lawmakers wrote, and crack down on practices that run afoul of existing federal anti-discrimination laws. 

“While the expansion of educational technology helped facilitate remote learning that was critical to students, parents and teachers during the pandemic,” the lawmakers wrote, “these technologies have also amplified student harms.” 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Lawmakers asked the Education Department’s civil rights office whether it has received complaints alleging discrimination facilitated by education technology software and whether it has taken any enforcement action related to potential civil rights violations. 

The letter comes in response to a recent national survey of educators, parents and students, the findings of which suggest that schools’ use of digital tools to monitor children online have based on their race, disability, sexual orientation and gender identity. The survey, conducted by the nonprofit Center for Democracy and Technology, found that while activity monitoring has become ubiquitous in schools and is intended to keep students safe, it’s used regularly as a discipline tool and routinely brings youth into contact with the police.

Findings from the CDT survey, lawmakers wrote, “raise serious concerns about the application of civil rights laws to schools’ use of these technologies.” Letter signatories include Democratic Reps. Lori Trahan of Massachusetts, Sara Jacobs of California, Hank Johnson of Georgia, Bonnie Watson Coleman of New Jersey and Adam Schiff of California. Trahan, who serves on the House Energy and Commerce Committee’s Innovation, Data and Commerce Subcommittee, has previously called for tighter student data privacy protections in the ed tech sector. 

The monitoring tools, such as those offered by for-profit companies GoGuardian and Gaggle, rely on artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Two-thirds of teachers reported that a student at their school was disciplined as a result of activity monitoring and a third said they know a student who was contacted by the police because of an alert generated by the software. 

Children with disabilities were more likely than their peers to report being watched, and special education teachers reported heightened rates of discipline as a result of activity monitoring. The findings, researchers argue, that entitle children with disabilities equal access to an education. Even beyond the technologies, students with disabilities are subjected to disproportionate levels of school discipline, including restraint and seclusion, when compared to their general education peers. 

Half of all students said their schools responded fairly to alerts generated by monitoring software, a sentiment shared by just 36% of LGBTQ+ youth. In fact, LGBTQ+ youth were more likely than their straight and cisgender peers to report that they or someone they know was disciplined as a result of monitoring. And nearly a third of LGBTQ+ youth reported that they or someone they know was outed because of the technology. 

More than a third of teachers said their school monitors students’ online behaviors outside of school hours — and sometimes on their personal devices. 

In a similar student survey, released this month by the American Civil Liberties Union, a majority of respondents expressed worries that the monitoring tools — despite being designed to keep them safe — could actually cause harm and a third said they “always feel” like they’re being watched. 

ĂŰĚŇÓ°ĘÓ has reported extensively on schools’ use of digital surveillance tools to monitor students’ online behaviors, and the tools’ implications for youth civil rights. The company Gaggle previously flagged to administrators student communications that referenced LGBTQ+ keywords like “gay” and “lesbian.” The company says it halted the practice last year in the wake of pushback from civil rights activists. 

Given the survey findings, the lawmakers urged the Education Department to clarify “how educators can fulfill their civil rights obligations” as they develop policies related to artificial intelligence, whose rapidly evolving role in education more broadly — including students’ use of tools like ChatGPT — has become a topic of debate. 

“This research is particularly concerning due to linkages between school disciplinary policies and incarceration rates of our nation’s youth,” the coalition wrote, adding concerns that the tools can create hostile learning environments. 

]]>
New Report: School Shootings Spawned ‘Digital Dystopia’ of Student Surveillance /article/new-report-school-shootings-spawned-digital-dystopia-of-student-surveillance/ Tue, 03 Oct 2023 18:48:00 +0000 /?post_type=article&p=715730 Updated, Oct. 4

Reeled in by deceptive, fear-based marketing and an influx of federal cash, school leaders have purchased and pervasively deployed student surveillance tools while failing to consider their detrimental consequences to young people’s civil rights, a new ACLU report concludes. 

In a youth survey accompanying the , a majority of students expressed worries that the tools — designed to keep them safe — could actually cause harm and a third said they “always feel” like they’re being watched. 

The 61-page report, titled “Digital Dystopia,” also offers an in-depth look at the rise of schools’ reliance on surveillance technology over the last few decades, arguing the tools have failed to improve campus safety while subjecting students — particularly students of color and those who are undocumented, LGBTQ or from low-income households — to discrimination. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“The ed tech surveillance companies, after fanning the flames of fear, were making these broad statements about the efficacy of their products, about their ability to keep students safe” from threats like school shootings and suicide, despite a lack of evidence to back up their claims, report lead author and ACLU senior policy counsel Chad Marlow told ĂŰĚŇÓ°ĘÓ. 

Rather than making kids safe, Marlow said, the tools could be damaging to their development and well-being. “The harm is actually significant and, by not acknowledging the harms that are caused, there’s less incentive to look at other interventions,” he said.

ACLU

Three-quarters of students worry about at least one negative consequence of student surveillance, which includes the widespread proliferation of digital tools that monitor their online communications for references to sex, drugs, violence or self-harm, according to the online survey. Commissioned by the ACLU, the polling firm YouGov queried 502 teens throughout the country in October 2022. Nearly a quarter of respondents said that digital monitoring tools limit the resources they feel they can access online while a similar percentage worried the information collected about them could be shared with the police or be used against them in the future by a college or an employer. Some 27% feared the tools could be used for disciplinary purposes.

As a result, students alter their behaviors due to fears that “deviating from expectations is punishable in the world that they’re growing up in,” Marlow said. “What does that tell them about innovation or exploring new ideas?”

Survey findings , released last month by the nonprofit Center for Democracy and Technology, which found that while a majority of parents and students still embrace digital tools that monitor students’ online behaviors, their support has dwindled over the last year. 

Both reports identified detrimental effects of digital surveillance that researchers said run counter to federal civil rights laws that protect students from discrimination based on race, disability, sexual orientation or gender identity. 

In the student survey conducted by the Center for Democracy and Technology, researchers found that while districts bought digital monitoring tools to keep students safe, they are used regularly as discipline tools that routinely bring youth in contact with the police. LGBTQ+ youth and those with disabilities were significantly more likely to experience the harms of surveillance. For example, 65% of LGBTQ+ youth said they or someone they knew got into trouble due to online activity monitoring, compared to 56% of their straight and cisgender peers. Meanwhile, nearly a third of LGBTQ+ students said that they or someone they know has been “outed” by the technology.

In the absence of rigorous, independent research on the efficacy of school surveillance tools to improve campus safety, the ACLU report argues that schools are left to make purchasing decisions based on what the group called fear-based marketing tactics. Security companies hype the risks of school violence and student self-harm while overstating the utility of their products, the report says. Security industry lobbying efforts, meanwhile, have successfully steered hundreds of millions of dollars in government school safety spending toward unproven technologies. 

“It would be like going to buy a car and the only source of information is the car salesperson,” Marlow said. “That’s probably not the best way to make a car purchasing decision, but that’s what’s happening with student surveillance.” 

The Security Industry Association, a trade group that represents security companies and lobbies on their behalf, didn’t immediately respond to a request for comment. 

The ACLU survey results suggest, however, that students have a complicated relationship with school surveillance: While recognizing its potential harms, many also believe it serves its intended purpose. Specifically, 40% of students reported that surveillance technology makes them feel “safe” and 43% said it makes them feel “protected.” Meanwhile, just 14% said it makes them feel “anxious” and a fraction of respondents, 7%, said the tools made them feel “unsafe.” 

Marlow said this support may be the result, at least in part, of successful marketing and a belief that few other options exist. 

“​​When you talk about keeping students safe, I think students are smart enough to realize that in too many places in this country, gun control is off the table,” he said. “Because of the dominance of money and power of the ed tech surveillance industry,” that’s used in marketing and lobbying, “the discussion is almost entirely centered around, ‘Do we use or do we not use student surveillance technologies?’ while alternatives like mental health screenings fail to receive similar consideration. “In that option, between a highly questionable, harmful protection or nothing at all, no one wants to pick nothing at all.” 

While the report focuses largely on digital tools that monitor students’ behaviors online, it also questions the efficacy of surveillance cameras in creating physical safety for students in schools. Cameras have become nearly ubiquitous, with them in the 2019-20 school year, according to the most recent data included in a U.S. Department of Education report released last month. 

Meanwhile, just 55% of schools offered students mental health assessments, according to the most recent federal data, and 42% offered mental health treatment services. 

Despite a sharp rise in schools’ reliance on surveillance and other tools in the last two decades, the number of school shootings has grown. 

There were a record 188 school shootings resulting in injuries or deaths in the 2021-22 school year, according to the federal report. That’s twice as many shootings on campus than the previous record — set just one year earlier. Placing security cameras in schools, Marlow argues, has failed to deter the very crimes they were installed to prevent. In an ACLU analysis of the 10 deadliest school shootings in the last two decades, for example, researchers found that surveillance cameras were present for eight, including in Parkland, Florida, and Uvalde, Texas. 

Along with scrutiny from researchers and civil rights groups, schools’ use of digital monitoring tools has led to several lawsuits alleging they’re ineffective and violate students’ civil liberties. 

In one class-action lawsuit, filed this year in California, the parents of two students claim the student surveillance company and sold the information to targeted advertising vendors without their knowledge or consent. 

A separate federal negligence lawsuit, filed in 2021 in Oklahoma, of being ineffective at keeping kids safe from self-harm. The lawsuit, filed by the parents of a 15-year-old boy who died by suicide, accuses the surveillance company and the state’s third-largest school district of failing to act on warning signs that could have prevented the teenager’s 2019 death. 

The student submitted a “personal odyssey” essay in his freshman English class that was riddled with references to self-harm and suicide, but his teacher failed to act, the complaint alleges, giving him a grade of 100%. The district used Gaggle to identify and flag troubling student digital communications, including references to self-harm and suicide. Yet the lawsuit alleges the company “failed to notify school administration” about the student’s warning signs, including the essay titled “Running Out of Reasons” and an email with a classmate where the two contemplated a plan to “go out at the same time.”

A Gaggle spokesperson didn’t immediately respond to a request for comment. Securly spokesperson Josh Mukai called the lawsuit “baseless and uninformed.”

“Securly has never sold student data to third parties, nor have we ever used student data to target advertisements,” Mukai said in an email. “Securly’s suite of student safety solutions upholds the highest standards for student data privacy and complies with all international, federal and state privacy regulations.”

]]>
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds /article/chatgpt-is-landing-kids-in-the-principals-office-survey-finds/ Wed, 20 Sep 2023 04:01:00 +0000 /?post_type=article&p=715056 Ever since ChatGPT burst onto the scene last year, a heated debate has centered on its potential benefits and pitfalls for students. As educators worry students could use artificial intelligence tools to cheat, a new survey makes clear its impact on young people: They’re getting into trouble. 

Half of teachers say they know a student at their school who was disciplined or faced negative consequences for using — or being accused of using — generative artificial intelligence like ChatGPT to complete a classroom assignment, , a nonprofit think tank focused on digital rights and expression. The proportion was even higher, at 58%, for those who teach special education. 

Cheating concerns were clear, with survey results showing that teachers have grown suspicious of their students. Nearly two-thirds of teachers said that generative AI has made them “more distrustful” of students and 90% said they suspect kids are using the tools to complete assignments. Yet students themselves who completed the anonymous survey said they rarely use ChatGPT to cheat, but are turning to it for help with personal problems.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“The difference between the hype cycle of what people are talking about with generative AI and what students are actually doing, there seems to be a pretty big difference,” said Elizabeth Laird, the group’s director of equity in civic technology. “And one that, I think, can create an unnecessarily adversarial relationship between teachers and students.”   

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat. 

Center for Democracy and Technology

The results on ChatGPT’s educational impacts were included in the Center for Democracy and Technology’s broader annual survey analyzing the privacy and civil rights concerns of teachers, students and parents as tech, including artificial intelligence, becomes increasingly engrained in classroom instruction. Beyond generative AI, researchers observed a sharp uptick in digital privacy concerns among students and parents over last year. 

Among parents, 73% said they’re concerned about the privacy and security of student data collected and stored by schools, a considerable increase from the 61% who expressed those reservations last year. A similar if less dramatic trend was apparent among students: 62% had data privacy concerns tied to their schools, compared with 57% just a year earlier. 

Center for Democracy and Technology

Those rising levels of anxiety, researchers theorized, are likely the result of the growing frequency of cyberattacks on schools, which have become a primary target for ransomware gangs. High-profile breaches, including in Los Angeles and Minneapolis, have compromised a massive trove of highly sensitive student records. Exposed records, investigative reporting by ĂŰĚŇÓ°ĘÓ has found, include student psychological evaluations, reports detailing campus rape cases, student disciplinary records, closely guarded files on campus security, employees’ financial records and copies of government-issued identification cards. 

Survey results found that students in special education, whose records are among the most sensitive that districts maintain, and their parents were significantly more likely than the general education population to report school data privacy and security concerns. As attacks ratchet up, 1 in 5 parents say they’ve been notified that their child’s school experienced a data breach. Such breach notices, Laird said, led to heightened apprehension. 

“There’s not a lot of transparency” about school cybersecurity incidents “because there’s not an affirmative reporting requirement for schools,” Laird said. But in instances where parents are notified of breaches, “they are more concerned than other parents about student privacy.” 

Parents and students have also grown increasingly wary of another set of education tools that rely on artificial intelligence: digital surveillance technology. Among them are student activity monitoring tools, such as those offered by the for-profit companies Gaggle and GoGuardian, which rely on algorithms in an effort to keep students safe. The surveillance software employs artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Among parents surveyed this year, 55% said they believe the benefits of activity monitoring outweigh the potential harms, down from 63% last year. Among students, 52% said they’re comfortable with academic activity monitoring, a decline from 63% last year. 

Such digital surveillance, researchers found, frequently has disparate impacts on students based on their race, disability, sexual orientation and gender identity, potentially violating longstanding federal civil rights laws. 

The tools also extend far beyond the school realm, with 40% of teachers reporting their schools monitor students’ personal devices. More than a third of teachers say they know a student who was contacted by the police because of online monitoring, the survey found, and Black parents were significantly more likely than their white counterparts to fear that information gleaned from online monitoring tools and AI-equipped campus surveillance cameras could fall into the hands of law enforcement. 

Center for Democracy and Technology

Meanwhile, as states nationwide pull literature from school library shelves amid a conservative crusade against LGBTQ+ rights, the nonprofit argues that digital tools that filter and block certain online content “can amount to a digital book ban.” Nearly three-quarters of students — and disproportionately LGBTQ+ youth — said that web filtering tools have prevented them from completing school assignments. 

The nonprofit highlights how disproportionalities identified in the survey could run counter to federal laws that prohibit discrimination based on race and sex, and those designed to ensure equal access to education for children with disabilities. In a letter sent Wednesday to the White House and Education Secretary Miguel Cardona, the Center for Democracy and Technology was joined by a coalition of civil rights groups urging federal officials to take a harder tack on ed tech practices that could threaten students’ civil rights. 

“Existing civil rights laws already make schools legally responsible for their own conduct, and that of the companies acting at their direction in preventing discriminatory outcomes on the basis of race, sex and disability,” the coalition wrote. “The department has long been responsible for holding schools accountable to these standards.”

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like “gay” and “lesbian” in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth — rather than criticism of its product — as the impetus for the change as part of a “continuous evaluation and updating process.”

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they’re than their straight and cisgender classmates. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But in practice, Gaggle’s critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle’s decision to remove several LGBTQ-specific keywords, including “queer” and “bisexual,” from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by ĂŰĚŇÓ°ĘÓ into the company’s business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle’s software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students’ personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students’ sexual orientation or gender identity last year, according to data provided to ĂŰĚŇÓ°ĘÓ by its civil rights office. That’s a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, ĂŰĚŇÓ°ĘÓ revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle’s website as a collaboration to “improve mental health outcomes for LGBTQ young people.” 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle “having a role in negatively impacting LGBTQ students.” 

The Trevor Project didn’t respond to requests for comment on Gaggle’s decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to ĂŰĚŇÓ°ĘÓ, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students’ digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company’s efforts to protect students from abuse and were purged late last year.

“At points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,” Hetherington said. “Thankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.”

Hetherington said Gaggle will continue to monitor students’ use of the words “faggot,” “lesbo,” and others that are “commonly used as slurs.” A previous review by ĂŰĚŇÓ°ĘÓ found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school’s literary magazine, and students’ private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of “Don’t Say Gay” laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn’t been independently verified and there’s a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled “Essay on the Reasons Why I Want to Kill Myself but Can’t/Didn’t.” Adults wouldn’t have known she was struggling without Gaggle, she said. 

“I do think that it’s helpful in some ways,” the student said, “but I also kind of think that it’s — I wouldn’t say an invasion of privacy — but if obviously something gets flagged and a person who it wasn’t intended for reads through that, I think that’s kind of uncomfortable.” 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle’s keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company’s stated justification for the move as being “out of touch” with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using “reclaimed slurs,” reappropriating words that are generally considered derogatory and remain in Gaggle’s dictionary. 

“This is just like lipstick on a pig — no offense to pigs — but I don’t see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,” Greer said. “I don’t see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.”

Gaggle and its competitors — including , and — have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has “no context or background on students,” including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the “continuous surveillance” of students if monitoring tools are likely to trample students’ rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

“There is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,” Greer said. “If anything, queer kids are more in the crosshairs today than they were a year ago or two years ago — and that’s why this surveillance is so dangerous.”

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project’s toll-free support line at 866-488-7386.

]]>
White House Cautions Schools Against ‘Continuous Surveillance’ of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to “continuous surveillance” if the use of digital monitoring tools — already accused of targeting at-risk youth — are likely to trample students’ rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out “in favor of a human alternative.”


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools’ use of artificial intelligence that “define specifications for the safety, fairness and efficacy of AI models used within education” and introduce “guardrails that build on existing education data privacy regulations.” 

During , Education Secretary Miguel Cardona said officials at the department “embrace utilizing Ed Tech to enhance learning” but recognize “the need for us to change how we do business.” The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children’s information is being used.

“This has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,” he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a “massive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.” 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by ĂŰĚŇÓ°ĘÓ on schools’ growing use of digital surveillance and the tech’s impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes “unilateral federal action to prohibit” the software.

“That’s not the appropriate role of the federal government to come and say this cannot happen,” she said. “But smart guardrails that allow for good practices, that protect students’ safety and privacy, that’s a more appropriate role.”

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear “the ways in which algorithmic systems can deepen inequality.” 

“We commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,” CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks â€” particularly for students of color and LGBTQ youth — and promoted a “need for federal action to protect students’ civil rights, safety and privacy.” Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy “had no idea this was coming,” and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document’s failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students’ race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

“Of course there are a lot of privacy and equity and ethical landmines in this area,” Vance said. “But we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.” 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint’s on-the-ground implications, but that student data privacy efforts present schools with “a balancing act.”

“You want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn’t working,” she said. “Sometimes that’s broader than just a pure academic indicator.”

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they’ve highlighted how student data can wind up in the hands of the police.

“When police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,” he said in a media release. “While the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.”

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool’s “role in negatively impacting LGBTQ students.”

“Our philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,” the nonprofit said in the statement. “We hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.” 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project’s “guidance on how to do what we do better.” The company also where it previously touted the partnership. 

“We’re disappointed that The Trevor Project has decided to pause our collaboration,” she said. “However, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.” 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to “improve mental health outcomes for LGBTQ young people.” 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project’s work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle’s surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

“It really does feel like a ‘We paid you, now say we’re fine,’ kind of thing,” said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle’s “seal of approval” from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

“People who want to defend Gaggle can just point to their little Trevor Project thing and say, ‘See, they have the support of “The Gays” so it’s fine actually,’ and all it does is make it easier to deflect and defend actual issues with Gaggle.” 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among “Corporate Partners” on The Trevor Project’s website (screenshot)

Following an investigation by ĂŰĚŇÓ°ĘÓ into Gaggle’s monitoring practices, the company . Gaggle’s algorithm relies on keyword matching to compare students’ online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are “gay” and “lesbian,” verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance — a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it’s important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle’s “desire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.” 

“It’s true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring — many worry that these tools could out them to teachers or parents against their will,” the statement continued. “It is because of that very real concern that we have worked in a limited capacity with digital safety companies — to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.” 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is “honored to be able to align with The Trevor Project to better serve LGBTQ youth,” and that the company is “always looking for ways to learn and to improve upon what we do to better support students and keep them safe.” 

‘Faceless bureaucracy’ 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students’ school-issued Microsoft and Google accounts, Gaggle seeks to alert educators — and in some cases the police — of students’ online behaviors that suggest they might harm themselves or others.

“It really is about collecting details that kids may not be voluntarily sharing — information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,” Vance said. At The Trevor Project, “you have proactive outreach from youth who know that they need help or they need a community.” 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy’s and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it’s ultimately up to youth to decide which information they share with adults. 

It’s important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where “some faceless bureaucracy is finding out and informing your parents” about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn’t the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic — one that’s in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project’s suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity — typically called “outing” — due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students’ civil liberties and to state its intent “to take enforcement action against violations that result in discrimination.” The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group’s co-founder and executive director, said The Trevor Project’s partnership with Gaggle could be positive if it’s used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company “is proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.” (Screenshot)

“If that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they’ve done is harm the student,” Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking “to avoid liability issues.”  

“It is our stance that since these tools are not going anywhere, we think it’s important to do our part to offer our expertise around LGBTQ experiences,” the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

“I have friends who are queer and/or trans who are out at school but not to their parents,” he said. “If they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.” 

In The Trevor Project’s recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like “gay” and “lesbian” can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he’s been called just about every “slur for a queer or a trans person that isn’t from like 80 years ago.” While slurs are common, terms like “lesbian” are not.

“As an actual teenager going to an actual public school, those words are not being used to bully people,” he said. “They’re just not.”

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
With ‘Don’t Say Gay’ Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence — and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn’t explain who they were as a person. 

“It was very confusing trying to navigate understanding who I am and my identity,” said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. “I was able to find the words to understand who I am — words that I wouldn’t be able to piece together in a sentence if the internet wasn’t there.” 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality — intimate details that are gleaned from students’ conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center’s director of equity in civic technology. Following the Supreme Court’s repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 â€œRight now — without doing anything — schools may be getting alerts about students” who are searching the internet for resources related to reproductive health,” Laird said. “If you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.”

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

“I have switched to using Signal to talk to friends and colleagues in this space,” they said, referring to the . “The fear, even though it’s been common knowledge for basically my generation’s entire life that everything you do is being surveilled, it definitely has been amplified tenfold.”

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen’s private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state’s ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called “Don’t Say Gay” laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group’s co-founder and executive director, told ĂŰĚŇÓ°ĘÓ. 

“Right now, we are not in a landscape or an environment where that is safe for a company to be doing,” Wood said. “If there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.”

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

‘Extraordinarily concerned’

For decades, has required school technology to block access to images that are obscene, child pornography or deemed “harmful to minors,” and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren’t explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students’ social media posts, follow their digital movements in real time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

They’ve also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools’ widespread adoption of the tools could trample students’ civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies — GoGuardian, Gaggle, Securly and Bark — whether they flag students for using keywords related to reproductive health, such as “pregnant” and “abortion.”

“We are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,” Markey and Warren wrote. “With reproductive rights under attack nationwide, it would represent a betrayal of your company’s mission to support students if you fail to provide appropriate protections for students’ privacy related to reproductive health information.”

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples’ movements. Such precise location data, the , “may be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.” 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe’s repeal was imminent – despite maintaining a 30-day retention period for most other data. 

“By immediately and permanently deleting data which contains a student’s reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,” Bark CEO Brian Bason , which the company shared with ĂŰĚŇÓ°ĘÓ. 

GoGuardian spokesperson Jeff Gordon said its tools “cannot be used by educators or schools to flag reproductive health-related search terms” and its web filter cannot “flag reproductive health-related searches.” Securly didn’t respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not “collect health data of any kind including reproductive health information,” specifying that the monitoring tool does not flag students who use the terms “pregnant, abortion, birth control, contraception or Planned Parenthood. ” 

Yet tracking conversations about sex is a primary part of Gaggle’s business — more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by ĂŰĚŇÓ°ĘÓ, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to “sexual activity involving a student.” Students were regularly flagged for using keywords like “virginity,” “rape,” and, simply, “sex.” 

Patterson, the Gaggle CEO, has acknowledged that a student’s private diary entry about being raped wasn’t off limits. In touting the tool’s capabilities, he told ĂŰĚŇÓ°ĘÓ his company uncovered the girl’s diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they’d learned from her diary, Patterson said. 

“They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own,” Patterson said.

Any information that surveillance companies collect about students’ sexual behaviors could be used against them by police during investigations, privacy experts warned. And it’s unclear, Laird said, how long the police can retain any data gleaned from the tools. 

‘Don’t Say Gay’

Internet search engines are “particularly potent” tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company — which sifts through files and chat messages on students’ school-issued Microsoft and Google accounts — is not a web filter and therefore “does not track students’ online searches.” Yet Patterson’s assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students’ online searches. By working in tandem, flags triggered by Linewize’s web filtering “can be sent straight to the Gaggle Safety Team,” if the material “should be forwarded to the school or district.” 

In an email, Gaggle spokesperson Paget Hetherington said that in “a very small number of school systems,” the company reviews alerts from web filters before they’re sent to school officials to “alleviate the large number of false positives” and ensure that “only the most critical and imminent issues are being seen by the district.” 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including “gay” and “lesbian.” Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for “inferring sexual perversion” and “serving gay people.” From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of “Don’t Say Gay” laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have “been used against us in very detrimental ways.” 

Companies, he said, are unable to control how officials use that information in an era “where teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.” In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

“They can’t control what’s going to happen in Florida or Texas and they can’t control what’s going to happen in an individual home,” where students could be subjected to abuse, Wood said. “Any person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.” 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a “time capsule” letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one — not even him — would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher’s assurance wasn’t quite true, she learned. Someone had been reading the letter — and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was “identified as inappropriate” and urged her to “refrain from storing or sharing inappropriate content.” In a “second warning,” sent to her inbox, she was told a school administrator was given “access to this violation.” After a third alert, she said, access to her school email account was restricted. She said the experience left her with “a sense of betrayal from my school.” She said she had no idea words like “gay” or “sex” could get flagged by Gaggle’s algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a “time capsule” letter to her future self, had been “identified as inappropriate.” (Courtesy Susan)

“It’s frustrating to know that this program finds the need to have these as keywords, and quite depressing,” she said. “There’s always going to be oppression against the community somewhere, it seems, and it’s quite disheartening.” 

School administrators reviewed the time capsule letter and determined it didn’t contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that’s not the case for everyone.

“That’s not just the surveillance of your activities, that’s the surveillance of your thoughts,” Thomas said of Susan’s experience. “I know that wouldn’t have gone very well for me and I know for a lot of young people that would place them in a lot of danger.”

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio’s strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that “law enforcement is immediately contacted” if the district receives an alert from Gaggle suggesting that a student poses “an imminent threat of harm to self or others.” 

Given the state of abortion rules in Ohio, Susan said she’s concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

“I just flat out don’t trust adults in positions of power or authority,” Susan said. “You don’t really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.”

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to ĂŰĚŇÓ°ĘÓ, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its , GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the . The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>
Minneapolis Schools to Halt Controversial Student Surveillance Initiative /article/minneapolis-schools-to-halt-controversial-student-surveillance-initiative/ Mon, 27 Jun 2022 19:56:23 +0000 /?post_type=article&p=692269 The Minneapolis school district has announced plans to end its relationship with Gaggle, a controversial digital surveillance tool that monitored students’ online behaviors during pandemic-induced remote learning. 

The announcement, which follows extensive reporting by ĂŰĚŇÓ°ĘÓ about how the tool subjected the city’s youth to pervasive round-the-clock digital surveillance, was outlined last week at the bottom of a newsletter alerting families to changes at the district. Gaggle, which uses artificial intelligence and human content moderators to track students’ online activities and notify district officials of “inappropriate behaviors or potential threats to self or others,” will no longer be used beginning on July 1, the district announced. 

A week after schools went remote in Minneapolis and nationally in March 2020, the district sidestepped typical procurement rules and used federal pandemic relief money to contract with Gaggle, a for-profit company that reported significant business growth when classes went online. The district has spent more than $355,000 on the tool, which monitors student behaviors on school-issued Google and Microsoft accounts, and has a contract with the company through September 2023. 

District officials said the tool saved lives but civil rights advocates and students targeted by the program have questioned its efficacy and accused the company of violating students’ privacy rights. 

In an email, district spokesperson Julie Schultz Brown attributed the change to “made in order to honor the terms of our new contract” with educators. Gaggle founder and CEO Jeff Patterson said the Minneapolis district will stop using the tool at a moment when “students across the United States are suffering.” In June, the company alerted Minneapolis officials to 15 “critical incidents” related to suicide, death threats, violence and drug use, Patterson wrote in a statement. Nationally, the pandemic has led to a surge in youth mental health issues and . 

A recent report by Democratic Sens. Elizabeth Warren and Ed Markey warned that Gaggle and similar services could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars. Gaggle claims it during the 2020-21 school year, yet independent research on the tool’s effectiveness doesn’t exist. 

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Teeth Logsdon-Wallace, a rising freshman in Minneapolis, saw the district’s decision to cut ties with Gaggle as a major victory. He became an outspoken Gaggle critic after a homework assignment, which discussed a previous suicide attempt and how he learned important coping skills, got flagged by the tool’s surveillance dragnet. Officials at Gaggle and the district said the tool helps identify students who are struggling emotionally and need adult intervention. But 14-year-old Logsdon-Wallace and other critics argue that digital surveillance is an inappropriate way to pinpoint students who need mental health care. Rather than helping, he said the experience “felt violating and gross.” 

“When you’re spying on kids and their stuff, especially about mental health stuff, they’re just going to be more secretive about it,” he said. “That can just cause more danger.”

While Gaggle relies on technology to ferret out students with issues like depression, Logsdon-Wallace said that he and other students are more likely to share their mental health struggles with adults at school if there’s a culture of trust. Monitoring communications through an algorithm and a team of low-paid remote workers who the students don’t even know, he said, had the opposite effect and left students more apprehensive about district computers, “which could be positive and negative.”

While his peers learned how to better protect their own privacy online “even when it’s inherently being violated,” he said, he worried that some may have been “bottling up mental health issues because of it.”

The district will no longer use Gaggle’s student activity monitoring tool or the company’s anonymous tip line, SpeakUp for Safety, which allows students to report potential safety threats confidentially. Instead of turning to SpeakUp, concerned parents and students should report issues to police officials with the state Bureau of Criminal Apprehension, the district wrote in its newsletter. 

District officials have said the anonymous tip line was central to its decision to contract with Gaggle, yet previous reporting by ĂŰĚŇÓ°ĘÓ found that the service was rarely used. Meanwhile, the digital surveillance tool routinely flagged students who made references to sex, drugs and violence on district technology. An analysis of nearly 1,300 alerts found the service flagged Minneapolis students for discussing violent impulses, eating disorders, abuse at home and suicidal plans. 

But Gaggle regularly flagged benign student chatter and personal files, including classroom assignments, casual conversations between teens and sensitive journal entries. Gaggle flags students who use keywords related to sexual orientation including “gay” and “lesbian,” and on at least one occasion school officials in Minneapolis outed an LGBT student to their parents. The sheer volume of student communications that got flagged by Gaggle was at times overwhelming, the Minneapolis school district’s head of security acknowledged, but he also felt like he was able to save students from dying by suicide. 

In interviews with ĂŰĚŇÓ°ĘÓ, former content moderators at Gaggle — hundreds of whom are paid just $10 an hour on month-to-month contracts — raised serious questions about the company’s efficacy, its employment practices and its effect on students’ civil rights. 

Moderators said they received little training before they were given access to students’ sensitive materials and were pressured to prioritize speed over quality. They also reported insufficient safeguards to protect students’ sensitive files, including nude selfies. Patterson acknowledged that moderators, who work remotely with little supervision or oversight, could easily save copies of students’ nude photographs and share them on the dark web. 

As a transgender teenager who believes the school district has done too little to address bullying, Logsdon-Wallace said he already had little trust in district leaders. While Gaggle didn’t address the abuse from peers, having his sensitive experiences caught in the company’s algorithm made the situation worse.

“The very little trust I had in the administration is just destroyed,” he said. “You can’t expect students to trust you if you’ve done nothing to earn that trust.”

]]>
Meet the Gatekeepers of Students’ Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn’t want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students’ emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids’ deepest secrets — like nude selfies and suicide notes — regularly flashed onto Waskiewicz’s screen. Though she felt “a little bit like a voyeur,” she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle’s moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment’s notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

“In all honesty I was sort of half-assing it,” Waskiewicz admitted in an interview with ĂŰĚŇÓ°ĘÓ. “It wasn’t enough money and you’re really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students’ private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about “a tsunami of youth suicide headed our way” and said that schools have “a moral obligation to protect the kids on their digital playground.” 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company’s efficacy, its employment practices and its effect on students’ civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers — either in-person, on the phone or over Zoom — before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn’t sleep and without “any money to show for what I was putting up with.”

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors’ role with the company, arguing they use “common sense” to distinguish false flags generated by the algorithm from potential threats and do “not require substantial training.” 

While the experiences reported by Gaggle’s moderator team platforms like Meta-owned Facebook, Patterson said his company relies on “U.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,” as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

“Some people are not fast decision-makers. They need to take more time to process things and maybe they’re not right for that job,” he told ĂŰĚŇÓ°ĘÓ. “For some people, it’s no problem at all. For others, their brains don’t process that quickly.”

Executives also sought to minimize the contractors’ access to students’ personal information; a spokeswoman said they only see “small snippets of text” and lacked access to what’s known as students’ “personally identifiable information.” Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students’ names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to “gray areas,” such as whether a Victoria’s Secret lingerie ad would be considered acceptable or not. 

“Those people are really just the very, very first pass,” Gaggle spokeswoman Paget Hetherington said. “It doesn’t really need training, it’s just like if there’s any possible doubt with that particular word or phrase it gets passed on.” 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

“I went into the experience extremely excited to help children in need,” McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. “I realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that’s seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about “lives saved” and child safety incidents at every meeting, and they are open about sharing the company’s financial outlook so that employees “can have confidence in the security of their jobs.”

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle’s content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

‘We are just expendable’

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a “highly trained content review team” to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle’s content review team, described their training as “a joke,” consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company’s safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were “more confused than when we started.”

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators’ feedback to ĂŰĚŇÓ°ĘÓ.

“If you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,” one on Indeed. “Warning, you will see awful awful things. No they don’t provide therapy or any kind of support either.

“That isn’t even the worst part,” the reviewer continued. “The worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn’t be able to function, but we are just expendable.” 

As the first layer of Gaggle’s human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students’ communications for additional consideration. Designated employees on Gaggle’s Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students’ files, Patterson said.

Gaggle’s staunchest critics have questioned the tool’s efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students’ civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with ĂŰĚŇÓ°ĘÓ â€œstruck me as the worst-case scenario,” said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators’ limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, “is not acceptable.”

In to lawmakers, Gaggle described a two-tiered review procedure but didn’t disclose that low-wage contractors were the first line of defense. CEO Patterson told ĂŰĚŇÓ°ĘÓ they “didn’t have nearly enough time” to respond to lawmakers’ questions about their business practices and didn’t want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren’t interviewed before getting placed on the job.

“There’s a lot of contractors. We can’t do a physical interview of everyone and I don’t know if that’s appropriate,” he said. “It might actually introduce another set of biases in terms of who we hire or who we don’t hire.”

‘Other eyes were seeing it’

In a previous investigation, ĂŰĚŇÓ°ĘÓ analyzed a cache of public records to expose how Gaggle’s algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools’ authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle’s algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students’ online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students’ online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students’ online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

“I felt kind of bad because the kids didn’t have the ability to have stuff of their own and I wondered if they realized that it was public,” she said. “I just wonder if they realized that other eyes were seeing it other than them and their little friends.”

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students’ computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students’ screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don’t share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children’s activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by ĂŰĚŇÓ°ĘÓ’s reporting, said Elizabeth Laird, the group’s director of equity in civic technology. 

“I don’t know that the way this information is being handled actually would meet parents’ expectations,” Laird said. 

Another former contractor, who reached out to ĂŰĚŇÓ°ĘÓ to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids’ private lives — including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn’t come with health insurance. 

“I went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,” said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. “It broke my heart that they had to go through these revelations about themselves in a context where they can’t even go to school and get out of the house a little bit. They have to do everything from home — and they’re being constantly monitored.” 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to ĂŰĚŇÓ°ĘÓ by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they’re warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

“Quite honestly, we’re dealing with school districts with very limited budgets,” Patterson said. “There have to be some tradeoffs.” 

The anonymous contractor said he wasn’t as concerned about his own well-being as he was about the welfare of the students under the company’s watch. The company lacked adequate safeguards to protect students’ sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students’ nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn’t aware of any data breaches. 

“We do things in the interface to try to disable the ability to save those things,” Patterson said, but “you know, human beings who want to get around things can.”

‘Made me feel like the day was worth it’

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

“It was a little weird when they were asking for the banking information, like ‘Wait a minute is this real or what?’” Waskiewicz said. “I Googled them and I think they’re pretty big.”

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student’s suicide note. 

“Knowing I was able to help with that made me feel like the day was worth it,” she said. “Hearing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.” 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district’s contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student’s suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district’s code of conduct. 

“No tool is perfect, every organization has room to improve, I’m sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,” said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

“There’s always going to be pros and cons to any organization, any service,” Enfield told ĂŰĚŇÓ°ĘÓ, “but our experience has been overwhelmingly positive.”

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company’s artificial intelligence lacked sophistication. They said the algorithm routinely flagged students’ papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that’s long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that “99% of the time” Gaggle’s algorithm flagged pedestrian materials including pictures of sunsets and student’s essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing “the right thing.”

McElligott said that managers’ personal opinions added another layer of complexity. Though moderators were “held to strict rules of right and wrong decisions,” she said they were ultimately “being judged against our managers’ opinions of what is concerning and what is not.” 

“I was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,” she said. “There was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up — and when I alerted it, I was told it was not as serious as I thought.” 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are “so concerned about students.” 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle’s surveillance dragnet, and pressure to work quickly didn’t offer enough time to evaluate long chat logs between students having “heartfelt and sensitive” conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

“When I would see stuff like that I was like ‘Oh, thank God, I can just get this out of the way and heighten how many items per hour I’m getting,’” he said. “It’s like ‘I hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.’” 

Ultimately, he said he was unprepared for such extensive access to students’ private lives. Because Gaggle’s algorithm flags keywords like “gay” and “lesbian,” for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to “ensure that these vulnerable students are not being harassed or suffering additional hardships,” but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

“I thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,” the former moderator said. “I felt tremendous power was being put in my hands” to distinguish students’ benign conversations from real danger, “and I was given that power immediately for $10 an hour.” 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle’s watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it’s “just really freaky” that moderators can review students’ sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle’s content review team. 

“Not only is it violating the privacy rights of students, which is bad for our mental health, it’s traumatizing these moderators, which is bad for their mental health,” he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

“Bad labor conditions don’t just affect the workers,” he said. “It affects the people they say they are helping.” 

Gaggle cannot prohibit contractors from reviewing students’ private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

“However, the contractors know the nature of the content they will be reviewing,” Durkac said. “It is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.” 

Gaggle’s former contractors also weighed students’ privacy rights. Heyman said she “went back and forth” on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

“If you don’t want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,” she said. “As long as they’re being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.” 

Logsdon-Wallace and his mother said they didn’t know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle’s algorithm helped him understand the effects that surveillance can have on young people. 

“Sometimes a kid would use a curse word and another kid would be like, ‘Dude, shut up, you know they’re watching these things,’” he said. “These kids know that they’re being looked in on,” even if they don’t realize their observer is a contractor working from the couch in his living room. “And to be the one that is doing that — that is basically fulfilling what these kids are paranoid about — it just felt awful.” 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded ĂŰĚŇÓ°ĘÓ and sits on its board of directors.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students’ online activities, that educators’ widespread use of digital surveillance tools could trample students’ civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate — and potentially harmful — effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country — often 24 hours a day, seven days a week — to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies’ software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use — and potential misuse — of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire “need for federal action to protect students’ civil rights, safety and privacy.”

“While the intent of these products, many of which monitor students’ online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,” the lawmakers wrote. “Studies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.”

An FCC spokesperson said they’re reviewing the and an Education Department spokesperson said they “look forward to corresponding with the senators” about its findings.

Lawmakers’ inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by ĂŰĚŇÓ°ĘÓ into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. ĂŰĚŇÓ°ĘÓ used public records to expose how Gaggle’s algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn’t respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators’ March 30 report and looks forward “to continuing our dialogue with Senators Warren and Markey on the important topics they have raised.”

“Parents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,” GoGuardian spokesman Jeff Gordon said in a statement. “Schools also have a responsibility to keep students safe in digital spaces and on school-issued devices.” 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers’ recommendations “to assess how we can further strengthen our work to better protect students.”

“We want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,” Patterson continued. “We have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.”

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country’s “terrible history of bias in school discipline” by removing the decisions of individual teachers and administrators.

“While any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,” Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies’ letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders “prefer that we contact public safety agencies directly in lieu of a district contact.”

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students’ internet use to ensure they don’t access material “harmful to minors,” such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law’s scope. Meanwhile, advocates have questioned whether schools’ use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students’ computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their “true thoughts or ideas because I know what I do online is being monitored.”  

Elizabeth Laird, the group’s director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it’s “disappointing that it took a letter from Congress to get this information.” She said she hopes the FCC and Education Department adopt lawmakers’ recommendations.

“None of these companies have researched whether their products are biased against certain groups of students,” she said in an email while questioning their justification for holding off on such an inquiry. “They cite privacy as the reason for not doing so while simultaneously monitoring students’ messages, documents and sites visited 24 hours a day, seven days a week.” 

ĂŰĚŇÓ°ĘÓ’s investigation, which used data on Gaggle’s foothold in Minneapolis Public Schools, failed to identify whether the tool’s algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like “gay” and “lesbian” were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification “would do little at best and may provide counterproductive guidance at worst.” Many schools, she said, are likely to use the tools regardless of the federal rules. 

“Schools aren’t required to monitor social media, and many have chosen to do so anyway,” said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance “could actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.”

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it’s ‘Not That Smart’ /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone’s gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since “graduated” from weekly therapy sessions and has found a better headspace, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope — intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song “Your Heart is a Muscle the Size of Your Fist” helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was “a reminder to keep on loving, keep on fighting and hold on for your life.” (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, ĂŰĚŇÓ°ĘÓ analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word “suicide,” context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment — that his mental health had improved — was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 â€œI was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context,” and while some districts have requested to be notified about references to previous suicide attempts, it’s ultimately up to administrators to “decide the proper response, if any.”  

‘A crisis on our hands’

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by ĂŰĚŇÓ°ĘÓ through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

“Before the pandemic, we had a crisis on our hands,” he said. “I believe there’s a tsunami of youth suicide headed our way that we are not prepared for.” 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace’s mother Alexis Logsdon didn’t know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart — it’s just like ‘Woop, we saw the word.’” 

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications — an experience she described as “really scary.”

“If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”
—
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal and, according to her, Gaggle had ultimately flagged profanity in students’ fictional article submissions. 

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email while pointing to one of the fictional articles. “Please refrain from storing or sharing inappropriate content in your files.” 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn’t catch everything. Even as she got flagged when students shared documents with her, the articles’ authors weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious,” and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

“With such a seemingly random service, that doesn’t seem to — in the end — have an impact on improving student health or actually taking action to prevent suicide and threats” she said in an interview. “If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times “does not properly indicate the author of a document and assigns a random collaborator.”

“We are hoping Google will improve this functionality so we can better protect students,” Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn’t notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she’d shoot her “puny little brain with my grandpa’s rifle.”

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter’s teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

“I didn’t hear a word from Gaggle about it,” she said. “If I hadn’t brought it to the teacher’s attention, I don’t think that anything would have been done.” 

The incident, which occurred in April, fell outside the six-month period for which ĂŰĚŇÓ°ĘÓ obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it “does not have any insight into the steps the district took to address this particular matter.” 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials “would never discuss with a community member any communication flagged by Gaggle.” 

“That unrelated but concerned parent would not have been provided that information nor should she have been,” she wrote in an email. “That is private.” 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

‘The big scary algorithm’

When identifying potential trouble, Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

“We’re using the big scary algorithm term here when I don’t think it applies,” This is not Netflix’s recommendation engine. This is not Spotify.”
—
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.” 

She said that Gaggle’s utility could be impaired because it doesn’t adjust to students’ behaviors over time, comparing it to Netflix, which recommends television shows based on users’ ever-evolving viewing patterns. “Something that doesn’t learn isn’t going to be accurate,” she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle’s marketing materials appear to overhype the tool’s sophistication to schools, she said. 

“We’re using the big scary algorithm term here when I don’t think it applies,” she said. “This is not Netflix’s recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.” 

“Artificial intelligence without human intelligence ain’t that smart.”
—
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed.” The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.” 

Ultimately, the algorithm to identify keywords is used to “narrow down the haystack as much as possible,” Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

“Artificial intelligence without human intelligence ain’t that smart,” he said. 

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy,” and that the misuse of school equipment could result in discipline and “civil or criminal liability.” District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor “the online activities of minors.” 

Patterson suggested that teachers aren’t paying close enough attention to keep students safe on their own and “sometimes they forget that they’re mandated reporters.” On the , Patterson says he launched the company in 1999 to provide teachers with “an easy way to watch over their gaggle of students.” Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company’s role in meeting it. As technology becomes a key facet of American education, Patterson said that schools “have a moral obligation to protect the kids on their digital playground.” 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user.” In , her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they’re concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases.” Around-the-clock surveillance, they wrote, demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.” 

“Escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report,” the senators wrote. “Flagging students as ‘high-risk’ may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.”

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd’s murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle’s algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by ĂŰĚŇÓ°ĘÓ offer a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. The Minneapolis district’s internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district’s Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that law enforcement is not a “regular partner,” when responding to incidents flagged by Gaggle. It doesn’t deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by ĂŰĚŇÓ°ĘÓ.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

“Even if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,” Matlock said, though it’s unclear if any students have faced legal consequences. “It’s the question as to why they’re doing it,” and to raise the issue with their parents.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay, and “lesbian.” On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident “disgusting and horribly messed up.” 

“They have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” he said. “When people are just talking about being gay, anything they’re writing would be flagged.” 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in ĂŰĚŇÓ°ĘÓ’s data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

“That’s definitely really messed up, especially when the school is like ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by ĂŰĚŇÓ°ĘÓ, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>
Report: Most Parents, Teachers Support Student Surveillance Tech /article/new-research-most-parents-and-teachers-have-accepted-student-surveillance-as-a-safety-tool-but-see-the-potential-for-serious-harm/ Tue, 21 Sep 2021 16:30:00 +0000 /?post_type=article&p=577984 Tools that monitor students’ online behavior have become ubiquitous in U.S. schools — and grew rapidly as the pandemic closed campuses nationwide — but a majority of parents and teachers believe the benefits of such digital surveillance outweigh the risks, .

Similarly, half of students said they are comfortable with schools’ use of monitoring software while a quarter reported feeling queasy about the idea, according to the new research by the Center for Democracy and Technology, a nonprofit group based in Washington, D.C. Despite their overall comfort with digital software, teachers, parents and students each worried about how the tools could have detrimental side effects. Specifically, many parents and teachers were concerned that digital surveillance could be used to discipline students and young people reported becoming more reserved when they knew they were being watched.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“In response to the pandemic, the focus on technology and its use has never been greater,” said report co-author Elizabeth Laird, the center’s director of equity in civic technology. As tech gains a greater grasp on education, she said it’s important for school leaders and policymakers to remain focused on protecting students’ individual rights. She worried that student surveillance technology could have a damaging impact on students, especially youth of color and those from low-income households.

“I don’t think it’s a slam dunk,” Laird said.

Though the report didn’t highlight specific tools used, schools deploy a range of digital monitoring software to track student activity, including programs that block online material deemed inappropriate, track when students log into school applications, and allow teachers to view students’ screens in real-time and even take control of their computers.

Last week, an investigative report by ĂŰĚŇÓ°ĘÓ exposed how the Minneapolis school district’s use of the digital surveillance tool Gaggle had subjected children to relentless online surveillance long after classes ended for the day — including inside students’ homes. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day by sifting through data stored on their school-issued Google and Microsoft accounts. In Minneapolis, the company flagged school security when moderators believed students could harm themselves or others, but it also picked up students’ classroom assignments, journal entries, chats with friends and fictional stories.

Among teachers surveyed by the Center for Democracy and Technology, 81 percent said their schools use software that tracks students’ computer activity, including to block obscene material, monitor students’ screens in real time and prohibit students from using websites unrelated to school like YouTube. A majority of both parents and students reported such tools were used in their schools, but they were also more likely than teachers to be unsure about whether youth were being actively monitored by educators. In interviews with administrators, researchers found that many school leaders weren’t sure how best to be transparent with families about their monitoring practices.

“Certainly there is an imbalance in information and transparency around what is happening,” Laird said. School districts have been clear [that] students shouldn’t have an expectation of privacy but they haven’t been as clear about what they are tracking, how they are tracking it, how long they keep that information. They really should be doing that.”

Four-fifths of surveyed teachers said their schools used digital tools to track students online. Both parents and students were more unlikely than teachers to be unsure whether such tools were in use in their schools. (Photo courtesy Center for Democracy and Technology)

Among teachers, 66 percent said the benefits of activity monitoring outweigh student privacy concerns and 62 percent of parents reached a similar conclusion. Meanwhile, 78 percent of teachers reported that digital surveillance helps keep students safe by identifying problematic online behaviors and 72 percent said it helps keep students on task. But their answers also revealed equity concerns: 71 percent of teachers reported that monitoring software is applied to all students equally, 51 percent worried that it could come with unintended consequences like “outing” LGBTQ students and 49 percent said it violates students’ privacy.

Many teachers reported that such monitoring tools are used on students long after classes end for the day. In total, 30 percent of educators said the tools are active “all of the time,” and 16 percent said the software tracks kids on their personal devices.

Nearly a third of teachers who reported their schools use digital services like Gaggle to track students online said the tools monitor youth behaviors 24 hours a day. (Photo by Center for Democracy and Technology)

Among parents, 75 percent said digital surveillance helps keep students safe and 73 percent said it ensures children remain focused on schoolwork. Yet many parents also reported potential downsides: 61 percent worried of long-term harm if the tools were used to discipline students, 51 percent were concerned about unintended consequences and 49 percent said it violates students’ privacy rights.

Perhaps unsurprisingly, students were less at ease with educators watching their online behaviors. Half said they were comfortable with monitoring tools, a quarter said they were uncomfortable with them and another quarter were unsure.

The data also suggest that students alter their behaviors as a result of being watched: 58 percent said they don’t share their true thoughts or ideas online as a result of being monitored at school and 80 percent said they were more careful about what they search online. While just 39 percent of students said it was unfair that educators monitored their school-issued services, 74 percent opposed the surveillance of their own devices like their cell phones. are among those that could track students’ behaviors on their own technology.

The data raise significant equity concerns. For many students, school-issued devices are their only method of connectivity.

“The privacy and security of personal devices is a luxury not all can afford,” Alexandra Givens, the center’s president and CEO, said in a press release. “Constant online monitoring — especially of students who cannot afford or don’t have access to personal devices — risks creating disparities in the ways student privacy is protected nationwide.”

To reach its findings, researchers conducted online surveys in June that were completed by 1,001 teachers, 1,663 parents and 420 high school students. Researchers also conducted interviews with school administrators to understand their motives in deploying digital surveillance. Among the justifications is a federal law that requires schools to monitor students online. But the law also includes a disclaimer noting that the statute does not “require the tracking of internet use by any identifiable minor or adult user.”

Understanding context is critical, Laird said, adding that the law’s authors hadn’t fully envisioned a world where students could be surveilled by artificial intelligence long after classes end for the day.

“What was happening at the time was students were in a school computer lab for part of the day and monitoring meant having an adult walking around a computer lab and physically looking at what was on students’ computer monitors,” she said. But today, she said the statute is being interpreted very differently.

In response, the center, along with the American Civil Liberties Union and the Center for Learner Equity Tuesday to clarify the law’s stipulations and inform educators it “does not require broad, invasive and constant surveillance of students’ lives online.”

“Systemic monitoring of online activity can reveal sensitive information about students’ personal lives, such as their sexual orientation, or cause a chilling effect on their free expression, political organizing, or discussion of sensitive issues such as mental health,” the letter continued. “These harms likely fall disproportionately on already vulnerable, over-policed and over-disciplined communities.”

]]>
An Inside Look at Spy Tech Used on Students During Remote Classes — and Beyond /article/gaggle-spy-tech-minneapolis-students-remote-learning/ Tue, 14 Sep 2021 10:30:00 +0000 /?post_type=article&p=577556 A week after the pandemic forced Minneapolis students to attend classes online, the city school district’s top security chief got an urgent email, its subject line in all caps, alerting him to potential trouble. Just 12 seconds later, he got a second ping. And two minutes after that, a third.

In each instance, the emails warning Jason Matlock of “QUESTIONABLE CONTENT” pointed to a single culprit: Kids were watching cartoon porn.

Over the next six months, Matlock got nearly 1,300 similar emails from Gaggle, a surveillance company that monitors students’ school-issued Google and Microsoft accounts. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day. The sheer volume of reports was overwhelming at first, Matlock acknowledged, and many incidents were utterly harmless. About 100 were related to animated pornography and, on one occasion, a member of Gaggle’s remote surveillance team flagged a fictional story that referenced “underwear.”

Hundreds of others, however, suggested imminent danger.

In emails and chat messages, students discussed violent impulses, eating disorders, abuse at home, bouts of depression and, as one student put it, “ending my life.” At a moment of heightened social isolation and elevated concern over students’ mental health, references to self-harm stood out, accounting for nearly a third of incident reports over a six-month period. In a document titled “My Educational Autobiography,” students at Roosevelt High School on the south side of Minneapolis discussed bullying, drug overdoses and suicide. “Kill me,” one student wrote in a document titled “goodbye.”

Nearly a year after ĂŰĚŇÓ°ĘÓ submitted public records requests to understand the Minneapolis district’s use of Gaggle during the pandemic, a trove of documents offer an unprecedented look into how one school system deploys a controversial security tool that grew rapidly during COVID-19, but carries significant civil rights and privacy implications.

The data, gleaned from those 1,300 incident reports in the first six months of the crisis, highlight how Gaggle’s team of content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools’ authority far beyond their traditional powers to regulate student speech and behavior, including at home.

Now, as COVID-era restrictions subside and Minneapolis students return to in-person learning this fall, a tool that was pitched as a remote learning necessity isn’t going away anytime soon. Minneapolis officials reacted swiftly when the pandemic engulfed the nation and forced students to learn from the confines of their bedrooms, paying more than $355,000 — including nearly $64,000 in federal emergency relief money — to partner with Gaggle until 2023. Faced with a public health emergency, the district circumvented normal procurement rules, a reality that prevented concerned parents from raising objections until after it was too late.

A mental health dilemma

With each alert, Matlock and other district officials were given a vivid look into students’ most intimate thoughts and online behaviors, raising significant privacy concerns. It’s unclear, however, if any of them made kids safer. Independent research on the efficacy of Gaggle and similar services .

When students’ mental health comes into play, a complicated equation emerges. In recent years, schools have ramped up efforts to identify and provide interventions to children at risk of harming themselves or others. Gaggle executives see their tool as a key to identify youth who are lamenting over hardships or discussing violent plans. On average, Gaggle notifies school officials within 17 minutes after zeroing in on student content related to suicide and self-harm, according to the company, and officials claim they saved more than 1,400 lives during the 2020-21 school year.

Jeff Patterson

“As a parent you have no idea what’s going on in your kid’s head, but if you don’t know you can’t help them,” said Jeff Patterson, Gaggle’s founder and CEO. “And I would always want to err on trying to identify kids who need help.”

Critics, however, have questioned Gaggle’s effectiveness and worry that rummaging through students personal files and conversations — and in some cases outing students for exhibiting signs of mental health issues including depression — could backfire.

Using surveillance to identify children in distress could exacerbate feelings of stigma and shame and could ultimately make students less likely to ask for help, said Jennifer Mathis, the director of policy and legal advocacy at in Washington, D.C.

“Most kids in that situation are not going to share anything anymore and are going to suffer for that,” she said. “It suggests that anything you write or say or do in school — or out of school — may be found and held against you and used in ways that you had not envisioned.”

Minneapolis parent Holly Kragthorpe-Shirley had a similar concern and questioned whether kids “actually have a safe space to raise some of their issues in a safe way” if they’re stifled by surveillance.

In Minneapolis, for instance, Gaggle flagged the keywords “feel depressed” in a document titled “SEL Journal,” a reference to social-emotional learning. In another instance, Gaggle flagged “suicidal” in a document titled “mental health problems workbook.”

District officials acknowledged that Gaggle had captured student assignments and other personal files, an issue that civil rights groups have long been warning about. The documents obtained by ĂŰĚŇÓ°ĘÓ put hard evidence behind those concerns, said Amelia Vance, the director of at The Future of Privacy Forum, a Washington-based think tank.

Amelia Vance

“The hypotheticals we’ve been talking about for a few years have come to fruition,” she said. “It is highly likely to undercut the trust of students not only in their school generally but in their teacher, in their counselor — in the mental health problems workbook.” 

Patterson shook off any privacy reservations, including those related to monitoring sensitive materials like journal entries, which he characterized as “cries for help.”

“Sometimes when we intervene we might cause some challenges, but more often than not the kids want to be helped,” he said. Though Gaggle only monitors student files tied to school accounts, he cited a middle school girl’s private journal in a success story. He said the girl wrote in a digital journal that she suffered with self esteem issues and guilt after getting raped.

“No one in her life knew about this incident and because she journaled about it,” Gaggle was able to notify school officials about what they’d learned, he said. “They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own.”

‘Needles in haystacks’

Tools like Gaggle have become ubiquitous in classrooms across the country, according to forthcoming research by the D.C.-based In a recent survey, 81 percent of teachers reported having such software in place in their schools. Though most students said they’re comfortable being monitored, 58 percent said they don’t share their “true thoughts or ideas” as a result and 80 percent said they’re more careful about what they search online.

Such data suggest that youth are being primed to accept surveillance as an inevitable reality, said Elizabeth Laird, the center’s director of equity in civic technology. In return, she said, they’re giving up the ability to explore new ideas and learn from mistakes.

Gaggle, in business since 1999 and recently relocated to Dallas, monitors the digital files of more than 5 million students across the country each year with the pandemic being very good for its bottom line. Since the onset of the crisis, the number of students surveilled by the privately held company, which does not report its yearly revenue, has . Through artificial intelligence, Gaggle scans students’ emails, chat messages and other materials uploaded to students’ Google or Microsoft accounts in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. Moderators evaluate flagged material and notify school officials about content they find troubling — a bar that Matlock acknowledged is quite low as “the system is always going to err on the side of caution” and requires district administrators to evaluate materials’ context.

“We’re looking for needles in haystacks to basically save kids.”
—Jeff Patterson, founder and CEO of Gaggle, which analyzed more than 10 billion online student communications in the 2020-21 school year.

In Minneapolis, Gaggle officials discovered a majority of offenses in files within students’ Google Drive, including in word documents and spreadsheets. More than half of incidents originated on the Drive. Meanwhile, 22 percent originated in emails and 23 percent came from Google Hangouts, the chat feature.

School officials are alerted to only a tiny fraction of student communications caught up in Gaggle’s dragnet. Last school year, Gaggle collected more than 10 billion items nationally but just 360,000 incidents resulted in notifications to district officials, according to the company. Nationally, 41 percent of incidents during the 2020-21 school year related to suicide and self-harm, according to Gaggle, and a quarter centered on violence.

“We are looking for needles in haystacks to basically save kids,” Patterson said.

‘A really slippery slope’

It was Google Hangouts that had Matt Shaver on edge. When the pandemic hit, classrooms were replaced by video conferences and casual student interactions in hallways and cafeterias were relegated to Hangouts. For Shaver, who taught at a Minneapolis elementary school during the pandemic, students’ Hangouts use became overwhelming.

Students were so busy chatting with each other, he said, that many had lost focus on classroom instruction. So he proposed a blunt solution to district technology officials: Shut it down.

“The thing I wanted was ‘Take the temptation away, take the opportunity away for them to use that,’” said Shaver, who has since left teaching and is now policy director at the education reform group EdAllies. “And I actually got pushback from IT saying ‘No we’re not going to do that, this is a good social aspect that we’re trying to replicate.’”

But unlike those hallway interactions, nobody was watching. Matlock, the district’s security head, said he was initially in the market for a new anonymous reporting tool, which allows students to flag their friends for behaviors they find troubling. He turned to Gaggle, which operates the anonymous reporting system SpeakUp for Safety, and saw the company’s AI-powered digital surveillance tool, which goes well beyond SpeakUp’s powers to ferret out potentially alarming student behavior, as a possibility to “enhance the supports for students online.”

“We wanted to get something in place quickly, as we were moving quickly with the lockdown,” he said, adding that going through traditional procurement hoops could take months. “Gaggle had a strong national presence and a reputation.”

The district signed an initial six-month, $99,603 contract with Gaggle just a week after the virus shuttered schools in Minneapolis. Board of Education Chair Kim Ellison signed a second, three-year contract at an annual rate of $255,750 in September 2020.

The move came with steep consequences. Though SpeakUP was used just three times during the six-month window included in ĂŰĚŇÓ°ĘÓ’s data, Gaggle’s surveillance tool flagged students nearly 1,300 times.

During that time, which coincided with the switch to remote learning, the largest share of incidents — 38 percent — were pornographic or sexual in nature, including references to “sexual activity involving a student,” professional videos and explicit, student-produced selfies which trigger alerts to the National Center for Missing and Exploited Children.

“I’m trying to imagine finding out about this as a high schooler, that every single word I’ve written on a Google Hangout or whatever is being monitored … we live in a country with laws around unreasonable search and seizure — and surveillance is just a really slippery slope.”
—Matt Shaver, former Minneapolis Public Schools teacher

An additional 30 percent were related to suicide and self-harm, including incidents that were triggered by keywords including “cutting,” “feeling depressed,” “want to die,” and “end it all.” an additional 18 percent were related to violence, including threats, physical altercations, references to weapons and suspected child abuse. Such incidents were triggered by keywords including “Bomb,” “Glock,” “going to fight,” and “beat her.” About a fifth of incidents were triggered by profanity.

Concerns over Gaggle’s reach during the pandemic weren’t limited to Minneapolis. In December 2020, a group of civil rights organizations including the American Civil Liberties Union of Northern California that by using Gaggle, the Fresno Unified School District had violated the California Electronic Communications Privacy Act, which requires officials to obtain search warrants before accessing electronic information. Such monitoring, the groups contend, infringe on students’ free-speech and privacy rights with little ability to opt out.

Shaver, whose students used Google Hangouts to the point of it becoming a distraction, was alarmed to learn that those communications were being analyzed by artificial intelligence and poured over by a remote team of people he didn’t even know.

“I’m trying to imagine finding out about this as a high schooler, that every single word I’ve written on a Google Hangout or whatever is being monitored,” he said. “There is, of course, some lesson in this, obviously like, ‘Be careful of what you put online.’ But we live in a country with laws around unreasonable search and seizure — and surveillance is just a really slippery slope.”

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

The potential to save lives

To Matlock, Gaggle is a lifesaver — literally. When the tool flagged a Minneapolis student’s suicide note in the middle of the night, Matlock said he rushed to intervene. In a late-night phone call, the security chief said he warned the unnamed parents, who knew their child was struggling but didn’t fully recognize how bad things had become. Because of Gaggle, school officials were able to get the student help. To Matlock, the possibility that he saved a student’s life offers a feeling he “can’t even measure in words.”

“If it saved one kid, if it supported one caregiver, if it supported one family, I’ll take it,” he said. “That’s the bottom line.”

Despite heightened concern over youth mental health issues during the pandemic, its effect on youth suicide rates remains fuzzy. Preliminary data from the Minnesota health department show . Between 2019 and 2020, suicides among people 24 years old and younger decreased by more than 20 percent statewide. Nationally, the has surged during the pandemic, according to the Centers for Disease Control and Prevention, but for people of all ages show a 5.6 percent decline in self-inflicted fatalities in 2020 compared to 2019.

Meanwhile, Gaggle reported that it identified a significant increase of threats related to suicide, self-harm and violence nationwide between March 2020 and March 2021. During that period, Gaggle observed a 31 percent increase in flagged content overall, including a 35 percent increase in materials related to suicide and self-harm. Gaggle officials said the data highlight a mental health crisis among youth during the pandemic. But other factors could be at play. Among them is , creating additional opportunities for Gaggle to tag youth behavior. Meanwhile, the number of students monitored by Gaggle nationally grew markedly during the pandemic.

But that hasn’t stopped Gaggle from as it markets a new service: Gaggle Therapy. In school districts that sign up for the service, students who are flagged by Gaggle’s digital monitoring tool are matched with counselors for weekly teletherapy sessions. Therapists available through the service are independent contractors for Gaggle and districts can either pay Gaggle for “blanket coverage,” which makes all students eligible, or a “retainer” fee, which allows them to “use the service as you need it,” . Under the second scenario, Gaggle would have a financial incentive to identify more students in need of teletherapy.

In Minneapolis, Matlock said that school-based social workers and counselors lead intervention efforts when students are identified for materials related to self-harm. “The initial moment may be a shock” when students are confronted by school staff about their online behaviors, he said, but providing them with help “is much better in the long run.”

A presentation sent to Minneapolis teachers explains how the district responds after Gaggle flags a “possible student situation” that officials say present an imminent threat. (Photo obtained by ĂŰĚŇÓ°ĘÓ)

As the district rolled out the service, many parents and students were out of the loop. Among them was Nathaniel Genene, a recent graduate who served as the Minneapolis school board’s student representative at the time. He said that classmates contacted him after initial news of the Gaggle contract was released.

“I had a couple of friends texting me like ‘Nathaniel, is this true?’” he said. “It was kind of interesting because I had no idea it was even a thing.”

Yet as students gained a greater awareness that their communications were being monitored, Matlock said they began to test Gaggle’s parameters using potential keywords “and then say ‘Hi’ to us while they put it in there.”

As students became conditioned to Gaggle, “the shock is probably a little bit less,” said Rochelle Cox, an associate superintendent at the Minneapolis school district. Now, she said students have an outlet to get help without having to explicitly ask. Instead, they can express their concerns online with an understanding that school officials are listening. As a result, school-based mental health professionals are able to provide the care students need, she said.

Mathis, with The Bazelon Center for Mental Health Law, called that argument “ridiculous.” Officials should make sure that students know about available mental health services and ensure that they feel comfortable reaching out for help, she said.

“That’s very different than deciding that we’re going to catch people by having them write into the ether and that’s how we’re going to find the students who need help,” she said. “We can be a lot more direct in communicating than that, and we should be a lot more direct and a lot more positive.”

In fact, subjecting students to surveillance could push them further into isolation and condition them to lie when officials reach out to inquire about their digital communications, argued Vance of the Future of Privacy Forum.

“Effective interventions are rarely going to be built on that, you know, ‘I saw what you were typing into a Google search last night’ or ‘writing a journal entry for your English class,’” Vance said. “That doesn’t feel like it builds a trusting relationship. It feels creepy.”

]]>