school surveillance – ĂŰĚŇÓ°ĘÓ America's Education News Source Wed, 04 Oct 2023 15:47:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png school surveillance – ĂŰĚŇÓ°ĘÓ 32 32 New Report: School Shootings Spawned ‘Digital Dystopia’ of Student Surveillance /article/new-report-school-shootings-spawned-digital-dystopia-of-student-surveillance/ Tue, 03 Oct 2023 18:48:00 +0000 /?post_type=article&p=715730 Updated, Oct. 4

Reeled in by deceptive, fear-based marketing and an influx of federal cash, school leaders have purchased and pervasively deployed student surveillance tools while failing to consider their detrimental consequences to young people’s civil rights, a new ACLU report concludes. 

In a youth survey accompanying the , a majority of students expressed worries that the tools — designed to keep them safe — could actually cause harm and a third said they “always feel” like they’re being watched. 

The 61-page report, titled “Digital Dystopia,” also offers an in-depth look at the rise of schools’ reliance on surveillance technology over the last few decades, arguing the tools have failed to improve campus safety while subjecting students — particularly students of color and those who are undocumented, LGBTQ or from low-income households — to discrimination. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“The ed tech surveillance companies, after fanning the flames of fear, were making these broad statements about the efficacy of their products, about their ability to keep students safe” from threats like school shootings and suicide, despite a lack of evidence to back up their claims, report lead author and ACLU senior policy counsel Chad Marlow told ĂŰĚŇÓ°ĘÓ. 

Rather than making kids safe, Marlow said, the tools could be damaging to their development and well-being. “The harm is actually significant and, by not acknowledging the harms that are caused, there’s less incentive to look at other interventions,” he said.

ACLU

Three-quarters of students worry about at least one negative consequence of student surveillance, which includes the widespread proliferation of digital tools that monitor their online communications for references to sex, drugs, violence or self-harm, according to the online survey. Commissioned by the ACLU, the polling firm YouGov queried 502 teens throughout the country in October 2022. Nearly a quarter of respondents said that digital monitoring tools limit the resources they feel they can access online while a similar percentage worried the information collected about them could be shared with the police or be used against them in the future by a college or an employer. Some 27% feared the tools could be used for disciplinary purposes.

As a result, students alter their behaviors due to fears that “deviating from expectations is punishable in the world that they’re growing up in,” Marlow said. “What does that tell them about innovation or exploring new ideas?”

Survey findings , released last month by the nonprofit Center for Democracy and Technology, which found that while a majority of parents and students still embrace digital tools that monitor students’ online behaviors, their support has dwindled over the last year. 

Both reports identified detrimental effects of digital surveillance that researchers said run counter to federal civil rights laws that protect students from discrimination based on race, disability, sexual orientation or gender identity. 

In the student survey conducted by the Center for Democracy and Technology, researchers found that while districts bought digital monitoring tools to keep students safe, they are used regularly as discipline tools that routinely bring youth in contact with the police. LGBTQ+ youth and those with disabilities were significantly more likely to experience the harms of surveillance. For example, 65% of LGBTQ+ youth said they or someone they knew got into trouble due to online activity monitoring, compared to 56% of their straight and cisgender peers. Meanwhile, nearly a third of LGBTQ+ students said that they or someone they know has been “outed” by the technology.

In the absence of rigorous, independent research on the efficacy of school surveillance tools to improve campus safety, the ACLU report argues that schools are left to make purchasing decisions based on what the group called fear-based marketing tactics. Security companies hype the risks of school violence and student self-harm while overstating the utility of their products, the report says. Security industry lobbying efforts, meanwhile, have successfully steered hundreds of millions of dollars in government school safety spending toward unproven technologies. 

“It would be like going to buy a car and the only source of information is the car salesperson,” Marlow said. “That’s probably not the best way to make a car purchasing decision, but that’s what’s happening with student surveillance.” 

The Security Industry Association, a trade group that represents security companies and lobbies on their behalf, didn’t immediately respond to a request for comment. 

The ACLU survey results suggest, however, that students have a complicated relationship with school surveillance: While recognizing its potential harms, many also believe it serves its intended purpose. Specifically, 40% of students reported that surveillance technology makes them feel “safe” and 43% said it makes them feel “protected.” Meanwhile, just 14% said it makes them feel “anxious” and a fraction of respondents, 7%, said the tools made them feel “unsafe.” 

Marlow said this support may be the result, at least in part, of successful marketing and a belief that few other options exist. 

“​​When you talk about keeping students safe, I think students are smart enough to realize that in too many places in this country, gun control is off the table,” he said. “Because of the dominance of money and power of the ed tech surveillance industry,” that’s used in marketing and lobbying, “the discussion is almost entirely centered around, ‘Do we use or do we not use student surveillance technologies?’ while alternatives like mental health screenings fail to receive similar consideration. “In that option, between a highly questionable, harmful protection or nothing at all, no one wants to pick nothing at all.” 

While the report focuses largely on digital tools that monitor students’ behaviors online, it also questions the efficacy of surveillance cameras in creating physical safety for students in schools. Cameras have become nearly ubiquitous, with them in the 2019-20 school year, according to the most recent data included in a U.S. Department of Education report released last month. 

Meanwhile, just 55% of schools offered students mental health assessments, according to the most recent federal data, and 42% offered mental health treatment services. 

Despite a sharp rise in schools’ reliance on surveillance and other tools in the last two decades, the number of school shootings has grown. 

There were a record 188 school shootings resulting in injuries or deaths in the 2021-22 school year, according to the federal report. That’s twice as many shootings on campus than the previous record — set just one year earlier. Placing security cameras in schools, Marlow argues, has failed to deter the very crimes they were installed to prevent. In an ACLU analysis of the 10 deadliest school shootings in the last two decades, for example, researchers found that surveillance cameras were present for eight, including in Parkland, Florida, and Uvalde, Texas. 

Along with scrutiny from researchers and civil rights groups, schools’ use of digital monitoring tools has led to several lawsuits alleging they’re ineffective and violate students’ civil liberties. 

In one class-action lawsuit, filed this year in California, the parents of two students claim the student surveillance company and sold the information to targeted advertising vendors without their knowledge or consent. 

A separate federal negligence lawsuit, filed in 2021 in Oklahoma, of being ineffective at keeping kids safe from self-harm. The lawsuit, filed by the parents of a 15-year-old boy who died by suicide, accuses the surveillance company and the state’s third-largest school district of failing to act on warning signs that could have prevented the teenager’s 2019 death. 

The student submitted a “personal odyssey” essay in his freshman English class that was riddled with references to self-harm and suicide, but his teacher failed to act, the complaint alleges, giving him a grade of 100%. The district used Gaggle to identify and flag troubling student digital communications, including references to self-harm and suicide. Yet the lawsuit alleges the company “failed to notify school administration” about the student’s warning signs, including the essay titled “Running Out of Reasons” and an email with a classmate where the two contemplated a plan to “go out at the same time.”

A Gaggle spokesperson didn’t immediately respond to a request for comment. Securly spokesperson Josh Mukai called the lawsuit “baseless and uninformed.”

“Securly has never sold student data to third parties, nor have we ever used student data to target advertisements,” Mukai said in an email. “Securly’s suite of student safety solutions upholds the highest standards for student data privacy and complies with all international, federal and state privacy regulations.”

]]>
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds /article/chatgpt-is-landing-kids-in-the-principals-office-survey-finds/ Wed, 20 Sep 2023 04:01:00 +0000 /?post_type=article&p=715056 Ever since ChatGPT burst onto the scene last year, a heated debate has centered on its potential benefits and pitfalls for students. As educators worry students could use artificial intelligence tools to cheat, a new survey makes clear its impact on young people: They’re getting into trouble. 

Half of teachers say they know a student at their school who was disciplined or faced negative consequences for using — or being accused of using — generative artificial intelligence like ChatGPT to complete a classroom assignment, , a nonprofit think tank focused on digital rights and expression. The proportion was even higher, at 58%, for those who teach special education. 

Cheating concerns were clear, with survey results showing that teachers have grown suspicious of their students. Nearly two-thirds of teachers said that generative AI has made them “more distrustful” of students and 90% said they suspect kids are using the tools to complete assignments. Yet students themselves who completed the anonymous survey said they rarely use ChatGPT to cheat, but are turning to it for help with personal problems.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“The difference between the hype cycle of what people are talking about with generative AI and what students are actually doing, there seems to be a pretty big difference,” said Elizabeth Laird, the group’s director of equity in civic technology. “And one that, I think, can create an unnecessarily adversarial relationship between teachers and students.”   

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat. 

Center for Democracy and Technology

The results on ChatGPT’s educational impacts were included in the Center for Democracy and Technology’s broader annual survey analyzing the privacy and civil rights concerns of teachers, students and parents as tech, including artificial intelligence, becomes increasingly engrained in classroom instruction. Beyond generative AI, researchers observed a sharp uptick in digital privacy concerns among students and parents over last year. 

Among parents, 73% said they’re concerned about the privacy and security of student data collected and stored by schools, a considerable increase from the 61% who expressed those reservations last year. A similar if less dramatic trend was apparent among students: 62% had data privacy concerns tied to their schools, compared with 57% just a year earlier. 

Center for Democracy and Technology

Those rising levels of anxiety, researchers theorized, are likely the result of the growing frequency of cyberattacks on schools, which have become a primary target for ransomware gangs. High-profile breaches, including in Los Angeles and Minneapolis, have compromised a massive trove of highly sensitive student records. Exposed records, investigative reporting by ĂŰĚŇÓ°ĘÓ has found, include student psychological evaluations, reports detailing campus rape cases, student disciplinary records, closely guarded files on campus security, employees’ financial records and copies of government-issued identification cards. 

Survey results found that students in special education, whose records are among the most sensitive that districts maintain, and their parents were significantly more likely than the general education population to report school data privacy and security concerns. As attacks ratchet up, 1 in 5 parents say they’ve been notified that their child’s school experienced a data breach. Such breach notices, Laird said, led to heightened apprehension. 

“There’s not a lot of transparency” about school cybersecurity incidents “because there’s not an affirmative reporting requirement for schools,” Laird said. But in instances where parents are notified of breaches, “they are more concerned than other parents about student privacy.” 

Parents and students have also grown increasingly wary of another set of education tools that rely on artificial intelligence: digital surveillance technology. Among them are student activity monitoring tools, such as those offered by the for-profit companies Gaggle and GoGuardian, which rely on algorithms in an effort to keep students safe. The surveillance software employs artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Among parents surveyed this year, 55% said they believe the benefits of activity monitoring outweigh the potential harms, down from 63% last year. Among students, 52% said they’re comfortable with academic activity monitoring, a decline from 63% last year. 

Such digital surveillance, researchers found, frequently has disparate impacts on students based on their race, disability, sexual orientation and gender identity, potentially violating longstanding federal civil rights laws. 

The tools also extend far beyond the school realm, with 40% of teachers reporting their schools monitor students’ personal devices. More than a third of teachers say they know a student who was contacted by the police because of online monitoring, the survey found, and Black parents were significantly more likely than their white counterparts to fear that information gleaned from online monitoring tools and AI-equipped campus surveillance cameras could fall into the hands of law enforcement. 

Center for Democracy and Technology

Meanwhile, as states nationwide pull literature from school library shelves amid a conservative crusade against LGBTQ+ rights, the nonprofit argues that digital tools that filter and block certain online content “can amount to a digital book ban.” Nearly three-quarters of students — and disproportionately LGBTQ+ youth — said that web filtering tools have prevented them from completing school assignments. 

The nonprofit highlights how disproportionalities identified in the survey could run counter to federal laws that prohibit discrimination based on race and sex, and those designed to ensure equal access to education for children with disabilities. In a letter sent Wednesday to the White House and Education Secretary Miguel Cardona, the Center for Democracy and Technology was joined by a coalition of civil rights groups urging federal officials to take a harder tack on ed tech practices that could threaten students’ civil rights. 

“Existing civil rights laws already make schools legally responsible for their own conduct, and that of the companies acting at their direction in preventing discriminatory outcomes on the basis of race, sex and disability,” the coalition wrote. “The department has long been responsible for holding schools accountable to these standards.”

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool’s “role in negatively impacting LGBTQ students.”

“Our philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,” the nonprofit said in the statement. “We hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.” 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project’s “guidance on how to do what we do better.” The company also where it previously touted the partnership. 

“We’re disappointed that The Trevor Project has decided to pause our collaboration,” she said. “However, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.” 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to “improve mental health outcomes for LGBTQ young people.” 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project’s work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle’s surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

“It really does feel like a ‘We paid you, now say we’re fine,’ kind of thing,” said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle’s “seal of approval” from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

“People who want to defend Gaggle can just point to their little Trevor Project thing and say, ‘See, they have the support of “The Gays” so it’s fine actually,’ and all it does is make it easier to deflect and defend actual issues with Gaggle.” 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among “Corporate Partners” on The Trevor Project’s website (screenshot)

Following an investigation by ĂŰĚŇÓ°ĘÓ into Gaggle’s monitoring practices, the company . Gaggle’s algorithm relies on keyword matching to compare students’ online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are “gay” and “lesbian,” verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance — a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it’s important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle’s “desire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.” 

“It’s true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring — many worry that these tools could out them to teachers or parents against their will,” the statement continued. “It is because of that very real concern that we have worked in a limited capacity with digital safety companies — to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.” 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is “honored to be able to align with The Trevor Project to better serve LGBTQ youth,” and that the company is “always looking for ways to learn and to improve upon what we do to better support students and keep them safe.” 

‘Faceless bureaucracy’ 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students’ school-issued Microsoft and Google accounts, Gaggle seeks to alert educators — and in some cases the police — of students’ online behaviors that suggest they might harm themselves or others.

“It really is about collecting details that kids may not be voluntarily sharing — information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,” Vance said. At The Trevor Project, “you have proactive outreach from youth who know that they need help or they need a community.” 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy’s and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it’s ultimately up to youth to decide which information they share with adults. 

It’s important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where “some faceless bureaucracy is finding out and informing your parents” about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn’t the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic — one that’s in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project’s suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity — typically called “outing” — due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students’ civil liberties and to state its intent “to take enforcement action against violations that result in discrimination.” The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group’s co-founder and executive director, said The Trevor Project’s partnership with Gaggle could be positive if it’s used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company “is proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.” (Screenshot)

“If that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they’ve done is harm the student,” Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking “to avoid liability issues.”  

“It is our stance that since these tools are not going anywhere, we think it’s important to do our part to offer our expertise around LGBTQ experiences,” the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

“I have friends who are queer and/or trans who are out at school but not to their parents,” he said. “If they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.” 

In The Trevor Project’s recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like “gay” and “lesbian” can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he’s been called just about every “slur for a queer or a trans person that isn’t from like 80 years ago.” While slurs are common, terms like “lesbian” are not.

“As an actual teenager going to an actual public school, those words are not being used to bully people,” he said. “They’re just not.”

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Minneapolis Schools to Halt Controversial Student Surveillance Initiative /article/minneapolis-schools-to-halt-controversial-student-surveillance-initiative/ Mon, 27 Jun 2022 19:56:23 +0000 /?post_type=article&p=692269 The Minneapolis school district has announced plans to end its relationship with Gaggle, a controversial digital surveillance tool that monitored students’ online behaviors during pandemic-induced remote learning. 

The announcement, which follows extensive reporting by ĂŰĚŇÓ°ĘÓ about how the tool subjected the city’s youth to pervasive round-the-clock digital surveillance, was outlined last week at the bottom of a newsletter alerting families to changes at the district. Gaggle, which uses artificial intelligence and human content moderators to track students’ online activities and notify district officials of “inappropriate behaviors or potential threats to self or others,” will no longer be used beginning on July 1, the district announced. 

A week after schools went remote in Minneapolis and nationally in March 2020, the district sidestepped typical procurement rules and used federal pandemic relief money to contract with Gaggle, a for-profit company that reported significant business growth when classes went online. The district has spent more than $355,000 on the tool, which monitors student behaviors on school-issued Google and Microsoft accounts, and has a contract with the company through September 2023. 

District officials said the tool saved lives but civil rights advocates and students targeted by the program have questioned its efficacy and accused the company of violating students’ privacy rights. 

In an email, district spokesperson Julie Schultz Brown attributed the change to “made in order to honor the terms of our new contract” with educators. Gaggle founder and CEO Jeff Patterson said the Minneapolis district will stop using the tool at a moment when “students across the United States are suffering.” In June, the company alerted Minneapolis officials to 15 “critical incidents” related to suicide, death threats, violence and drug use, Patterson wrote in a statement. Nationally, the pandemic has led to a surge in youth mental health issues and . 

A recent report by Democratic Sens. Elizabeth Warren and Ed Markey warned that Gaggle and similar services could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars. Gaggle claims it during the 2020-21 school year, yet independent research on the tool’s effectiveness doesn’t exist. 

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Teeth Logsdon-Wallace, a rising freshman in Minneapolis, saw the district’s decision to cut ties with Gaggle as a major victory. He became an outspoken Gaggle critic after a homework assignment, which discussed a previous suicide attempt and how he learned important coping skills, got flagged by the tool’s surveillance dragnet. Officials at Gaggle and the district said the tool helps identify students who are struggling emotionally and need adult intervention. But 14-year-old Logsdon-Wallace and other critics argue that digital surveillance is an inappropriate way to pinpoint students who need mental health care. Rather than helping, he said the experience “felt violating and gross.” 

“When you’re spying on kids and their stuff, especially about mental health stuff, they’re just going to be more secretive about it,” he said. “That can just cause more danger.”

While Gaggle relies on technology to ferret out students with issues like depression, Logsdon-Wallace said that he and other students are more likely to share their mental health struggles with adults at school if there’s a culture of trust. Monitoring communications through an algorithm and a team of low-paid remote workers who the students don’t even know, he said, had the opposite effect and left students more apprehensive about district computers, “which could be positive and negative.”

While his peers learned how to better protect their own privacy online “even when it’s inherently being violated,” he said, he worried that some may have been “bottling up mental health issues because of it.”

The district will no longer use Gaggle’s student activity monitoring tool or the company’s anonymous tip line, SpeakUp for Safety, which allows students to report potential safety threats confidentially. Instead of turning to SpeakUp, concerned parents and students should report issues to police officials with the state Bureau of Criminal Apprehension, the district wrote in its newsletter. 

District officials have said the anonymous tip line was central to its decision to contract with Gaggle, yet previous reporting by ĂŰĚŇÓ°ĘÓ found that the service was rarely used. Meanwhile, the digital surveillance tool routinely flagged students who made references to sex, drugs and violence on district technology. An analysis of nearly 1,300 alerts found the service flagged Minneapolis students for discussing violent impulses, eating disorders, abuse at home and suicidal plans. 

But Gaggle regularly flagged benign student chatter and personal files, including classroom assignments, casual conversations between teens and sensitive journal entries. Gaggle flags students who use keywords related to sexual orientation including “gay” and “lesbian,” and on at least one occasion school officials in Minneapolis outed an LGBT student to their parents. The sheer volume of student communications that got flagged by Gaggle was at times overwhelming, the Minneapolis school district’s head of security acknowledged, but he also felt like he was able to save students from dying by suicide. 

In interviews with ĂŰĚŇÓ°ĘÓ, former content moderators at Gaggle — hundreds of whom are paid just $10 an hour on month-to-month contracts — raised serious questions about the company’s efficacy, its employment practices and its effect on students’ civil rights. 

Moderators said they received little training before they were given access to students’ sensitive materials and were pressured to prioritize speed over quality. They also reported insufficient safeguards to protect students’ sensitive files, including nude selfies. Patterson acknowledged that moderators, who work remotely with little supervision or oversight, could easily save copies of students’ nude photographs and share them on the dark web. 

As a transgender teenager who believes the school district has done too little to address bullying, Logsdon-Wallace said he already had little trust in district leaders. While Gaggle didn’t address the abuse from peers, having his sensitive experiences caught in the company’s algorithm made the situation worse.

“The very little trust I had in the administration is just destroyed,” he said. “You can’t expect students to trust you if you’ve done nothing to earn that trust.”

]]>
Philly Schools Are Screening Middle Schoolers for Weapons /article/philly-schools-are-screening-middle-schoolers-for-weapons/ Sat, 14 May 2022 12:31:00 +0000 /?post_type=article&p=589238 The School District of Philadelphia is now periodically screening students for weapons in an effort to combat gun violence.

Effective Monday, 6th through 8th grade students will be subject to the screenings which will happen in the mornings when students first come to school. The screenings will be done at six schools per day and will be conducted at every middle school and elementary schools with middle grades. The district was already conducting screenings at high schools.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


The measure comes as Philadelphia is experiencing an uptick in gun violence.

“We understand that there are mixed emotions about this,”  district spokesperson Monica Lewis said of the move.

“There are some who are in favor of because they know that it’s an effort to keep students and staff safe and we know that there are some who aren’t in favor of because they feel like it’s intrusive and we understand that and we respect their opinion, but we hope that they understand that anything that we are doing is with the safety and the well-being of the students and staff in mind,” Lewis said.

The screenings will be done throughout the remainder of the school year which ends on June 11.

The screenings are being conducted by a team of School Safety personnel and will include searches by a hand wand or metal detector and a physical check of all bags, backpacks and personal items.

The district said any student in possession of a firearm will be detained by School Safety and referred to the Philadelphia Police Department.

Lewis said the district will not be commenting on whether any weapons were found during the process.

The district said it is giving students the opportunity to dispose of any illegal or inappropriate items prior to being screened without consequence.

Ayana Jones is a reporter for the Philadelphia Tribune, . 

is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Pennsylvania Capital-Star maintains editorial independence. Contact Editor John Micek for questions: info@penncapital-star.com. Follow Pennsylvania Capital-Star on and .

]]>
Meet the Gatekeepers of Students’ Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn’t want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students’ emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids’ deepest secrets — like nude selfies and suicide notes — regularly flashed onto Waskiewicz’s screen. Though she felt “a little bit like a voyeur,” she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle’s moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment’s notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

“In all honesty I was sort of half-assing it,” Waskiewicz admitted in an interview with ĂŰĚŇÓ°ĘÓ. “It wasn’t enough money and you’re really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students’ private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about “a tsunami of youth suicide headed our way” and said that schools have “a moral obligation to protect the kids on their digital playground.” 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company’s efficacy, its employment practices and its effect on students’ civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers — either in-person, on the phone or over Zoom — before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn’t sleep and without “any money to show for what I was putting up with.”

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors’ role with the company, arguing they use “common sense” to distinguish false flags generated by the algorithm from potential threats and do “not require substantial training.” 

While the experiences reported by Gaggle’s moderator team platforms like Meta-owned Facebook, Patterson said his company relies on “U.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,” as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

“Some people are not fast decision-makers. They need to take more time to process things and maybe they’re not right for that job,” he told ĂŰĚŇÓ°ĘÓ. “For some people, it’s no problem at all. For others, their brains don’t process that quickly.”

Executives also sought to minimize the contractors’ access to students’ personal information; a spokeswoman said they only see “small snippets of text” and lacked access to what’s known as students’ “personally identifiable information.” Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students’ names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to “gray areas,” such as whether a Victoria’s Secret lingerie ad would be considered acceptable or not. 

“Those people are really just the very, very first pass,” Gaggle spokeswoman Paget Hetherington said. “It doesn’t really need training, it’s just like if there’s any possible doubt with that particular word or phrase it gets passed on.” 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

“I went into the experience extremely excited to help children in need,” McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. “I realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that’s seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about “lives saved” and child safety incidents at every meeting, and they are open about sharing the company’s financial outlook so that employees “can have confidence in the security of their jobs.”

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle’s content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

‘We are just expendable’

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a “highly trained content review team” to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle’s content review team, described their training as “a joke,” consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company’s safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were “more confused than when we started.”

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators’ feedback to ĂŰĚŇÓ°ĘÓ.

“If you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,” one on Indeed. “Warning, you will see awful awful things. No they don’t provide therapy or any kind of support either.

“That isn’t even the worst part,” the reviewer continued. “The worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn’t be able to function, but we are just expendable.” 

As the first layer of Gaggle’s human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students’ communications for additional consideration. Designated employees on Gaggle’s Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students’ files, Patterson said.

Gaggle’s staunchest critics have questioned the tool’s efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students’ civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with ĂŰĚŇÓ°ĘÓ â€œstruck me as the worst-case scenario,” said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators’ limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, “is not acceptable.”

In to lawmakers, Gaggle described a two-tiered review procedure but didn’t disclose that low-wage contractors were the first line of defense. CEO Patterson told ĂŰĚŇÓ°ĘÓ they “didn’t have nearly enough time” to respond to lawmakers’ questions about their business practices and didn’t want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren’t interviewed before getting placed on the job.

“There’s a lot of contractors. We can’t do a physical interview of everyone and I don’t know if that’s appropriate,” he said. “It might actually introduce another set of biases in terms of who we hire or who we don’t hire.”

‘Other eyes were seeing it’

In a previous investigation, ĂŰĚŇÓ°ĘÓ analyzed a cache of public records to expose how Gaggle’s algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools’ authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle’s algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students’ online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students’ online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students’ online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

“I felt kind of bad because the kids didn’t have the ability to have stuff of their own and I wondered if they realized that it was public,” she said. “I just wonder if they realized that other eyes were seeing it other than them and their little friends.”

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students’ computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students’ screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don’t share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children’s activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by ĂŰĚŇÓ°ĘÓ’s reporting, said Elizabeth Laird, the group’s director of equity in civic technology. 

“I don’t know that the way this information is being handled actually would meet parents’ expectations,” Laird said. 

Another former contractor, who reached out to ĂŰĚŇÓ°ĘÓ to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids’ private lives — including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn’t come with health insurance. 

“I went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,” said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. “It broke my heart that they had to go through these revelations about themselves in a context where they can’t even go to school and get out of the house a little bit. They have to do everything from home — and they’re being constantly monitored.” 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to ĂŰĚŇÓ°ĘÓ by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they’re warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

“Quite honestly, we’re dealing with school districts with very limited budgets,” Patterson said. “There have to be some tradeoffs.” 

The anonymous contractor said he wasn’t as concerned about his own well-being as he was about the welfare of the students under the company’s watch. The company lacked adequate safeguards to protect students’ sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students’ nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn’t aware of any data breaches. 

“We do things in the interface to try to disable the ability to save those things,” Patterson said, but “you know, human beings who want to get around things can.”

‘Made me feel like the day was worth it’

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

“It was a little weird when they were asking for the banking information, like ‘Wait a minute is this real or what?’” Waskiewicz said. “I Googled them and I think they’re pretty big.”

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student’s suicide note. 

“Knowing I was able to help with that made me feel like the day was worth it,” she said. “Hearing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.” 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district’s contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student’s suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district’s code of conduct. 

“No tool is perfect, every organization has room to improve, I’m sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,” said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

“There’s always going to be pros and cons to any organization, any service,” Enfield told ĂŰĚŇÓ°ĘÓ, “but our experience has been overwhelmingly positive.”

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company’s artificial intelligence lacked sophistication. They said the algorithm routinely flagged students’ papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that’s long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that “99% of the time” Gaggle’s algorithm flagged pedestrian materials including pictures of sunsets and student’s essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing “the right thing.”

McElligott said that managers’ personal opinions added another layer of complexity. Though moderators were “held to strict rules of right and wrong decisions,” she said they were ultimately “being judged against our managers’ opinions of what is concerning and what is not.” 

“I was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,” she said. “There was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up — and when I alerted it, I was told it was not as serious as I thought.” 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are “so concerned about students.” 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle’s surveillance dragnet, and pressure to work quickly didn’t offer enough time to evaluate long chat logs between students having “heartfelt and sensitive” conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

“When I would see stuff like that I was like ‘Oh, thank God, I can just get this out of the way and heighten how many items per hour I’m getting,’” he said. “It’s like ‘I hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.’” 

Ultimately, he said he was unprepared for such extensive access to students’ private lives. Because Gaggle’s algorithm flags keywords like “gay” and “lesbian,” for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to “ensure that these vulnerable students are not being harassed or suffering additional hardships,” but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

“I thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,” the former moderator said. “I felt tremendous power was being put in my hands” to distinguish students’ benign conversations from real danger, “and I was given that power immediately for $10 an hour.” 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle’s watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it’s “just really freaky” that moderators can review students’ sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle’s content review team. 

“Not only is it violating the privacy rights of students, which is bad for our mental health, it’s traumatizing these moderators, which is bad for their mental health,” he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

“Bad labor conditions don’t just affect the workers,” he said. “It affects the people they say they are helping.” 

Gaggle cannot prohibit contractors from reviewing students’ private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

“However, the contractors know the nature of the content they will be reviewing,” Durkac said. “It is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.” 

Gaggle’s former contractors also weighed students’ privacy rights. Heyman said she “went back and forth” on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

“If you don’t want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,” she said. “As long as they’re being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.” 

Logsdon-Wallace and his mother said they didn’t know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle’s algorithm helped him understand the effects that surveillance can have on young people. 

“Sometimes a kid would use a curse word and another kid would be like, ‘Dude, shut up, you know they’re watching these things,’” he said. “These kids know that they’re being looked in on,” even if they don’t realize their observer is a contractor working from the couch in his living room. “And to be the one that is doing that — that is basically fulfilling what these kids are paranoid about — it just felt awful.” 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded ĂŰĚŇÓ°ĘÓ and sits on its board of directors.

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it’s ‘Not That Smart’ /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone’s gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since “graduated” from weekly therapy sessions and has found a better headspace, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope — intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song “Your Heart is a Muscle the Size of Your Fist” helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was “a reminder to keep on loving, keep on fighting and hold on for your life.” (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, ĂŰĚŇÓ°ĘÓ analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word “suicide,” context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment — that his mental health had improved — was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 â€œI was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context,” and while some districts have requested to be notified about references to previous suicide attempts, it’s ultimately up to administrators to “decide the proper response, if any.”  

‘A crisis on our hands’

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by ĂŰĚŇÓ°ĘÓ through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

“Before the pandemic, we had a crisis on our hands,” he said. “I believe there’s a tsunami of youth suicide headed our way that we are not prepared for.” 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace’s mother Alexis Logsdon didn’t know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart — it’s just like ‘Woop, we saw the word.’” 

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications — an experience she described as “really scary.”

“If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”
—
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal and, according to her, Gaggle had ultimately flagged profanity in students’ fictional article submissions. 

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email while pointing to one of the fictional articles. “Please refrain from storing or sharing inappropriate content in your files.” 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn’t catch everything. Even as she got flagged when students shared documents with her, the articles’ authors weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious,” and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

“With such a seemingly random service, that doesn’t seem to — in the end — have an impact on improving student health or actually taking action to prevent suicide and threats” she said in an interview. “If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times “does not properly indicate the author of a document and assigns a random collaborator.”

“We are hoping Google will improve this functionality so we can better protect students,” Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn’t notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she’d shoot her “puny little brain with my grandpa’s rifle.”

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter’s teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

“I didn’t hear a word from Gaggle about it,” she said. “If I hadn’t brought it to the teacher’s attention, I don’t think that anything would have been done.” 

The incident, which occurred in April, fell outside the six-month period for which ĂŰĚŇÓ°ĘÓ obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it “does not have any insight into the steps the district took to address this particular matter.” 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials “would never discuss with a community member any communication flagged by Gaggle.” 

“That unrelated but concerned parent would not have been provided that information nor should she have been,” she wrote in an email. “That is private.” 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

‘The big scary algorithm’

When identifying potential trouble, Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

“We’re using the big scary algorithm term here when I don’t think it applies,” This is not Netflix’s recommendation engine. This is not Spotify.”
—
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.” 

She said that Gaggle’s utility could be impaired because it doesn’t adjust to students’ behaviors over time, comparing it to Netflix, which recommends television shows based on users’ ever-evolving viewing patterns. “Something that doesn’t learn isn’t going to be accurate,” she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle’s marketing materials appear to overhype the tool’s sophistication to schools, she said. 

“We’re using the big scary algorithm term here when I don’t think it applies,” she said. “This is not Netflix’s recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.” 

“Artificial intelligence without human intelligence ain’t that smart.”
—
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed.” The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.” 

Ultimately, the algorithm to identify keywords is used to “narrow down the haystack as much as possible,” Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

“Artificial intelligence without human intelligence ain’t that smart,” he said. 

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy,” and that the misuse of school equipment could result in discipline and “civil or criminal liability.” District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor “the online activities of minors.” 

Patterson suggested that teachers aren’t paying close enough attention to keep students safe on their own and “sometimes they forget that they’re mandated reporters.” On the , Patterson says he launched the company in 1999 to provide teachers with “an easy way to watch over their gaggle of students.” Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company’s role in meeting it. As technology becomes a key facet of American education, Patterson said that schools “have a moral obligation to protect the kids on their digital playground.” 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user.” In , her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they’re concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases.” Around-the-clock surveillance, they wrote, demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.” 

“Escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report,” the senators wrote. “Flagging students as ‘high-risk’ may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.”

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd’s murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle’s algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by ĂŰĚŇÓ°ĘÓ offer a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. The Minneapolis district’s internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district’s Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that law enforcement is not a “regular partner,” when responding to incidents flagged by Gaggle. It doesn’t deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by ĂŰĚŇÓ°ĘÓ.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

“Even if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,” Matlock said, though it’s unclear if any students have faced legal consequences. “It’s the question as to why they’re doing it,” and to raise the issue with their parents.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay, and “lesbian.” On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident “disgusting and horribly messed up.” 

“They have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” he said. “When people are just talking about being gay, anything they’re writing would be flagged.” 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in ĂŰĚŇÓ°ĘÓ’s data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

“That’s definitely really messed up, especially when the school is like ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by ĂŰĚŇÓ°ĘÓ, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>
Report: Most Parents, Teachers Support Student Surveillance Tech /article/new-research-most-parents-and-teachers-have-accepted-student-surveillance-as-a-safety-tool-but-see-the-potential-for-serious-harm/ Tue, 21 Sep 2021 16:30:00 +0000 /?post_type=article&p=577984 Tools that monitor students’ online behavior have become ubiquitous in U.S. schools — and grew rapidly as the pandemic closed campuses nationwide — but a majority of parents and teachers believe the benefits of such digital surveillance outweigh the risks, .

Similarly, half of students said they are comfortable with schools’ use of monitoring software while a quarter reported feeling queasy about the idea, according to the new research by the Center for Democracy and Technology, a nonprofit group based in Washington, D.C. Despite their overall comfort with digital software, teachers, parents and students each worried about how the tools could have detrimental side effects. Specifically, many parents and teachers were concerned that digital surveillance could be used to discipline students and young people reported becoming more reserved when they knew they were being watched.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“In response to the pandemic, the focus on technology and its use has never been greater,” said report co-author Elizabeth Laird, the center’s director of equity in civic technology. As tech gains a greater grasp on education, she said it’s important for school leaders and policymakers to remain focused on protecting students’ individual rights. She worried that student surveillance technology could have a damaging impact on students, especially youth of color and those from low-income households.

“I don’t think it’s a slam dunk,” Laird said.

Though the report didn’t highlight specific tools used, schools deploy a range of digital monitoring software to track student activity, including programs that block online material deemed inappropriate, track when students log into school applications, and allow teachers to view students’ screens in real-time and even take control of their computers.

Last week, an investigative report by ĂŰĚŇÓ°ĘÓ exposed how the Minneapolis school district’s use of the digital surveillance tool Gaggle had subjected children to relentless online surveillance long after classes ended for the day — including inside students’ homes. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day by sifting through data stored on their school-issued Google and Microsoft accounts. In Minneapolis, the company flagged school security when moderators believed students could harm themselves or others, but it also picked up students’ classroom assignments, journal entries, chats with friends and fictional stories.

Among teachers surveyed by the Center for Democracy and Technology, 81 percent said their schools use software that tracks students’ computer activity, including to block obscene material, monitor students’ screens in real time and prohibit students from using websites unrelated to school like YouTube. A majority of both parents and students reported such tools were used in their schools, but they were also more likely than teachers to be unsure about whether youth were being actively monitored by educators. In interviews with administrators, researchers found that many school leaders weren’t sure how best to be transparent with families about their monitoring practices.

“Certainly there is an imbalance in information and transparency around what is happening,” Laird said. School districts have been clear [that] students shouldn’t have an expectation of privacy but they haven’t been as clear about what they are tracking, how they are tracking it, how long they keep that information. They really should be doing that.”

Four-fifths of surveyed teachers said their schools used digital tools to track students online. Both parents and students were more unlikely than teachers to be unsure whether such tools were in use in their schools. (Photo courtesy Center for Democracy and Technology)

Among teachers, 66 percent said the benefits of activity monitoring outweigh student privacy concerns and 62 percent of parents reached a similar conclusion. Meanwhile, 78 percent of teachers reported that digital surveillance helps keep students safe by identifying problematic online behaviors and 72 percent said it helps keep students on task. But their answers also revealed equity concerns: 71 percent of teachers reported that monitoring software is applied to all students equally, 51 percent worried that it could come with unintended consequences like “outing” LGBTQ students and 49 percent said it violates students’ privacy.

Many teachers reported that such monitoring tools are used on students long after classes end for the day. In total, 30 percent of educators said the tools are active “all of the time,” and 16 percent said the software tracks kids on their personal devices.

Nearly a third of teachers who reported their schools use digital services like Gaggle to track students online said the tools monitor youth behaviors 24 hours a day. (Photo by Center for Democracy and Technology)

Among parents, 75 percent said digital surveillance helps keep students safe and 73 percent said it ensures children remain focused on schoolwork. Yet many parents also reported potential downsides: 61 percent worried of long-term harm if the tools were used to discipline students, 51 percent were concerned about unintended consequences and 49 percent said it violates students’ privacy rights.

Perhaps unsurprisingly, students were less at ease with educators watching their online behaviors. Half said they were comfortable with monitoring tools, a quarter said they were uncomfortable with them and another quarter were unsure.

The data also suggest that students alter their behaviors as a result of being watched: 58 percent said they don’t share their true thoughts or ideas online as a result of being monitored at school and 80 percent said they were more careful about what they search online. While just 39 percent of students said it was unfair that educators monitored their school-issued services, 74 percent opposed the surveillance of their own devices like their cell phones. are among those that could track students’ behaviors on their own technology.

The data raise significant equity concerns. For many students, school-issued devices are their only method of connectivity.

“The privacy and security of personal devices is a luxury not all can afford,” Alexandra Givens, the center’s president and CEO, said in a press release. “Constant online monitoring — especially of students who cannot afford or don’t have access to personal devices — risks creating disparities in the ways student privacy is protected nationwide.”

To reach its findings, researchers conducted online surveys in June that were completed by 1,001 teachers, 1,663 parents and 420 high school students. Researchers also conducted interviews with school administrators to understand their motives in deploying digital surveillance. Among the justifications is a federal law that requires schools to monitor students online. But the law also includes a disclaimer noting that the statute does not “require the tracking of internet use by any identifiable minor or adult user.”

Understanding context is critical, Laird said, adding that the law’s authors hadn’t fully envisioned a world where students could be surveilled by artificial intelligence long after classes end for the day.

“What was happening at the time was students were in a school computer lab for part of the day and monitoring meant having an adult walking around a computer lab and physically looking at what was on students’ computer monitors,” she said. But today, she said the statute is being interpreted very differently.

In response, the center, along with the American Civil Liberties Union and the Center for Learner Equity Tuesday to clarify the law’s stipulations and inform educators it “does not require broad, invasive and constant surveillance of students’ lives online.”

“Systemic monitoring of online activity can reveal sensitive information about students’ personal lives, such as their sexual orientation, or cause a chilling effect on their free expression, political organizing, or discussion of sensitive issues such as mental health,” the letter continued. “These harms likely fall disproportionately on already vulnerable, over-policed and over-disciplined communities.”

]]>