Ed Markey – ĂŰĚŇÓ°ĘÓ America's Education News Source Wed, 14 Sep 2022 21:18:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Ed Markey – ĂŰĚŇÓ°ĘÓ 32 32 As Advocates and Parents Rally, Youth Online Privacy Bills on Life Support /article/as-advocates-and-parents-rally-youth-online-privacy-bills-on-life-support/ Wed, 14 Sep 2022 21:07:24 +0000 /?post_type=article&p=696557 Sen. Ed Markey was getting quizzed on the viability of new online privacy laws for children when he took a brief but awkward pause. 

The Democrat from Massachusetts, who has long championed consumer privacy and become a key adversary of tech companies like Meta for monetizing user data, joined a Zoom call Tuesday evening to rally support for two bills he said would protect kids from being manipulated by social media algorithms. But he also brought some bad news: The legislation had “stalled” in Washington despite bipartisan support. 

Advocates this week are making a push to get the bipartisan bills — the Kids Online Safety Act and the Children’s Online Privacy Protection Act 2.0 — across the finish line. In a letter on Monday, 145 groups including Fairplay and Common Sense Media urged lawmakers to pass the legislation in the interests of protecting youth mental health, now considered at an all-time low in this country. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But Markey seemed to lay out a path requiring Herculean effort. 

“Only the paranoid survive,” Markey said, adding that the legislation would pass if its supporters — and youth activists in particular — called their lawmakers and demanded they “pull this out of the pile of issues” and give it priority. “We’re going to try to get it over the finish line, but we need you to just have your energy level go higher and higher for these final couple of months and we will get it done.”

The legislative push comes a year after a Facebook whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially for teenage girls. The whistleblower, Frances Haugen, to regulate social media companies — Meta owns Facebook and Instagram — that she accused of pursuing “astronomical profits” while knowingly putting its users at risk. revealed the company knew Instagram made “body image issues worse for one in three teen girls” who blamed the social media platform for driving “increases in the rate of anxiety and depression” and, for some, suicidal thoughts. 

The would make tech companies liable if they expose young people to content deemed harmful, including materials that promote self-harm, eating disorders and substance abuse. It would also require parental controls that could be used to block adult content and to study systems to verify users’ age “at the device or operating system level.”

The , which expands a law that Markey championed in 1998 to cover older teens, would ban targeted advertisements directed at children and require companies to offer an “eraser button” that allows children and teens to remove their personal data. 

Former Facebook employee Frances Haugen (Getty Images)

But deep-pocketed tech companies, Sen. Richard Blumenthal said Tuesday, are standing in the way. 

“Our obstacles here are the big tech lobbyists,” he said. “They have armies of lobbyists. They pay them, they pay them very well. They hire them to block this legislation.”

While the legislation is designed to protect kids, some digital privacy experts say the rules could come with significant unintended consequences — and could lead to an age-verification system where all web users are made to submit documentation like a driver’s license, requiring them to hand over personal information to tech companies. 

On the Zoom call to bolster support for the bills was Vinaya Sivakumar, a high school senior from Ohio, who created her first social media profile when she was 12. What started out as being harmless, she said, quickly took a toll on her health. 

“It just snowballed into something that constantly perpetuated actions and thoughts like self-harm and eating disorders and it was really never let out of my sight,” said Sivakumar, referring to a stream of content she found harmful being fed to her by algorithms. “It almost encouraged me to make decisions that I didn’t necessarily feel were mine and my mental health was in the worst state ever.”

Kristin Bride, a mother and digital safety advocate from Oregon, implored lawmakers to pass the legislation for kids like her 16-year-old son Carson, who died by suicide in 2020 after he was “visciously bullied” by other kids on Snapchat who used third-party apps to conceal their identities. Last year, Bride , the company that owns the social media app Snapchat, and accused it of lacking safeguards to protect children from harassment. In response, Snap suspended two of the apps, Yolo and LMK. But , NGL, has since cropped up. 

“Until social media companies are held accountable for their harmful products, they will always put profit over people,” Bride said, “and kids like Carson and so many others are just collateral damage.” 

Despite the heightened focus in Washington around digital rights and tech companies’ use of user data for targeted advertising, broader digital privacy legislation has also struggled this year. which would create a national digital privacy standard and limit the personal data that tech companies can collect about users, has hit roadblocks, from House Speaker Nancy Pelosi. 

Earlier this month, Ireland’s Data Protection Commission for violating European Union data privacy laws. The commission has been investigating the company for an Instagram setting that automatically sets the profiles of teenagers as public by default. 

Meanwhile, Meta has begun to roll out , including that automatically routes new users younger than 16 to a version with limits on content deemed inappropriate.

The childrens’ safety legislation, which would strengthen rules that haven’t been updated for decades, has received support from a broad range of groups focused on youth well-being, including and the American Psychological Association and The Jed Foundation. from digital rights advocates including the Electronic Frontier Foundation. In that while lawmakers deserve credit “for attempting to improve online data privacy for young people,” the plan would ultimately “require surveillance and censorship” of children and teens “and would greatly endanger the rights, and safety, of young people online.” 

“Data collection is a scourge for every internet user, regardless of age,” the report notes, but the legislation could ultimately force tech companies to further track their users. “Surveillance of young people is , even in the healthiest household, and is not a solution to helping young people navigate the internet.”

Disclosure: Campbell Brown oversees global media partnerships at Meta. Brown co-founded ĂŰĚŇÓ°ĘÓ  and sits on its board of directors.

]]>
With ‘Don’t Say Gay’ Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence — and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn’t explain who they were as a person. 

“It was very confusing trying to navigate understanding who I am and my identity,” said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. “I was able to find the words to understand who I am — words that I wouldn’t be able to piece together in a sentence if the internet wasn’t there.” 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality — intimate details that are gleaned from students’ conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center’s director of equity in civic technology. Following the Supreme Court’s repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 â€œRight now — without doing anything — schools may be getting alerts about students” who are searching the internet for resources related to reproductive health,” Laird said. “If you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.”

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

“I have switched to using Signal to talk to friends and colleagues in this space,” they said, referring to the . “The fear, even though it’s been common knowledge for basically my generation’s entire life that everything you do is being surveilled, it definitely has been amplified tenfold.”

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen’s private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state’s ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called “Don’t Say Gay” laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group’s co-founder and executive director, told ĂŰĚŇÓ°ĘÓ. 

“Right now, we are not in a landscape or an environment where that is safe for a company to be doing,” Wood said. “If there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.”

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

‘Extraordinarily concerned’

For decades, has required school technology to block access to images that are obscene, child pornography or deemed “harmful to minors,” and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren’t explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students’ social media posts, follow their digital movements in real time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

They’ve also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools’ widespread adoption of the tools could trample students’ civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies — GoGuardian, Gaggle, Securly and Bark — whether they flag students for using keywords related to reproductive health, such as “pregnant” and “abortion.”

“We are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,” Markey and Warren wrote. “With reproductive rights under attack nationwide, it would represent a betrayal of your company’s mission to support students if you fail to provide appropriate protections for students’ privacy related to reproductive health information.”

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples’ movements. Such precise location data, the , “may be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.” 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe’s repeal was imminent – despite maintaining a 30-day retention period for most other data. 

“By immediately and permanently deleting data which contains a student’s reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,” Bark CEO Brian Bason , which the company shared with ĂŰĚŇÓ°ĘÓ. 

GoGuardian spokesperson Jeff Gordon said its tools “cannot be used by educators or schools to flag reproductive health-related search terms” and its web filter cannot “flag reproductive health-related searches.” Securly didn’t respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not “collect health data of any kind including reproductive health information,” specifying that the monitoring tool does not flag students who use the terms “pregnant, abortion, birth control, contraception or Planned Parenthood. ” 

Yet tracking conversations about sex is a primary part of Gaggle’s business — more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by ĂŰĚŇÓ°ĘÓ, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to “sexual activity involving a student.” Students were regularly flagged for using keywords like “virginity,” “rape,” and, simply, “sex.” 

Patterson, the Gaggle CEO, has acknowledged that a student’s private diary entry about being raped wasn’t off limits. In touting the tool’s capabilities, he told ĂŰĚŇÓ°ĘÓ his company uncovered the girl’s diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they’d learned from her diary, Patterson said. 

“They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own,” Patterson said.

Any information that surveillance companies collect about students’ sexual behaviors could be used against them by police during investigations, privacy experts warned. And it’s unclear, Laird said, how long the police can retain any data gleaned from the tools. 

‘Don’t Say Gay’

Internet search engines are “particularly potent” tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company — which sifts through files and chat messages on students’ school-issued Microsoft and Google accounts — is not a web filter and therefore “does not track students’ online searches.” Yet Patterson’s assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students’ online searches. By working in tandem, flags triggered by Linewize’s web filtering “can be sent straight to the Gaggle Safety Team,” if the material “should be forwarded to the school or district.” 

In an email, Gaggle spokesperson Paget Hetherington said that in “a very small number of school systems,” the company reviews alerts from web filters before they’re sent to school officials to “alleviate the large number of false positives” and ensure that “only the most critical and imminent issues are being seen by the district.” 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including “gay” and “lesbian.” Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for “inferring sexual perversion” and “serving gay people.” From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of “Don’t Say Gay” laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have “been used against us in very detrimental ways.” 

Companies, he said, are unable to control how officials use that information in an era “where teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.” In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

“They can’t control what’s going to happen in Florida or Texas and they can’t control what’s going to happen in an individual home,” where students could be subjected to abuse, Wood said. “Any person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.” 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a “time capsule” letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one — not even him — would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher’s assurance wasn’t quite true, she learned. Someone had been reading the letter — and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was “identified as inappropriate” and urged her to “refrain from storing or sharing inappropriate content.” In a “second warning,” sent to her inbox, she was told a school administrator was given “access to this violation.” After a third alert, she said, access to her school email account was restricted. She said the experience left her with “a sense of betrayal from my school.” She said she had no idea words like “gay” or “sex” could get flagged by Gaggle’s algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a “time capsule” letter to her future self, had been “identified as inappropriate.” (Courtesy Susan)

“It’s frustrating to know that this program finds the need to have these as keywords, and quite depressing,” she said. “There’s always going to be oppression against the community somewhere, it seems, and it’s quite disheartening.” 

School administrators reviewed the time capsule letter and determined it didn’t contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that’s not the case for everyone.

“That’s not just the surveillance of your activities, that’s the surveillance of your thoughts,” Thomas said of Susan’s experience. “I know that wouldn’t have gone very well for me and I know for a lot of young people that would place them in a lot of danger.”

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio’s strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that “law enforcement is immediately contacted” if the district receives an alert from Gaggle suggesting that a student poses “an imminent threat of harm to self or others.” 

Given the state of abortion rules in Ohio, Susan said she’s concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

“I just flat out don’t trust adults in positions of power or authority,” Susan said. “You don’t really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.”

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students’ online activities, that educators’ widespread use of digital surveillance tools could trample students’ civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate — and potentially harmful — effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country — often 24 hours a day, seven days a week — to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies’ software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use — and potential misuse — of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire “need for federal action to protect students’ civil rights, safety and privacy.”

“While the intent of these products, many of which monitor students’ online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,” the lawmakers wrote. “Studies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.”

An FCC spokesperson said they’re reviewing the and an Education Department spokesperson said they “look forward to corresponding with the senators” about its findings.

Lawmakers’ inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by ĂŰĚŇÓ°ĘÓ into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. ĂŰĚŇÓ°ĘÓ used public records to expose how Gaggle’s algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn’t respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators’ March 30 report and looks forward “to continuing our dialogue with Senators Warren and Markey on the important topics they have raised.”

“Parents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,” GoGuardian spokesman Jeff Gordon said in a statement. “Schools also have a responsibility to keep students safe in digital spaces and on school-issued devices.” 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers’ recommendations “to assess how we can further strengthen our work to better protect students.”

“We want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,” Patterson continued. “We have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.”

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country’s “terrible history of bias in school discipline” by removing the decisions of individual teachers and administrators.

“While any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,” Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies’ letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders “prefer that we contact public safety agencies directly in lieu of a district contact.”

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students’ internet use to ensure they don’t access material “harmful to minors,” such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law’s scope. Meanwhile, advocates have questioned whether schools’ use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students’ computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their “true thoughts or ideas because I know what I do online is being monitored.”  

Elizabeth Laird, the group’s director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it’s “disappointing that it took a letter from Congress to get this information.” She said she hopes the FCC and Education Department adopt lawmakers’ recommendations.

“None of these companies have researched whether their products are biased against certain groups of students,” she said in an email while questioning their justification for holding off on such an inquiry. “They cite privacy as the reason for not doing so while simultaneously monitoring students’ messages, documents and sites visited 24 hours a day, seven days a week.” 

ĂŰĚŇÓ°ĘÓ’s investigation, which used data on Gaggle’s foothold in Minneapolis Public Schools, failed to identify whether the tool’s algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like “gay” and “lesbian” were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification “would do little at best and may provide counterproductive guidance at worst.” Many schools, she said, are likely to use the tools regardless of the federal rules. 

“Schools aren’t required to monitor social media, and many have chosen to do so anyway,” said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance “could actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.”

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by ĂŰĚŇÓ°ĘÓ, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>