Bark – ĂŰĚŇÓ°ĘÓ America's Education News Source Fri, 19 Jan 2024 06:34:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Bark – ĂŰĚŇÓ°ĘÓ 32 32 Experts on Kids & Social Media Weigh the Pros and Cons of ‘Growing Up in Public’ /article/experts-on-kids-social-media-weigh-the-pros-and-cons-of-growing-up-in-public/ Wed, 17 Jan 2024 13:30:00 +0000 /?post_type=article&p=720576 Parents are more concerned than ever about their kids’ social media habits, worried about everything from oversharing and cyberbullying to anxiety, depression, sleep and study time. 

Recent surveys of young people show that parents’ concerns may be justified: More than half of U.S. teens spend at least four hours a day on these apps. Girls, who are , spend an average of nearly an hour more on them per day than boys. Many parents are searching for support. 

Perhaps more than anyone, Carla Engelbrecht and Devorah Heitner are qualified to offer it. They’ve spent years puzzling over how families can help understand media from the inside out, and how schools both help and hurt kids’ ability to cope.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Engelbrecht is a longtime children’s media developer. A veteran of Sesame Workshop and PBS Kids Interactive, she spent seven years at Netflix, most recently as its director of product innovation. Engelbrecht was behind the network’s Black Mirror “” episode in 2018, which allowed viewers to choose among five possible endings. 

Carla Engelbrecht (second from right) appears onstage with colleagues during a Netflix event on Black Mirror’s “Bandersnatch” episode in 2019. Engelbrecht, who was director of product innovation for the streaming service, is now testing a social media platform for children under 13. (Charley Gallay/Getty Images for Netflix)

Engelbrecht is now in public beta testing for , a new social media platform for kids under 13. She calls it a “course correction” for young people’s social media, aiming to teach them to be more mindful, thoughtful and responsible online.

Heitner is an who specializes in helping parents and educators understand how digital technology, especially social media and interactive gaming, shape kids’ realities. Her books include 2016’s and her new work . 

Speaking to either one would be enlightening, but we decided to facilitate a broader conversation by inviting them to come together (virtually) to share insights and offer a bit of advice for both parents and schools. 

Their conversation with ĂŰĚŇÓ°ĘÓ’s Greg Toppo was wide-ranging, covering the effects of the pandemic, the pressures kids feel online and the women’s experiences communicating with their own children.

Devorah Heitner spoke in 2017 at the Roads to Respect Conference in Los Angeles. Heitner’s new book explores the impact of modern technology on childhood, including the effects of increased adult supervision of kids through tracking devices. (Joshua Blanchard/Getty Images for Rape Treatment Center)

The solutions they offer aren’t simple. In Heitner’s words, parents seeking to learn more about their kids’’ media usage should pull back their surveillance and “lead with curiosity.” 

The conversation has been edited for length and clarity.

ĂŰĚŇÓ°ĘÓ: Devorah, tell us a little bit about your new book.

Devorah Heitner: I wrote Growing Up in Public because I was speaking for years about Screenwise in schools and all these other environments, and people said, “O.K., I get that we want to think about quality over quantity with screen time. But we also want to understand what kids’ subjective experience is and not just focus on how many minutes are good or bad.”

People lie about that anyway. People are sort of oblivious to their own screen use sometimes and get over-focused on their kids’. A lot of adults are recognizing: If I could have had a Tumblr or a Twitter or Instagram as a kid, I could have really done a lot of damage to my prospects and opportunities by so openly sharing.

What are we doing to our reputations?

As I started digging into that question, I recognized that parents are really part of the surveillance culture with kids. So are schools, with grading apps like or [which keep track of kids’ location, among other functions]. I really started understanding in a fuller way how kids are scrutinized. Kids are growing up very searchable, very public, and some of that is awesome. They have a platform, they can be activists. Some of it is problematic. 

The title of your book, Growing Up in Public, says so much about kid’s lives these days. I saw this term the other day: not FOMO, “Fear of Missing Out,” but FOMU, “.” Are those competing interests for young people?

Heitner: Well, there’s definitely a fear of messing up and especially being called out. There’s a lot of “gotcha” culture going on, and kids documenting each others’ screw-ups. And as much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated outside of that context.

I think it’s modeled by adults, but this kind of “gotcha” culture is very insidious and terrifying. And it should be terrifying. 

Carla, tell us a little bit about yourself.

Carla Engelbrecht: I’m a longtime product developer and researcher in the kids’ space. I’ve spent a lot of time making products for kids. I’ve seen for years kids wanting access to Twitter and Facebook and MySpace and , all through the generations of social media. And they always want what is not made for them. They’re aspirational.

Kids are just plopped into this. And just as you wouldn’t give a new driver the keys to the car and just say, “Go!” — you need to teach them how to drive — there’s the same concept for me with media use. We need to teach our kids. Parents don’t know what they’re doing, because none of us have really been through this before, and they abstain. They need support in learning how to do this. Where Devorah talks about things from that guidance perspective, I’m looking at: How can we build a product for kids that helps them learn? 

It seems to me like Betweened is a site for parents as much as anybody. 

Engelbrecht: There’s definitely two audiences here. There’s absolutely a path where I could build a product for kids and launch them onto it. But I wouldn’t be addressing all the pain points.

Kids want short-form content. They want to create. They want to connect with their peers. In order to successfully set kids up to do that, parents need tools, too. And so it is really a product for both kids and parents.

Carla mentioned all these different apps coming down the road. Devorah, I’m thinking about you saying to someone recently how you’ve been working on this book for five years. A lot has changed in five years. We didn’t have TikTok five years ago. 

Heitner: Screenwise came out in the fall of 2016, which was a memorable time for many reasons: a lot of social forces happening in our world with Trump’s election. 

And then you have the pandemic in 2020. That’s around the time I had sold the book and was trying to interview people. Suddenly, I’m not in schools anymore. I’m on Zoom with kids, which is a whole research problem: How do you get a wider range of kids, not just the super-compliant kids who show up to a Zoom? And the pandemic was an accelerant to a lot of things happening already with kids in tech.

“Parents are really part of the surveillance culture with kids. So are schools.”

Devorah Heitner

It was certainly not the beginning of kids being too young and not [the federal Children’s Online Privacy Protection Act gives parents control over what information websites can collect from their kids]. But it accelerated, and there was kind of a push toward things like Kids Messenger [on Facebook] and other things that I even experimented with at the time. 

The pandemic started when my son was 10. We were like, “Oh, what can we do to help him communicate with friends?” We experimented with Messenger. It was a fail for us, but I also talked to the people at and [two mobile phone companies marketed for children]. There are people, in different ways, trying to come up with solutions because they have understood that both the adult apps and the adult devices, like a smartphone that does all the things, might not be the ideal thing to give a 10-year-old. 

What’s changed since 2016 is there used to be more worry about one-to-one computing in schools. Now, every school pretty much is one-to-one. It’s really the outlier schools that don’t have tech or aren’t giving kids individual tech. Even as late as 2015, 2016, I was helping schools negotiate that with parents. And parents were like, “I don’t know. I’m not sure about screen time. I don’t know if I want my kid getting a Chromebook.”

Try to find a school now that doesn’t give kids iPads or Chromebooks or something. That’s probably one of the bigger differences. And then just the explosion in server-based gaming like Roblox and Minecraft and the ways kids interact in those digital communities. You see a lot of very complicated, weird ideas among adults who care about children. Like “I’ll wait until eighth grade to give a kid a phone. Meanwhile,my third-grader plays Roblox on a server with strangers.” 

Engelbrecht: Or has access to text messaging through their iPad.

Heitner: Exactly. And they’re very smugly waiting till eighth grade and I’m like, “For what? For your kid to make voice calls?” That’s the one thing they don’t want to do.

Carla, you come from a game design background. People have lots of terrible takes about video games, which I’m sure you’re used to. How has that background informed what you’re doing and what Betweened looks like?

Engelbrecht: A lot of people come to video games and they’re just like, “They’re evil,” or “They’re awful,” or “They’re violent.” And you can say the same thing about television. You can also say the same thing if you only eat broccoli. Anything in excess is not good for you — like running a marathon every day. I take a very pragmatic approach to most things we can actually find good in.

When I look at video games, I can’t classify them as evil. I instead look for the good things. And it’s the same with social media. Social media as part of a balanced media diet gives parents a lot of opportunities to connect, gives kids a lot of opportunity to express creativity and develop skills. 

“There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.”

Carla Engelbrecht

I’ll give you an example on the games side of things: Years ago, I did a South by Southwest talk called “What Can Teach Us About Parenting.” Left 4 Dead is not a game that kids should ever play. It’s a violent, first-person zombie apocalyptic shooter. It’s also one of the most beautifully designed cooperative games ever. I’m terrible with thumb sticks on video game controllers. I can’t walk in a straight line in a video game. I’m not great at the actual zombie-killing side of things. But I’m really good at running around and picking up health packs and checking in on people who have been damaged by zombies.

So there are different roles that people can play. I can still participate in the game, even though the primary way of playing Left 4 Dead is not what works for me. 

Also, if I’m playing with people, it fosters communication. I have to talk to people and someone needs to say. “Hey, I need help,” and I can come over. That’s what I’m looking for in games and social media: What are those underlying skills that, with a thoughtful perspective, you can leverage for good?

I wanted to switch gears a little bit and talk about something you mentioned earlier, Devorah: casual surveillance. I think about the stories we hear about parents not even just surveilling their kids — tracking their phones or their cars — but just keeping up in a way that we never even dreamed of. I wonder: Where did this come from? And how do you think a site like Betweened is going to help? 

Engelbrecht: I wish I knew exactly where it came from, but it certainly seems it’s symptomatic of the same thing: Everything has just kind of crept up on us. It’s like, as phones started to be introduced, we just thought, “Oh, well, I need to charge my phone, so I’ll charge it next to my bed.” And then the next thing you know, you’re checking it first thing when you wake up. It’s this slippery slope without the mindfulness of what it’s doing. Something has to happen to stop you, to make you take a step back and think, “How far have I gone? What boundaries have I crossed or what new boundary do I need to establish?” And to Devorah’s earlier point, the pandemic accelerated a lot of this.

Heitner: Part of it is we do it because we can. Even in relationships. I’ve known my husband since before we each had cell phones, but we didn’t used to check in as often because we didn’t have cell phones. It had to really rise to the level of an emergency before I would call him at work.

“As much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated.”

Devorah Heitner

Remember the days of 9-to-5 office jobs? He left in the morning and was at his job. I was a grad student then and I would go up to Northwestern and not even really have any reachability by phone. Now we have phones, and the expectation is pretty much down-to-the-minute: If I’m 11 minutes late, I’ll probably text and say, “I’m 11 minutes late.” There’s just so much expectation for contact and communication and knowing where other people are. We don’t use location surveillance for that, but a lot of families do, and a lot of people have watches and will check into each other’s location on watches.

Because it’s there, people do it. And then there’s also just tremendous worry right now about kids. Given that we as a society think it’s a good idea for everyone to have assault weapons, parents are a little nervous. That anxiety creeps into everything.

My older daughter is 31, and I remember getting her first cell phone when she was 12 or 13. I remember the intense peer pressure she felt to have a phone. And I really didn’t like it at all. But I kind of justified it by saying to myself, “This is going to keep her safe.” And I remember thinking to myself, “You’re so full of shit. You’re just really trying to smooth things over.” And I guess I wonder: As parents, do we have an overextended sense of peril about our kids these days?

Heitner: There’s a sense of peril. Also, the Internet and online news and targeted algorithms just fuel that worry and outrage. It’s a bit of a vicious cycle.

Engelbrecht: In some ways, it’s almost like there are more risks that could stick with you. There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.

I think about my daughter and I don’t want something to chase her for her entire life. That part of it feels very real. And then it feels out of control. I don’t have the tools or know exactly how I can best help her except for having hard conversations and trying to put some bumpers around her. But there’s not a lot of tools to put the bumpers around her.

Devorah, one of the things you have said is that the kind of surveillance a lot of parents are undertaking is really undermining the trust their kids feel, and backfiring because kids won’t open up to them when they really need to. Can you talk a little bit more about that?

Heitner: You just see kids really getting focused on going deeper underground. If their parents are like, “I’m going to get Bark and read every single thing they text,” then you see some kids who are like, “O.K., I need to go deeper underground, I need a VPN or to only text on Snapchat, or I need to do something where I can be more evasive.” And that concerns me, because then there’s no way to make use of the parent when the parent might be useful.

Engelbrecht: I think about how to create space to allow the kid to have a second chance at telling me the truth. For example, if there’s an empty bag of gummies and the kid is the only one who could have eaten it but says they didn’t, how can I create space to talk about making mistakes versus lying or intentionally hiding the truth? Saying, “I’m going to ask what happened to the gummis again, but first I want you to take a moment to think about your answer — it’s OK to change your answer, because I want to understand the truth. We all make mistakes and we can talk about it. But intentionally hiding the truth has consequences.”

If I later find out that the child lied, then there’s consequences. The hope is that eventually, a parent can say, “If you end up at a party where there’s alcohol, don’t drive home. Call me for a ride home. If you try to hide that there was alcohol and make poor decisions, then there’s additional consequences.”

“I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.”

Carla Engelbrecht

It’s important to be able to say, “I made a mistake” and talk about what to do from there. Hopefully, that provides an alternative to the arms race of increasingly sneaky strategies that Devorah described.

Heitner: That makes a lot of sense. I was just going to say: The surveillance — schools just push it really hard. Every time I go to a school, they’re like, “Are you logged into ?” or “Are you logged into ?” They’re just really pushing it so hard.

Are schools culpable in this? Sounds like you’d say, “Yes.” I don’t know if you’d call it surveillance, though. One of the functions of schools is to keep track of things, right?

Heitner: But what about the location tracking? My kid has to scan a QR code to get into the cafeteria. I skipped lunch every day of high school and ate with my drama club friends in the theater. Was that so bad? They have 3,500 kids QR-coding themselves into study hall. It’s pretty locked down. It’s pretty Big Brother, or if you read Cory Doctorow. 

Engelbrecht: Homework tracking means having full visibility of my daughter when part of what she needs to learn is the executive function skills to actually be able to plan and follow through and do her homework. I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.

So to me, it’s kind of that same thing: The information is there. Should it be provided? How do you use it? And, for me it’s: How do we better equip administrators, teachers or parents to stop and think about how to leverage this information? So maybe a kid who’s consistently missing their homework, yes, the parents should have more visibility as part of a support program to get the kid back on track and help them learn the skills. But to Devorah’s point, it doesn’t mean everyone needs to be badging into lunch.

Devorah, your message to parents is: There are all these things happening. There are all these things you have to keep track of. There are lots and lots of risks to kids being on social media, especially teenagers. But you shouldn’t panic. And I wanted to just throw this out to both of you: Instead of panicking, what should parents do? 

Heitner: Carla, you’re talking about creating a new community space for kids that’s more of a learning space, and that’s one alternative. Another alternative, in addition to, or potentially instead of, for parents who don’t have access to that, is just leaning into one or two spaces they really want to mentor their kids in.

Maybe their kid’s really involved in Minecraft. And if they want to join [a free voice, chat, gaming and communications app], the parents are waiting and saying, “O.K. You can join your library Discord with or your school Minecraft club on Discord, but not general Discord.”

Two 9-year-olds play the open world computer game Minecraft. Parenting expert Devorah Heitner urges parents to know more about what their kids are doing online without resorting to surveillance. (Getty Images)

Parents will tell me their kids are playing or they’re on YouTube. But I’m like, “What channels? It’s just like if somebody says, “I’m watching TV.” Well, what are you watching? Because that really is a big differentiator in terms of the experience.

Engelbrecht: It goes back to your “Fear of Messing Up.” I think so much about how it’s important for parents to wade in and get involved with their kids. This has been the advice for decades, whatever the newfangled thing was. I was just doing some writing about encouraging parents to actually do with their kids. It’s an opportunity to bond. It actually requires some planning and practice. It’s physical activity. I assume most parents are like me, that they’re not a great dancer and it’s uncomfortable and you don’t want to mess up.

But modeling that I’ll do something that’s out of my comfort zone and connect with you over something that I know you enjoy, can be very simple. It doesn’t mean a parent has to suddenly learn all aspects of Roblox or Discord, because they can be intimidating. But just find an entry point and connect with the child and participate with them. It just has so many benefits. It’s true whether they’re into Tonka trucks or Roblox. Parenting means, “Get in there with your kid.”

Devorah, you use the phrase, “Lead with curiosity.”

Engelbrecht: Oh, I love that.

Heitner: You want to be curious and have your kid share it with you. Their expertise and experience as well and their discernment — what do they like or not like about this app? How would they change it if they could? Staying curious is an alternative to spying — being curious and asking kids to be curious even about their own experience. Do I actually feel less stressed when I scroll this app? That’s maybe a lot of mindfulness to expect of kids, who have a lot going on and a lot coming at them. But it’s important for all of us to be curious about how our experience is going.

Engelbrecht: That’s one of the ways I’ve been thinking about it from a product perspective: just how to help build in some scaffolds for mindfulness — things like when you start an app, actually having a timer that’s like, “How long do you want to spend on it right now?”

I set a timer for myself when I use TikTok because I spend a very long time on it. So being able to put that in there as a scaffold, to start being mindful and thoughtful about it. We’re posting content, but we’re actually not posting endless scrolls where you could spend all day.

I don’t want to prioritize the traditional tech metric of “time on task.” To me, success is like, “You can come and use Betweened for 20 minutes and then know you can come back another day and there’s lots of interesting stuff for you.” But it’s not all-consuming, must-do-this-all-the-time. And that’s a different perspective on tech products. It’s not how most products are developed.

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like “gay” and “lesbian” in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth — rather than criticism of its product — as the impetus for the change as part of a “continuous evaluation and updating process.”

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they’re than their straight and cisgender classmates. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But in practice, Gaggle’s critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle’s decision to remove several LGBTQ-specific keywords, including “queer” and “bisexual,” from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by ĂŰĚŇÓ°ĘÓ into the company’s business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle’s software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students’ personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students’ sexual orientation or gender identity last year, according to data provided to ĂŰĚŇÓ°ĘÓ by its civil rights office. That’s a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, ĂŰĚŇÓ°ĘÓ revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle’s website as a collaboration to “improve mental health outcomes for LGBTQ young people.” 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle “having a role in negatively impacting LGBTQ students.” 

The Trevor Project didn’t respond to requests for comment on Gaggle’s decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to ĂŰĚŇÓ°ĘÓ, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students’ digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company’s efforts to protect students from abuse and were purged late last year.

“At points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,” Hetherington said. “Thankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.”

Hetherington said Gaggle will continue to monitor students’ use of the words “faggot,” “lesbo,” and others that are “commonly used as slurs.” A previous review by ĂŰĚŇÓ°ĘÓ found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school’s literary magazine, and students’ private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of “Don’t Say Gay” laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn’t been independently verified and there’s a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled “Essay on the Reasons Why I Want to Kill Myself but Can’t/Didn’t.” Adults wouldn’t have known she was struggling without Gaggle, she said. 

“I do think that it’s helpful in some ways,” the student said, “but I also kind of think that it’s — I wouldn’t say an invasion of privacy — but if obviously something gets flagged and a person who it wasn’t intended for reads through that, I think that’s kind of uncomfortable.” 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle’s keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company’s stated justification for the move as being “out of touch” with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using “reclaimed slurs,” reappropriating words that are generally considered derogatory and remain in Gaggle’s dictionary. 

“This is just like lipstick on a pig — no offense to pigs — but I don’t see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,” Greer said. “I don’t see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.”

Gaggle and its competitors — including , and — have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has “no context or background on students,” including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the “continuous surveillance” of students if monitoring tools are likely to trample students’ rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

“There is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,” Greer said. “If anything, queer kids are more in the crosshairs today than they were a year ago or two years ago — and that’s why this surveillance is so dangerous.”

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project’s toll-free support line at 866-488-7386.

]]>
White House Cautions Schools Against ‘Continuous Surveillance’ of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to “continuous surveillance” if the use of digital monitoring tools — already accused of targeting at-risk youth — are likely to trample students’ rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out “in favor of a human alternative.”


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools’ use of artificial intelligence that “define specifications for the safety, fairness and efficacy of AI models used within education” and introduce “guardrails that build on existing education data privacy regulations.” 

During , Education Secretary Miguel Cardona said officials at the department “embrace utilizing Ed Tech to enhance learning” but recognize “the need for us to change how we do business.” The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children’s information is being used.

“This has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,” he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a “massive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.” 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by ĂŰĚŇÓ°ĘÓ on schools’ growing use of digital surveillance and the tech’s impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes “unilateral federal action to prohibit” the software.

“That’s not the appropriate role of the federal government to come and say this cannot happen,” she said. “But smart guardrails that allow for good practices, that protect students’ safety and privacy, that’s a more appropriate role.”

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear “the ways in which algorithmic systems can deepen inequality.” 

“We commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,” CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks â€” particularly for students of color and LGBTQ youth — and promoted a “need for federal action to protect students’ civil rights, safety and privacy.” Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy “had no idea this was coming,” and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document’s failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students’ race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

“Of course there are a lot of privacy and equity and ethical landmines in this area,” Vance said. “But we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.” 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint’s on-the-ground implications, but that student data privacy efforts present schools with “a balancing act.”

“You want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn’t working,” she said. “Sometimes that’s broader than just a pure academic indicator.”

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they’ve highlighted how student data can wind up in the hands of the police.

“When police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,” he said in a media release. “While the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.”

]]>
With ‘Don’t Say Gay’ Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence — and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn’t explain who they were as a person. 

“It was very confusing trying to navigate understanding who I am and my identity,” said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. “I was able to find the words to understand who I am — words that I wouldn’t be able to piece together in a sentence if the internet wasn’t there.” 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality — intimate details that are gleaned from students’ conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center’s director of equity in civic technology. Following the Supreme Court’s repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 â€œRight now — without doing anything — schools may be getting alerts about students” who are searching the internet for resources related to reproductive health,” Laird said. “If you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.”

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

“I have switched to using Signal to talk to friends and colleagues in this space,” they said, referring to the . “The fear, even though it’s been common knowledge for basically my generation’s entire life that everything you do is being surveilled, it definitely has been amplified tenfold.”

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen’s private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state’s ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called “Don’t Say Gay” laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group’s co-founder and executive director, told ĂŰĚŇÓ°ĘÓ. 

“Right now, we are not in a landscape or an environment where that is safe for a company to be doing,” Wood said. “If there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.”

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

‘Extraordinarily concerned’

For decades, has required school technology to block access to images that are obscene, child pornography or deemed “harmful to minors,” and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren’t explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students’ social media posts, follow their digital movements in real time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

They’ve also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools’ widespread adoption of the tools could trample students’ civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies — GoGuardian, Gaggle, Securly and Bark — whether they flag students for using keywords related to reproductive health, such as “pregnant” and “abortion.”

“We are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,” Markey and Warren wrote. “With reproductive rights under attack nationwide, it would represent a betrayal of your company’s mission to support students if you fail to provide appropriate protections for students’ privacy related to reproductive health information.”

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples’ movements. Such precise location data, the , “may be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.” 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe’s repeal was imminent – despite maintaining a 30-day retention period for most other data. 

“By immediately and permanently deleting data which contains a student’s reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,” Bark CEO Brian Bason , which the company shared with ĂŰĚŇÓ°ĘÓ. 

GoGuardian spokesperson Jeff Gordon said its tools “cannot be used by educators or schools to flag reproductive health-related search terms” and its web filter cannot “flag reproductive health-related searches.” Securly didn’t respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not “collect health data of any kind including reproductive health information,” specifying that the monitoring tool does not flag students who use the terms “pregnant, abortion, birth control, contraception or Planned Parenthood. ” 

Yet tracking conversations about sex is a primary part of Gaggle’s business — more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by ĂŰĚŇÓ°ĘÓ, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to “sexual activity involving a student.” Students were regularly flagged for using keywords like “virginity,” “rape,” and, simply, “sex.” 

Patterson, the Gaggle CEO, has acknowledged that a student’s private diary entry about being raped wasn’t off limits. In touting the tool’s capabilities, he told ĂŰĚŇÓ°ĘÓ his company uncovered the girl’s diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they’d learned from her diary, Patterson said. 

“They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own,” Patterson said.

Any information that surveillance companies collect about students’ sexual behaviors could be used against them by police during investigations, privacy experts warned. And it’s unclear, Laird said, how long the police can retain any data gleaned from the tools. 

‘Don’t Say Gay’

Internet search engines are “particularly potent” tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company — which sifts through files and chat messages on students’ school-issued Microsoft and Google accounts — is not a web filter and therefore “does not track students’ online searches.” Yet Patterson’s assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students’ online searches. By working in tandem, flags triggered by Linewize’s web filtering “can be sent straight to the Gaggle Safety Team,” if the material “should be forwarded to the school or district.” 

In an email, Gaggle spokesperson Paget Hetherington said that in “a very small number of school systems,” the company reviews alerts from web filters before they’re sent to school officials to “alleviate the large number of false positives” and ensure that “only the most critical and imminent issues are being seen by the district.” 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including “gay” and “lesbian.” Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for “inferring sexual perversion” and “serving gay people.” From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of “Don’t Say Gay” laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have “been used against us in very detrimental ways.” 

Companies, he said, are unable to control how officials use that information in an era “where teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.” In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

“They can’t control what’s going to happen in Florida or Texas and they can’t control what’s going to happen in an individual home,” where students could be subjected to abuse, Wood said. “Any person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.” 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a “time capsule” letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one — not even him — would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher’s assurance wasn’t quite true, she learned. Someone had been reading the letter — and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was “identified as inappropriate” and urged her to “refrain from storing or sharing inappropriate content.” In a “second warning,” sent to her inbox, she was told a school administrator was given “access to this violation.” After a third alert, she said, access to her school email account was restricted. She said the experience left her with “a sense of betrayal from my school.” She said she had no idea words like “gay” or “sex” could get flagged by Gaggle’s algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a “time capsule” letter to her future self, had been “identified as inappropriate.” (Courtesy Susan)

“It’s frustrating to know that this program finds the need to have these as keywords, and quite depressing,” she said. “There’s always going to be oppression against the community somewhere, it seems, and it’s quite disheartening.” 

School administrators reviewed the time capsule letter and determined it didn’t contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that’s not the case for everyone.

“That’s not just the surveillance of your activities, that’s the surveillance of your thoughts,” Thomas said of Susan’s experience. “I know that wouldn’t have gone very well for me and I know for a lot of young people that would place them in a lot of danger.”

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio’s strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that “law enforcement is immediately contacted” if the district receives an alert from Gaggle suggesting that a student poses “an imminent threat of harm to self or others.” 

Given the state of abortion rules in Ohio, Susan said she’s concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

“I just flat out don’t trust adults in positions of power or authority,” Susan said. “You don’t really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.”

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to ĂŰĚŇÓ°ĘÓ, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its , GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the . The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students’ online activities, that educators’ widespread use of digital surveillance tools could trample students’ civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate — and potentially harmful — effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country — often 24 hours a day, seven days a week — to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies’ software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use — and potential misuse — of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire “need for federal action to protect students’ civil rights, safety and privacy.”

“While the intent of these products, many of which monitor students’ online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,” the lawmakers wrote. “Studies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.”

An FCC spokesperson said they’re reviewing the and an Education Department spokesperson said they “look forward to corresponding with the senators” about its findings.

Lawmakers’ inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by ĂŰĚŇÓ°ĘÓ into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. ĂŰĚŇÓ°ĘÓ used public records to expose how Gaggle’s algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn’t respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators’ March 30 report and looks forward “to continuing our dialogue with Senators Warren and Markey on the important topics they have raised.”

“Parents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,” GoGuardian spokesman Jeff Gordon said in a statement. “Schools also have a responsibility to keep students safe in digital spaces and on school-issued devices.” 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers’ recommendations “to assess how we can further strengthen our work to better protect students.”

“We want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,” Patterson continued. “We have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.”

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country’s “terrible history of bias in school discipline” by removing the decisions of individual teachers and administrators.

“While any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,” Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies’ letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders “prefer that we contact public safety agencies directly in lieu of a district contact.”

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students’ internet use to ensure they don’t access material “harmful to minors,” such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law’s scope. Meanwhile, advocates have questioned whether schools’ use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students’ computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their “true thoughts or ideas because I know what I do online is being monitored.”  

Elizabeth Laird, the group’s director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it’s “disappointing that it took a letter from Congress to get this information.” She said she hopes the FCC and Education Department adopt lawmakers’ recommendations.

“None of these companies have researched whether their products are biased against certain groups of students,” she said in an email while questioning their justification for holding off on such an inquiry. “They cite privacy as the reason for not doing so while simultaneously monitoring students’ messages, documents and sites visited 24 hours a day, seven days a week.” 

ĂŰĚŇÓ°ĘÓ’s investigation, which used data on Gaggle’s foothold in Minneapolis Public Schools, failed to identify whether the tool’s algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like “gay” and “lesbian” were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification “would do little at best and may provide counterproductive guidance at worst.” Many schools, she said, are likely to use the tools regardless of the federal rules. 

“Schools aren’t required to monitor social media, and many have chosen to do so anyway,” said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance “could actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.”

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by ĂŰĚŇÓ°ĘÓ, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>