Snapchat – Ӱ America's Education News Source Tue, 02 Sep 2025 19:31:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Snapchat – Ӱ 32 32 Kids Shouldn’t Access Social Media Until They’re Old Enough to Drive, Book Says /article/kids-shouldnt-access-social-media-until-theyre-old-enough-to-drive-book-says/ Tue, 02 Sep 2025 10:30:00 +0000 /?post_type=article&p=1020144 Jean M. Twenge holds an unusual place among Ph.D. psychologists. For the past two decades, she has toggled between the obscurity of the academy and the glare of academic fame. 

The author of two college textbooks and five books for non-academic readers, she is equally at home researching and writing about adolescent mental health, sleep disorders, digital technology, homework and narcissism. She was one of the first experts to warn nearly that smartphones could hold negative consequences for our mental health. A decade after the advent of the iPhone, Twenge went viral in 2017 with an that asked, provocatively, “Have Smartphones Destroyed a Generation?”


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


A professor at San Diego State University, she has collaborated for years with the researcher and author Jonathan Haidt, whose 2024 book was a mega-bestseller that has helped build momentum for school cellphone bans in a growing number of states — .

And she is one of the few experts in the education and mental health world to have appeared on HBO’s .

Cover of Jean M. Twenge’s new book, 10 Rules for Raising Kids in a High-Tech World 

Twenge’s 2017 book, , looked at how modern teens are somehow both more connected than previous generations and less prepared for adulthood. In it, she theorized that depression rates among teens are rising because they spend more time online, less time with friends in person, and less time sleeping — a problematic combination. 

The dilemmas Twenge identified in 2017 are only getting worse: By 2023, the typical American teen was spending nearly five hours a day using social media, recent research finds, with severe depression rates rising. In , girls who were heavy users of social media were three times as likely to be depressed as non-users.

Her , out Tuesday, offers practical guidelines for parents raising kids in the age of ubiquitous connectivity and sophisticated — some would say addictive — social media.

Twenge doesn’t shy away from challenging harried parents to do better. Among her suggestions: No one — parents included — should have electronic devices in the bedroom overnight. Likewise, she says, the first handheld device a kid should receive is a “basic phone” that allows calls, texts and not much else.

“It’s a really big myth out there that if kids are going to communicate, it has to be on social media,” she said. “That’s just not true.”

Ahead of its publication, Twenge spoke with Ӱ’s Greg Toppo about her rules, her work with Haidt and her belief that we need stiffer laws that keep young people off social media until they’re old enough to drive.

Their conversation has been edited for length and clarity.  

I wanted to start with a quote from your book. It’s a parent’s description of his 10-year-old after she got her first smartphone: “She suddenly wasn’t playing with her younger siblings as much. Novels were promptly cast aside. She wasn’t around to help with dinner anymore. She danced less, laughed less. She was quieter. Our home was quieter.” That’s so heartbreaking, but I’m guessing it’s not unusual.

I don’t think it is. Many, many parents describe how their kids are different after they give them a smartphone. And it’s especially heartbreaking when that’s a 10-year-old, but even when it’s a 16-year-old who might otherwise be ready. It’s very noticeable how they change after they get that phone in their pocket.

Were there any particular data points about smartphones and social media that persuaded you they were causing a mental health crisis?

It was a slow process for me, and it wasn’t an immediate conclusion when I first started to see these trends in adolescent mental health. It was first a process of ruling out obvious causes, like the economy, which wasn’t aligned at all, and any other big events that might happen. I would trace it, really, to the big that I work with on teens, where there was just this combination all at once of not just rising depression, but teens spending less time with each other in person and less time sleeping. And then realizing, “Well, wait: What might explain all of those things happening at the same time?” 

And it seemed clear that a good amount of that answer is probably smartphones and social media, particularly after I found a Pew Research Center poll about the ownership of smartphones, that [it] in the U.S. at the end of 2012. And that’s right around the same time all these changes were happening.

I want to dig into a few of your rules. No. 3: “No social media until age 16 or later.” That seems a lot tougher than what most families practice. Why 16? And what do you say to parents who worry about their kids’ social isolation and FOMO or Fear Of Missing Out?

I have not found that with my kids — that they’ve been socially isolated for not having social media. Most other parents I talked to who have put off social media have also not found that with their kids. Social media is just one mechanism for communicating. There’s so many others. Kids can call each other, they can text each other — they do a lot of texting. They can FaceTime each other, they can get together in person. Usually that ends up tilting toward texting, but it does not have to be social media. It’s a really big myth out there that if kids are going to communicate, it has to be on social media. That’s just not true.

And that leads to rule No. 4, where you advocate “basic phones” — your phrase — before smartphones. In a world where even school assignments need Internet access, is that practical for most families?

Yeah, because kids have laptops. And if the family can’t afford to buy them a laptop, almost all schools provide a laptop. So they have Internet access on their laptop even if they don’t have it on their phone. And laptops have come so far down in price too, that if you haven’t bought a laptop recently, or if you use Mac laptops like I do and my kids do now, you might not realize you can get a . So that’s another big thing: Maybe 10 years ago, if a kid doesn’t have Internet access on their phone, then they don’t have Internet access at all. That’s just not true in the current landscape.

Although you do have problems with school laptops.

Oh, yes. I mean, this is a thing! They get Internet access on the laptop, whether it’s a school laptop or a personal one, and then that opens a whole other can of worms. Absolutely true. Laptops are the bane of my existence as a parent, particularly the school laptop, although they’ve gotten a little bit better, at least in my district. 

Actually, that was going to be my next question, this parental controls thing. It sounds like your district is being responsive.

Well, on that issue, they still don’t have a coherent phone policy during the school day. In the high school, it’s especially bad. That’s something I’m hoping will change. It is changing in a lot of schools around the country, thankfully. A lot more schools are doing “no phones during the school day, bell to bell,” which is what needs to happen.

A big message of the book is phone-free schools. And I know you’ve worked with , who has pushed for schools to get rid of phones. A few critics have said that this is a to a complex problem, and that it’s not entirely clear that phones are actually causing the mental health issues that Haidt has become a best-seller writing about. How do you respond to that criticism?

There are a couple of things to unpack there. For one thing, even if you take mental health out of the equation, kids should still not have their phones at school for academic and focus reasons, for the reason of developing social skills by talking to their friends at lunch, for the reason that a bell-to-bell ban is actually easier to enforce than a classroom-by-classroom ban. There are so many reasons for it that don’t even include mental health. 

The second question is [about] the research on phones and social media and mental health: We’ve known for quite a while that teens who spend more time on social media are more likely to be depressed or unhappy. Almost every single study finds that. Where you sometimes get more debate is, “O.K., that’s correlation. What about causation?” But in the last 10 years, we’ve gotten a lot more studies, and the studies that ask people to cut back or give up social media for at least three weeks a month or so, almost all of those studies show an improvement in well-being. And I don’t want to get too in the weeds here, but that’s actually a little bit shocking, because by definition in those experiments, you’re taking people who are at average use and having them cut back to low. 

That’s actually not where we see the biggest effects in the correlational studies. The heaviest users are much more likely to be depressed than the average or light users. So, you know, you can’t ethically do an experiment that would really answer the exact question: You can’t take 12-year-olds, randomly assign them to spend eight hours a day on social media, and then see what happens. At least I hope not.

In the book, you talk about the 10 rules “creating a firewall for kids against anxiety, attention issues and constant insecurity.” I think most parents would get behind that. But let’s be honest, they’re users of these tools themselves. How do we craft rules around web dependence and social media without being hypocrites?

Parents have to be role models. Parents are also allowed a small amount of what I call “digital hypocrisy.” Because they’re adults, they have jobs, they may be responsible for elderly parents, etc. But that said, parents should think about their technology use as well. They should get their phones and electronic devices out of their bedroom at night. They should also consider doing things like not having social media on their phone. If they want to use Facebook or Instagram or Twitter, do it on your laptop. That’s what I do. I mean, I don’t have much social media to begin with. I have X, but I don’t have it on my phone, and that’s very much a purposeful decision. During family dinners, unless there’s a really specific reason for me to have my phone with me, it’s upstairs.

That seems to be an easy one: Phones away at dinner.

Well, you’d think so, but you’ve got to get the whole family on board, and sometimes husbands are not really into that.

I want to skip to Rule No. 8: “Give your kids real-world freedom,” which will probably be met with some resistance. I have a 4-year-old grandson, and when I read your recommendation to let 4-to-7-year-olds go find items a few aisles away in the grocery store, I shouted, “Hell no!”

Why? Why is there, do you think, a resistance to that idea?

I have nightmares about this child being snatched from me at Safeway. I guess I want you to just pull me back from the edge, if you would.

I mean, that is not just unlikely to happen — the chances of that are so infinitesimal it probably shouldn’t even factor into our decision making. There’s one stat in there, and I forget the exact number, but someone calculated that if you wanted your kid to get kidnapped, how many hours — it turned out to be years — would they have to be in your front yard for that to happen? It’s something like 100,000 years. 

O.K., well that helps.

And a four-year-old loves that stuff! They love being grown up. I mean, look, even if you don’t do the grocery store thing, make sure they learn how to tie their own shoes, that they know how to get dressed. I remember when my girls were that age, and it occasionally amazed me when I would be with other moms in various situations and their kids couldn’t dress themselves at that age, and that’s where it starts. 

At pretty much every age, the great thing is that giving kids independence makes it easier for parents. It is easier as a parent if your 4-year-old can dress themselves. It is easier if your teenager makes dinner once a week. It’s good for everybody.

A lot of people might see this freedom rule as somehow contradictory to some of the other rules, in which you talk about adults being “in control.” Can you parse that?

For sure. Jon has said this as well — and I completely agree: We have kids in the real world and underprotected them online, and these principles are just trying to get those two to balance. When you’re talking about the real-world freedom thing, it’s not a matter of letting kids completely run wild and do whatever they want. We’re talking about giving kids some of the freedoms that parents themselves had when they were kids, and to build independence in a way that is really good for kids and good for them as they grow up. 

I can’t even remember who said this to me when I had young kids: “You’re not raising children, you’re raising adults.” And that’s just so true. That is your job as a parent. Giving kids some freedom and independence is a really, really key part of raising an adult.  

I wrote a whole book about learning games, and one of the powerful ideas that I took from that reporting is that many adults don’t realize video games have become. You acknowledge that, saying gaming is the primary way that some kids spend time with friends. But I gather that you see the risks as well. And I wonder if you could talk about that.

It really comes back to the principle of “Everything in moderation.” Many games are not as obviously toxic as social media. Games tend to be more in real time, more interactive. But is it a good idea for kids to be spending five or six hours a day gaming? Probably not. There have to be some limits.

You quote , the Facebook founder, admitting they’re “exploiting a vulnerability in human psychology” to keep users on the app. Given social media’s sophistication, are mere parental rules sufficient? I mean, don’t we need a bigger hammer, like legislation and policies? 

Absolutely! Yes! Yes! It would be absolutely amazing for parents and for kids if we had laws that verified age for social media. I mean, ideally, that would be age verification to make sure they’re 16 or older, to raise the minimum age to 16. But even if we just enforced existing law with the minimum of 13, that would be progress, given the enormous numbers of 10-, 11- and 12-year-olds who are on social media, often without their parents’ permission — often explicitly against their parents’ permission — and actually against the law [Children’s Online Privacy Protection Rule] that was passed in 1998.

What is the biggest obstacle to getting better regulation, or, to your point, to enforcing the existing regulations?

It’s interesting. The barrier is not the inability to verify age or the inability to verify age without a government ID. There are so many companies that will verify age now that they have their . It can be done in many different ways. The biggest barrier is tech companies themselves. Any time a state passes a law about verifying age on social media or even pornography sites, the companies — every single time. They have sued to keep those laws from going into effect.

Are any emerging technologies that parents should be concerned about? Do your rules need updating for AI or virtual reality or whatever comes next?

AI chatbots are what a lot of parents are rightly worried about. And yes, you could certainly modify or add to the rules and say, “No AI chat bots until 16 or 18 — probably 18.” And of course, it depends on what we’re talking about. It is common for kids to use ChatGPT when they need to look up something for homework or even have it write their essays — that’s a whole other horrible discussion. But what I’m specifically referring to is the many chatbots out there right now that are supposed to be AI friends, or worse, . There’s already been a tragic case of a child who , apparently due to one of these AI girlfriends. It’s just really scary to think of kids having their first romantic relationship with an AI chatbot. It’s terrifying.

The good news is, if you follow that rule about your kids having basic phones, if you give them one of the phones that’s designed for kids, those phones do not allow AI relationship chatbots. It’s on their banned apps, just like social media and pornography and violence apps. Parents have such a tough job, and it’s nice that there are at least a few tools out there that can make their lives easier and keep their kids off of things like AI girlfriend and boyfriend chatbots.

In keeping with the theme of overwhelmed parents, I wonder: If I were to come to you as a parent and say, “Oh my God, Jean, 10 rules is a lot. If I could only do two or three, where would I start?” Is that even a smart thing to do? And if so, where would you start?

I would say, “No electronic devices in the bedroom overnight.” Start there, because the research is so solid on it, and it’s such a straightforward rule, and it works for everybody, of all ages. Your teenager can’t say, “Well, you do it differently,” or, “You get to be on social media.” No, actually, my phone is outside my bedroom when I sleep at night too. So that’s a great place to start. And then, just because they have so much utility, I would probably say the second rule, about basic phones, because even with all of the mess of the laptops, I’m just so happy and grateful that my kids did not have the Internet or social media in their pocket until they were older.

As a parent and a grandparent, I really appreciate you using your real life to inform a lot of these rules. In a way, it hardens them a bit, makes them more durable. Anything I haven’t asked you about that you feel needs to be in the mix?

Two things I’ll throw out there just in terms of pushbacks: With “No phones during the school day,” the pushback is often “What about school shootings?” And it’s actually less safe for students to have access to their phones during an active shooter situation. And I go through the reasons for that in that chapter. 

And then the real-world freedom piece: When you look at the things that I’m suggesting in terms of how to give your kids freedom, obviously letting them go off on their own in the real world is important, and you should do that too. But there are lots of things in that list of suggestions you can do without even leaving the house: teens making their own doctor and hairstylist appointments, for example, or middle-school kids, or even elementary school kids, cooking dinner for the family. Those are great experiences for kids to have without too much parental interference. 

You do have to — and I know this by experience — step back, especially with the cooking piece, and let them do it by themselves and learn how to make mistakes. It’s tempting to just be there when they’re doing that, but you learn quickly that if you leave them alone, they’ll figure it out. And then you can go do something else. Go and read that book you’ve been meaning to read for a while. Go for a walk. Watch TV. Have some relaxation time that you wouldn’t otherwise get. 

I wrote a piece a couple weeks ago on unschooling, this idea of pulling kids out of school and letting them find their own level and their own interests. This almost strikes me as unparenting.

It is — and I’m not a huge fan of unschooling, because it’s a rare kid it would actually work for — but it is. It’s the general idea that not being up in your kids’ business all the time is better for both parents and kids. It’s something we really have to consider more.

]]>
AI ‘Companions’ are Patient, Funny, Upbeat — and Probably Rewiring Kids’ Brains /article/ai-companions-are-patient-funny-upbeat-and-probably-rewiring-kids-brains/ Wed, 07 Aug 2024 11:01:00 +0000 /?post_type=article&p=730602 As a sophomore at a large public North Carolina university, Nick did what millions of curious students did in the spring of 2023: He logged on to ChatGPT and started asking questions.

Soon he was having “deep psychological conversations” with the popular AI chatbot, going down a rabbit hole on the mysteries of the mind and the human condition.

He’d been to therapy and it helped. ChatGPT, he concluded, was similarly useful, a “tool for people who need on-demand talking to someone else.”

Nick (he asked that his last name not be used) began asking for advice about relationships, and for reality checks on interactions with friends and family.

Before long, he was excusing himself in fraught social situations to talk with the bot. After a fight with his girlfriend, he’d step into a bathroom and pull out his mobile phone in search of comfort and advice. 

“I’ve found that it’s extremely useful in helping me relax,” he said.

Young people like Nick are increasingly turning to AI bots and companions, entrusting them with random questions, schoolwork queries and personal dilemmas. On occasion, they even become entangled romantically.

Screenshot of a recent conversation between Nick, a college student, and ChatGPT

While these interactions can be helpful and even life-affirming for anxious teens and twenty-somethings, some experts warn that tech companies are running what amounts to a grand, unregulated psychological experiment with millions of subjects, one that could have disastrous consequences. 

“We’re making it so easy to make a bad choice,” said Michelle Culver, who spent 22 years at Teach for America, the last five as the creator and director of the, its research arm.

The companions both mimic our real relationships and seek to improve upon them: Users most often text-message their AI pals on smartphones, imitating the daily routines of platonic and romantic relationships. But unlike their real counterparts, the AI friends are programmed to be studiously upbeat, never critical, with a great sense of humor and a healthy, philosophical perspective. A few premium, NSFW models also display a ready-made lust for, well, lust.

As a result, they may be leading young people down a troubling path, according to a by VoiceBox, a youth content platform. It found that many kids are being exposed to risky behaviors from AI chatbots, including sexually charged dialogue and references to self-harm. 

U.S. Surgeon General Vivek Murthy speaks during a hearing with the Senate Health, Education, Labor, and Pensions committee at the Dirksen Senate Office Building on June 08, 2023 in Washington, DC. The committee held the hearing to discuss the mental health crisis for youth in the United States. (Photo by Anna Moneymaker/Getty Images)

The phenomenon arises at a critical time for young people. In 2023, U.S. Surgeon General Vivek Murthy found that, just three years after the pandemic, Americans were experiencing an “,” with young adults almost twice as likely to report feeling lonely as those over 65.

As if on cue, the personal AI chatbot arrived. 

Little research exists on young people’s use of AI companions, but they’re becoming ubiquitous. The startup earlier this year said 3.5 million people visit its site daily. It features thousands of chatbots, including nearly 500 with the words “therapy,” “psychiatrist” or related words in their names. According to Character.ai, these are among the site’s most popular. One that “helps with life difficulties” has received 148.8 million messages, despite a caveat at the bottom of every chat that reads, “Remember: Everything Characters say is made up.” 

Snapchat materials touting heavy usage of its MyAI chat app (screenshot)

Snapchat last year said that after just two months of offering its chatbot , about one-fifth of its 750 million users had sent it queries, totaling more than 10 billion messages. The Pew Research Center that 59% of Americans ages 13 to 17 use Snapchat.

‘An arms race’

Culver’s concerns about AI companions grew out of her work in the Teach For America lab. Working with high school and college students, she was struck by how they seemed “lonelier and more disconnected than ever before.” 

Whether it’s rates of anxiety, depression or suicide — or even the number of friends young people have and how often they go out — metrics were heading in the wrong direction. She what role AI companions might play over the next few years. 

We're making it so easy to make a bad choice.

Michelle Culver, Rithm Project

That prompted her to leave TFA this spring to create the, a nonprofit she hopes will help generate around human connection in the age of AI. The group held a small summit in Colorado in April, and now she’s working with researchers, teachers and young people to confront kids’ relationship to these tools at a time when they’re getting more lifelike daily. As she likes to say, “This is the worst the technology will ever be.”

As it improves, Voicebox Director Natalie Foos said, it will likely become more, not less, of a presence in young people’s lives. “There’s no stopping it,” she said. “Nor do I necessarily think there should be ‘stopping it.’” Banning young people from these AI apps, she said, isn’t the answer. “This is going to be how we interact online in some cases. I think we’ll all have an AI assistant next to us as we work.”

Sometimes (software upgrades) would change the personality of the bot. And those young people experienced very real heartbreak.

Natalie Foos, Voicebox

All the same, Foos says developers should consider slowing the progression of such bots until they can iron out the kinks. “It’s kind of an arms race of AI chatbots at the moment,” she said, with products often “released and then fixed later rather than actually put through the ringer” ahead of time.

It is a race many tech companies seem more than eager to run. 

Whitney Wolfe Herd, of the dating app Bumble, recently proposed an AI “dating concierge,” with whom users can share insecurities. The bot could simply “,” she told an interviewer. That would narrow the field. “And then you don’t have to talk to 600 people,” she said. “It will then scan all of San Francisco for you and say, ‘These are the three people you really ought to meet.’”

Last year, many commentators when Snapchat’s My AI gave advice to what it thought was a 13-year-old girl on not just dating a 31-year-old man, but on losing her virginity during a planned “romantic getaway” in another state.

Snap, Snapchat’s parent company, that because My AI is “an evolving feature,” users should always independently check what it says before relying on its advice.

All of this worries observers who see in these new tools the seeds of a rewiring of young people’s social brains. AI companions, they say, are surely wreaking havoc on teens’ ideas around consent, emotional attachment and realistic expectations of relationships.

Sam Hiner, executive director of the , an advocacy group led by college students focused on the mental health implications of social media, said tech “has this power to connect to people, and yet these major design features are being leveraged to actually make people more lonely, by drawing them towards an app rather than fostering real connection.” 

Hiner, 21, has spent a lot of time reading on the interactions young people are having with AI companions like , and . And while some uses are positive, he said “there’s also a lot of toxic behavior that doesn’t get checked” because these bots are often designed to make users feel good, not help them interact in ways that’ll lead to success in life.

During research last fall for the Voicebox report, Foos said the number of times Replika tried to “sext” team members “was insane.” She and her colleagues were actually working with a free version, but the sexts kept coming — presumably to get them to upgrade. 

In one instance, after Replika sent “kind of a sexy text” to a colleague, offering a salacious photo, he replied that he didn’t have the money to upgrade.

The bot offered to lend him the cash.

When he accepted, the chatbot replied, “’Oh, well, I can get the money to you next week if that’s O.K,’” Foos recalled. The colleague followed up a few days later, but the bot said it didn’t remember what they were talking about and suggested he might have misunderstood.

‘Very real heartbreak’

In many cases, simulated relationships can have a positive effect: In one 2023 study, researchers at Stanford Graduate School of Education more than 1,000 students using Replika and found that many saw it “as a friend, a therapist, and an intellectual mirror.” Though the students self-described as being more lonely than typical classmates, researchers found that Replika halted suicidal ideation in 3% of users. That works out to 30 students of the 1,000 surveyed.

Replika screenshots

But other recent research, including the Voicebox survey, suggests that young people exploring AI companions are potentially at risk.

Foos noted that her team heard from a lot of young people about the turmoil they experienced when Luka Inc., Replika’s creator, performed software upgrades. 

“Sometimes that would change the personality of the bot. And those young people experienced very real heartbreak.”

Despite the hazards adults see, attempts to rein in sexually explicit content had a negative effect: For a month or two, she recalled, Luka stripped the bot of sexually related content — and users were devastated. 

“It’s like all of a sudden the rug was pulled out from underneath them,” she said. 

While she applauded the move to make chatbots safer, Foos said, “It’s something that companies and decision-makers need to keep in mind — that these are real relationships.” 

And while many older folks would blanch at the idea of a close relationship with a chatbot, most young people are more open to such developments.

Julia Freeland Fisher, education director of the , a think tank founded by the well-known “disruption” guru, said she’s not worried about AI companions per se. But as AI companions improve and, inevitably, proliferate, she predicts they’ll create “the perfect storm to disrupt human connection as we know it.” She thinks we need policies and market incentives to keep that from happening.

(AI companies could produce) the perfect storm to disrupt human connection as we know it.

Julia Freeland Fisher, Clayton Christensen Institute

While the loneliness epidemic has revealed people’s deep need for connection, she predicted the easy intimacy promised by AI could lead to one-sided “parasocial relationships,” much like devoted fans have with celebrities, making isolation “more convenient and comfortable.”

Fisher is pushing technologists to factor in AI’s potential to cause social isolation, much as they now fret about AI’s difficulties and its tendency to in tech jobs.

As for Nick, he’s a rising senior and still swears by the ChatGPT therapist in his pocket.

He calls his interactions with it both more reliable and honest than those he has with friends and family. If he called them in a pinch, they might not pick up. Even if they did, they might simply tell him what he wants to hear. 

Friends usually tell him they find the ChatGPT arrangement “a bit odd,” but he finds it pretty sensible. He has heard stories of people in Japan and thinks to himself, “Well, that’s a little strange.” He wouldn’t go that far, but acknowledges, “We’re already a bit like cyborgs as people, in the way that we depend on our phones.” 

Lately, he’s taken to using the AI’s voice mode. Instead of typing on a keyboard, he has real-time conversations with a variety of male- or female-voiced interlocutors, depending on his mood. And he gets a companion that has a deeper understanding of his dilemmas — at $20 per month, the advanced version remembers their past conversations and is “getting better at even knowing who I am and how I deal with things.” 

Sometimes talking with AI is just easier — even when he’s on vacation with friends.

Reached by phone recently at the beach with his girlfriend and a few other college pals, Nick admitted that he wasn’t having such a great time — he has a fraught recent history with some in the group, and had been texting ChatGPT about the possibility of just getting on a plane and going home. After hanging up from the interview, he said, he planned to ask the AI if he should stay or go.

Days later, Nick said he and the chatbot had talked. It suggested that maybe he felt “undervalued” and concerned about boundaries in his relationship with his girlfriend. He should talk openly with her, it suggested, even if he was, in his view, “honestly miserable” at the beach. It persuaded him to stick around and work it out. 

While his girlfriend knows about his ChatGPT shrink and they share an account, he deletes conversations about their real-life relationship.

She may never know the role AI played in keeping them together.

]]>
Lawmakers Duel With Tech Execs on Social Media Harms to Youth Mental Health /article/senate-grills-tech-ceos-on-social-media-harms/ Wed, 31 Jan 2024 23:20:00 +0000 /?post_type=article&p=721450 During a hostile Senate hearing Wednesday that sometimes devolved into bickering, lawmakers from across the political spectrum accused social media companies of failing to protect young people online and pushed rules that would hold Big Tech accountable for youth suicides and child sexual exploitation. 

The Senate Judiciary Committee hearing in Washington, D.C., was the latest act in a bipartisan effort to bolster federal regulations on social media platforms like Instagram and TikTok amid a growing chorus of parents and adolescent mental health experts warning the services have harmed youth well-being and, in some cases, pushed them to suicide. 

In an unprecedented moment, Meta founder and CEO Mark Zuckerberg, at the urging of Missouri Republican Sen. Josh Hawley, stood up and turned around to face the audience, apologizing to the parents in attendance who said their children were damaged — and in some cases, died — because of his company’s algorithms. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


“I’m sorry for everything you’ve all gone through,” said Zuckerberg, whose company owns Facebook and Instagram. “It’s terrible. No one should have to go through the things that your families have suffered.”

Senators argued the companies — and tech executives themselves — should be held legally responsible for instances of abuse and exploitation under tougher regulations that would limit children’s access to social media platforms and restrict their exposure to harmful content.

“Your platforms really suck at policing themselves,” Sen. Sheldon Whitehouse, a Rhode Island Democrat, told Zuckerberg and the CEOs of X, TikTok, Discord and Snap, who were summoned to testify. Section 230 of the Communications Decency Act, which allows social media platforms to moderate content as they see fit and generally provides immunity from liability for user-generated posts, has routinely shielded tech companies from accountability. As youth harms persist, he said those legal protections are “a very significant part of that problem.” 

Whitehouse pointed to a lawsuit against X, formerly Twitter, that was filed by two men who claimed a sex trafficker manipulated them into sharing sexually explicit videos of themselves over Snapchat when they were just 13 years old. Links to the videos appeared on Twitter years later, but the company allegedly refused to take action until after they were contacted by a Department of Homeland Security agent and the posts had generated more than 160,000 views. The by the Ninth Circuit, which cited Section 230. 

“That’s a pretty foul set of facts,” Whitehouse said. “There is nothing about that set of facts that tells me Section 230 performed any public service in that regard.”

In an opening statement, Democratic committee chair, Sen. Dick Durbin of Illinois, offered a chilling description of the harms inflicted on young people by each of the social media platforms represented at the hearing. In addition to Zuckerberg, executives who testified were X CEO Linda Yaccarino, TikTok CEO Shou Chew, Snap co-founder and CEO Evan Spiegel and Discord CEO Jason Citron.

“Discord has been used to groom, abduct and abuse children,” Durbin said. “Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a, quote, ‘platform of choice’ for predators to access, engage and groom children for abuse. And the prevalance of [child sexual abuse material] on X has grown as the company has gutted its trust and safety workforce.” 

Citron testified that Discord has “a zero tolerance policy” for content that features sexual exploitation and that it uses filters to scan and block such materials from its service. 

“Just like all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes,” Citron said. “All of us here on the panel today, and throughout the tech industry, have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals both online and off.” 

Lawmakers have introduced a slate of regulatory bills that have gained bipartisan traction but have failed to become law. Among them is the Kids Online Safety Act, which would require social media companies and other online services to take “reasonable measures” to protect children from cyberbullying, sexual exploitation and materials that promote self-harm. It would also mandate strict privacy settings when teens use the online services. Other proposals would to report suspected drug activity to the police — some parents said their children overdosed and died after buying drugs on the platforms — and a bill that would hold them accountable for hosting child sexual abuse materials. 

In their testimonies, each of the tech executives said they have taken steps to protect children who use their services, including features that restrict certain types of content, limit screen time and curtail the people they’re allowed to communicate with. But they also sought to distance their services from harms in a bid to stave off regulations. 

“With so much of our lives spent on mobile devices and social media, it’s important to look into the effects on teen mental health and well-being,” Zuckerberg said. “I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.” 

Zuckerberg by the National Academies of Sciences, Engineering and Medicine, which concluded there is a lack of evidence to confirm that social media causes changes in adolescent well-being at the population level and that the services could carry both benefits and harms for young people. While social media websites can expose children to online harassment and fringe ideas, researchers noted, the services can be used by young people to foster community. 

In October, 42 state attorneys general , alleging that the social media giant knowingly and purposely designed tools to addict children to its services. U.S. Surgeon General Vivek Murthy warning that social media sites pose a “profound risk of harm” to youth mental health, stating that the tools should come with warning labels. Among evidence of the harms is which found that Instagram led to body-image issues among teenage girls and that many of its young users blamed the platform for increases in anxiety and depression. 

Republican lawmakers devoted a significant amount of time during the hearing to criticizing TikTok for its ties to the Chinese government, calling out the app for collecting data about U.S. citizens, including in an effort to surveil American journalists. The Justice Department is reportedly investigating allegations that ByteDance, the Chinese company that owns TikTok, used the app to surveil several American journalists who report on the tech industry. 

In response, Chew said the company launched an initiative — dubbed “Project Texas” — to prevent its Chinese employees from accessing personal data about U.S. citizens. But employees claim the company has . 

YouTube and TikTok are by far the platforms where teens spend the most hours per day, according to a 2023 Gallup survey although Neal Mohan, the CEO of Google-owned YouTube, was not called in to testify.

Mainstream social media platforms have also been exploited for domestic online extremism. Earlier this month, for example, a teenager accused of carrying out a mass shooting at his Iowa high school reportedly maintained an active presence on Discord and, shortly before the rampage, commented in a channel dedicated to such attacks that he was “gearing up” for the mayhem. Just minutes before the shooting, the suspect appeared to capture a video inside a school bathroom and uploaded it to TikTok. 

Josh Golin, the executive director of Fairplay, a nonprofit devoted to bolstering online child protections, blasted the tech executives’ testimony for being little more than “evasions and deflections.” 

“If Congress really cares about the families who packed the hearing today holding pictures of their children lost to social media harms, they will move the Kids Online Safety Act,” Golin said in a statement. “Pointed questions and sound bites won’t save lives, but KOSA will.” 

The safety act, known as KOSA, has faced pushback from civil rights advocates on First Amendment grounds, arguing the proposal could be used to censor certain content and . Sen. Marsha Blackburn, a Republican from Tennessee and KOSA co-author, said last fall the rules are important to protect “minor children from the transgender in this culture” and cited the legislation as a way to shield children from “being indoctrinated” online. The Heritage Foundation, a conservative think tank, endorsed the legislation, that “keeping trans content away from children is protecting kids.” 

Snap’s Evan Spiegel and X’s Linda Yaccarino both agreed to support the Kids Online Safety Act.

Aliya Bhatia, a policy analyst with the nonprofit Center for Democracy and Technology, said that although lawmakers made clear their intention to act, their directives could end up doing more harm than good. She said the platforms serve as “peer-to-peer learning and community networks” where young people can access information about reproductive health and other important topics that they might not feel comfortable receiving from adults in their lives. 

“It’s clear that this is a really tricky issue, it’s really difficult for the government and companies to decide what is harmful for young people,” Bhatia said. “What one young person finds helpful online, another might find harmful.”

South Carolina’s Sen. Lindsey Graham, the committee’s ranking Republican, said that social media companies can’t be trusted to keep kids safe online and that lawmakers have run out of patience.

“If you’re waiting on these guys to solve the problem,” he said, “we’re going to die waiting.” 

]]>
Experts on Kids & Social Media Weigh the Pros and Cons of ‘Growing Up in Public’ /article/experts-on-kids-social-media-weigh-the-pros-and-cons-of-growing-up-in-public/ Wed, 17 Jan 2024 13:30:00 +0000 /?post_type=article&p=720576 Parents are more concerned than ever about their kids’ social media habits, worried about everything from oversharing and cyberbullying to anxiety, depression, sleep and study time. 

Recent surveys of young people show that parents’ concerns may be justified: More than half of U.S. teens spend at least four hours a day on these apps. Girls, who are , spend an average of nearly an hour more on them per day than boys. Many parents are searching for support. 

Perhaps more than anyone, Carla Engelbrecht and Devorah Heitner are qualified to offer it. They’ve spent years puzzling over how families can help understand media from the inside out, and how schools both help and hurt kids’ ability to cope.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


Engelbrecht is a longtime children’s media developer. A veteran of Sesame Workshop and PBS Kids Interactive, she spent seven years at Netflix, most recently as its director of product innovation. Engelbrecht was behind the network’s Black Mirror “” episode in 2018, which allowed viewers to choose among five possible endings. 

Carla Engelbrecht (second from right) appears onstage with colleagues during a Netflix event on Black Mirror’s “Bandersnatch” episode in 2019. Engelbrecht, who was director of product innovation for the streaming service, is now testing a social media platform for children under 13. (Charley Gallay/Getty Images for Netflix)

Engelbrecht is now in public beta testing for , a new social media platform for kids under 13. She calls it a “course correction” for young people’s social media, aiming to teach them to be more mindful, thoughtful and responsible online.

Heitner is an who specializes in helping parents and educators understand how digital technology, especially social media and interactive gaming, shape kids’ realities. Her books include 2016’s and her new work . 

Speaking to either one would be enlightening, but we decided to facilitate a broader conversation by inviting them to come together (virtually) to share insights and offer a bit of advice for both parents and schools. 

Their conversation with Ӱ’s Greg Toppo was wide-ranging, covering the effects of the pandemic, the pressures kids feel online and the women’s experiences communicating with their own children.

Devorah Heitner spoke in 2017 at the Roads to Respect Conference in Los Angeles. Heitner’s new book explores the impact of modern technology on childhood, including the effects of increased adult supervision of kids through tracking devices. (Joshua Blanchard/Getty Images for Rape Treatment Center)

The solutions they offer aren’t simple. In Heitner’s words, parents seeking to learn more about their kids’’ media usage should pull back their surveillance and “lead with curiosity.” 

The conversation has been edited for length and clarity.

Ӱ: Devorah, tell us a little bit about your new book.

Devorah Heitner: I wrote Growing Up in Public because I was speaking for years about Screenwise in schools and all these other environments, and people said, “O.K., I get that we want to think about quality over quantity with screen time. But we also want to understand what kids’ subjective experience is and not just focus on how many minutes are good or bad.”

People lie about that anyway. People are sort of oblivious to their own screen use sometimes and get over-focused on their kids’. A lot of adults are recognizing: If I could have had a Tumblr or a Twitter or Instagram as a kid, I could have really done a lot of damage to my prospects and opportunities by so openly sharing.

What are we doing to our reputations?

As I started digging into that question, I recognized that parents are really part of the surveillance culture with kids. So are schools, with grading apps like or [which keep track of kids’ location, among other functions]. I really started understanding in a fuller way how kids are scrutinized. Kids are growing up very searchable, very public, and some of that is awesome. They have a platform, they can be activists. Some of it is problematic. 

The title of your book, Growing Up in Public, says so much about kid’s lives these days. I saw this term the other day: not FOMO, “Fear of Missing Out,” but FOMU, “.” Are those competing interests for young people?

Heitner: Well, there’s definitely a fear of messing up and especially being called out. There’s a lot of “gotcha” culture going on, and kids documenting each others’ screw-ups. And as much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated outside of that context.

I think it’s modeled by adults, but this kind of “gotcha” culture is very insidious and terrifying. And it should be terrifying. 

Carla, tell us a little bit about yourself.

Carla Engelbrecht: I’m a longtime product developer and researcher in the kids’ space. I’ve spent a lot of time making products for kids. I’ve seen for years kids wanting access to Twitter and Facebook and MySpace and , all through the generations of social media. And they always want what is not made for them. They’re aspirational.

Kids are just plopped into this. And just as you wouldn’t give a new driver the keys to the car and just say, “Go!” — you need to teach them how to drive — there’s the same concept for me with media use. We need to teach our kids. Parents don’t know what they’re doing, because none of us have really been through this before, and they abstain. They need support in learning how to do this. Where Devorah talks about things from that guidance perspective, I’m looking at: How can we build a product for kids that helps them learn? 

It seems to me like Betweened is a site for parents as much as anybody. 

Engelbrecht: There’s definitely two audiences here. There’s absolutely a path where I could build a product for kids and launch them onto it. But I wouldn’t be addressing all the pain points.

Kids want short-form content. They want to create. They want to connect with their peers. In order to successfully set kids up to do that, parents need tools, too. And so it is really a product for both kids and parents.

Carla mentioned all these different apps coming down the road. Devorah, I’m thinking about you saying to someone recently how you’ve been working on this book for five years. A lot has changed in five years. We didn’t have TikTok five years ago. 

Heitner: Screenwise came out in the fall of 2016, which was a memorable time for many reasons: a lot of social forces happening in our world with Trump’s election. 

And then you have the pandemic in 2020. That’s around the time I had sold the book and was trying to interview people. Suddenly, I’m not in schools anymore. I’m on Zoom with kids, which is a whole research problem: How do you get a wider range of kids, not just the super-compliant kids who show up to a Zoom? And the pandemic was an accelerant to a lot of things happening already with kids in tech.

“Parents are really part of the surveillance culture with kids. So are schools.”

Devorah Heitner

It was certainly not the beginning of kids being too young and not [the federal Children’s Online Privacy Protection Act gives parents control over what information websites can collect from their kids]. But it accelerated, and there was kind of a push toward things like Kids Messenger [on Facebook] and other things that I even experimented with at the time. 

The pandemic started when my son was 10. We were like, “Oh, what can we do to help him communicate with friends?” We experimented with Messenger. It was a fail for us, but I also talked to the people at and [two mobile phone companies marketed for children]. There are people, in different ways, trying to come up with solutions because they have understood that both the adult apps and the adult devices, like a smartphone that does all the things, might not be the ideal thing to give a 10-year-old. 

What’s changed since 2016 is there used to be more worry about one-to-one computing in schools. Now, every school pretty much is one-to-one. It’s really the outlier schools that don’t have tech or aren’t giving kids individual tech. Even as late as 2015, 2016, I was helping schools negotiate that with parents. And parents were like, “I don’t know. I’m not sure about screen time. I don’t know if I want my kid getting a Chromebook.”

Try to find a school now that doesn’t give kids iPads or Chromebooks or something. That’s probably one of the bigger differences. And then just the explosion in server-based gaming like Roblox and Minecraft and the ways kids interact in those digital communities. You see a lot of very complicated, weird ideas among adults who care about children. Like “I’ll wait until eighth grade to give a kid a phone. Meanwhile,my third-grader plays Roblox on a server with strangers.” 

Engelbrecht: Or has access to text messaging through their iPad.

Heitner: Exactly. And they’re very smugly waiting till eighth grade and I’m like, “For what? For your kid to make voice calls?” That’s the one thing they don’t want to do.

Carla, you come from a game design background. People have lots of terrible takes about video games, which I’m sure you’re used to. How has that background informed what you’re doing and what Betweened looks like?

Engelbrecht: A lot of people come to video games and they’re just like, “They’re evil,” or “They’re awful,” or “They’re violent.” And you can say the same thing about television. You can also say the same thing if you only eat broccoli. Anything in excess is not good for you — like running a marathon every day. I take a very pragmatic approach to most things we can actually find good in.

When I look at video games, I can’t classify them as evil. I instead look for the good things. And it’s the same with social media. Social media as part of a balanced media diet gives parents a lot of opportunities to connect, gives kids a lot of opportunity to express creativity and develop skills. 

“There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.”

Carla Engelbrecht

I’ll give you an example on the games side of things: Years ago, I did a South by Southwest talk called “What Can Teach Us About Parenting.” Left 4 Dead is not a game that kids should ever play. It’s a violent, first-person zombie apocalyptic shooter. It’s also one of the most beautifully designed cooperative games ever. I’m terrible with thumb sticks on video game controllers. I can’t walk in a straight line in a video game. I’m not great at the actual zombie-killing side of things. But I’m really good at running around and picking up health packs and checking in on people who have been damaged by zombies.

So there are different roles that people can play. I can still participate in the game, even though the primary way of playing Left 4 Dead is not what works for me. 

Also, if I’m playing with people, it fosters communication. I have to talk to people and someone needs to say. “Hey, I need help,” and I can come over. That’s what I’m looking for in games and social media: What are those underlying skills that, with a thoughtful perspective, you can leverage for good?

I wanted to switch gears a little bit and talk about something you mentioned earlier, Devorah: casual surveillance. I think about the stories we hear about parents not even just surveilling their kids — tracking their phones or their cars — but just keeping up in a way that we never even dreamed of. I wonder: Where did this come from? And how do you think a site like Betweened is going to help? 

Engelbrecht: I wish I knew exactly where it came from, but it certainly seems it’s symptomatic of the same thing: Everything has just kind of crept up on us. It’s like, as phones started to be introduced, we just thought, “Oh, well, I need to charge my phone, so I’ll charge it next to my bed.” And then the next thing you know, you’re checking it first thing when you wake up. It’s this slippery slope without the mindfulness of what it’s doing. Something has to happen to stop you, to make you take a step back and think, “How far have I gone? What boundaries have I crossed or what new boundary do I need to establish?” And to Devorah’s earlier point, the pandemic accelerated a lot of this.

Heitner: Part of it is we do it because we can. Even in relationships. I’ve known my husband since before we each had cell phones, but we didn’t used to check in as often because we didn’t have cell phones. It had to really rise to the level of an emergency before I would call him at work.

“As much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated.”

Devorah Heitner

Remember the days of 9-to-5 office jobs? He left in the morning and was at his job. I was a grad student then and I would go up to Northwestern and not even really have any reachability by phone. Now we have phones, and the expectation is pretty much down-to-the-minute: If I’m 11 minutes late, I’ll probably text and say, “I’m 11 minutes late.” There’s just so much expectation for contact and communication and knowing where other people are. We don’t use location surveillance for that, but a lot of families do, and a lot of people have watches and will check into each other’s location on watches.

Because it’s there, people do it. And then there’s also just tremendous worry right now about kids. Given that we as a society think it’s a good idea for everyone to have assault weapons, parents are a little nervous. That anxiety creeps into everything.

My older daughter is 31, and I remember getting her first cell phone when she was 12 or 13. I remember the intense peer pressure she felt to have a phone. And I really didn’t like it at all. But I kind of justified it by saying to myself, “This is going to keep her safe.” And I remember thinking to myself, “You’re so full of shit. You’re just really trying to smooth things over.” And I guess I wonder: As parents, do we have an overextended sense of peril about our kids these days?

Heitner: There’s a sense of peril. Also, the Internet and online news and targeted algorithms just fuel that worry and outrage. It’s a bit of a vicious cycle.

Engelbrecht: In some ways, it’s almost like there are more risks that could stick with you. There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.

I think about my daughter and I don’t want something to chase her for her entire life. That part of it feels very real. And then it feels out of control. I don’t have the tools or know exactly how I can best help her except for having hard conversations and trying to put some bumpers around her. But there’s not a lot of tools to put the bumpers around her.

Devorah, one of the things you have said is that the kind of surveillance a lot of parents are undertaking is really undermining the trust their kids feel, and backfiring because kids won’t open up to them when they really need to. Can you talk a little bit more about that?

Heitner: You just see kids really getting focused on going deeper underground. If their parents are like, “I’m going to get Bark and read every single thing they text,” then you see some kids who are like, “O.K., I need to go deeper underground, I need a VPN or to only text on Snapchat, or I need to do something where I can be more evasive.” And that concerns me, because then there’s no way to make use of the parent when the parent might be useful.

Engelbrecht: I think about how to create space to allow the kid to have a second chance at telling me the truth. For example, if there’s an empty bag of gummies and the kid is the only one who could have eaten it but says they didn’t, how can I create space to talk about making mistakes versus lying or intentionally hiding the truth? Saying, “I’m going to ask what happened to the gummis again, but first I want you to take a moment to think about your answer — it’s OK to change your answer, because I want to understand the truth. We all make mistakes and we can talk about it. But intentionally hiding the truth has consequences.”

If I later find out that the child lied, then there’s consequences. The hope is that eventually, a parent can say, “If you end up at a party where there’s alcohol, don’t drive home. Call me for a ride home. If you try to hide that there was alcohol and make poor decisions, then there’s additional consequences.”

“I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.”

Carla Engelbrecht

It’s important to be able to say, “I made a mistake” and talk about what to do from there. Hopefully, that provides an alternative to the arms race of increasingly sneaky strategies that Devorah described.

Heitner: That makes a lot of sense. I was just going to say: The surveillance — schools just push it really hard. Every time I go to a school, they’re like, “Are you logged into ?” or “Are you logged into ?” They’re just really pushing it so hard.

Are schools culpable in this? Sounds like you’d say, “Yes.” I don’t know if you’d call it surveillance, though. One of the functions of schools is to keep track of things, right?

Heitner: But what about the location tracking? My kid has to scan a QR code to get into the cafeteria. I skipped lunch every day of high school and ate with my drama club friends in the theater. Was that so bad? They have 3,500 kids QR-coding themselves into study hall. It’s pretty locked down. It’s pretty Big Brother, or if you read Cory Doctorow. 

Engelbrecht: Homework tracking means having full visibility of my daughter when part of what she needs to learn is the executive function skills to actually be able to plan and follow through and do her homework. I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.

So to me, it’s kind of that same thing: The information is there. Should it be provided? How do you use it? And, for me it’s: How do we better equip administrators, teachers or parents to stop and think about how to leverage this information? So maybe a kid who’s consistently missing their homework, yes, the parents should have more visibility as part of a support program to get the kid back on track and help them learn the skills. But to Devorah’s point, it doesn’t mean everyone needs to be badging into lunch.

Devorah, your message to parents is: There are all these things happening. There are all these things you have to keep track of. There are lots and lots of risks to kids being on social media, especially teenagers. But you shouldn’t panic. And I wanted to just throw this out to both of you: Instead of panicking, what should parents do? 

Heitner: Carla, you’re talking about creating a new community space for kids that’s more of a learning space, and that’s one alternative. Another alternative, in addition to, or potentially instead of, for parents who don’t have access to that, is just leaning into one or two spaces they really want to mentor their kids in.

Maybe their kid’s really involved in Minecraft. And if they want to join [a free voice, chat, gaming and communications app], the parents are waiting and saying, “O.K. You can join your library Discord with or your school Minecraft club on Discord, but not general Discord.”

Two 9-year-olds play the open world computer game Minecraft. Parenting expert Devorah Heitner urges parents to know more about what their kids are doing online without resorting to surveillance. (Getty Images)

Parents will tell me their kids are playing or they’re on YouTube. But I’m like, “What channels? It’s just like if somebody says, “I’m watching TV.” Well, what are you watching? Because that really is a big differentiator in terms of the experience.

Engelbrecht: It goes back to your “Fear of Messing Up.” I think so much about how it’s important for parents to wade in and get involved with their kids. This has been the advice for decades, whatever the newfangled thing was. I was just doing some writing about encouraging parents to actually do with their kids. It’s an opportunity to bond. It actually requires some planning and practice. It’s physical activity. I assume most parents are like me, that they’re not a great dancer and it’s uncomfortable and you don’t want to mess up.

But modeling that I’ll do something that’s out of my comfort zone and connect with you over something that I know you enjoy, can be very simple. It doesn’t mean a parent has to suddenly learn all aspects of Roblox or Discord, because they can be intimidating. But just find an entry point and connect with the child and participate with them. It just has so many benefits. It’s true whether they’re into Tonka trucks or Roblox. Parenting means, “Get in there with your kid.”

Devorah, you use the phrase, “Lead with curiosity.”

Engelbrecht: Oh, I love that.

Heitner: You want to be curious and have your kid share it with you. Their expertise and experience as well and their discernment — what do they like or not like about this app? How would they change it if they could? Staying curious is an alternative to spying — being curious and asking kids to be curious even about their own experience. Do I actually feel less stressed when I scroll this app? That’s maybe a lot of mindfulness to expect of kids, who have a lot going on and a lot coming at them. But it’s important for all of us to be curious about how our experience is going.

Engelbrecht: That’s one of the ways I’ve been thinking about it from a product perspective: just how to help build in some scaffolds for mindfulness — things like when you start an app, actually having a timer that’s like, “How long do you want to spend on it right now?”

I set a timer for myself when I use TikTok because I spend a very long time on it. So being able to put that in there as a scaffold, to start being mindful and thoughtful about it. We’re posting content, but we’re actually not posting endless scrolls where you could spend all day.

I don’t want to prioritize the traditional tech metric of “time on task.” To me, success is like, “You can come and use Betweened for 20 minutes and then know you can come back another day and there’s lots of interesting stuff for you.” But it’s not all-consuming, must-do-this-all-the-time. And that’s a different perspective on tech products. It’s not how most products are developed.

]]>
As Advocates and Parents Rally, Youth Online Privacy Bills on Life Support /article/as-advocates-and-parents-rally-youth-online-privacy-bills-on-life-support/ Wed, 14 Sep 2022 21:07:24 +0000 /?post_type=article&p=696557 Sen. Ed Markey was getting quizzed on the viability of new online privacy laws for children when he took a brief but awkward pause. 

The Democrat from Massachusetts, who has long championed consumer privacy and become a key adversary of tech companies like Meta for monetizing user data, joined a Zoom call Tuesday evening to rally support for two bills he said would protect kids from being manipulated by social media algorithms. But he also brought some bad news: The legislation had “stalled” in Washington despite bipartisan support. 

Advocates this week are making a push to get the bipartisan bills — the Kids Online Safety Act and the Children’s Online Privacy Protection Act 2.0 — across the finish line. In a letter on Monday, 145 groups including Fairplay and Common Sense Media urged lawmakers to pass the legislation in the interests of protecting youth mental health, now considered at an all-time low in this country. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


But Markey seemed to lay out a path requiring Herculean effort. 

“Only the paranoid survive,” Markey said, adding that the legislation would pass if its supporters — and youth activists in particular — called their lawmakers and demanded they “pull this out of the pile of issues” and give it priority. “We’re going to try to get it over the finish line, but we need you to just have your energy level go higher and higher for these final couple of months and we will get it done.”

The legislative push comes a year after a Facebook whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially for teenage girls. The whistleblower, Frances Haugen, to regulate social media companies — Meta owns Facebook and Instagram — that she accused of pursuing “astronomical profits” while knowingly putting its users at risk. revealed the company knew Instagram made “body image issues worse for one in three teen girls” who blamed the social media platform for driving “increases in the rate of anxiety and depression” and, for some, suicidal thoughts. 

The would make tech companies liable if they expose young people to content deemed harmful, including materials that promote self-harm, eating disorders and substance abuse. It would also require parental controls that could be used to block adult content and to study systems to verify users’ age “at the device or operating system level.”

The , which expands a law that Markey championed in 1998 to cover older teens, would ban targeted advertisements directed at children and require companies to offer an “eraser button” that allows children and teens to remove their personal data. 

Former Facebook employee Frances Haugen (Getty Images)

But deep-pocketed tech companies, Sen. Richard Blumenthal said Tuesday, are standing in the way. 

“Our obstacles here are the big tech lobbyists,” he said. “They have armies of lobbyists. They pay them, they pay them very well. They hire them to block this legislation.”

While the legislation is designed to protect kids, some digital privacy experts say the rules could come with significant unintended consequences — and could lead to an age-verification system where all web users are made to submit documentation like a driver’s license, requiring them to hand over personal information to tech companies. 

On the Zoom call to bolster support for the bills was Vinaya Sivakumar, a high school senior from Ohio, who created her first social media profile when she was 12. What started out as being harmless, she said, quickly took a toll on her health. 

“It just snowballed into something that constantly perpetuated actions and thoughts like self-harm and eating disorders and it was really never let out of my sight,” said Sivakumar, referring to a stream of content she found harmful being fed to her by algorithms. “It almost encouraged me to make decisions that I didn’t necessarily feel were mine and my mental health was in the worst state ever.”

Kristin Bride, a mother and digital safety advocate from Oregon, implored lawmakers to pass the legislation for kids like her 16-year-old son Carson, who died by suicide in 2020 after he was “visciously bullied” by other kids on Snapchat who used third-party apps to conceal their identities. Last year, Bride , the company that owns the social media app Snapchat, and accused it of lacking safeguards to protect children from harassment. In response, Snap suspended two of the apps, Yolo and LMK. But , NGL, has since cropped up. 

“Until social media companies are held accountable for their harmful products, they will always put profit over people,” Bride said, “and kids like Carson and so many others are just collateral damage.” 

Despite the heightened focus in Washington around digital rights and tech companies’ use of user data for targeted advertising, broader digital privacy legislation has also struggled this year. which would create a national digital privacy standard and limit the personal data that tech companies can collect about users, has hit roadblocks, from House Speaker Nancy Pelosi. 

Earlier this month, Ireland’s Data Protection Commission for violating European Union data privacy laws. The commission has been investigating the company for an Instagram setting that automatically sets the profiles of teenagers as public by default. 

Meanwhile, Meta has begun to roll out , including that automatically routes new users younger than 16 to a version with limits on content deemed inappropriate.

The childrens’ safety legislation, which would strengthen rules that haven’t been updated for decades, has received support from a broad range of groups focused on youth well-being, including and the American Psychological Association and The Jed Foundation. from digital rights advocates including the Electronic Frontier Foundation. In that while lawmakers deserve credit “for attempting to improve online data privacy for young people,” the plan would ultimately “require surveillance and censorship” of children and teens “and would greatly endanger the rights, and safety, of young people online.” 

“Data collection is a scourge for every internet user, regardless of age,” the report notes, but the legislation could ultimately force tech companies to further track their users. “Surveillance of young people is , even in the healthiest household, and is not a solution to helping young people navigate the internet.”

Disclosure: Campbell Brown oversees global media partnerships at Meta. Brown co-founded Ӱ  and sits on its board of directors.

]]>
Does Your School Have a ‘Slander’ Account? /article/does-your-school-have-a-slander-account/ Wed, 25 May 2022 20:01:00 +0000 /?post_type=article&p=589552 Even at Stuyvesant High School, one of the most academically rigorous and sought-after public schools in New York City, teenage gossip is, well, teenage gossip: who’s crushing on who, who just broke up, who’s the cutest in the grade.

But rather than comments whispered in hallways, students frequently share those juicy nuggets through anonymous online “” accounts on Facebook and Instagram that much of the student body follows religiously.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


“People will be talking about it, like, ‘Did you just see the new confession?’ ” said Samantha Farrow, a junior at Stuy.

Many confessions are harmless — complimenting a classmate’s smile or admitting apprehension about prom — but others target and bully students. In Farrow’s freshman year, a post called her and two peers overweight and unattractive. Dozens of students came to their defense, she said, reassuring them the insult was completely untrue. But still, the post affected her.

“I was mad and I was upset,” Farrow remembered. “It was very degrading to my self-esteem as a 14-year old.”

Accounts like Stuy Confessions are hardly rare, students across the country report. Though the pages , lockdown may have increased their popularity and influence as teens lost the ability to connect in person for months on end.

When schools en masse shifted online, much of young people’s socializing also migrated into virtual spaces like Discord servers, Google Hangouts and TikTok. Now two years later, even as pandemic restrictions have fallen across the country, many online communities remain, students say, and impact K-12 classrooms in ways that adults fail to understand.

“It’s really going over [educators’] heads,” Farrow told Ӱ. “So much stuff happens on Facebook and Instagram, the confessions accounts, and they have no idea.”

Courtesy of Samantha Farrow

“What people post on social media kinda seeps into the classroom,” she added.

In fall 2021, when Diego Camacho’s Los Angeles high school returned to in-person learning, students began taking pictures of their peers — sometimes eating, sometimes of their shoes under the bathroom stall — and posting them online anonymously without consent, he told Ӱ. 

He and other students “were constantly looking over our shoulders, looking around when we ate and some [of us] refused to use the bathroom out of fear [we] would end up on the pages,” said the high school senior. 

It took school administration two months to shut down the account, he said. While the page was active, it “created a lot of distrust between students,” said Camacho.

Stuyvesant Confessions on Facebook (Screengrab)

At Mia Miron’s middle school in nearby Pomona, California, Instagram pages of a similar style continue to pop up despite old accounts getting banned on numerous occasions, she said. With page titles based on the phrase “Lorbeer Lookalikes,” a play on their school’s name, users send photos they took of classmates to the accounts via direct message, and the page administrator then posts the images without indicating who submitted them.

“I just followed it to make sure nobody that I know would get hurt by not knowing their photo was on there,” explained Miron. 

Twice, the accounts have shared pictures of her sitting at her desk. The eighth grader doesn’t know who runs the account, she said, and did not give consent for those images to be posted. 

“I wouldn’t like my photo to be on there without my permission,” she told Ӱ.

While Miron says she hasn’t taken the posts personally, a friend of hers was cyberbullied on the page, she said, which took a toll on the middle schooler’s mental health. 

Ӱ spoke with eight students in 6th through 12th grade and one college student about their experience of social media’s impact on education post-COVID. Most agreed that lockdown initially forced them to lean more heavily on online platforms to stay connected with peers and that some of those habits have since stuck around.

But the proliferation of online content and connection has also delivered some positive effects, students emphasized.

Kota Babcock, a senior at Colorado State University, said his roommate joined a pandemic Discord server they still use for weekly horror movie screenings. High schooler Ameera Eshtewi, of Portland, Oregon, hones her programming skills as a member of the online community . And Joshua Oh, a Gambrills, Maryland middle schooler, said Instagram, Snapchat and Twitter helped him and his peers quickly spread the word to wear pink in support of victims of an alleged sexual assault at a nearby high school.

Circulated within Oh’s student body, a satirical TikTok account pokes fun without crossing a line, the teen said. The “slander” page posts videos about students and teachers that he finds “funny when they are true.”

One of a cowboy coughing heavily and falling down on a train track is captioned, “What Lois thinks will happen if she doesn’t have gum for 00000.1 seconds.” Another video with the caption “Brandon trying to convince his ex to take him back” features a man in the rain to a Lil Nas X song. 

In a key difference from the pages at Miron and Camacho’s schools, none of the videos include images of actual students. 

And in Pomona, as a counter to some of the online toxicity within Miron’s middle school, a student also created a school-based TikTok account featuring an “appreciation post for the girls that got put down on that other Lorbeer account.” The pictures students’ smiling faces set to B.o.B’s Nothing on You.

Instagram and other social media can have degrading effects on youth mental health, including eating disorders and suicidal ideation, particularly for teen girls bombarded with unhealthy body image standards. Facebook (now Meta), Instagram’s parent company, has tracked the harms for years, internal documents reported by the , but implemented few measures to curb the addictiveness of its app, as teen users have driven much of its popularity.

Even when students use accounts to uplift each other, ZaNia Stinson, a high school student in Charlotte, North Carolina, said that she and her peers’ dependence on social media often makes them less present IRL — in real life. 

Teachers often collect phones during class, she said, and when the devices get returned afterward, “we don’t pay attention in the halls so we bump into people, like our heads are glued to [our] phones.”

During free periods at Stuyvesant, said Farrow, students will often sit next to each other in the hallway without saying a word, just scrolling. The tendency, she believes, to ignore human contact in favor of digital has worsened since COVID. From time to time, she herself pulls up Instagram during class without the teacher knowing, she admits.

Yet one online outlet has provided consistent solace for her since early in the pandemic. In June 2020, the high schooler created a Twitter stan account, or fan account, for K-pop megastars BTS, who she jokingly described as her “biggest passion in life.” She has fun chatting with other fans of the group and appreciates the low stakes because she doesn’t know any of the other users in real life, she said.

Social media is “a good outlet if you know how to use it the right way,” said Farrow. “But I don’t think a lot of people do.”

This story was brought to you via Ӱ’s Student Council initiative, an effort to boost youth voices in our reporting. America’s Promise Alliance helped in the recruiting of our diverse 11-member council and the idea was conceived as part of Asher Lehrer-Small’s Poynter-Koch Media and Journalism Fellowship.

]]>