youth privacy – 蜜桃影视 America's Education News Source Fri, 20 Dec 2024 22:05:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png youth privacy – 蜜桃影视 32 32 AI Tools and Student Privacy: 9 Tips for Teachers /article/ai-tools-and-student-privacy-9-tips-for-teachers/ Wed, 01 Jan 2025 17:30:00 +0000 /?post_type=article&p=737439 This article was originally published in

Since the release of ChatGPT to the public in November 2022, the number of AI tools has skyrocketed, and there are now many advocates for the potential changes AI can cause in education.

But districts have not been as fast in providing teachers with training. As a result, many are experimenting without any guidance, an .

To learn about how teachers and other educators can protect student data and abide by the law when using AI tools, Chalkbeat consulted documents and interviewed specialists from school districts, nonprofits, and other groups. Here are nine suggestions from experts.


Get stories like this delivered straight to your inbox. Sign up for 蜜桃影视 Newsletter


Consult with your school district about AI

Navigating the details about the privacy policies in each tool can be challenging for a teacher. Some districts list tools that they have vetted or with which they have contracts.

Give preference to these tools, if possible, and check if your district has any recommendations about how to use them. When a tool has a contract with a school or a district, they are supposed to protect students鈥 data and follow national and state law, but always check if your district has any recommendations on how to use the tool. Checking with your school鈥檚 IT or education technology department is also a good option.

It is also essential to investigate if your school or district has guidelines or policies for the general use of AI. These documents usually review privacy risks and ethical questions.

Check for reviews about AI platforms鈥 safety

Organizations like and review ed-tech tools and provide feedback on their safety.

Be careful when platforms say they comply with laws like the Family Educational Rights and Privacy Act, or FERPA, and the Children鈥檚 Online Privacy Protection Rule. According to the law, the school is ultimately responsible for children鈥檚 data and must be aware of any information it shares with a third party.

Study the AI platform鈥檚 privacy policy and terms

The privacy policy and the terms of use should provide some answers about how a company uses the data it collects from you. Make sure to read them carefully, and look for some of the following information:

  • What information does the platform collect?
  • How does the platform use the collected data? Is it used to determine which ads it will show you? Does it share data with any other company or platform?
  • For how long does it keep the collected data?
  • Is the data it collects used to train the AI model?

The list of questions that Common Sense Media uses for their privacy evaluations is .

You should avoid signing up for platforms that collect a broad volume of data or that are not clear in their policies. One potential red flag: vague claims about 鈥渞etaining personal information for as long as necessary鈥 and 鈥渟haring data with third parties to provide services.鈥

Bigger AI platforms can be safer

Big companies like OpenAI, Google, Meta, and others are under more scrutiny: NGOs, reporters, and politicians tend to investigate their privacy policies more frequently. They also have bigger teams and resources that allow them to invest heavily in compliance with privacy regulations. For these reasons, they tend to have better safeguards than small companies or start-ups.

You still have to be careful. Most of these platforms are not explicitly intended for educational purposes, making them less likely to create specific policies regarding student or teacher data.

Use the tools as an assistant, not a replacement

Even though these tools provide better results when you input more information, try to use them for tasks that don鈥檛 require much information about your students.

AI tools can help provide suggestions on how to ask questions about a book, set up document templates, like an Individualized Educational Program plan or a behavioral assessment, or create assessment rubrics.

But even tasks that can seem mundane can increase risks. For example, providing the tool with a list of students and their grades on a specific assignment and asking it to organize it in alphabetical order could represent a violation of student privacy.

Turn on maximum privacy settings for AI platforms

Some tools allow you to adjust your privacy settings. Look online for tutorials on the best private settings for the tool that you are using and how to activate them. , for example, allows users to stop it from using your data to train AI models.

Doing this does not necessarily make AI tools completely safe or compliant with student privacy regulations.

Never input personal information to AI platforms

Even if you take all the steps above, do not input student information. Information that is restricted can include:

  • Personal information: a student鈥檚 name, Social Security number, education ID, names of parents or other relatives, address and phone number, location of birth, or any other information that can be used to identify a student.
  • Academic records: reports about absences, grades, and student behaviors in the school, student work, and teachers鈥 feedback on and assessments of student work.

This may be harder than it sounds.

If teachers upload student work to a platform to get help with grading, for example, they should remove all identification, including the student鈥檚 name, and replace it with an alias or random number that can鈥檛 be traced back to the student. It鈥檚 also wise to ensure the students haven鈥檛 included any personal information, like their place of birth, where they live or personal details about their families, friends, religious or political inclination, sexual orientation, and club affiliations.

One exception is for platforms approved by the school or the district and holding contracts with them.

Be transparent with others about using AI

Communicate with your school supervisors, principal, parents, and students about when and how you use AI in your work. That way, everyone can ask questions and bring up concerns you may not know about.

It is also a good way to model behavior for students. For example, if teachers ask students to disclose when they use AI to complete assignments, being transparent with them in turn about how teachers use AI might foster a better classroom environment.

If uncertain, ask AI platforms to delete information

In some states, the law says platforms must delete users鈥 information if they request it. And some companies will delete it even if you aren鈥檛 in one of these states.

Deleting the data may be challenging and not solve all of the problems caused by misusing AI. Some companies may take a long time to respond to deletion requests or find loopholes in order to avoid deleting it.

The tips listed above come from the , published by the American Federation of Teachers; the report by the U.S. Department of Education鈥檚 Office of Educational Technology; and the used by Common Sense Media to carry out its privacy evaluations.

Additional help came from Calli Schroeder, senior counsel and global privacy counsel at the Electronic Privacy Information Center; Brandon Wilmart, director of educational technology at Moore Public Schools in Oklahoma; and Anjali Nambiar, education research manager at Learning Collider.

This story was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at . 

]]>
As Advocates and Parents Rally, Youth Online Privacy Bills on Life Support /article/as-advocates-and-parents-rally-youth-online-privacy-bills-on-life-support/ Wed, 14 Sep 2022 21:07:24 +0000 /?post_type=article&p=696557 Sen. Ed Markey was getting quizzed on the viability of new online privacy laws for children when he took a brief but awkward pause. 

The Democrat from Massachusetts, who has long championed consumer privacy and become a key adversary of tech companies like Meta for monetizing user data, joined a Zoom call Tuesday evening to rally support for two bills he said would protect kids from being manipulated by social media algorithms. But he also brought some bad news: The legislation had 鈥渟talled鈥 in Washington despite bipartisan support. 

Advocates this week are making a push to get the bipartisan bills 鈥 the Kids Online Safety Act and the Children鈥檚 Online Privacy Protection Act 2.0 鈥 across the finish line. In a letter on Monday, 145 groups including Fairplay and Common Sense Media urged lawmakers to pass the legislation in the interests of protecting youth mental health, now considered at an all-time low in this country. 


Get stories like this delivered straight to your inbox. Sign up for 蜜桃影视 Newsletter


But Markey seemed to lay out a path requiring Herculean effort. 

鈥淥nly the paranoid survive,鈥 Markey said, adding that the legislation would pass if its supporters 鈥 and youth activists in particular 鈥 called their lawmakers and demanded they 鈥減ull this out of the pile of issues鈥 and give it priority. 鈥淲e鈥檙e going to try to get it over the finish line, but we need you to just have your energy level go higher and higher for these final couple of months and we will get it done.鈥

The legislative push comes a year after a Facebook whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially for teenage girls. The whistleblower, Frances Haugen, to regulate social media companies 鈥 Meta owns Facebook and Instagram 鈥 that she accused of pursuing 鈥渁stronomical profits鈥 while knowingly putting its users at risk. revealed the company knew Instagram made 鈥渂ody image issues worse for one in three teen girls鈥 who blamed the social media platform for driving 鈥渋ncreases in the rate of anxiety and depression鈥 and, for some, suicidal thoughts. 

The would make tech companies liable if they expose young people to content deemed harmful, including materials that promote self-harm, eating disorders and substance abuse. It would also require parental controls that could be used to block adult content and to study systems to verify users鈥 age 鈥渁t the device or operating system level.鈥

The , which expands a law that Markey championed in 1998 to cover older teens, would ban targeted advertisements directed at children and require companies to offer an 鈥渆raser button鈥 that allows children and teens to remove their personal data. 

Former Facebook employee Frances Haugen (Getty Images)

But deep-pocketed tech companies, Sen. Richard Blumenthal said Tuesday, are standing in the way. 

鈥淥ur obstacles here are the big tech lobbyists,鈥 he said. 鈥淭hey have armies of lobbyists. They pay them, they pay them very well. They hire them to block this legislation.鈥

While the legislation is designed to protect kids, some digital privacy experts say the rules could come with significant unintended consequences 鈥 and could lead to an age-verification system where all web users are made to submit documentation like a driver鈥檚 license, requiring them to hand over personal information to tech companies. 

On the Zoom call to bolster support for the bills was Vinaya Sivakumar, a high school senior from Ohio, who created her first social media profile when she was 12. What started out as being harmless, she said, quickly took a toll on her health. 

鈥淚t just snowballed into something that constantly perpetuated actions and thoughts like self-harm and eating disorders and it was really never let out of my sight,鈥 said Sivakumar, referring to a stream of content she found harmful being fed to her by algorithms. 鈥淚t almost encouraged me to make decisions that I didn’t necessarily feel were mine and my mental health was in the worst state ever.鈥

Kristin Bride, a mother and digital safety advocate from Oregon, implored lawmakers to pass the legislation for kids like her 16-year-old son Carson, who died by suicide in 2020 after he was 鈥渧isciously bullied鈥 by other kids on Snapchat who used third-party apps to conceal their identities. Last year, Bride , the company that owns the social media app Snapchat, and accused it of lacking safeguards to protect children from harassment. In response, Snap suspended two of the apps, Yolo and LMK. But , NGL, has since cropped up. 

鈥淯ntil social media companies are held accountable for their harmful products, they will always put profit over people,鈥 Bride said, 鈥渁nd kids like Carson and so many others are just collateral damage.鈥 

Despite the heightened focus in Washington around digital rights and tech companies鈥 use of user data for targeted advertising, broader digital privacy legislation has also struggled this year. which would create a national digital privacy standard and limit the personal data that tech companies can collect about users, has hit roadblocks, from House Speaker Nancy Pelosi. 

Earlier this month, Ireland鈥檚 Data Protection Commission for violating European Union data privacy laws. The commission has been investigating the company for an Instagram setting that automatically sets the profiles of teenagers as public by default. 

Meanwhile, Meta has begun to roll out , including that automatically routes new users younger than 16 to a version with limits on content deemed inappropriate.

The childrens鈥 safety legislation, which would strengthen rules that haven鈥檛 been updated for decades, has received support from a broad range of groups focused on youth well-being, including and the American Psychological Association and The Jed Foundation. from digital rights advocates including the Electronic Frontier Foundation. In that while lawmakers deserve credit 鈥渇or attempting to improve online data privacy for young people,鈥 the plan would ultimately 鈥渞equire surveillance and censorship鈥 of children and teens 鈥渁nd would greatly endanger the rights, and safety, of young people online.鈥 

鈥淒ata collection is a scourge for every internet user, regardless of age,鈥 the report notes, but the legislation could ultimately force tech companies to further track their users. 鈥淪urveillance of young people is , even in the healthiest household, and is not a solution to helping young people navigate the internet.鈥

Disclosure: Campbell Brown oversees global media partnerships at Meta. Brown co-founded 蜜桃影视  and sits on its board of directors.

]]>