Sponsor a child

California online safety law requires tech companies to take steps to protect children’s mental health

Research shows that more and more young Americans are dealing with mental health issues, and technology is partly to blame. A new California law is forcing tech companies to do more to protect children’s privacy and data online. The measure could pave the way for similar laws elsewhere. (Photographic illustration by Alexia Faith/Cronkite News)

PHOENIX — The word “crisis” dominates headlines about children’s mental health these days, with experts and advocates pointing the finger at one factor in particular: social media.

Following recent reports about the impact of platforms like Instagram on the well-being of teens, several groups have sued tech companies, and in September California enacted the first nationwide law requiring companies to do more to protect privacy. privacy and data of children online.

Dr. Jenny Radesky, a behavioral and developmental pediatrician who studies the intersection between technology and child development, has seen firsthand what this youth crisis looks like.

“The media is now so good at interacting with our psychology that you can imagine that sometimes they’ll play to our strengths, but other times they’ll play to our weaknesses,” said Radesky, an assistant professor at the University College of Medicine. the University of Michigan. .

“We see it with tons of referrals to the medical center for everything from eating disorders to suicide attempts. There is no doubt that there is a mental health crisis that existed before the pandemic – and which is only getting worse.

An American surgeon general advisory, published in December, warns that more and more young Americans are facing mental health issues. The COVID-19 pandemic is to blame, he said, but so is technology, as young people are “bombarded with messages through the media and popular culture that erode their self-esteem – telling them that ‘they aren’t handsome enough, popular enough, smart enough or rich enough.

Tech companies tend to prioritize engagement and profit over protecting users’ health, the report finds, using methods that can increase the time children spend online and, at in turn, contribute to anxiety, depression, eating disorders and other problems.

During the pandemic, the time spent by young people in front of the screens for activities not related to schoolwork increased from an average of 3.8 hours per day to 7.7 hours per day.

Radesky is particularly concerned about social media apps that use algorithms to continuously deliver content to the user. Her concern is when a child’s viewing habits or online behaviors reveal something about them to the platform. A user who constantly interacts with violent videos may tell the app that they are a bit impulsive, for example.

She noted that TikTok and Instagram use such algorithms, while on Twitch and Discord platforms users have to search for content.

“Automated systems don’t always detect when they’re serving something that could potentially go — for a child, a teenager, or even an adult — into territory that’s not in their best interest,” Radesky said.

“We need the digital ecosystem to respect that children need space and time away from technology and to engage with positive and hopeful content.”

As the nation searches for remedies, California has passed a law that could serve as a model for other states.
The bipartisan legislation was sponsored by Assemblymen Buffy Wicks, D-Oakland, and Jordan Cunningham, R-San Luis Obispo. It prohibits companies offering online services from accessing children’s personal information, collecting or storing location data of younger users, profiling a child, and encouraging children to provide personal information.

A working group will be needed to determine how best to implement the policies by January 2024.

The measure, announced as a first in the United States, was modeled on a similar measure passed last year in the United Kingdom, where the government imposed 15 standards that technology companies, in particular those that collect data from children, must follow.

Common Sense Media, a San Francisco nonprofit that advocates for the safe and responsible use of children’s media, supported the California measure. Irene Ly, the organisation’s policy adviser, called it a first step in forcing tech companies to adopt changes to make the internet safer for children.

Related story

Ly said the companies made “intentional design choices” to boost engagement, such as automatically playing videos when users scroll and using algorithms to deliver targeted content to users, and argued that companies are more than capable of making changes that protect young users.

“It’s high time companies made some of these simple and necessary changes, like giving young users the option to have the most productive privacy settings by default and not automatically track their precise location,” said said Ly.

Ly said protecting privacy goes hand in hand with protecting mental health, given that teens are particularly vulnerable to the influences of online content.

“They won’t develop critical thinking or the ability to distinguish between what’s an advertisement and what’s the content until they’re older. This makes them really ill-equipped to assess what they see and the impact it may have on them.

Ly quoted a April report from advertising watchdog group Fairplay for Kids who discovered that Instagram’s algorithm was promoting eating disorder accounts that had attracted 1.6 million unique followers.

“Algorithms profile children and teens to provide images, memes and videos encouraging restrictive diets and extreme weight loss,” the report said. “And in turn, Instagram promotes and recommends child and adolescent eating disorder content to half a million people around the world.”

The report caught the attention of members of Congress, who asked for answers of Meta, the parent company of Instagram, and its CEO, Mark Zuckerberg.

The Social Media Victims Law Center subsequently filed a court case against Meta on behalf of Alexis Spence, a California teenager who developed an eating disorder, as well as anxiety and depression, when she was just 11 years old.

The lawsuit alleges that Alexis was directed to Instagram pages promoting anorexia, negative body image and self-harm, and argues that Instagram’s algorithm is designed to be addictive and targets specifically preteens.

It is one of many similar trials against technology companies filed after Frances Haugen, former product manager at Meta, leaked internal documents in 2021, this suggested the company was aware of the harmful content its algorithms were pushing.

In a September 2021 Statement, Meta said he has taken steps to reduce harm to young people, including introducing new resources for people struggling with body image issues; updated policies to remove graphic content related to suicide; and launch a Instagram function which allows users to shield themselves from unwanted interactions to reduce bullying.

“We have a long history of using our research…to inform changes to our apps and provide resources for the people who use them,” the company said.

And in a Facebook post last year, Zuckerberg said, “The reality is that young people are using technology. … Tech companies need to create experiences that meet their needs while keeping them safe. We are deeply committed to doing cutting-edge work in this area.

Dylan Hoffman is CEO of Tech Net, a network of tech executives representing approximately 100 companies. Although the organization supports protecting children online, it had some concerns about the new California measure, he said.

A provision requires companies to estimate the ages of child users “with a reasonable level of certainty,” and Hoffman worries that these verification steps could affect adults looking for legal content.

“The bill defines children to mean anyone under the age of 18, which could create problems,” he said, noting that TechNet had tried to lobby to change the definition of “children” in bill for users under the age of 16. does it mean for companies to identify the age of their users? Are they required to more strictly and rigorously verify the age and identity of their users?

This, he said, “could have a number of consequences” – not only in terms of access for children, but also access for adults.

Radesky hopes that as conversations continue about the pros and cons of using social media, the media will portray children’s mental health as an issue that everyone should address, not just parents.

“Hopefully in the future as the press continues to cover this…they really start to focus more on changing the tech environment and what tech companies can do better,” he said. she declared.

A federal measure calling on tech companies to implement new child safeguards was introduced in Congress earlier this year. But with the Child Online Safety Act Still pending, Radesky noted that California’s measure will serve as a test for businesses and for young people.

“You’re going to ask this bunch of kids in California to see: How well is tech doing this? It’s all about law enforcement and tech companies really listening to their kid design teams,” he said. she declared.

Ultimately, Radesky added, businesses also need to start thinking of these laws not as regulations, but “more like cleaning up that area of ​​the neighborhood that’s filled with trash.”



#California #online #safety #law #requires #tech #companies #steps #protect #childrens #mental #health

Related Articles

Back to top button