海角大神

Social media titans add teen safety features. Will they work?

Growing concern about the effects of social media on teens has led three social media giants 鈥 Snapchat, Instagram, and TikTok 鈥 to implement new parental control options. But is it enough to change online behaviors?

A phone in New York shows the Snapchat logo, Aug. 9, 2017. Snapchat, the third most used app by American teens, is one of the social media platforms taking strong measures to increase practical safety and mental health safety for its teen users.

Richard Drew/AP

August 11, 2022

As concerns about social media鈥檚 harmful effects on teens continue to rise, platforms from Snapchat to TikTok to Instagram are bolting on new features they say will make their services safer and more age appropriate. But the changes rarely address the elephant in the the room 鈥 the algorithms pushing endless content that can drag anyone, not just teens, into harmful rabbit holes.

The tools do offer some help, such as blocking strangers from messaging kids. But they also share some deeper flaws, starting with the fact that teenagers can get around limits if they lie about their age. The platforms also place the burden of enforcement on parents. And they do little or nothing to screen for inappropriate and harmful material served up by algorithms that can affect teens鈥 mental, and physical well-being.

鈥淭hese platforms know that their algorithms can sometimes be amplifying harmful content, and they鈥檙e not taking steps to stop that,鈥 said Irene Ly, privacy counsel at the nonprofit Common Sense Media. The more teens keep scrolling, the more engaged they get 鈥 and the more engaged they are, the more profitable they are to the platforms, she said. 鈥淚 don鈥檛 think they have too much incentive to be changing that.鈥

Lesotho makes Trump鈥檚 polo shirts. He could destroy their garment industry.

Take, for instance, Snapchat, which on Tuesday introduced new parental controls in what it calls the 鈥淔amily Center鈥 鈥 a tool that lets parents see who their teens are messaging, though not the content of the messages themselves. One catch: both parents and their children have to opt into to the service.

Nona Farahnik Yadegar, Snap鈥檚 director of platform policy and social impact, likens it to parents wanting to know who their kids are going out with.

If kids are headed out to a friend鈥檚 house or are meeting up at the mall, she said, parents will typically ask, 鈥淗ey, who are you going to meet up with? How do you know them?鈥 The new tool, she said, aims to give parents 鈥渢he insight they really want to have in order to have these conversations with their teen while preserving teen privacy and autonomy.鈥

These conversations, experts agree, are important. In an ideal world, parents would regularly sit down with their kids and have honest talks about social media and the dangers and pitfalls of the online world.

But many kids use a bewildering variety of platforms, all of which are constantly evolving 鈥 and that stacks the odds against parents expected to master and monitor the controls on multiple platforms, said Josh Golin, executive director of children鈥檚 digital advocacy group Fairplay.

What the sentence in Breonna Taylor鈥檚 death says about police reform under Trump

鈥淔ar better to require platforms to make their platforms safer by design and default instead of increasing the workload on already overburdened parents,鈥 he said.

The new controls, Mr. Golin said, also fail to address a myriad of existing problems with Snapchat. These range from kids misrepresenting their ages to 鈥渃ompulsive use鈥 encouraged by the app鈥檚 Snapstreak feature to cyberbullying made easier by the disappearing messages that still serve as Snapchat鈥檚 claim to fame.

Ms. Farahnik Yadegar said Snapchat has 鈥渟trong measures鈥 to deter kids from falsely claiming to be over 13. Those caught lying about their age have their account immediately deleted, she said. Teens who are over 13 but pretend to be even older get one chance to correct their age.

Detecting such lies isn鈥檛 foolproof, but the platforms have several ways to get at the truth. For instance, if a user鈥檚 friends are mostly in their early teens, it鈥檚 likely that the user is also a teenager, even if they said they were born in 1968 when they signed up. Companies use artificial intelligence to look for age mismatches. A person鈥檚 interests might also reveal their real age. And, Ms. Farahnik Yadegar pointed out, parents might also find out their kids were fibbing about their birth date if they try to turn on parental controls but find their teens ineligible.

Child safety and teen mental health are front and center in both Democratic and Republicans critiques of tech companies. States, which have been much more aggressive about regulating technology companies than the federal government, are also turning their attention to the matter. In March, several state attorneys general launched a nationwide investigation into TikTok and its possible harmful effects on young users鈥 mental health.

TikTok is the most popular social app U.S. teenagers use, according to a new report out Wednesday from the Pew Research Center, which found that 67% say they use the Chinese-owned video sharing platform. The company has said that it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. It says features such as a screen-time management tool help young people and parents moderate how long children spend on the app and what they see. But critics note such controls are leaky at best.

鈥淚t鈥檚 really easy for kids to try to get past these these features and just go off on their own,鈥 said Ms. Ly of Common Sense Media.
Instagram, which is owned by Facebook parent Meta, is the second most popular app with teens, Pew found, with 62% saying they use it, followed by Snapchat with 59%. Not surprisingly, only 32% of teens reported ever having used Facebook, down from 71% in 2014 and 2015, according to the report.

Last fall, former Facebook employee-turned whistleblower Frances Haugen exposed internal research from the company concluding that the social network鈥檚 attention-seeking algorithms contributed to mental health and emotional problems among Instagram-using teens, especially girls. That revelation led to some changes; Meta, for instance, scrapped plans for an Instagram version aimed at kids under 13. The company has also introduced new parental control and teen well-being features, such as nudging teens to take a break if they scroll for too long.

Such solutions, Ms. Ly said, are 鈥渟ort of getting at the problem, but basically going around it and not getting to the root cause of it.鈥

This story was reported by The Associated Press.聽