Social media gives parents more control. But do they help?

As concerns about the harmful effects of social media on teens continue to mount, platforms from Snapchat to TikTok to Instagram are leveraging new features that they say will make their services safer and more age-appropriate. But the changes rarely involve the elephant in the room — the algorithms push endless content that can drag anyone, not just teens, down harmful rabbit holes.

The tools do offer some help, such as blocking strangers from messaging kids. But they also share some deeper flaws, starting with the fact that teens can break boundaries when they lie about their age. The platforms also place the enforcement burden on parents. And they do little or nothing to screen for inappropriate and harmful material served up by algorithms that can affect teens’ mental and physical well-being.

“These platforms know that their algorithms can sometimes amplify malicious content, and they don’t take steps to stop that,” said Irene Ly, a privacy advisor at the nonprofit Common Sense Media. The more teens keep scrolling, the more engaged they become — and the more engaged they are, the more profitable they are for the platforms, she said. “I don’t think they have too much incentive to change that.”

Take Snapchat, for example, which introduced new parental controls on Tuesday in what it calls the “Family Center” — a tool that allows parents to see who is messaging their teens, but not the content of the messages themselves. One catch: Both parents and their kids have to sign up for the service.

Nona Farahnik Yadegar, Snap’s director of platform policy and social impact, compares it to parents wanting to know who their kids are dating.

When kids go to a friend’s house or meet at the mall, parents will usually ask, “Hey, who are you going to meet?” How do you know them?” The new tool, she said, aims to give parents “the insight they really want to have to have these conversations with their teen while preserving teens’ privacy and autonomy.”

These conversations, experts agree, are important. In an ideal world, parents would regularly sit down with their children and have honest conversations about social media and the dangers and pitfalls of the online world.

But many kids use a bewildering variety of platforms, all of which are constantly evolving — and that raises the chance that parents will need to master and control controls across multiple platforms, said Josh Golin, executive director of the digital advocacy group for children Fairplay.

“It’s much better to require platforms to make their platforms more secure by design and by default rather than increasing the workload of the already overworked parents,” he said.

The new controls, Golin said, also fail to address many of the existing Snapchat issues. These range from kids misrepresenting their age to “compulsive use” encouraged by the app’s Snapstreak feature, to cyberbullying made easier by the disappearing posts that still serve as Snapchat’s claim to fame.

Farahnik Yadegar said Snapchat has taken “strong measures” to prevent children from falsely claiming to be over 13. Those caught lying about their age will have their accounts deleted immediately, she said. Teens who are over 13 but pretend to be even older get one chance to correct their age.

Tracking down such lies is not foolproof, but the platforms have different ways of finding out the truth. For example, if a user’s friends are mostly in their early teens, the user is probably also a teenager, even if they said they were born in 1968 when they signed up. Companies use artificial intelligence to detect age differences. A person’s interests can also reveal their real age. And, Farahnik Yadegar pointed out, parents can also find their kids mocking their birthdates if they try to turn on parental controls but find their teens ineligible.

Child safety and teen mental health have been at the center of both Democratic and Republican criticism of tech companies. States, which have been much more aggressive in regulating tech companies than the federal government, are also turning their attention to the issue. In March, several prosecutors launched a nationwide investigation into TikTok and its possible harmful effects on the mental health of young users.

TikTok is the most popular social app used by teens in the US, according to a new report from the Pew Research Center on Wednesday, which found that 67% say they use China’s video-sharing platform. The company has said it is targeting age-appropriate experiences, noting that some features, such as instant messaging, are not available to younger users. It says features like a screen time management tool help young people and parents control how long kids spend on the app and what they see. But critics note that such controls are leaky at best.

“It’s really easy for kids to get around these features and just get on with it on their own,” says Ly of Common Sense Media.

Instagram, owned by Facebook parent Meta, is the second most popular app among teens, Pew found, with 62% saying they use it, followed by Snapchat at 59%. Not surprisingly, according to the report, only 32% of teens said they had ever used Facebook, compared to 71% in 2014 and 2015.

Last fall, Frances Haugen, former Facebook employee turned whistleblower, revealed an internal investigation of the company and concluded that the social network’s attention-seeking algorithms contributed to mental health and emotional distress in teens who use Instagram, especially girls. That revelation led to some changes; For example, Meta has scrapped plans for an Instagram version for children under 13. The company has also introduced new parental controls and teen wellness features, such as urging teens to take a break if they scroll for too long.

Such solutions, Ly said, are “sort of tackling the problem, but actually going around it and not getting to the root of it.”

Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Leave a Reply