Social Media Verdict Sparks Fear Laws for Kids Will Bring Censorship

Social Media Verdict Sparks Fear Laws for Kids Will Bring Censorship


Minutes after jurors delivered a multimillion-dollar verdict at the landmark social media addiction trial in Los Angeles last week, bereaved parents and supporters celebrating the victory over Big Tech faced tough questions about what comes next.

On the courthouse steps, they were asked to weigh in on possible age-verification requirements amid fears that some proposed regulations could backfire with unintended consequences. Julianna Arnold, whose 17-year-old daughter Coco died in 2022 after a man she met on Instagram gave her fentanyl, walked up to the microphone. “They’re still new, and we’re figuring out how they would work,” she said. “I think they have value, but I’d really still like to see these companies do what’s needed to design their platforms safely for kids.”

The question clashed with the celebratory mood, but days later, Illinois Gov. JB Pritzker was on social media himself, promoting his state’s “Children’s Social Media Safety Act,” a bill that would mandate strict age-verification rules for users in the state.

Commentaries in The New York Times and Techdirt sounded the alarm, warning that the hard-fought momentum around protecting kids online should actually scare the hell out of anyone who truly values free speech. Parents across the country who cheered the verdict, confident in their belief social media exerts a far stronger pull on kids’ attention than concerns like unlimited television, suddenly found themselves cast as clueless suckers who fell for another moral panic.

When The Daily Show tackled the issue Monday night, Jon Stewart pressed online privacy expert Cindy Cohn on whether it’s a “false choice” to say laws aimed at protecting kids online can’t coexist with the First Amendment. Stewart compared social media to “secondhand smoke” pumped out by reckless algorithms that create a “toxic soup” drowning out real speech. Cohn, who leads the Electronic Frontier Foundation, said it’s not a “regulate or not” choice, but a choice about who gets the power. She’d rather see a sweeping consumer privacy law than regulations that could leave a partisan government official — “a second Brendan Carr,” in her words — effectively policing social media. (As chair of the Federal Communications Commission and a vocal ally of President Trump, Carr previously pressured ABC to temporarily take Jimmy Kimmel off the air.)

For critics of Big Tech who also fear censorship, the challenge ahead is coming into focus: Is it possible to regulate how platforms deliver content to children without infringing on free expression and potentially motivating platforms to gather more personal data to measure liability?

‘Parents Are Correctly Frustrated’

India McKinney, director of federal affairs at EFF, tells Rolling Stone that any new laws singling out children would inherently make the problem worse. “They have to identify the population to know who gets special protections,” she argues, and she sees no good “privacy-protected way” to do that. Worse, she says, if companies face liability for content shown to minors that’s later linked to harms like depression or sexual exploitation, they’ll have every incentive to collect and store more user data — not less.

“Parents are correctly frustrated. There’s this feeling these billionaires do not have our best interests at heart and are doing stuff to make themselves richer at our expense. That’s a completely understandable perspective. But the First Amendment still exists,” McKinney warns. She worries that the leading piece of federal legislation backed by the families and legal advocates now suing the social media companies — the latest Kids Online Safety Act (KOSA) now stalled in the Senate — could have a “chilling effect” on important, even life-saving content, for some of social media’s most vulnerable users. She fears the apps might try to minimize their exposure to lawsuits by throttling helpful information about reproductive rights, LGBTQ issues, and other potentially sensitive but vital topics in teenagers’ feeds. So-called “duty of care” language in bills, including the Senate version of KOSA, says platform design must mitigate harms to minors, including “sexual exploitation.” Speaking with Stewart, Cohn said she worries new laws could “create instant liability anytime somebody gets mad that their kids saw trans content.”

“Even climate science could be one of the things they seek to block because there have been studies that say fears about a warming planet are negatively impacting teenagers’ mental health,” McKinney says. “Is that what you’re actually trying to block your kids from seeing?”

Laura Marquez-Garrett sees it differently. She’s one of the lawyers for the 20-year-old California woman who won the $6 million verdict against Meta and Google last week. As both a parent and member of the LGBTQ+ community, she didn’t support the first version of KOSA proposed in 2021, finding it too restrictive, she says.

Speaking with Rolling Stone, Marquez-Garrett says she was a first-year college student in 1995 when she found the Pacific Center for Human Growth online and met other queer teens in supportive chat rooms. She wants to safeguard access to the same type of resources for today’s kids and believes the revised Senate version of KOSA does that. It has updated language saying platforms would not be held liable for anything that a minor “deliberately and independently” searches for online. It also states it has no power to “limit the scope or alter the meaning” of Section 230 of the Communications Decency Act, the law that allows platforms to host massive amounts of user speech without being crushed by lawsuits.

Importantly, Garcia-Marquez says, the current Senate version of KOSA places non-profit organizations, schools, and libraries outside its scope. As a result, leading online resources for LGBTQ youth, including the Trevor Project and PFLAG, would not face any potential liability under the proposed law for the information, services, and platforms they provide, she says.

“When I was in college, I controlled my experience. I was able to turn off and do my homework at night. When I got into chat rooms, they didn’t have my physical location, or my photo, or my real name. That’s what’s so scary about Snapchat, right? We have kids who are like, ‘I didn’t know those predators would know where my house was,’” Marquez-Garrett says. “It was a very different world. Now, the social media companies have convinced everyone they can’t live without them. That’s addiction, not utility. They say, ‘Well, we’re where all the teens are.’ Yeah, well that’s because they’re addicted. Get rid of the digital nicotine, and we will see something else emerge.”

Marquez-Garrett, who got her law degree from Harvard and worked at a large corporate firm for 20 years before finding her way to the Social Media Victims Law Center, also disputes the other major criticism of KOSA: that it will lead to age verification and the mass surveillance of all social media users.

“It’s a red herring,” she says, pointing to the section of KOSA that states nothing in the bill “shall be construed to require the affirmative collection of any personal data with respect to the age of users that a covered platform is not already collecting in the normal course of business.” She says the platforms will only be liable for what they know about their users based on the data they already collect. And it’s enough, she says. “Meta, Snap, TikTok, and Google already estimate the age of users with incredible accuracy. It’s what they rely on internally. We want to make them rely on what they rely on already,” she argues.

McKinney isn’t swayed. “They want to wordsmith what the law says and doesn’t say, and it’s ignoring how companies will react and what that looks like. There is a chilling effect. That’s actually the point of the legislation,” McKinney says. “You might still be able to go to Trevorproject.org, but if you’re [on a social media platform], do you have to scan a photo of your ID first, and now your search history is associated with your name? What is that going to mean for things like sexual health information?”

‘No Scientific Consensus’

Some critics of KOSA and the thousands of personal injury lawsuits that have piled up against the social media companies have another gripe: They simply don’t believe “addiction” to social media exists. On its website, EFF says “there’s no scientific consensus that online platforms cause mental health disorders, nor agreement on how to measure so-called ‘addictive’ behavior online.” But research on the topic is growing. A study published last year in JAMA tracked 4,285 children and found that 31.3 percent of adolescents had “increasing addictive use trajectories for social media” over four years. High or increasing addictive use trajectories were associated with elevated risks of suicidal behaviors or ideation compared with low addictive use, the study found.

At the landmark trial in Los Angeles, Dr. Anna Lembke, medical director of Stanford University’s addiction-medicine program and author of the best-selling book Dopamine Nation, told jurors she’s diagnosed and treated more patients with social media addiction than she can recall. She said adolescents’ brains are “especially vulnerable” to addiction because the prefrontal cortex is not fully integrated with midbrain systems that regulate behavior. “There’s a lack of communication between the brakes and the accelerator,” she testified, explaining why teens take risks and struggle to anticipate consequences.

Social media, Dr. Lembke added, has effectively “drugified” connection and validation, and “the younger the exposure, generally speaking, the greater the risk.” Jurors ultimately found that Instagram and YouTube were designed to be addictive and that they harmed the plaintiff, who was identified during the trial by her initials, K.G.M.

Don’t Feed the Algorithm

Either way, according to EFF, the best way to tackle the problem of social media algorithms affecting children’s mental health is a comprehensive consumer privacy bill that restricts data collection for all users and bans online behavioral advertising across the board. McKinney says the move would give users more power to opt out and delete their data while stripping platforms of a key incentive to scoop up personal preferences in the first place. Over time, companies would have less engagement data to sell targeted content to kids and feed the algorithms many consider predatory and exploitative.

Trending Stories

“We don’t want these companies trying to figure out who’s a kid,” McKinney says, warning that approach would only deepen surveillance and make Big Tech even more powerful.

“It’s certainly politically popular to talk about how we all need to protect children, but it misses the bigger picture. What about seniors who are at huge risk of fraud and scams online? We all deserve protections,” she says. “We believe in anonymous speech. We believe in anonymous browsing, and that means anonymous speech for teenagers, too. I don’t want the government deciding what my kids get to see.”




www.rollingstone.com
#Social #Media #Verdict #Sparks #Fear #Laws #Kids #Bring #Censorship

Share: X · Facebook · LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *