As concern grows over social media, U.S.lawsuits stack up<\/p>\n
*<\/p>\n
Surge in mental health problems worst among girls<\/p>\n
*<\/p>\n
Lawyers zone in on algorithm designs, whistleblower leaks<\/p>\n
*<\/p>\n
Others see platforms as scapegoat for society’s woes<\/p>\n
By Avi Asher-Schapiro<\/p>\n
LOS ANGELES, Feb 8 (Thomson Reuters Foundation) – At about the time her daughter reached the age of 12, American health executive Laurie saw her once confident, happy child turning into someone she barely recognized.At first, she thought a bad case of adolescent angst was to blame.<\/p>\n
Initially, her daughter had trouble sleeping and grappled with episodes of self-loathing and anxiety, but by the time she was 14, she had started cutting herself and was having suicidal thoughts.<\/p>\n
Without Laurie knowing, she had been sneaking away her confiscated smartphone and spending hours online at night, trawling through posts about self-harm and eating disorders on social media platforms.<\/p>\n
“One day she said to me: ‘Mom, I’m going to hurt myself badly if I don’t get help,'” Laurie said as she described the mental health crises that have plagued her daughter for the last two years, disrupting her education and devastating the family’s finances.<\/p>\n
She asked to use only her first name in order to protect the identity of her daughter.<\/p>\n
Paying for her daughter’s care – therapists, a psychiatrist, and multiple residential treatment facilities across the country – has nearly bankrupted Laurie, who recently sold her house in California and moved to a cheaper home in another state.<\/p>\n
In August, she filed a lawsuit on behalf of her daughter against the social media platforms she blames for the ordeal: Instagram, Snapchat and TikTok.<\/p>\n
The case is one of dozens of similar U.S.lawsuits which argue that, when it comes to children, social media is a dangerous product – like a car with a faulty seat-belt – and that tech companies should be held to account and pay for the resulting harms.<\/p>\n
“Before (she used) social media, there was no eating disorder, there was no mental illness, there was no isolation, there was no cutting, none of that,” Laurie told the Thomson Reuters Foundation about her daughter, who is identified as C.W.in the suit.<\/p>\n
Don Grant, a psychologist who specializes in treating children with mental health issues linked to digital devices, said Laurie’s predicament is increasingly common.<\/p>\n
“It’s like every night, kids all over the country sneak out of their houses and go to play in the sewers under the city with no supervision. That’s what being online can be like,” he said.<\/p>\n
“You think just because your kids are sitting in your living room they’re safe – but they’re not.”<\/p>\n
Facebook’s parent company Meta Platforms Inc, Snap Inc, which owns Snapchat, and TikTok declined to comment on individual lawsuits, but said they prioritized children’s safety online.<\/p>\n
Meta executives, under criticism over internal data showing its Instagram app damaged the mental health of teenagers, have highlighted the positive impacts of social media, and their efforts to better protect young users.<\/p>\n
ASBESTOS, TOBACCO, SOCIAL MEDIA?<\/p>\n
Laurie is represented by the Social Media Victims Law Center, a firm co-founded by veteran trial lawyer Matt Bergman, who won hundreds of millions of dollars suing makers of the building material asbestos for concealing its linkage with cancer in the 1990s and early 2000s.<\/p>\n
Bergman decided to turn his attention to social media after former Facebook executive Frances Haugen leaked thousands of internal company documents in 2021 that showed the company had some knowledge of the potential harm its products could cause.<\/p>\n
“These companies make the asbestos industry look like a bunch of Boy Scouts,” Bergman said.<\/p>\n
Facebook has said the Haugen papers have been mischaracterized and taken out of context, and that Wall Street Journal articles based on them “conferred egregiously false motives to Facebook’s leadership and employees”.<\/p>\n
Bergman’s firm has signed up more than 1,200 clients including Laurie over the past year, taking out television ads asking families who worry about their children’s social media use to get in touch on a toll-free hotline.<\/p>\n
In addition to more than 70 cases involving child suicide, the firm has collected over 600 cases linked to eating disorders.Dozens more accuse social media firms of failing to prevent sex trafficking on their platforms, or stem from accidental deaths after children attempted viral stunts allowed to spread online.<\/p>\n
In late 2022, 80 similar federal suits from 35 different jurisdictions were consolidated together and are now being considered by the U.S.District Court for the Northern District of California.<\/p>\n
Laurie’s suit is part of a similar bundle of suits filed in California state courts.<\/p>\n
HIDING BEHIND SECTION 230<\/p>\n
None of these cases – or any of those filed by Bergman – have yet to be heard by a jury, and it is not clear if they ever will.<\/p>\n
First, he has to get past Section 230 of the Communications Decency Act, a provision that provides technology companies some legal immunity for content published on their platform by third parties.<\/p>\n
Courts routinely cite the provision when they dismiss lawsuits against social media firms, which prevents the cases from moving on to trial.<\/p>\n
In October, for example, a court in Pennsylvania blocked a lawsuit against TikTok brought on behalf of a child who died after suffocating themselves doing a so-called blackout challenge that was widely shared on the video-sharing site.<\/p>\n
When it was enacted in the 1990s, Section 230 was intended to shield the nascent tech industry from being crushed under waves of lawsuits, providing space for companies to experiment with platforms that encouraged user-generated content.<\/p>\n
Laura Marquez-Garrett, a lawyer with the Social Media Victims Law Center who is taking the lead on Laurie’s case, said she believed her cases could be won if a court agreed<\/a> to hear them.<\/p>\n “The moment we get to litigate … and move forward, it’s game over,” she said.<\/p>\n Bergman and Marquez-Garrett are part of growing cohort of lawyers who think Section 230 is no longer tenable, as political pressure builds on the issue.<\/p>\n President Joe Biden has voiced support for “revoking” Section 230, and politicians in both parties have proposed legislation that would scrap or tweak the provision. But so far, no reform packages have gained traction, shifting the focus of reform efforts to litigation.<\/p>\n “We aren’t talking about small companies experimenting with new technology; we’re talking about huge companies who have built harmful products,” Bergman said.<\/p>\n Bergman and his team say the harm to their clients is not primarily about harmful speech that just so happened to be posted online, but that it can directly be attributed to design decisions made by the tech companies.<\/p>\n His lawsuits focus on the building of algorithms that maximize the amount of time children spend online and push them towards harmful content; the way friend recommendation features can introduce children to predatory adults – as well as the lax controls for parents who want to restrict access.<\/p>\n “These lawsuits are about specific design decisions social media platforms have made to maximize profit over safety,” Bergman said.<\/p>\n Asked by the Thomson Reuters Foundation to comment on the company’s product designs, Meta sent an emailed statement from its global head of safety, Antigone Davis, who said the company takes children’s safety seriously.<\/p>\n “We want teens to be safe online. We’ve<\/a> developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences,” the statement read.<\/p>\n A Snap spokesperson did not comment directly on the pending litigation, adding in a statement that “nothing is more important to us than the wellbeing of our community.”<\/p>\n “We curate content from known creators and publishers and use human moderation to review user generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content,” the statement added.<\/p>\n ‘FOR PARENTS EVERYWHERE’<\/p>\n Laurie’s lawsuit – which was filed in late August in the Superior Court of Los Angeles – alleges that TikTok, Meta and Snap, are “contributing to the burgeoning mental health crisis perpetrated upon the children and teenagers of the United States.”<\/p>\n “I’m doing this for parents everywhere,” she said.<\/p>\n A sharp increase in depression and suicide among U.S.teenagers coincided with a surge in social media use about a decade ago, though a slew of research has reached mixed conclusions about a possible link.<\/p>\n Bergman is not the first lawyer to try to bring a tech firm to court for building an allegedly harmful product.<\/p>\n Carrie Goldberg, aK<\/a> a New York-based lawyer, helped to popularize the notion that social media software is essentially like any other consumer product – and that harms it causes in the real world should open up manufacturers to lawsuits.<\/p>\n In 2017, hukukiman.tj<\/a> she sued the dating app Grindr on behalf of Matthew Herrick, a man who was stalked and threatened online by an ex-boyfriend, but could not get Grindr to block his harasser.<\/p>\n Goldberg argued that Grindr’s decision to make it difficult to kick harassers off the app should open the company up to some liability as designers of the product, but the court disagreed – ruling that Grindr merely facilitated communications, and was therefore protected under Section 230.<\/p>\n “I couldn’t get in front of a jury,” Goldberg recalled, saying that if such cases were allowed to proceed to trial, they would likely succeed.<\/p>\n A lot has changed in the last five years, she said: the public has become less trusting of social media companies and courts have started to entertain the notion that lawyers should be able to sue tech platforms in the same way as providers of other consumer products or services.<\/p>\n In 2021, the 9th Circuit Court in California ruled that Snap could potentially be held liable for the deaths of two boys who died in a high-speed car accident that took place while they were using a Snapchat filter that their families say encouraged reckless driving.<\/p>\n In October, the U.S.Supreme Court decided to hear a case against Google that accuses its YouTube video platform of materially supporting terrorism due to the algorithmic recommendation of videos by the Islamic State militant group.<\/p>\n Legal experts said that case could set an important precedent for how Section 230 applies to the content recommendations that platforms’ algorithms make to users – including those made to children such as Laurie’s daughter.<\/p>\n “The pendulum has really swung,” Goldberg said.”People no longer trust these products are operating in the public good, and the courts are waking up.”<\/p>\n Outside the United States, the balance has shifted still further, and is beginning to be reflected both in consumer lawsuits and regulation.<\/p>\n