Source: Reuters
Friday 27 October 2023 18:57:53
"Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens," according to the complaint filed in the Oakland, California federal court. "Its motive is profit."
Children have long been an appealing demographic for businesses, which hope to attract them as consumers at ages when they may be more impressionable, and solidify brand loyalty.
For Meta, younger consumers may help secure more advertisers who hope children will keep buying their products as they grow up.
"At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true," he posted in October 2021 on his Facebook page.
In Tuesday's cases, Meta could face civil penalties of $1,000 to $50,000 for each violation of various state laws -- an amount that could add up quickly given the millions of young children and teenagers who use Instagram.
Much of the focus on Meta stemmed from a whistleblower's release of documents in 2021 that showed the company knew Instagram, which began as a photo-sharing app, was addictive and worsened body image issues for some teen girls.
The lawsuit by the 33 states alleged that Meta has strived to ensure that young people spend as much time as possible on social media despite knowing that they are susceptible to the need for approval in the form of "likes" from other users about their content.
"Meta has been harming our children and teens, cultivating addiction to boost corporate profits," said California Attorney General Rob Bonta, whose state includes Meta's headquarters.
States also accused Meta of violating a law banning the collection of data of children under age 13, and deceptively denying that its social media was harmful.
"Meta did not disclose that its algorithms were designed to capitalize on young users' dopamine responses and create an addictive cycle of engagement," the complaint said.
Dopamine is a type of neurotransmitter that plays a role in feelings of pleasure.
According to the complaint, Meta's refusal to accept responsibility extended last year to its distancing itself from a 14-year-old girl's suicide in the UK, after she was exposed on Instagram to content about suicide and self-injury.
A coroner rejected a Meta executive's claim that such content was "safe" for children, finding that the girl likely binged on harmful content that normalized the depression she had felt before killing herself.
States also alleged Meta is seeking to expand its harmful practices into virtual reality, including its Horizon Worlds platform and the WhatsApp and Messenger apps.
By suing, authorities are seeking to patch holes left by the U.S. Congress' inability to pass new online protections for children despite years of discussions.
Colorado Attorney General Philip Weiser said the whistleblower's revelations showed that Meta knew how Facebook and Instagram were harming children.
"It is very clear that decisions made by social media platforms, like Meta, are part of what is driving mental health harms, physical health harms, and threats that we can't ignore," he said.