According to Reuters, attorneys general of 33 states in the United States have jointly sued Meta platform (formerly Facebook) and its subsidiary Instagram, accusing its social media platform of being addictive and causing harm to teenagers' mental health. These states include California and New York.
In the complaint filed on Tuesday, prosecutors said the Meta platform continued to mislead the public, concealing the dangers of its platform and deliberately inducing children and teenagers to indulge in social media. "Meta uses powerful and unprecedented technology to attract, mobilize and eventually trap young people and teenagers, motivated by profit," the complaint said. "
Studies have shown that children's use of social media on the Meta platform is strongly associated with "depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes," the states said.
The Meta platform said it was "disappointed" with the lawsuit. "instead of actively working with companies in the industry to develop clear, age-appropriate standards for a variety of applications suitable for teenagers, prosecutors have chosen this path," the company said. "
Eight other U.S. states and Washington, D.C., filed similar lawsuits against Meta on Tuesday, bringing the total number of authorities taking action to 42.
It House noted that Meta's shares fell 0.6 per cent on the Nasdaq market.
The complaint filed by 33 states points out that Meta has been trying to ensure that young people spend as much time on social media as possible, although the company knows that they are vulnerable to other users' recognition of its content. "Meta did not disclose how its algorithm exploited the dopamine response of young users and created an addictive participation cycle," the complaint said. Dopamine is a neurotransmitter associated with pleasure.
The complaint also alleges that the Meta platform refused to accept responsibility and even drew a line last year from the death of a 14-year-old girl who committed suicide in the UK when she was exposed to content about suicide and self-harm on Instagram. A coroner rejected claims by an Meta platform executive that such content was "safe" for children, while the coroner found that the girl was likely to be addicted to harmful content, normalizing the depression she felt before she committed suicide.
The states also accused the Meta platform of trying to extend its harmful practices to virtual reality, including its Horizon Worlds platform and WhatsApp and Messenger applications.
Through prosecution, the authorities are trying to close a loophole in Congress's inability to pass a new online child protection law, although it has been discussed for years.
Advertisement statement: the external jump links (including not limited to hyperlinks, QR codes, passwords, etc.) contained in the text are used to transmit more information and save selection time. The results are for reference only. All articles of IT House contain this statement.