Sydney Times

CITY OF SYDNEY NEWS Digital Media EDUCATION ENTERTAINMENT LIFESTYLE Smartphone Film making Social Media Apps

Instagram -California sues Facebook parent Meta over alleged harm to young people

Written by News Aggregator

*Reprinted and shared from the LA Times in the Public interest (Editor)

Instagram -California sues Facebook parent Meta over alleged harm to young people

https://www.latimes.com/california/story/2023-10-24/facebook-parent-meta-sued-instagram-young-people

California and other states on Tuesday sued Facebook parent company Meta over allegations that it “designed and deployed harmful features” on the main social network and its platform Instagram.

“Our bipartisan investigation has arrived at a solemn conclusion: Meta has been harming our children and teens, cultivating addiction to boost corporate profits,” California Atty. Gen. Rob Bonta said in a statement. “With today’s lawsuit, we are drawing the line. We must protect our children and we will not back down from this fight.”

The 233-page lawsuit, filed in a federal court in Northern California, alleges the social media giant violated consumer protection laws and a federal law aimed at safeguarding the privacy of children under 13 years old. Bonta co-led a bipartisan coalition of 33 attorneys general filing the federal lawsuit against Meta. Eight attorneys general are also filing lawsuits against Meta on Tuesday in state courts, according to Bonta’s office.

In 2021, a bipartisan group of state attorneys general, including from California, Tennessee and Nebraska, announced they were investigating Meta’s promotion of its social media app Instagram to children and young people. Advocacy groups, lawmakers and even parents have criticized Meta, alleging the platform hasn’t done enough to combat content about eating disorders, suicide and other potential harms.

As part of the investigation, the state attorneys general looked at Meta’s strategies for compelling young people to spend more time on its platform. The lawsuit alleges that Meta failed to address the platform’s harmful impact to young people.

Meta said it’s committed to keeping teens safe, noting it rolled out more than 30 tools to support young people and families.

“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” a Meta spokesperson said in a statement.


“Scrutiny over Meta’s potential damage to the mental health of young people intensified in 2021 after Frances Haugen, a former Facebook product manager, disclosed tens of thousands of internal company documents.”

Some of those documents included research that showed Facebook is “toxic for teen girls,” worsening body image issues and suicidal thoughts, the Wall Street Journal reported in 2021. Meta said its research was “mischaracterized,” and teens also reported Instagram made them feel better about other issues such as loneliness and sadness.

The amount of time teens spend on social media has been a growing concern especially as platforms use algorithms to recommend content it thinks users like to view. In 2022, attorneys general across the country started investigating TikTok’s potential harm to young people as well.

That year, executives from the social media company including Instagram’s head Adam Mosseri testified before Congress. Instagram then paused its development of a kids’ version of the app and rolled out more controls so parents could limit the amount of time teens spend on it. Social media apps like Instagram require users to be at least 13 years old, but children have lied about their age to access the platform.

The photo- and video-sharing app Instagram is popular among U.S. teens, according to a Pew Research Center survey released this year. About 62% of teens reported using Instagram in 2022. Google-owned YouTube, TikTok and Snapchat are also commonly used by teens.

Scrutiny over Meta’s potential damage to the mental health of young people intensified in 2021 after Frances Haugen, a former Facebook product manager, disclosed tens of thousands of internal company documents. Some of those documents included research that showed Facebook is “toxic for teen girls,” worsening body image issues and suicidal thoughts, the Wall Street Journal reported in 2021. Meta said that its research was “mischaracterized,” and that teens also reported Instagram made them feel better about other issues such as loneliness and sadness.

That year, executives from the social media company including Instagram head Adam Mosseri testified before Congress. Instagram then paused its development of a kids’ version of the app and rolled out more controls so parents could limit the amount of time teens spend on it. Social media apps like Instagram require users to be at least 13 years old, but children have lied about their age to access the platform.

Families in various states have sued Meta, blaming Instagram for worsening eating disorders and increasing suicidal thoughts among teenage girls. However, those legal actions have been impeded because Section 230 of the 1996 Communications Decency Act shields online platforms from being held liable for content posted by users. In California, tech companies and industry groups have also sued to stop new state laws aimed at protecting child safety and promoting transparency about content moderation from taking effect. While other lawsuits are still ongoing, Bonta said it’s possible the latest legal actions could help families receive monetary relief.

 

 

About the author

News Aggregator

Pin It on Pinterest

error: Content is protected !!