Meta and YouTube’s Oversight: Years of Proof Support Fine for Big Tech Addiction | Tech News

Try Our Free Tools!
Master the web with Free Tools that work as hard as you do. From Text Analysis to Website Management, we empower your digital journey with expert guidance and free, powerful tools.

Meta and YouTube Found Negligent; Ordered to Pay $6 Million in Damages

New Delhi: A jury in Los Angeles rendered a groundbreaking verdict on Wednesday, determining that Meta and YouTube exhibited negligence in the design of their social media platforms. The court mandated that both companies compensate $6 million in damages.

This unprecedented ruling was predicated on a wealth of internal corporate documents, research studies, and employee communications that accumulated through years of whistleblower revelations and extensive litigation discovery.

YouTube, a subsidiary of Google, was assigned 30% of the overall liability. Notably, a substantial portion of the internal documentary evidence scrutinized during the trial was associated with Meta, which accounted for 70% of the jurors’ liability allocation.

Collectively, these documents delineate a comprehensive narrative regarding the awareness these companies possessed regarding the impacts of their products on younger audiences.

Concerning Research Findings

The earliest and most expansive body of evidence emerged in September 2021 when Frances Haugen, a former Facebook employee, provided internal documents to the Wall Street Journal.

The resultant “Facebook Files” series unveiled that Meta had undertaken over three years of research into Instagram’s effects on adolescent users, consistently revealing detrimental consequences, particularly for teenage girls.

Among the troubling revelations was a 2019 internal presentation stating, “We exacerbate body image issues for one in three teen girls.”

A March 2020 presentation indicated that 32% of adolescent females felt worse about their bodies after using Instagram, according to the Journal’s reporting.

Alarmingly, 13% of British teens and 6% of American teens with suicidal thoughts traced their feelings back to Instagram.

Furthermore, 17% of teenage girls reported that the platform exacerbated their eating disorders. The research encompassed diverse methodologies—focus groups, diary studies, online surveys, and extensive questionnaires—encompassing tens of thousands of participants.

One study involving over 50,000 individuals from ten nations, including India, disclosed that 48% of teenage girls frequently compared their appearances to others on Instagram.

Meta’s researchers even flagged the Explore page—an algorithmically curated content space—as particularly detrimental for younger users, stating, “Aspects of Instagram exacerbate each other to create a perfect storm.”

Teens described their usage patterns in what the documents referred to as “an addict’s narrative,” expressing a desire to reduce their time spent on the platform but finding it difficult to disengage.

Meta’s Strategy to Attract Younger Users

The trial revealed a separate category of internal documents detailing Meta’s strategies devised to attract and retain young users.

Plaintiff’s attorneys presented communications indicating that company executives were focused on engaging children and teens. One document starkly stated, “If we want to win big with teens, we must entice them during their tween years.”

Another memo disclosed that 11-year-olds were four times more likely to return to Instagram compared to competing platforms, even as the platform mandated users to be at least 13 to create an account.

An internal review from 2015 reported the presence of four million children under 13 on Instagram, while a 2017 communication indicated a deliberate focus on users younger than that age.

It was not until late 2019 that Instagram implemented a birthdate requirement for account creation. Moreover, internal documents revealed that Instagram set a daily user engagement target of 40 minutes for 2023, with plans to augment it to 46 minutes by 2026, as per court filings.

Findings on Parental Controls

Among the pivotal documents disclosed during the trial was an unpublished Meta research project, dubbed “Project MYST,” conducted in collaboration with the University of Chicago.

This study surveyed 1,000 teenagers and their parents, concluding that conventional parental controls—such as time limits and content restrictions—had minimal effect on adolescent social media usage.

The research further established that children who faced adverse life experiences, such as familial instability or bullying, were especially susceptible to compulsive usage.

During his testimony, Instagram head Adam Mosseri claimed he could not recall the specifics of Project MYST but acknowledged that he had approved the research. The findings, however, were never made public, nor were any warnings disseminated to parents or teenagers.

Unheeded Warnings and Responses

In November 2023, a second whistleblower, Arturo Béjar, testified before the U.S. Senate Judiciary Subcommittee. Béjar, a former Facebook engineering director turned consultant for Instagram, recounted warnings he issued to senior executives—including Zuckerberg—regarding the prevalence of harmful user experiences on the platforms.

Béjar presented internal survey data revealing that 51% of Instagram users reported a negative experience within the preceding week, including alarming statistics about unwanted sexual advances among users aged 13 to 15. Notably, only 2% of reported harmful posts were removed.

On the same day, Frances Haugen testified before the Senate—October 5, 2021—Béjar emailed Zuckerberg with corroborating data but received no response.

He described safety features later introduced by Meta as “a placebo,” asserting they were merely designed to appease the media and regulators.

Internal communications from Meta employees echoed these concerns, with one individual likening Instagram to “a drug,” and another warning that the platform was inciting a “reward deficit disorder” due to excessive usage.

In response to the ruling, both Meta and Google expressed their intention to appeal, contending that they disagreed with the verdict.

Close-up of the Google app icon and label on a smartphone screen, next to the Twitter app icon.

In recent months, both companies have taken steps to enhance protections for young users. Meta has initiated “Teen Accounts” on Instagram, placing users under 18 in private accounts with restricted messaging and content filters.

After October 2025, those under 16 will also require parental permission to alter these settings. YouTube, in turn, began employing AI-driven age estimation technology to limit access to age-inappropriate content in July 2025. However, the efficacy of these measures remains undisclosed at this juncture.

Source link: Hindustantimes.com.

Disclosure: This article is for general information only and is based on publicly available sources. We aim for accuracy but can't guarantee it. The views expressed are the author's and may not reflect those of the publication. Some content was created with help from AI and reviewed by a human for clarity and accuracy. We value transparency and encourage readers to verify important details. This article may include affiliate links. If you buy something through them, we may earn a small commission — at no extra cost to you. All information is carefully selected and reviewed to ensure it's helpful and trustworthy.

Reported By

Souvik Banerjee

I’m Souvik Banerjee from Kolkata, India. As a Marketing Manager at RS Web Solutions (RSWEBSOLS), I specialize in digital marketing, SEO, programming, web development, and eCommerce strategies. I also write tutorials and tech articles that help professionals better understand web technologies.
Share the Love
Related News Worth Reading