Meta and Google found liable: big Internet platforms will never again be so strong

Condena a Meta y Google

The adjective ‘historical’ is generally overused. But in this case, it is quite likely that within a few years the recent verdict by a California jury, finding Meta and Google liable for using their design features to create addiction among minors, will continue to be studied.

The plaintiff, K.G.M., suffered serious mental health problems after becoming addicted to the social media platforms of Meta and YouTube, the defendants, when she was just a child. This is the first lawsuit regarding social media addiction from the perspective of liability for defective products.

With this ruling, the perception of invulnerability of the major technology firms is further weakened. As noted by Sacha Haworth, Executive Director of The Tech Oversight Project (an organisation working to reign in Big Tech), “This sentence is an earthquake that shakes the predatory business model of these companies to their very foundations. After years of manipulation by companies like Google and Meta, new evidence and testimonies have exposed, and proven, the damage that young people and their parents have been reporting for years. These products were deliberately designed to cause harm, create addiction in millions of young people, and bring about lifelong mental health problems”.

This sentence against Meta and Google joins another similar one, with less repercussion, in New Mexico (United States) and opens the door to new sentences in the dozens of cases being tried in the United States, and especially in California.

The ruling against Meta and Google rips apart their excuses

The trials have produced an unprecedented public record of compromising internal documents from the companies, which reveal their disregard for the well-being of minors. This proof has been compiled by The Tech Oversight Project.

So, Instagram developed an explicit commercial strategy to create addiction to its platform in several generations of American families, deliberately using teens as a gateway to recruit parents, younger siblings and grandparents.

One presentation suggested that Instagram could “use school networks as a recruitment lever” and position itself as “integral to managing school relationships, especially during transition periods” such as graduation or when changing schools. Meta executives tracked the “total time teens spent on the platform”, prioritized adolescent growth, created high school directories, and activated outreach campaigns in schools to increase engagement on specific campuses. One employee wrote: “We need to optimise […] to be able to look at your phone under your desk in the middle of chemistry class :)”.

In addition, a 2015 document showed the company’s goal to increase the time that ten-year-olds spent on Instagram, and internal documents also showed that Instagram recorded online behaviour of children as young as eight.

As for YouTube, as early as 2012 an internal document admitted that the company did not effectively measure the well-being of users and described its goal as “addiction”.

In 2023, Google specifically aimed YouTube Shorts at teens, despite internal research identifying the “two biggest challenges to teenage well-being” of this feature: recommendations of low-quality content “that can transmit and normalize unhealthy beliefs or behaviours” and “prolonged and involuntary use” that displaces “valuable activities such as spending time with friends or sleeping”.

In Europe as well

Technology platforms have also received bad news this week from across the Atlantic.

The European Commission is investigating Snapchat on suspicion that it might “be exposing minors to attempted recruitment for criminal purposes and to information about sales of illegal products, such as drugs, or subject to age restrictions, such as e-cigarettes and alcohol”.

And a preliminary investigation by the Commission accuses porn platforms PornHub, XVideos, XNXX and Stripchat of breaching the Digital Services Act by allowing minors to access pornography.

What has happened with these websites confirms that, as the EADT has long been denouncing, there is no serious will on the part of the platforms to establish effective age verification systems. In addition to legal action in specific cases, political initiatives, such as those foreseen by the Spanish Government prohibiting social media access for children under age 16, will be insufficient if they are not accompanied by concrete and verifiable technical measures.

But the tipping point has already been reached: The question is no longer whether to set limits on the platforms in their relationship with minors, but what is the most effective way to do so.