Dozens of states have filed a lawsuit against Meta in the united stated for supposedly harming young people’s mental health. Meta owns Facebook and Instagram – two of the most used social media platforms. The federal lawsuit alongside other parallel lawsuits allege that Meta has knowingly designed harmful features on Instagram and Facebook that lead to addiction to the apps for children and teenagers. Additionally, the states also allege that Meta has been collecting data from children under 13 years of age without consent from their parents.
“Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its Social Media Platforms,” the lawsuit said. “It has concealed the ways in which these Platforms exploit and manipulate its most vulnerable consumers: teenagers and children. And it has ignored the sweeping damage these Platforms have caused to the mental and physical health of our nation’s youth. In doing so, Meta engaged in, and continues to engage in, deceptive and unlawful conduct in violation of state and federal law.
New York Attorney General Letitia James said, “Kids and teenagers are suffering from record levels of poor mental health, and social media companies like Meta are to blame,” “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
Additionally in a statement the Meta spokesperson states that: “We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families. We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use the attorneys general have chosen this path.”
So what are the 30 tools one may ask. Some of them include setting teens’ accounts to private when they join. This limits the potential of sensitive content and unsolicited content they see. Others include age verification technology, parental supervision tools, and tools that show teenagers to take a break using tools like Take a Break and Quite Mode. Additionally expert resources are going to be shared for searches or posts related to suicide, self-injury, eating disorders or body image issues. Meta has also created a Family Centre page with tools which serves as an educational hub.
The lawsuit claims that Meta has allegedly exploited young users for profit by designing its business models in such a manner that it maximises young users’ time and attention. It does so by deploying harmful and manipulative features that harm young users. It further alleges that Meta has designed in such a way that it takes advantage of the vulnerabilities of young users.
The lawsuit states that those manipulative features include algorithms which recommend content that keeps users on the platform, propagating the harmful culture of compulsive “likes” and other social media comparison tools that are meant to harm young users. Some of those include incessant alerts and notifications prompting users back to the site during school and at night. Filters and body altering technology help promote body dysmorphia and content presentation formats like “infinite scroll” that prevents young users from self-regulating and disengaging from the Meta sites.
The presentation of Quite Mode and Take A Break are seemingly Meta’s attempts at reversing their damages.
In October 2023, public officials in New York also proposed a new state legislation that would target this issue by restricting algorithms that focus on young users. This legislation among other powers would give the attorney general’s office a new enforcement power over social media companies.
The meta spokesperson stated that at Meta they refer to research and feedback from parents, teens, experts and academics to inform their practices.
However, at this juncture it is pertinent to acknowledge the some, albeit limited, positives of social media. It has been noted that social media can benefit young users in some ways too. One example is giving them social support and helping them stay connected with peers, especially in times of distress. This too can be problematic in some scenarios including unsolicited, wrong and harmful advice shared among friends. Experts say there is still a lot of research that needs to be done to understand what exactly and how exactly social media affects young users, and why young users in particular.
The lawsuit claims that Meta’s own internal research shows its standing awareness that its products harm young users. This was kept private until they were leaked by a whistleblower and publicly reported, revealing that Meta has known for years the serious harms it deploys young user’s with on its platforms.
One of the more subtle manners in which Meta’s lack of governance has affected mental health negatively is by desensitizing the nature of it. Meme culture especially has normalised depression, anxiety and other such labels in a manner of fun and casualness. This has erased the significance and invalidated the experiences of people who truly suffer with those illnesses. The new meta revamp must include reflection on the language used by users too. Rather than saying “I’m so depressed/I have depression” when they are not clinically diagnosed, alternative language such as “I feel heaviness/sadness” can be recommended.
Meme culture has truly bypassed all regulations. While Meta might already sensor searches about suicide, self-harm etc, billions of memes regarding “omg life is so difficult, I should just kill myself” reek on their platforms. The joking sense of these posts has allowed darker and disturbing content to still float around on the platforms despite their regulations.
The Meta lawsuit is an ongoing situation that must address the aforementioned issues by the end of its run. As a platform that serves children as young as 12 and 13, it must not be the space where they are exposed to content prematurely. It must be a space for recreation that is safe and controlled. The lawsuit against Meta, and the fact that they have known the repercussions of their actions for years, as claimed, is a reflection of such platforms. At this juncture we must question, “who are these platforms made for? Truly?”