Facebook‘s parent company Meta disabled only a small fraction of the over one million reports it received of underage users on Instagram since early 2019, a lawsuit filed by 33 US states reportedly said.
The newly unsealed legal complaint accused the tech giant of carrying an “open secret” that it had millions of users under the age of 13, and that Instagram “routinely continued to collect” their personal information such as location without parental permission.
The complaint stated that within the company, Meta’s actual knowledge that millions of Instagram users were under the age of 13 was an “open secret” that was routinely documented, rigorously analyzed and confirmed, and zealously protected from disclosure to the public, according to a New York Times report.
Last month, attorneys general from 33 states, including New York’s AG Letitia James, filed a lawsuit against Meta alleging that the tech giant designed harmful features contributing to the country’s youth mental health crisis.
The lawsuit alleged Meta created addictive and “psychologically manipulative” features targeting young people while assuring the public falsely that the platform was safe to use.
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” Ms James said.
Meta’s spokesperson responded to the lawsuit, saying that the company was committed to providing teens with “safe, positive experiences online,” and that it had already introduced “over 30 tools to support teens and their families” such as age verification and preventing content promoting harmful behaviours.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the spokesperson added.
However, a significant portion of the evidence provided by the states was obscured from public view via redactions in the initial filing.
The new unsealed complaint filed last week provided fresh insights from the lawsuit, including the accusation that Instagram “coveted and pursued” underage users for years and that Meta “continually failed” to make effective age-checking systems a priority.
The lawsuit reportedly argued that Meta chose not to build effective systems to detect and exclude underage teen users, viewing them as a crucial next generation demographic it needed to capture.
It also accused the tech giant of “automatically” ignoring some reports of under 13 users and allowing them to continue using the platform while knowing about such cases via the company’s internal reporting channels.
The company responded that the now publicly revealed complaint “mischaracterizes our work using selective quotes and cherry-picked documents.”
It said verifying the ages of its users was a “complex” challenge especially with younger people who likely do not have IDs or licenses.
Meta recently said it supports federal legislation requiring app stores to get parents’ approval whenever their teens under 16 download apps.
“With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase,” the company said.
“Parents can decide if they want to approve the download. They can also verify the age of their teen when setting up their phone, negating the need for everyone to verify their age multiple times across multiple apps,” it said.
The tech giant holds that the best solution to support young people is a “simple, industry-wide solution” where all apps are held to the same standard.
“By verifying a teen’s age on the app store, individual apps would not be required to collect potentially sensitive identifying information,” Meta recently said.