Platformer - The states sue Meta over child safety
Here’s this week’s free column — a look at a major new lawsuit against Meta related to child safety, and whether moves like this can meaningfully address a mental health crisis among young people in the United States. Do you value independent reporting on social networks and the trust and safety teams that keep them running? If so, your support would mean a lot to us. Upgrade your subscription today and we’ll email you first with all our scoops — like our recent interview with a former Twitter employee that Elon Musk fired for criticizing her. ➡️
The states sue Meta over child safetyEveryone agrees there's a teen mental health crisis. Is this how you fix it?I. On Tuesday, 41 states and the District of Columbia sued Meta, alleging that the company hurts children by violating their privacy and misleading them about the potential harms they may experience from using its products. While a handful of states have pursued aggressive action against Meta and other platforms in an effort to prevent harm to minors, today's lawsuit represents the largest collective action against a social network on child safety grounds that we have seen to date. Here are Cristiano Lima and Naomi Nix at the Washington Post: While the scope of the legal claims varies, they paint a picture of a company that has hooked children on its platforms using harmful and manipulative tactics. A 233-page federal complaint alleges that the company engaged in a “scheme to exploit young users for profit” by misleading them about safety features and the prevalence of harmful content, harvesting their data and violating federal laws on children’s privacy. State officials claim that the company knowingly deployed changes to keep children on the site to the detriment of their well-being, violating consumer protection laws. We’ll get to the complaint itself. But first, some relevant context. II. The AGs’ lawsuit arrives nearly two years after the attorneys general began investigating Meta over child safety concerns. The investigation was prompted in large part by revelations from documents leaked by whistleblower Frances Haugen, particularly some internal research showing that a minority of teenage girls reported a negative effect on their mental health after using Instagram. After Congress failed to pass a single piece of internet legislation during the post-2016 backlash to Big Tech, efforts shifted to the state level, where in 39 states a single party controls both legislative chambers and the governor’s office. That makes it much easier to pass legislation, for better and for worse. So does framing regulation as an effort to protect children. The lawsuit filed Tuesday represents the culmination of efforts to re-focus the attention of lawmakers intent on regulating tech toward child safety. It is an issue in need of urgent attention at all levels of government. The Centers for Disease Control reported this year that 57 percent of US teen girls “felt persistently sad or hopeless” in 2021, the highest reported level in a decade. The relationship between social networks and teenage mental health remains controversial. In May, US Surgeon General Vivek Murthy issued an advisory opinion arguing that social media can put some teens at risk of serious harm. (The advisory noted that social media can have a positive impact on teens as well.) And in August, I wrote here about conflicting research on teens and social media, observing both that the effects on teenage mental health in these studies are typically small and that studies are often designed in ways that obscure the larger impact of social media. (I think regularly about the observation made by Prof. Sonia Livingstone of the London School of Economics to the BBC: “This reminds me of a conference I went to that asked, 'what difference did half a century of television make?'. How can there be one answer?") Over time, I have become more persuaded that social networks can be harmful to young people: in particular, certain groups of young people (those with existing mental health issues, victims of bullying) and in particular circumstances (those who are using social networks for more than three hours per day.) But my views are also formed by my experiences growing up as a gay man online, where I used the then-nascent social web to find connections and community that I struggled to locate elsewhere. That history makes me skeptical of regulations that would make it harder for LGBT teens and other minority groups to find and speak to one another online, which is the express point of some (bipartisan!) legislation proposed this year. III. There are two key questions to ask about the lawsuit against Meta today. One of those questions — and the one that will be explored in court — is whether Meta broke the law. The other is whether you can address the harms alleged in the lawsuit in a way that makes young people meaningfully safer. The complaint is 233 pages and somewhat heavily redacted. For that reason, I haven’t read the full thing, and we may learn more about the evidence uncovered by the AGs as the case moves forward. On the first question, the plaintiffs put Meta’s alleged crimes into two buckets. The first and biggest bucket concerns the company’s “scheme to exploit young users for profit,” which basically amounts to Meta making apps and then building features designed to get people to use those apps a lot. It “designs and deploys features to capture young users’ attention and prolong their time on its Social Media Platforms”; its “Recommendation Algorithms encourage compulsive use, which Meta does not disclose”; its “use of disruptive audiovisual and haptic notifications interfere with young users’ education and sleep.” There are certainly some worthwhile conversations to be had here. Section B-7 alleges that “Meta promotes Platform features such as visual filters known to promote eating disorders and body dysmorphia in youth.” Perhaps those ought to be disabled for younger users? Section B-3 argues that “Meta’s Recommendation Algorithms encourage compulsive use, which Meta does not disclose.” Do we want to set China-style limits on the amount of time that young people can spend using certain kinds of apps each day? Still, many of the complaints here boil down to Meta sending out a lot of push notifications. This can be annoying, and in fact I have shut off most of the notifications the company wants to send me. But it’s not clear that it’s illegal, even if some young people find themselves hooked on these notifications and experiencing harm as a result. The second bucket of crimes seems potentially more persuasive since it alleges violations of an actual law: the Children’s Online Privacy Protection Act, or COPPA, which requires platforms to obtain parental consent from any users younger than 13. The Supreme Court has consistently struck down laws that require age verification of users on Free Speech grounds. And so in practice platforms “verify” users’ ages by showing them a box that says something like, “hey, you’re 13 or older, right?” And then the 11-year-old clicks “yes,” and as far as platforms are concerned, they are then COPPA-compliant. “Meta does not obtain — or even attempt to obtain — verifiable parental consent before collecting the personal information of children on Instagram and Facebook,” the lawsuit alleges. Instead, the AGs say that Meta collects personal information about children, including names, addresses, email addresses, location data and photographs. Because under-13s are technically banned from Facebook and Instagram, Meta argues it is in compliance. It doesn’t need parental consent because kids shouldn’t be on the platform at all. (It also does take steps to identify under-13 users and delete their accounts, though the AGs argue it isn’t trying hard enough.) “We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families,” Meta told me in an email. “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.” Of course, kids are on Facebook and Instagram. In part because Meta has openly courted them, through products like Messenger Kids, and in part because using internet services without your parents’ permission has become a cherished hallmark of American childhood. Due to redactions in the complaint, it’s difficult to assess how strong the AGs’ evidence against Meta is here. But every platform of even modest size has at least some under-13 users, and most of those platforms do very little to identify those users, because for legal liability reasons you might be better off not knowing they are there. That’s why it’s relatively easy for me to imagine that between 40 state attorneys general, they will be able to muster some evidence that among the billions of people who have signed up for Meta’s services, there was some substantial number of children, some significant percentage of which did not run it by mom and dad first, and that this will result in some sort of settlement large enough for AGs not to feel embarrassed about it when they announce it in a news conference, and small enough that Meta continues to operate more or less as it did before this lawsuit was ever filed. And that leads us to the second and ultimately much more important question: can this lawsuit make young people safer? I suppose there’s a world where a judge agrees with everything in the complaint and forces Meta to design its services from scratch, at least for young people. (Particularly if all the redacted sections of the complaint contain compelling evidence of harm to teens.) But it’s hard to see how a complaint about ranking algorithms, likes, and augmented reality filters can withstand a First Amendment challenge. And it’s even harder to see how the government can address a society-wide mental health crisis at the level of app design. This isn’t to say Meta — or any of the other large platforms that teenagers use for hours every day, which go conspicuously unmentioned here — should be let off the hook. Platforms should conduct and release more research on how social media use can lead to mental health harms, and take steps to acknowledge and address it. That won’t solve the teen mental health crisis, either. But I imagine it would be more effective than this lawsuit. Musk saysOver the weekend, Elon Musk took aim at Wikipedia, a service he’s been bad-mouthing since at least 2019. Previously, Musk’s major gripe seemed to be that Tesla’s founding CEO, Martin Eberhard, had a Wikipedia page that lists him as a co-founder of that company. Now, Musk is asking why the company keeps asking people for donations. He also offered it $1 billion to change its name to Dickipedia. Well, since he asked: the crowdsourced encyclopedia is run by a nonprofit called the Wikimedia Foundation, and its finances are public. The company’s 2022 annual report says that revenue was $154,686,521, and total expenses were $145,970,915. Its financials are audited by the auditor KPMG. Remember third-party auditors? X’s resigned this summer with $500,000 in outstanding invoices. You can read the financial report for a full breakdown of these expenses, which includes $2,704,842 for internet hosting. Jaime Crespo, a senior database administrator at the site, posted a thorough breakdown about this figure on Bluesky:
Crespo politely declined to be interviewed. But a Wikipedia spokesperson had this comment: “We are grateful that generous individuals from all over the world give every year to keep Wikipedia freely available and accessible. The majority of our funding comes from donations ($11 is the average) from people who read Wikipedia. We are not funded by advertising, we don’t charge a subscription fee, and we don’t sell user data. This model is core to our values and our projects, including Wikipedia. It preserves our independence by reducing the ability of any one organization or person to influence the content on Wikipedia.” In summary, the Wikimedia Foundation asks for money to keep Wikipedia up and running. Imagine if Twitter had asked for $11 donations, rather than relying on a billionaire like Musk proposed. It might still be alive! — Zoe Schiffer Governing
Industry
Those good postsFor more good posts every day, follow Casey’s Instagram stories. (Link) (Link) (Link) Talk to usSend us tips, comments, questions, and platform safety ideas: casey@platformer.news and zoe@platformer.news. Become a sponsor.By design, the vast majority of Platformer readers never pay anything for the journalism it provides. But you made it all the way to the end of this week’s edition — maybe not for the first time. Want to support more journalism like what you read today? If so, click here: |
Older messages
How to see the future using DALL-E 3
Tuesday, October 24, 2023
To understand how quickly AI is improving, forget ChatGPT — use an image generator
Inside Discord’s reform movement for banned users
Friday, October 20, 2023
Most platforms ban their trolls forever. Discord wants to rehabilitate them
How one former Twitter employee could beat Elon Musk in court
Thursday, October 19, 2023
In her first interview, former engineer Yao Yue discusses being fired over a tweet — and her success in getting the NLRB to file its first formal complaint against X
How the attacks in Israel are changing Threads
Tuesday, October 10, 2023
Three months into its existence, the app's purpose may be coming into focus — if Meta will embrace it
Taylor Lorenz on her extremely online history of the internet
Wednesday, October 4, 2023
Notes on selling books via Instagram stories, fighting with Elon, and which platform is best at shielding users from harassment
You Might Also Like
Boring Strategy, Remote Nomad Jobs, GenFuse AI, Mochi Video AI, Notepad Online, and more
Wednesday, December 25, 2024
a powerful tool that transforms your ideas into a video BetaList BetaList Weekly Mochi Video AI a powerful tool that transforms your ideas into a video Remote Nomad Jobs 100% remote jobs for digital
💥 Make 2025 The Best Year of Your Life - CreatorBoom
Wednesday, December 25, 2024
Six Figure Local Newsletter, How Eddie Shleyner Built Very Good Copy, 10 Newsletter Success Stories Generating $1.1M in MRR, 4 Boring Websites That Make over $35k Per Month, 6 Things to Do if Your
🚀 This holiday, learn from the best & transform 2025
Wednesday, December 25, 2024
These experts have built $100M+ businesses—now they're here to help you do the same. fdrlogo Hey Friend , What do 30000+ Foundr students know that you don't? They know the difference between
🗞 What's New: AI video editing is coming to Instagram
Tuesday, December 24, 2024
Also: Mobile app earnings jumped 15.7% in 2024 ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
[CEI] Chrome Extension Ideas #171
Tuesday, December 24, 2024
ideas for Amazon, Podcast, Twitter, and AI ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Top angel investors in the U.S.
Tuesday, December 24, 2024
Inspiration for who to raise from when you're raising your early rounds ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
🎁 🎄 HO HO HO! Here's the ultimate gift for your business journey
Tuesday, December 24, 2024
Unwrap your holiday gifts and start building your dream in 2025! fdrlogo Hey Friend , HO HO HO! Your holiday gifts have arrived! This isn't your typical holiday surprise—these gifts are proven
Biggest rounds of 2024
Tuesday, December 24, 2024
+ Sriram Krishnan joining Trump's government View in browser Sponsor Card - Up Round-35 Good morning there, Welcome to the last Sifted Daily newsletter of 2024, in which we look back on the biggest
The Corner Office & Low Exp 👩💼
Monday, December 23, 2024
And some holiday news͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
🗞 ICYMI: insights on o3, AI job disruption, marketing on Bluesky
Monday, December 23, 2024
Also: a new social network ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏