How We Don’t Regulate Social Media in 2024
Fractionally Legal v16 Looks at the Long Strange Trip of Social Media Regulation
Before we start, I just wanted to share an update to my post from February when I wrote about the government accusing Google of being a monopoly. To the surprise of absolutely no one who does not work for Google, the Court found that Google was an illegal monopoly in internet search. As I wrote at the time, the real action is in the remedy—what will Google need to do to not be a monopoly? From the perspective of my clients in adtech, and to a lesser extent some of my AI clients who are trying to build search products using AI, how Google is forced to operate over the next few years is going to determine a lot about what those markets and products look like. I’ll write more about this in the coming weeks.
And now, our main story...
Like most Generation Xers (1977 represent!), I’ve been “online” since the mid-1990s and on social media since about 2002. Things started out slowly—the first site I was on was Friendster, followed by MySpace. Those were novelties and not too relevant. Then Facebook came along (2004) and really perfected the formula. Having a Facebook presence felt important. And it was fun. From there, we had Twitter (2007), Instagram (2010), Snapchat (2011), TikTok (2016), and Threads (2023). Social media became less fun and more essential as it became the de facto way most Americans got their news and culture. A whole generation of Americans born since about 1990 (some Millennials (born 1980 to 1995ish) and all of Generation Z (1996 to 2012ish) and now Generation Alphas (2013 to ?) know social media as a fact of life.
I’ve generally been off social media for the last few years, just doomscrolling and occasionally posting about my kids, both Generation Alpha. That might change, but more on that another time.
I suspect that my kids will get most of their news and information from social media. Which makes it strange, and a little irresponsible, that we’ve never gotten around to actually regulating social media in any real way. In fact, the last time the United States updated its laws governing the internet was in 1996, before most of Generation Z was even born and before we had what came to be known as social media.
But change is on the horizon. I’ve written before about New York State’s efforts to regulate social media with the SAFE for Kids Act and the New York Child Data Protection Act, which did become law. And now we have the Federal Children and Teens' Online Privacy Protection Act (“COPPA 2.0”) and the Kids Online Safety Act (“KOSA”), which passed the Senate but seems to be blocked in the House of Representatives.
I won’t prognosticate about what it would take to get COPPA 2.0 and KOSA through the House; however, if it does, I think it would survive the “strict scrutiny” standard under which the courts would evaluate the constitutionality of the law. I’m assuming that the social media companies will argue that COPPA 2.0 and KOSA are content-based restrictions subject to that very high standard. Under that standard, the laws need to be narrowly tailored to meet a compelling interest, and keeping kids safe online and preventing social media addiction is that compelling interest. So I think we will get some regulation that stands up to industry lawsuits,, assuming the House can pass it.
But COPPA 2.0 and KOSA are actually on the periphery of social media regulation, and the fact that we don’t even think we can pass laws to keep kids safe online is pretty damning. The more interesting question is how we got to this place where it's so hard to regulate social media. To understand how we got here, it's important to understand a little history.
Social Media Regulation Exisits, It Just Sucks.
If I’m talking to someone on the phone, or even on a conference call, or I leave nasty messages on 3 million people’s voicemail, the phone company is not responsible. That has been the law for decades, and it makes sense since the phone companies are really just conduits for people to create their own content and can’t possibly police their networks (we leave that to the actual police, who surveil phones for all types of reasons). Broadcasters and newspapers have a very different standard—they are totally responsible for what is on their networks/in their publications and get sued and fined often for the content. Remember Gwaker? The Federal Communiction Commission actually has an “indecency” rule for broadcasters where, for example, you can litigate for 10 years over a fine for showing Janet Jackson’s breast.
Social media has a different scheme. Those companies can do all the content curation they want of your social media feed (like a broadcaster or a publisher) but are not responsible for what users post (like a telephone company). Why? It's sort of convoluted, but the idea was that you want these innovative social media companies to do some type of moderation, but not actually be responsible for what is on the platform. But content moderation is expensive, so the social media companies got a sweet deal. Moderate if you want, and if you screw up, we will give you a liability shield. Also, in the early days, there was a difference between the internet and traditional media that still broadcasted and published somewhere other than the internet, so doing things differently on “the internet” made sense. All that is the genesis of the liability shield in Section 230 of the Communications Decency Act.
But is the liability shield of Section 230 still a good policy, to the extent it ever was? After all, neither of its stated purposes is true today. Content moderation is no longer hard, and social media companies are no longer small, innovative upstarts. Moderation can be automated, and the authenticity of images (and age), for example, can be verified pretty easily if the companies just spent the money to build out the system. Of course, why spend the money to build that when you have the liability shield? And there is no distinction between broadcast media and internet platforms in terms of how they are delivered: media is far more likely to be consumed through the internet than through bunny ears on a television set or a broadsheet bought at the newspaper stand or delivered to your door.
So now we have a system where one content producer and moderator, social media, has an exemption to all the laws governing the other, broadcast media and publishers, for no discernible reason. No wonder why one is thriving while the other is withering. Which is to say that the entire way we think about regulating the internet needs to be rethought. That is what happens when you don’t update your technology laws in 30 years.
Regulation by Enforcement
In the absence of any real regulation, we have what we call “regulation by enforcement,” which means that regulators are using old laws that might not apply to go after businesses and hoping that in doing so, they can establish some sort of regulatory scheme. We see that in crypto a lot: there are no federal crypto regulations, so regulators sue crypto companies under U.S. securities laws and make the rules that way. We also see that in social media.
Because the federal government has no social media regulations and can’t pass one, some states are passing laws and then trying to enforce them, and 200 school districts are suing the social media companies to compensate them for the damages their business causes to the mental health of kids.
Let's take a look at both those efforts at “regulation by enforcement.”
Using what I’ll call the “Tobacco Model,” school districts, generally represented by private lawyers or in cases where the school district has highly capable in-house lawyers (like my home of New York City where my two Alphas are enrolled in public school), are suing the social media companies for causing them to have to spend money addressing the mental health crisis. I call this the “Tobacco Model” because those suits were against the cigarette companies by states and localities for the extra healthcare costs, leading to a multi-billion dollar settlement. Here, the idea is that the social media companies have caused a mental health crisis affecting kids, causing the localities to spend extra money on dealing with it.
Those cases were filed in 2022 and are proceeding as a Multidistrict Litigation, meaning that all the suits filed in the home jurisdictions of the plaintiffs are now consolidated in the Northern District of California (San Francisco) where they are in discovery. There are now over 500 cases filed (and also a state court action), and unless they settle or the companies prevail in a motion for summary judgment, there will be a trial on a few of the earliest cases. As far as I can tell, the first case is scheduled for trial in October 2025, a so-called “Bellwether” case, meaning that it's expected that all the other cases will proceed the same way. When you have 500 cases, you get through a few trials, and then the cases either settle or are dismissed en masse. Billions will probably move to these school districts, but what we really need are changes to the social media companies' business models, which will be harder to win. Should be interesting, and I’ll keep you informed.
The second “regulation by enforcement” is a little different. Apparently, when former President Trump was kicked off X (then Twitter) and Facebook for inciting the riot/insurrection on January 6, 2021, Texas and Florida passed some version of a law that prevented the platforms from moderating—basically not allowing the platforms to kick anyone out. Those laws were challenged by the tech companies and some industry associations all the way up to the Supreme Court on First Amendment grounds. The Supreme Court reiterated its long-held belief that social media is no different than traditional media in its editorial choices, and just like the government would never tell a newspaper which articles to include and which to ignore, the government has no right to tell a social media network what to do. That is all well and good, but the issue is that the social media platforms do too little editing, not too much. But at least it's nice to know that if they want to be socially responsible, the Constitution is not standing in the way. Another similar case was dismissed on standing grounds without reaching the merits.
Regulation by enforcement is a boon to lawyers since someone needs to sort through the uncertainty and bring and defend the lawsuits and regulatory actions. But it’s a really bad way to regulate. Let’s hope we get some real leadership and clarity in this important area, at least before my Alphas are my age.
Keep building, keep thinking,
Jesse
Hi, and welcome to my newsletter! I’m Jesse Strauss, Your Fractional General Counsel. I’m a lawyer with a private practice based in New York City, helping clients in the United States and globally with their US legal needs. My expertise spans various areas, including raising funding rounds, employment issues, negotiating master service agreements, intellectual property, compliance, legal process management, and dispute resolution. My focus is on founding and nurturing great companies from seed to exit. Discover more atwww.yourfractionalgc.com and book a complimentary 30-minute consultation at Contact Your Fractional GC.