Advertisement

Supported by

What to Know About the Supreme Court Arguments on Social Media Laws

Both Florida and Texas passed laws regulating how social media companies moderate speech online. The laws, if upheld, could fundamentally alter how the platforms police their sites.

  • Share full article

A view of the Supreme Court building.

By David McCabe

McCabe reported from Washington.

Social media companies are bracing for Supreme Court arguments on Monday that could fundamentally alter the way they police their sites.

After Facebook, Twitter and YouTube barred President Donald J. Trump in the wake of the Jan. 6, 2021, riots at the Capitol, Florida made it illegal for technology companies to ban from their sites a candidate for office in the state. Texas later passed its own law prohibiting platforms from taking down political content.

Two tech industry groups, NetChoice and the Computer & Communications Industry Association, sued to block the laws from taking effect. They argued that the companies have the right to make decisions about their own platforms under the First Amendment, much as a newspaper gets to decide what runs in its pages.

So what’s at stake?

The Supreme Court’s decision in those cases — Moody v. NetChoice and NetChoice v. Paxton — is a big test of the power of social media companies, potentially reshaping millions of social media feeds by giving the government influence over how and what stays online.

“What’s at stake is whether they can be forced to carry content they don’t want to,” said Daphne Keller, a lecturer at Stanford Law School who filed a brief with the Supreme Court supporting the tech groups’ challenge to the Texas and Florida laws. “And, maybe more to the point, whether the government can force them to carry content they don’t want to.”

If the Supreme Court says the Texas and Florida laws are constitutional and they take effect, some legal experts speculate that the companies could create versions of their feeds specifically for those states. Still, such a ruling could usher in similar laws in other states, and it is technically complicated to accurately restrict access to a website based on location.

Critics of the laws say the feeds to the two states could include extremist content — from neo-Nazis, for example — that the platforms previously would have taken down for violating their standards. Or, the critics say, the platforms could ban discussion of anything remotely political by barring posts about many contentious issues.

What are the Florida and Texas social media laws?

The Texas law prohibits social media platforms from taking down content based on the “viewpoint” of the user or expressed in the post. The law gives individuals and the state’s attorney general the right to file lawsuits against the platforms for violations.

The Florida law fines platforms if they permanently ban from their sites a candidate for office in the state. It also forbids the platforms from taking down content from a “journalistic enterprise” and requires the companies to be upfront about their rules for moderating content.

Proponents of the Texas and Florida laws, which were passed in 2021, say that they will protect conservatives from the liberal bias that they say pervades the platforms, which are based in California.

“People the world over use Facebook, YouTube, and X (the social-media platform formerly known as Twitter) to communicate with friends, family, politicians, reporters, and the broader public,” Ken Paxton, the Texas attorney general, said in one legal brief. “And like the telegraph companies of yore, the social media giants of today use their control over the mechanics of this ‘modern public square’ to direct — and often stifle — public discourse.”

Chase Sizemore, a spokesman for the Florida attorney general, said the state looked “forward to defending our social media law that protects Floridians.” A spokeswoman for the Texas attorney general did not provide a comment.

What are the current rights of social media platforms?

They now decide what does and doesn’t stay online.

Companies including Meta’s Facebook and Instagram, TikTok, Snap, YouTube and X have long policed themselves, setting their own rules for what users are allowed to say while the government has taken a hands-off approach.

In 1997, the Supreme Court ruled that a law regulating indecent speech online was unconstitutional, differentiating the internet from mediums where the government regulates content. The government, for instance, enforces decency standards on broadcast television and radio.

For years, bad actors have flooded social media with misleading information , hate speech and harassment, prompting the companies to come up with new rules over the last decade that include forbidding false information about elections and the pandemic. Platforms have banned figures like the influencer Andrew Tate for violating their rules, including against hate speech.

But there has been a right-wing backlash to these measures, with some conservatives accusing the platforms of censoring their views — and even prompting Elon Musk to say he wanted to buy Twitter in 2022 to help ensure users’ freedom of speech.

What are the social media platforms arguing?

The tech groups say that the First Amendment gives the companies the right to take down content as they see fit, because it protects their ability to make editorial choices about the content of their products.

In their lawsuit against the Texas law, the groups said that just like a magazine’s publishing decision, “a platform’s decision about what content to host and what to exclude is intended to convey a message about the type of community that the platform hopes to foster.”

Still, some legal scholars are worried about the implications of allowing the social media companies unlimited power under the First Amendment, which is intended to protect the freedom of speech as well as the freedom of the press.

“I do worry about a world in which these companies invoke the First Amendment to protect what many of us believe are commercial activities and conduct that is not expressive,” said Olivier Sylvain, a professor at Fordham Law School who until recently was a senior adviser to the Federal Trade Commission chair, Lina Khan.

How does this affect Big Tech’s liability for content?

A federal law known as Section 230 of the Communications Decency Act shields the platforms from lawsuits over most user content. It also protects them from legal liability for how they choose to moderate that content.

That law has been criticized in recent years for making it impossible to hold the platforms accountable for real-world harm that flows from posts they carry, including online drug sales and terrorist videos.

The cases being argued on Monday do not challenge that law head-on. But the Section 230 protections could play a role in the broader arguments over whether the court should uphold the Texas and Florida laws. And the state laws would indeed create new legal liability for the platforms if they take down certain content or ban certain accounts.

Last year, the Supreme Court considered two cases, directed at Google’s YouTube and Twitter, that sought to limit the reach of the Section 230 protections. The justices declined to hold the tech platforms legally liable for the content in question.

What comes next?

The court will hear arguments from both sides on Monday. A decision is expected by June.

Legal experts say the court may rule that the laws are unconstitutional, but provide a road map on how to fix them. Or it may uphold the companies’ First Amendment rights completely.

Carl Szabo, the general counsel of NetChoice, which represents companies including Google and Meta and lobbies against tech regulations, said that if the group’s challenge to the laws fails, “Americans across the country would be required to see lawful but awful content” that could be construed as political and therefore covered by the laws.

“There’s a lot of stuff that gets couched as political content,” he said. “Terrorist recruitment is arguably political content.”

But if the Supreme Court rules that the laws violate the Constitution, it will entrench the status quo: Platforms, not anybody else, will determine what speech gets to stay online.

Adam Liptak contributed reporting.

David McCabe covers tech policy. He joined The Times from Axios in 2019. More about David McCabe

Skip to main navigation

  • Email Updates
  • Federal Court Finder

Elonis v. U.S.

This First Amendment activity applies the landmark Supreme Court case Elonis v. U.S. to a teen conflict posted on Facebook.

The First Amendment provides that “Congress shall make no law . . . abridging the freedom of speech [.]”

Elonis v. U.S.  (2015) is the first time that the Supreme Court of the United States has agreed to hear a case involving the constitutionality of prosecuting potential threats in a social media context. This is a relatively new and rapidly developing area of law. The Court’s decision may have far-reaching consequences for the development of First Amendment law, in general, and for students and others who use social media, in particular. 

Most students and a majority of adults use some form of social media, including Facebook, Twitter, LinkedIn, etc. The growth of social media has often blurred the lines between professional and personal conduct. 

What’s Different About This Activity?

  • Landmark Case with Teen Scenario
  • Realistic Court Simulation
  • Ready in 30 Minutes
  • Every Learning Style Involved
  • Centerpiece is Jury Deliberations
  • Challenges Media Stereotypes

Issues also have developed as statements made on social media are taken out of context. This is especially true when the individual who makes a statement cannot control who else views the statement and/or how others interpret it. For instance, several court cases have arisen over the authority of schools to discipline students for comments about teachers and school administrators that the students made outside of school on their own personal social media sites. 

There is common concern that comments made on social media sites may be misconstrued if they are taken out of context. On the other hand, there are legitimate concerns that authorities must protect against cyberbullying, harassment, and threats that are made on social media. As a result, when they are drafting laws, state and the federal lawmakers struggle with how to balance First Amendment free speech rights with the interests of individuals who want to be free from harassment, fear, and intimidation on the Internet.

How to Use These Resources

Download the agenda and complete activity package  (doc).

  • While waiting for the program to start, participants review the agenda and read the Elonis  v. U.S. facts and case summary and fictional scenario .
  • Students volunteer to be attorneys for each side – four student attorneys for Elonis and four student attorneys for the U.S. They will work with attorney coaches to review the talking points  (doc) they will present to the host Judge. These are suggested points -- not a script–for the debate. Student attorneys are encouraged to add their own arguments.
  • After the closing arguments are presented for both sides, all other students are jurors who deliberate during the open floor debate. They use the worksheets  (doc) they have prepared to present their arguments.

The host Judge, attorney coaches, and student attorneys watch as the jurors debate in open court. The program moderator facilitates the arguments and makes sure that everyone has the opportunity to speak. As the debate comes to a close, the moderator asks for a show-of-hands vote. Because of time constraints, the verdict does not have to be unanimous.

DISCLAIMER: These resources are created by the Administrative Office of the U.S. Courts for educational purposes only. They may not reflect the current state of the law, and are not intended to provide legal advice, guidance on litigation, or commentary on any pending case or legislation.

Educate your inbox

Subscribe to Here’s the Deal, our politics newsletter for analysis you won’t find anywhere else.

Thank you. Please check your inbox to confirm.

Nation

Supreme Court hears cases involving free speech rights on social media

Amna Nawaz

Amna Nawaz Amna Nawaz

Saher Khan

Saher Khan Saher Khan

Ian Couzens Ian Couzens

Leave your feedback

  • Copy URL https://www.pbs.org/newshour/show/supreme-court-hears-cases-involving-free-speech-rights-on-social-media

The Supreme Court heard arguments in highly consequential cases navigating First Amendment protections on social media. Tech companies are taking on state laws, decrying conservative censorship online. A decision could fundamentally change the use of speech on the internet. Amna Nawaz discussed the hearing with Supreme Court analyst Marcia Coyle.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

Amna Nawaz:

The Supreme Court heard arguments today in a highly consequential case navigating First Amendment protections on social media.

Tech companies are taking on state laws, decrying conservative censorship online. A decision here could fundamentally change the use of speech on the Internet.

The Supreme Court is wading into a digital age First Amendment battle. Do social media companies have the right to decide what appears on and what's removed from their own platforms? That is the question at the heart of two major cases heard today by the justices.

A decision here could give government the power to change what millions of people see online. After sites like Twitter and Facebook removed former President Donald Trump following the January 6 attack on the U.S. Capitol, Texas and Florida passed laws restricting how these platforms moderate and remove content and users from their Web sites.

But tech industry groups sued the states.

Alan Gura, Vice President for Litigation, Institute for Free Speech: Whether it happens to a conservative group or to a liberal group or to any other kind of group, OK, people in America should be able to access the modern public square to express themselves. It does tend to be conservative groups that are under the thumb more of some of these social media sites.

Alan Gura with the Institute for Free Speech filed an amicus brief with conservative activist group Moms for Liberty in support of the states.

Moms for Liberty had a problem.

The teachers union, their sort of traditional political adversary, went to Facebook and put pressure on Facebook and said, look, the people who are promoting disinformation, and, instead, Moms' chapters saw all kinds of post blocked, things that were very innocuous, things like, are you ready to run for school board or questions about, hey, does anybody know what curriculum is being used by a school district?

David Greene, Electronic Frontier Foundation:

I think we can all agree that content moderation is a process is really problematic. I don't think the right solution to that is to give the government the ability to impose its own editorial viewpoints on private actors. I think that's a dangerous power to hand the government.

David Greene is with the Electronic Frontier Foundation and filed a brief opposing the states.

David Greene:

Social media sites have a First Amendment right to curate and edit their sites according to their own curatorial and editorial philosophies and policies. That is a right that others in their position have, whether they be art curators or parade organizers.

But are tech companies publishers. Gura and the states don't think so.

Whose speech is it? And nobody thinks that your speech is the company's speeches. Obviously, your speech, if I pick up the phone and talk to you today, it won't be AT&T's speech, and AT&T can't unplug me because they don't like my politics.

That back-and-forth is what the justices themselves navigated today.

Marcia Coyle was in the courtroom and joins us now.

Marcia, it's great to see you.

Marcia Coyle:

Good to see you.

These are big issues here, free speech and content moderation and social media platforms.

How did the justices seem to be navigating and examining these issues today?

Well, I think it's a difficult one. Just as you said, for many levels, they're having trouble.

And — but they asked good questions. Most of the arguments focused on, as one of your speakers just said, whether social media platforms fall into a category of newspaper publishers, where they can pretty much determine how they use the content they have, or are they more like common carriers, such as a telegraph or anything that carries a message from point A to B, but doesn't do anything else?

So they also struggled with language. Justice Alito asked at one point, well, what is content moderation? Is it just another way of saying censorship? And there were other words, too, that created problems. So this is a difficult case for them on more levels than just determining which category to put social media platforms into.

I mean, the concerns around censorship online have long been more of a conservative issue. Did we hear questions from the conservative justices that seemed to align with that view or to challenge it?

No, not at all.

And it seemed as though, as they struggled with the categories of newspapers versus common carriers, that they weren't focused at all on politics or ideology. This is clearly an attempt to become very familiar with what social media does, what these platforms do. And that's one of the problems that they're having in the case.

They didn't know how broadly these laws sweep. Justice Barrett, for example, pointed out, well, some say these laws could cover Venmo, Uber, e-mail.

Not just limited to social media platforms.

Exactly, e-mail, direct messaging. And they don't know. In fact, as they asked the lawyers, they said, well, they might — it might cover them.

And why don't they know? Because the way the case came to the Supreme Court, there was no trial below on the merits to flesh all this out through discovery.

Well, I want to ask you about the arguments on both sides of the debate here.

And we did speak earlier with Jameel Jaffer of The Knight Institute, who argues that actually both sides of the debate have some merit to their arguments. Take a listen.

Jameel Jaffer, Knight First Amendment Institute:

Everybody involved in it claims to be a champion of free speech and the First Amendment. You have the social media platforms claiming that they are speakers and editors here and that these laws are a form of censorship of their First Amendment-protected activity.

And on the other side, then you have the states arguing that these laws are intended to protect the free speech rights of social media platforms' users. The truth is that everybody has a point. You need to find a way of accounting for all of the First Amendment interests in play here.

Marcia, for an issue as core as free speech, we're talking about the First Amendment here, and as broad and influential as these social media platforms, what are the implications of a decision like this?

Well, it depends on who wins and who loses. If the platforms lose, they claim that they will have to put all kind of speech on their platforms, the good, the bad, and the ugly.

And their desire, their rules that they have to try to get a handle on hate speech, on bullying, they will have to also put up pro-bullying and pro-hate speech, that they just will not be able to exercise the editorial discretion they have. On the other hand, the states don't think that there's going to be a parade of horribles, that there are other ways to deal with that kind of bad speech.

So does all of this say to you that the justices are more likely to try to keep this as narrow as possible?

Yes, it does.

In fact, I think, because they don't know how broadly the law sweeps, they did talk — I will say, though, that it seemed to me they were more inclined to view platforms as closer to newspapers and publishers than to common carriers.

But because they don't know how broad the law sweeps, they did talk about keeping injunctions that are in place right now that temporarily keep the laws on hold, but sending the cases back to the lower court in order to flesh out a lot of these issues.

We should mention too this is one of a handful of cases the justices are considering about social media.

Precedent here is hard, right? A lot of it predates the Internet era. What should we understand about why the justices are taking up these cases and how they're viewing them?

Amna, I think this is just the inevitability of how things have changed and there are challenges and they come to the court. So I'm not surprised that they're getting more and more into this and having more and more cases come to them.

Just this term, not only do we have the two cases from Florida and Texas, but there are two additional cases that they already heard arguments in that really involve public officials and how they use their Web sites and whether they can block commenters on their Web sites.

So, I think we're going to see these cases come in a variety of situations. And it's a new world for the justices. For many of them, it's a new world.

A lot to make sense of at the Supreme Court. We're so glad you're here to help us do it all

Marcia Coyle, thank you so much. Great to see you.

My pleasure, Amna.

Listen to this Segment

Israeli forces say they found 10 km long tunnel connecting north and south Gaza strip

Watch the Full Episode

Amna Nawaz serves as co-anchor of PBS NewsHour.

Saher Khan is a reporter-producer for the PBS NewsHour.

Support Provided By: Learn more

More Ways to Watch

  • PBS iPhone App
  • PBS iPad App

Cunard

Supreme Court wades into social media wars over free speech

People line up outside of the Supreme Court

WASHINGTON — Justice Elena Kagan drew laughter in February when she remarked in court that she and her eight colleagues on the Supreme Court are not "the nine greatest experts on the internet."

But that hasn't stopped the justices from taking up a new series of high-stakes cases on the role of social media in society, all of which raise different free speech questions and could have broad repercussions.

The cases on the docket reflect how social media has become a contentious battleground in society as a whole, with the rules of the road yet to be fully defined.

“Collectively, they are likely to have quite dramatic effect on the digital public sphere,” said Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University.

Noting that the Supreme Court gets to decide what cases it takes up, he added that the justices “plainly want to be part of this debate.”

The court hears the first round of arguments in those cases on Tuesday, as the justices weigh a recurring question that first came to prominence when then-President Donald Trump blocked critics from following him on Twitter: When a public official blocks someone does that violate the Constitution’s First Amendment?

The cases feature public officials with significantly lower profiles than Trump: members of a school district board of trustees in Southern California and a city manager in Michigan.

The ways the issue has arisen at all levels of government reflects how elected officials increasingly use social media to interact with voters.

The Supreme Court’s ruling on whether the officials were acting in their official capacities when posting to social media, meaning they can be sued for violating free speech rights under the Constitution's First Amendment, will have a broad impact in guiding how lower courts handle such cases.

Later in the court's term, which runs until June, the justices will hear oral arguments over the constitutionality of Republican-backed laws in Florida and Texas that seek to prevent social media companies from banning users for contentious rhetoric. Both laws were enacted at a time when Republicans were outraged at what they perceived as anti-conservative bias in moderation decisions.

In those cases it is the social media companies themselves, via trade groups, arguing that their free speech rights to choose what content to allow on their platforms would be violated.

Finally, the court will consider claims that the Biden administration has unlawfully put pressure on social media platforms to remove content with which it disagrees, a form of coercion dubbed "jawboning" — on issues such as criticism of the government response to the pandemic.

Again, the case raises free speech claims, on this occasion brought by states and individuals asserting the First Amendment right of users to be able to post their chosen content without government interference.

Earlier this year the court heard two other cases concerning Twitter and Google. Then, the court sidestepped a major ruling that could have limited the liability protections that platforms enjoy for content posted by users.

Daphne Keller, an expert on internet law at Stanford Law School, said the justices in those earlier cases, as shown by Kagan's quip made at oral argument, realized they had taken up an issue that was "frankly too complicated for their first foray into this area."

As such, the justices "recognized what huge unintended consequences a careless ruling could have," she added.

The Supreme Court itself, unlike some high courts in other countries, has no social media presence and none of the justices have accounts in their professional capacities. In 2020, Kagan admitted to being an anonymous lurker on Twitter, while former Justice Stephen Breyer said more than a decade ago that he was a Twitter and Facebook user.

The average age of the justices has, however, decreased substantially in the last six years, with four new justices all now in their 50s replacing much older members of the court.

But lawyers point out that the new cases are squarely on an issue the justices do know about — free speech rights — and do not hinge on a deep understanding of technology.

"Many of these cases talk about how social media is the modern public square to try to convey to the justices, who may not be as familiar with social media, just how important this is in the present day and age," said Jenin Younes, a lawyer at the New Civil Liberties Alliance who represents individual plaintiffs in the jawboning case.

"I do think it can be hard for some people who don't use it as much to grasp that this is where a lot of public discourse takes place," she added.

Ironically, the cases reach the Supreme Court as some industry experts have started to predict that the age of social media may already be over .

Anger on the right aimed at social media companies has also been blunted somewhat by Elon Musk's takeover of Twitter, which he has renamed X.

Conservative outrage that led to the Florida and Texas laws being enacted in 2021 was fueled in part by the decisions of Twitter, Facebook and others to ban Trump after his effort to overturn the 2020 presidential election results ended in his supporters storming the U.S. Capitol on Jan. 6, 2021.

In contrast with the previous Twitter management, Musk has allied himself with conservative critics of the platform and allowed various banned users, including Trump, to return, while abandoning efforts to limit the spread of disinformation.

Jaffer stressed the importance of the justices cutting through the politics of the day and coming up with legal rules that will apply fairly no matter who is in power.

“It’s not that easy to come up with the right answers even for those thinking about it every day,” he said. “These are hard questions.”

freedom of speech on social media cases

Lawrence Hurley covers the Supreme Court for NBC News.

The Supreme Court Cases That Could Redefine the Internet

What does freedom of speech actually mean on social media? We’re about to find out.

Photo of a gavel

In the aftermath of the January 6 attack on the U.S. Capitol, both Facebook and Twitter decided to suspend lame-duck President Donald Trump from their platforms. He had encouraged violence, the sites reasoned ; the megaphone was taken away, albeit temporarily . To many Americans horrified by the attack, the decisions were a relief. But for some conservatives, it marked an escalation in a different kind of assault: It was, to them, a clear sign of Big Tech’s anti-conservative bias.

That same year, Florida and Texas passed bills to restrict social-media platforms’ ability to take down certain kinds of content. (Each is described in this congressional briefing .) In particular, they intend to make political “ deplatforming ” illegal, a move that would have ostensibly prevented the removal of Trump from Facebook and Twitter. The constitutionality of these laws has since been challenged in lawsuits—the tech platforms maintain that they have a First Amendment right to moderate content posted by their users. As the separate cases wound their way through the court system, federal judges ( all of whom were nominated by Republican presidents) were divided on the laws’ legality. And now they’re going to the Supreme Court.

On Friday, the Court announced it would be putting these cases on its docket. The resulting decisions could be profound: “This would be—I think this is without exaggeration—the most important Supreme Court case ever when it comes to the internet,” Alan Rozenshtein, a law professor at the University of Minnesota and a senior editor at Lawfare, told me. At stake are tricky questions about how the First Amendment should apply in an age of giant, powerful social-media platforms. Right now, these platforms have the right to moderate the posts that appear on them; they can, for instance, ban someone for hate speech at their own discretion. Restricting their ability to pull down posts would cause, as Rozenshtein put it, “a mess.” The decisions could reshape online expression as we currently know it.

Read: Is this the beginning of the end of the internet?

Whether or not these particular laws are struck down is not what’s actually important here, Rozenshtein argues. “What’s much, much more important is what the Court says in striking down those laws—how the Court describes the First Amendment protections.” Whatever they decide will set legal precedents for how we think about free speech when so much of our lives take place on the web. Rozenshtein and I caught up on the phone to discuss why these cases are so interesting—and why the decision might not fall cleanly along political lines.

Our conversation has been condensed and edited for clarity.

Caroline Mimbs Nyce: How did we get here?

Alan Rozenshtein: If you ask the companies and digital-civil-society folks, we got here because the crazy MAGA Republicans need something to do with their days, and they don’t have any actual policy proposals. So they just engage in culture-war politics, and they have fastened on Silicon Valley social-media companies as the latest boogeyman. If you ask conservatives, they’re going to say, “Big Tech is running amok. The liberals have been warning us about unchecked corporate power for years, and maybe they had a point.” This really came to a head when, in the wake of the January 6 attack on the Capitol, major social-media platforms threw Donald Trump, the president of the United States, off of their platforms.

Nyce: Based on what we know about the Court, do we have any theories about how they’re going to rule?

Rozenshtein: I do think it is very likely that the Texas law will be struck down. It is very broad and almost impossible to implement. But I think there will be some votes to uphold the Florida law. There may be votes from the conservatives, especially Justices Samuel Alito and Clarence Thomas, but you might also get some support from some folks on the left, in particular Justices Ketanji Brown Jackson and Sonia Sotomayor—not because they believe conservatives are being discriminated against, but because they themselves have a lot of skepticism of private power and big companies.

But what’s actually important is not whether these laws are struck down or not. What’s much, much more important is what the Court says in striking down those laws—how the Court describes the First Amendment protections.

Nyce: What are the important things for Americans to consider at this moment?

Rozenshtein: This would be—I think this is without exaggeration—the most important Supreme Court case ever when it comes to the internet.

The Supreme Court in 1997 issued a very famous case called Reno v. ACLU . And this was a constitutional case about what was called the Communications Decency Act. This was a law that purported to impose criminal penalties on internet companies and platforms that transmitted indecent content to minors. So this is part of the big internet-pornography scare of the mid-’90s. The Court said this violates the First Amendment because to comply with this law, platforms are going to have to censor massive, massive, massive amounts of information. And that’s really bad. And Reno v. ACLU has always been considered the kind of Magna Carta of internet–First Amendment cases, because it recognized the First Amendment is really foundational and really important. The Court has recognized this in various forms since then. But, in the intervening almost 30 years, it’s never squarely taken on a case that deals with First Amendment issues on the internet so, so profoundly.

Even if the Court strikes these laws down, if it does not also issue very strong language about how platforms can moderate—that the moderation decisions of platforms are almost per se outside the reach of government regulation under the First Amendment—this will not be the end of this. Whether it’s Texas or Florida or some blue state that has its own concerns about content moderation of progressive causes, we will continue to see laws like this.

This is just the beginning of a new phase in American history where, rightly, it is recognized that because these platforms are so important, they should be the subject of government regulation. For the next decade, we’ll be dealing with all sorts of court challenges. And I think this is as it should be. This is the age of Big Tech. This is not the end of the conversation about the First Amendment, the internet, and government regulation over big platforms. It’s actually the beginning of the conversation.

Nyce: This could really influence the way that Americans experience social media.

Rozenshtein: Oh, it absolutely could, in very unpredictable ways. If you believe the state governments, they’re fighting for internet freedom, for the freedom of users to be able to use these platforms, even if users express unfriendly or unfashionable views. But if you listen to the platforms and most of the tech-policy and digital-civil-society crowd, they’re the ones fighting for internet freedom, because they think that the companies have a First Amendment right to decide what’s on the platforms, and that the platforms only function because companies aggressively moderate.

Even if the conservative states are arguing in good faith, this could backfire catastrophically. Because if you limit what companies can do to take down harmful or toxic content, you’re not going to end up with a freer speech environment. You’re going to end up with a mess.

Official Logo MTSU Freedom Of Speech

  • ENCYCLOPEDIA
  • IN THE CLASSROOM

Home » Articles » Topic » Media » Social Media

Social Media

Written by Deborah Fisher, published on October 24, 2023 , last updated on May 8, 2024

Social Media

Several First Amendment cases have arisen over the use of social media. Courts have examined how far the government can go in regulating speech on social media without violating the First Amendment and the liability of social media companies in spreading terrorist content.

Though they are private businesses and not government entities, U.S. social media platforms have nonetheless been at the center of a number of free speech disputes.

Social media is a method of internet-based communication in which users create communities and share information, videos and personal messages with each other. Some of the most popular social media platforms are Facebook, YouTube,  X (formerly called Twitter), Instagram, TikTok and Snapchat.

Users of social media can create accounts, share information, and follow, like and block other users. Social media companies that control the platforms can create some customization of what you see through algorithms.

A user has to agree to a platform’s rules, which often allow the social media company to remove or block accounts. 

First Supreme Court speech case involving social media was 2017

The first free speech case to reach the Supreme Court that involved social media was in 2017 when the court struck down a state law prohibiting a convicted sex offender from using the platforms. Noting the power of social media, the court called it "the modern public square" and said that these “websites can provide perhaps the most powerful mechanisms available to a private citizen to make his or her voice heard.”

Since then the Supreme Court has examined, or is in the process of examining:

  • When a government official or government entity with a social media account can block, under the public forum doctrine , others from seeing posts or responding to posts on those accounts;
  • Whether states can pass laws that regulate or restrict a social media company’s content moderation activities, such as when a social media company chooses to remove individual posts or entire user accounts;
  • Whether the federal government is violating free speech rights when it works with social media companies to remove posts that it considers disinformation or a national security threat;
  • Whether social media companies  can be held liable  when their algorithms recommend content by terrorist groups to others; and
  • What level of proof is necessary to show that threats published or sent on social media can result in a criminal conviction of stalking.

Some have warned that the use of disguised social media accounts by Russia, China and others to spread disinformation  and sow division and confusion among Americans is a new type of “information warfare” that threatens national security and warrants some type of government intervention. Others counter that such intervention could lead to improper government  censorship  and limit the  right of Americans to receive information  through social media platforms.

Other cases have arisen in lower courts over the extent of the government's ability to limit the speech of government employees or public school students on social media.

Court strikes down law barring sex offenders from social media

In its first Supreme Court case involving social media in 2017, the court considered a North Carolina law intended to restrict sex offenders from cultivating new victims on social media platforms.

Authorities had charged Lester Packingham under the law after he posted a message on Facebook thanking God after the dismissal of a traffic ticket. The problem for Packingham was that he had been convicted of taking indecent liberties with a 13-year-old when he was a 21-year-old college student and was a convicted sex offender.

The Supreme Court overturned lower courts who had upheld the charges against Packingham. It explained in Packingham v. North Carolina that the law was overbroad and would criminalize a wide range of legal speech activities.  

Can government officials block users, delete comments?

Government officials routinely use social media to communicate policy, advocate positions, introduce new legislation and for other communication.

However, once a government entity or government official creates a forum that allows people to comment on posts, the government may run into First Amendment hurdles if the entity or official tries to shut down or silence  opposing viewpoints .

In 2024, the Supreme Court looked closer at when a government official might be violating free speech rights when he or she deleted comments of users or blocked them.  In Lindke v. Freed , the court established a new test to determine when such an official was engaging in state action versus a private action. The court explained that a government official engages in state action on social media if (1) he or she had “actual authority to speak on behalf of the State on a particular matter,” and (2) if he or she “purported to exercise that authority in the relevant posts.” 

Justice Amy Coney Barrett, in explaining the test, said that “the line between private conduct and state action is difficult to draw.” But she noted that “the distinction between private conduct and state action turns on substance, not labels.”

A few years earlier in 2021, another case had reached the Supreme Court — this one involving president Donald Trump and his Twitter account.  In 2019, the 2nd U.S. Circuit Court of Appeals had ruled in Knight First Amendment Institute v. Trump  (2019), that Trump violated the First Amendment by removing from the “interactive space” of his Twitter account several individuals who were critical of him and his governmental policies.    

The appeals court agreed with a lower court that the interactive space associated with Trump’s Twitter account, “@realDonaldTrump,” is a  designated public forum  and that blocking individuals because of their political expression constitutes  viewpoint discrimination .

The Supreme Court in April 2021 vacated the decision and sent it back to the 2nd Circuit with instructions to dismiss it for mootness — Trump was no longer president. Also, after the  Jan. 6, 2021, attack  on the U.S. Capitol, Twitter had eliminated his account over concern that his comments were being interpreted as  encouragement to commit violence . He was barred from using the platform anymore.

Circuit courts split over states regulating social media content

An emerging issue is how far social media companies themselves can go in removing content and users. Some Republicans have complained that posts from conservative leaders and journalists are being blocked or removed or their accounts removed because of their posts, similar to what happened to Trump.

freedom of speech on social media cases

Florida Gov. Ron DeSantis speaks at Miami’s Freedom Tower, on Monday, May 9, 2022. A Florida law intended to punish social media platforms like Facebook and Twitter for blocking or removing conservative viewpoints is an unconstitutional violation of the First Amendment, a federal appeals court ruled Monday, May 23, 2022, dealing a major victory to companies who had been accused by DeSantis of discriminating against conservative thought. (AP Photo/Marta Lavandier, File)

Two states, Florida and Texas, passed similar, though not identical, laws to reduce such blocking, saying the social media companies were discriminating by not allowing certain people to use their platforms based on their political views.

The laws are on hold while the U.S. Supreme Court considers if they violate the First Amendment.

Trade associations for the social media companies argue that content moderation, which included removing posts or users, is a form of editorial judgment and protected by the First Amendment as their own speech. Just like the government can’t force a newspaper to publish something, the government can’t force social media companies to allow certain content on their platforms, they argue.

The Supreme Court took the case after two U.S. Circuit Courts in 2022 reached different conclusions related to a state’s ability to pass laws regulating a social media company’s content-moderation activities.

In  NetChoice v. Attorney General of Florida  the 11th Circuit Court of Appeals upheld an injunction preventing the Florida law from going into effect, saying the “Stop Social Media Censorship Act” would likely be found to violate the First Amendment. The Florida law sought to prohibit social media companies from “deplatforming” political candidates, from prioritizing or deprioritizing any post or message by or about a candidate and from removing anything posted by a “journalistic enterprise” based on content.

The 11th Circuit held that social media companies are private enterprises that have a First Amendment right to moderate and curate the content they disseminate on their platforms.

A few months later, the 5th U.S. Circuit Court of Appeals took an opposite view of the similar Texas law and vacated a preliminary injunction by a local court that had prevented it from being enforced.

In  NetChoice v. Paxton , the court supported Texas’s view that social media companies function as “common carriers,” like a phone company, and as such, they can be regulated with anti-discrimination laws. The court said it rejected “the idea that corporations have a freewheeling First Amendment right to censor what people say” on their platforms.

The Texas law “does not regulate the (p)latforms’ speech at all; it protects  other people’s  speech and regulates the (p)latform’s  conduct,”  the court said.

The Supreme Court is expected to rule in 2024.

Court will consider federal coercion to remove social media posts

Another case  regarding government control of social media sites is under consideration by the Supreme Court in 2024. This case stems from lawsuits by the states of Missouri and Louisiana alleging that federal officials and agencies coerced social media companies to remove certain posts because they believed the posts were disinformation or could harm national security.

The court in  Murthy v. Missouri    will consider whether the actions by officials in the  Biden administration were enough so that the removal of posts could be considered a state action. Stated another way, the case concerns whether the social media companies — because of the “encouragement” and alleged coercion by the federal government agencies — engaged in state action sufficient to trigger constitutional claims.

Are social media companies liable for promoting terrorist content?

In another set of cases decided in 2023, the Supreme Court examined:

  • Whether Section 230 of the  Communications Decency Act  that shields internet service providers from being legally liable for user-created content also shields them when their algorithms promote or recommend content to others; and
  • Whether a social media company provides substantial assistance to terrorists by allowing them to operate on their platforms and can be held liable for aiding and abetting them in their attacks by not taking more aggressive action to remove their content from their platforms.

The court declined to rule in Gonzalez v. Google  (2023) on whether targeted recommendations by a social media company’s algorithms would fall outside the liability of Section 230  of the Communications Decency Act.

Instead, the court said that its ruling in Twitter v. Taamneh  on the same day “is sufficient to acknowledge that much (if not all) of plaintiffs’ complaint seems to fail.” In  Twitter,  the court found that social media companies’ hosting of terrorist content, lack of action in removing their content and algorithms recommending their content was not enough to show that they aided and abetted the terrorists in an attack in Istanbul that killed 39 people.

In both cases, families of Americans killed in ISIS attacks had said that by allowing ISIS to post videos and other content to communicate the terrorists’ messages and to radicalize new recruits, the social media platforms aided and abetted terrorist attacks that killed their relatives, allowing the families to seek damages under the Anti-Terrorism Act .

Courts have examined online stalking, ‘true threats’ on social media

Online stalking also has been at the center of First Amendment cases in which courts have had to decide whether repeated unwanted and threatening communications to a person over social media are “ true threats ” and unprotected by the First Amendment.

In legal parlance, a true threat is a statement that is meant to frighten or intimidate one or more specified persons into believing that they will be seriously harmed by the speaker or by someone acting at the speaker’s behest.

In  Elonis v. United States (2015), a case involving posts on Facebook, the U.S. Supreme Court reversed a criminal conviction under a federal stalking statute because the jury was instructed that it only had to show a reasonable person would view the speech as threatening and not consider the mental state of the speaker. The court did not, however, state what standard of proof was necessary to determine the speaker’s intent. Must it be objective, a reasonable person considering the facts and context? Or must it be subjective, proving a person’s understanding of the effect of the messages when sending them?

The court provided additional guidance on what constituted a "true threat" in a different stalking case involving a man who made posts about a female musician on Facebook. In, Counterman v. Colorado (2023), the U.S. Supreme Court vacated the stalking conviction of the man, sending it back to the lower court for reconsideration.  The court ruled that the First Amendment’s protection of free speech requires that prosecutors show that he was aware of the threatening nature of his communications.

New laws, regulations restrict use of TikTok

In 2023 and 2024, states and the federal government started banning access to TikTok on government devices. Some states have also passed legislation that would effectively bar TikTok use in their state. The bans arise from concerns that a Chinese company owns the popular video platform and that the Communist government of China will harvest data on Americans and use the platform against America's interest. The concerns largely are centered around privacy and national security.

Courts have upheld the government's right to bar the use of TikTok on government devices and even on public university Wifi networks. However, questions have arisen about how far regulations can go in barring TikTok use more broadly.

For example, Montana became the first U.S. state to attempt to ban TikTok in a law passed in May 2023. A judge temporarily halted the law from going into effect after TikTok filed a lawsuit arguing that the ban constituted prior restraint on speech, which is unconstitutional under the First Amendment. The state says it plans to appeal.

Social media posts have gotten public employees, students in trouble

freedom of speech on social media cases

Former Mississippi high school rapper Taylor Bell is flanked by his attorneys, Wilbur Colom, left, and Scott Colom, right, after the May 12 oral argument in his First Amendment case before federal appeals judges in New Orleans. Bell was punished for a rap song he created and posted on social media off-campus. (Photo Credit/Frank LoMonte, with permission to republish by Frank LoMonte)

Many government employees have faced discipline for Facebook posts about their bosses, co-workers,   or for comments related to core functions about their job that the employers viewed as inappropriate. Courts have reached different conclusions, based on the circumstances, on when that discipline violates free speech rights of public employees .

For example, the 6th U.S. Circuit Court of Appeals in December 2022  upheld the termination of a Maryville, Tenn., police officer over Facebook posts that were critical of the county sheriff. The court reasoned that the local police department had an interest in maintaining a good working relationship with the sheriff’s department and this trumped the officer’s free-speech rights.

Student speech cases have also arisen out of social media posts, including for such communications that occurred off-campus.

For example, in  Bell v. Itawamba School Distric t  (2015), the 5th U.S. Circuit Court of Appeals determined that public school officials could punish a student for a rap song he created off-campus and posted on Facebook and YouTube. The video referenced two teachers at the school who allegedly had engaged in sexually inappropriate behavior with female students.

However, in  Mahanoy Area School District v. B.L.  (2021), the U.S. Supreme Court said that a cheerleader’s vulgar post to Snapchat after not making the varsity squad did not pass  the substantial disruption test used in student speech cases and the student’s free speech rights protected her from school discipline.

This article was published in April 2023 and was updated in Feb. 16, 2024. Deborah Fisher is the director of the Seigenthaler Chair of Excellence in First Amendment Studies at Middle Tennessee State University. Parts of this article were contributed by  David L. Hudson, Jr ., a law professor at Belmont who publishes widely on First Amendment topics.

Send Feedback on this article

How To Contribute

The Free Speech Center operates with your generosity! Please  donate now!

The Supreme Court Will Set an Important Precedent for Free Speech Online

A finger pressign a keyboard.

Do social media sites have a First Amendment right to choose which information they publish on their websites?

That’s the question the Supreme Court will address this term when it reviews two laws from Texas and Florida that would force businesses such as Facebook and YouTube to carry certain content that they do not want to feature. Under the guise of “prohibiting censorship,” these laws seek to replace the private entities’ editorial voice with preferences dictated by the government.

The court’s decision will define the public’s experience on the internet: How much control will the government have over public debate? How vigorous will our online conversations be if platforms feel pressured to avoid controversial topics? What art, news, opinion, and communities will we discover on the platforms that are so central to how we communicate when the government mandates what content and which speakers must appear?

To enable online speech and access to information, the ACLU has long urged social media companies to exercise great caution when deciding whether and how to remove or manage lawful posts. On the very largest platforms, free expression values are best served if companies choose to preserve as much political speech as possible, including the speech of public figures. But, regardless of what platforms ought to permit as a matter of corporate policy, the government can’t constitutionally mandate what they ultimately choose.

freedom of speech on social media cases

The Costs of Forcing an Online Haven for Racists Off the Internet

The ACLU worries about gatekeeper companies blocking even the most heinous speakers because it sets a terrible precedent.

Source: American Civil Liberties Union

Moreover, platforms have no choice but to prioritize some content over others — something always has to come first. They make decisions to remove, demote, or hide lawful content to minimize speech that the business does not want to be associated with, that puts off their consumers or advertisers, and that is of little interest or value to their users. And they don’t all make the same decisions, reflecting their different editorial choices. Facebook, for example, prohibits nudity while Twitter, now X, allows it.

Motivated by a perception that social media platforms disproportionately silence conservative voices, some states have sought to regulate platforms’ curatorial decisions. The Florida law at issue before the Supreme Court prohibits social media companies from banning, in any way limiting the distribution of posts by, or prioritizing posts by or about political candidates; it also prohibits taking any action to limit distribution of posts by “journalistic enterprises.” The Texas law bars larger social media platforms from blocking, removing, or demonetizing content based on the users’ views.

The government’s desire to have private speakers distribute more conservative — or for that matter, progressive, liberal, or mainstream — viewpoints is not a permissible basis for regulating the editorial decisions of private platforms. Choosing what not to publish and how to prioritize what is published is protected expression. In deciding what books to release or sell, publishers and booksellers are unquestionably exercising their free speech rights, as are curators of an art exhibit, and editors deciding what op-eds to publish in a newspaper. The government can’t make the decision for them.

This is why in the lower courts’ review of these laws, the ACLU submitted two friend-of-the-court briefs arguing that it is unconstitutional to force social media and other communications platforms to publish unwanted content.

This has long been settled law. For example, in a case called Miami Herald v. Tornillo , the Supreme Court held that a law requiring newspapers that published criticisms of political candidates to also publish any reply by those candidates was unconstitutional. The law had forced private publishers to carry the speech of political candidates, whether they liked it (or agreed with it) or not. As the Supreme Court explained in striking down the law, a government-mandated “right of access inescapably dampens the vigor and limits the variety of public debate.”

The Supreme Court’s established precedent for protecting editorial discretion applies to online platforms as well. Private speech on the internet should receive at least as much First Amendment protection as print newspapers and magazines do. And social media platforms, in combining multifarious voices, exercise their First Amendment rights while also creating the space for the free expression of their users.

freedom of speech on social media cases

Packingham v. North Carolina

Whether North Carolina can prohibit individuals who are registered sex offenders from “accessing” any social media websites.

These entities shouldn’t be required to publish, and their users shouldn’t be forced to contend with, speech that doesn’t fit the expressive goals of the platform or of the community of users. Nor should platforms be required to avoid certain topics entirely because they don’t want to publish or distribute all viewpoints on those topics. Under the guise of “neutrality,” if these laws go into effect, we will be confronted by a lot more distracting, unwanted, and problematic content when using the internet.

For example, a platform should be able to publish posts about vaccination without having to present the views of a political candidate recommending that people drink bleach to combat COVID-19. Similarly, a platform should be able to welcome posts about anti-racism without having to host speech by neo-Nazis. And a social media site should be able to host speakers questioning the scientific basis for climate change or affirming the existence of God without having to publish contrary viewpoints. If people want any of this material, they can seek it out. But the government cannot force it upon either the platforms or the public that relies on them.

Social media and other online platforms are vital to online speech, enabling us to discuss ideas and share perspectives. Given their significant role, the major platforms should facilitate robust debate by erring on the side of preserving the public’s speech. And if they remove protected content, they should offer clarity upfront as to why and, at a minimum, stick to their own rules. Platforms should also offer opportunities for appeals when they inevitably get things wrong . But the government can’t force platforms to carry the speech or promote the viewpoints that it prefers, any more than it could require a bookstore to stock books it did not want to sell.

Ultimately, users should have as much control as possible over what expression they can access. Even if we think the major platforms could be doing a better job, a government-mandated point of view would be a cure worse than the disease.

Learn More About the Issues on This Page

  • Privacy & Technology
  • Social Networking Privacy
  • Internet Privacy
  • Free Speech
  • Internet Speech

Related Content

A pair of hands holding a cell phone at night with street lights in the background.

United States v. Hunt

Hand holding phone with police car lights shining in the background.

Police Want to Treat Your Data Privacy Like Garbage. The Courts Shouldn't Let Them.

ACLU Urges Ninth Circuit to Protect Privacy of Data on Cell Phones

ACLU Urges Ninth Circuit to Protect Privacy of Data on Cell Phones

Smith v. BlueCross BlueShield

Smith v. BlueCross BlueShield

  • Skip to main content
  • Keyboard shortcuts for audio player

Supreme Court Rules Cheerleader's F-Bombs Are Protected By The 1st Amendment

Nina Totenberg

freedom of speech on social media cases

The U.S. Supreme Court sided with students in a case involving a cheerleader who dropped F-bombs on Snapchat while complaining about her school. Mark Tenally/AP hide caption

The U.S. Supreme Court sided with students in a case involving a cheerleader who dropped F-bombs on Snapchat while complaining about her school.

In a victory for student speech rights, the Supreme Court on Wednesday ruled that a former cheerleader's online F-bombs about her school is protected speech under the First Amendment.

But in an 8-1 vote, the court also declared that school administrators do have the power to punish student speech that occurs online or off campus if it genuinely disrupts classroom study. But the justices concluded that a few swearwords posted online off school grounds, as in this case, did not rise to the definition of disruptive.

At issue in the case was a series of F-bombs issued in 2017 on Snapchat by Brandi Levy, then a 14-year-old cheerleader who failed to win a promotion from the junior varsity to the varsity cheerleading team at her Pennsylvania school.

"I was really upset and frustrated at everything," she said in an interview with NPR in April. So she posted a photo of herself and a friend flipping the bird to the camera, along with a message that said, "F*** the school. ... F*** cheer, F*** everything."

Suspended from the team for what was considered disruptive behavior, Levy — and her parents — went to court. They argued that the school had no right to punish her for off-campus speech, whether it was posted online while away from school or spoken out loud at a Starbucks across the street from school.

A federal appeals court agreed with her, declaring that school officials have no authority to punish students for speech that occurs in places unconnected to the campus.

On Wednesday, the Supreme Court ruled for Levy while at the same time declaring that schools may in fact punish some speech, especially if it is harassing, bullying, cheating or otherwise disruptive.

Writing for the majority, Justice Stephen Breyer said that while "public schools may have a special interest in regulating some off-campus student speech," the justifications offered for punishing Levy's speech were simply insufficient. "To the contrary," said Breyer, the speech that Levy uttered "is the kind of pure speech to which, were she an adult, the First Amendment would provide strong protection."

Supreme Court Restricts Police Powers To Enter A Home Without A Warrant

Supreme Court Restricts Police Powers To Enter A Home Without A Warrant

Supreme Court Hands Farmworkers Union A Major Loss

In A Narrow Ruling, Supreme Court Hands Farmworkers Union A Loss

Supreme Court Grants A Reprieve To Agency That Runs Fannie And Freddie

Supreme Court Grants A Reprieve To Agency That Runs Fannie, Freddie

Breyer's decision harkened back to a 1969 case that involved students suspended for wearing black armbands to school to protest the Vietnam War. The court ruled then that students do have free speech rights under the Constitution, as long as the speech is not disruptive to the school.

On Wednesday, the high court reinforced that decision, concluding that while Levy's post was less than admirable, it did not meet the test of being disruptive. In his majority opinion, Breyer noted that her post did not target any individual, did not even name her school; her comments, he said, were made on her personal cellphone over the weekend, off campus and to her friends.

Breyer went on to establish some general guardrails for school districts to follow in the future. Parents, not schools, he said, generally have the responsibility for discipling students off campus. Indeed, were the school to have the power to discipline off-campus speech as a general matter, it would mean that everything a student said 24 hours a day would be subject to punishment by school authorities.

Instead, Breyer said, school authorities have an interest in protecting unpopular student expression, especially when it occurs off campus. After all, he added, "America's public schools are the nurseries of democracy."

"It's a huge victory for students' speech rights," said David Cole, legal director for the American Civil Liberties Union, which represented Levy. "It means that when students leave school every day, they don't have to carry the schoolhouse on their backs."

But Michael Levin, counsel for Pennsylvania's Mahanoy Area School District, also claimed victory, contending that schools could easily operate under these rules. "The Supreme Court ruled clearly that school districts had the right under the Constitution to regulate off-campus speech in a wide variety of situations," he said.

Joie Green, superintendent for the Mahanoy Area School District, however, was not so sure, noting that in this case Levy had signed a contract to follow the team rules, and she didn't. "All the school did was support the coach's rules," Green said. "Where is the line drawn?"

Gregory Garre, the former solicitor general who represented the National School Boards Association in the case, said he saw Wednesday's decision as a win for both sides — a victory for Levy on the facts of her case but also a clear rejection of the notion that off-campus speech is out of bounds for school discipline.

"The court took a common-sense approach here," Garre said. "Just because speech originates off campus, particularly in a special context of social media, doesn't mean that it can't substantially disrupt the campus and the classroom."

Yale law professor Justin Driver, author of The Schoolhouse Gate , a book about these issues, called the decision incredibly significant.

"It's the first time in more than 50 years that a public school student has prevailed in a free speech case at the Supreme Court," Driver pointed out. "Public school students should be dancing in the streets."

"At the same time," Driver said, "Justice Breyer's opinion for the court left many significant questions unanswered. And this suggests that the court is going to have another off-campus student speech case somewhere down the line."

But Garre noted that Breyer, whose future on the court is the subject of much scrutiny , still wrote for a near-unanimous court. "This well could end up being one of Justice Breyer's more significant opinions, whether he ends up stepping down this year or in future years," Garre said.

In a concurring opinion, Justice Samuel Alito wrote: "If today's decision teaches any lesson, it must be that the regulation of many types of off-premises student speech raises serious First Amendment concerns, and school officials should proceed cautiously before venturing into this territory."

In a statement, the National School Boards Association said that "while the school district lost on the facts of this particular case, it represents a win for schools, as well as students, who can still be protected from off-campus student speech that bullies, harasses, threatens, disrupts, or meets other circumstances outlined by the Court."

In a dissent, Justice Clarence Thomas wrote that the school was right to suspend Levy because students like her "who are active in extracurricular programs have a greater potential, by virtue of their participation, to harm those programs."

Thomas has long taken the position that students generally do not have free speech rights.

  • Find a Lawyer
  • Ask a Lawyer
  • Research the Law
  • Law Schools
  • Laws & Regs
  • Newsletters
  • Justia Connect
  • Pro Membership
  • Basic Membership
  • Justia Lawyer Directory
  • Platinum Placements
  • Gold Placements
  • Justia Elevate
  • Justia Amplify
  • PPC Management
  • Google Business Profile
  • Social Media
  • Justia Onward Blog

Free Speech Supreme Court Cases

The First Amendment to the U.S. Constitution provides that the government must not “abridge the freedom of speech, or of the press.” Free speech has long been considered one of the pillars of a democracy. Explaining its importance, Justice Oliver Wendell Holmes, Jr. declared that “the best test of truth is the power of the thought to get itself accepted in the competition of the market.” A faith in this marketplace of ideas continues to buttress First Amendment law.

Since the First World War, the Supreme Court has grappled with how far the government can go in restricting speech. This often requires asking a threshold question: what is “speech” for First Amendment purposes? The Supreme Court has found that speech may extend beyond the spoken and written word into the area of expressive conduct, in which actions send a symbolic message. For example, burning a flag or wearing a black arm band has received First Amendment protection. Cases involving campaign financing have shown that sometimes even certain uses of money are considered speech.

The distinction between content-based and content-neutral laws has played a key role in free speech cases. Content-based laws regulate speech based on its substance, while content-neutral laws generally control the time, place, and manner of speech. The government bears a heavy burden in defending content-based restrictions, since they are subject to strict scrutiny. In contrast, content-neutral regulations are reviewed under a form of intermediate scrutiny, which means that they are more likely to survive a challenge.

Below is a selection of Supreme Court cases involving free speech, arranged from newest to oldest.

Author: Sonia Sotomayor

The First Amendment prohibits government officials from wielding their power selectively to punish or suppress speech, including through private intermediaries.

Author: Amy Coney Barrett

When a government official posts about job-related topics on social media, this speech is attributable to the government only if the official possessed actual authority to speak on the government’s behalf and purported to exercise that authority when they spoke on social media.

Author: Neil Gorsuch

The First Amendment prohibits a state from forcing a website designer to create expressive designs speaking messages with which the designer disagrees.

Author: Elena Kagan

Although true threats of violence are outside the bounds of First Amendment protection, the First Amendment still requires proof that the defendant had some subjective understanding of the threatening nature of their statements. However, the state only needs to prove recklessness, which means that the defendant consciously disregarded a substantial risk that their communications would be viewed as threatening violence.

A distinction between on-premises and off-premises signs was facially content neutral under the First Amendment and thus not subject to strict scrutiny.

Author: Brett Kavanaugh

When a private entity operates public access channels on a cable system, it is not performing a traditional, exclusive public function, and it is not transformed into a state actor by opening its property for speech by others. Thus, it is not subject to First Amendment constraints on its editorial discretion.

Author: Samuel A. Alito, Jr.

The state's extraction of agency fees from non-consenting public-sector employees violates the First Amendment.

Author: John Roberts

A ban on voters wearing a political badge, political button, or anything bearing political insignia inside a polling place on Election Day violated the Free Speech Clause.

Author: Stephen Breyer

When an employer demotes an employee out of a desire to prevent the employee from engaging in protected political activity, the employee is entitled to challenge that unlawful action under the First Amendment and Section 1983 even if the employer's actions are based on a factual mistake about the employee's behavior.

Author: Clarence Thomas

Since content-based laws target speech based on its communicative content, they are presumptively unconstitutional and may be justified only if the government proves that they are narrowly tailored to serve compelling state interests. Speech regulation is content-based if a law applies to particular speech because of the topic discussed or the idea or message expressed.

Congress may regulate campaign contributions to protect against corruption or the appearance of corruption, but it may not regulate contributions simply to reduce the amount of money in politics, or to restrict the political participation of some in order to enhance the relative influence of others.

A public employee's sworn testimony outside the scope of their ordinary job duties is entitled to First Amendment protection.

Author: Anthony Kennedy

There is no general exception to the First Amendment for false statements. This comports with the common understanding that some false statements are inevitable if there is to be an open and vigorous expression of views in public and private conversation, expression that the First Amendment seeks to guarantee.

Author: Antonin Scalia

Video games qualify for First Amendment protection. Like protected books, plays, and movies, they communicate ideas through familiar literary devices and features distinctive to the medium.

Speech in aid of pharmaceutical marketing is a form of expression protected by the Free Speech Clause.

The First Amendment can serve as a defense in state tort claims, including claims for intentional infliction of emotional distress. Whether the First Amendment prohibits holding a defendant liable for their speech turns largely on whether the speech is of public or private concern, as determined by all the circumstances of the case.

The government may not suppress political speech on the basis of the speaker's corporate identity. No sufficient governmental interest justifies limits on the political speech of non-profit or for-profit corporations.

The placement of a permanent monument in a public park is a form of government speech and is therefore not subject to scrutiny under the Free Speech Clause.

A principal may restrict student speech at a school event when that speech is reasonably viewed as promoting illegal drug use.

When public employees make statements pursuant to their official duties, the Constitution does not insulate their communications from employer discipline.

When plaintiffs challenge a content-based speech restriction, the government has the burden to prove that the proposed alternatives will not be as effective as the challenged statute.

Author: Sandra Day O’Connor

While a state may ban cross burning carried out with the intent to intimidate, a provision in a statute treating any cross burning as prima facie evidence of intent to intimidate rendered the statute unconstitutional.

Author: William Rehnquist

The governmental interest in preventing the actual or apparent corruption of federal candidates and officeholders constitutes a sufficiently important interest to justify contribution limits.

A canon of judicial conduct that prohibits a candidate for a judicial office from announcing their views on disputed legal or political issues violates the First Amendment.

A law's reliance on community standards to identify what material is harmful to minors did not by itself render the statute substantially overbroad for First Amendment purposes.

Viewpoint-based funding decisions may be sustained when the government is the speaker, or when the government uses private speakers to transmit information pertaining to its own program. It does not follow that viewpoint-based restrictions are proper when the government does not speak or subsidize transmittal of a message that it favors but instead expends funds to encourage a diversity of views from private speakers.

Although the First Amendment applies in the subsidy context, Congress has wide latitude to set spending priorities. Also, when the government is acting as patron rather than sovereign, the consequences of imprecision in its decision-making criteria are not constitutionally severe.

Author: John Paul Stevens

Although the government has an interest in protecting children from potentially harmful materials, a law cannot pursue this interest by suppressing a large amount of speech that adults have a constitutional right to send and receive if less restrictive alternatives would be at least as effective in achieving the law's legitimate purposes.

In determining whether the government is acting to preserve the limits of the forum that it has created so that the exclusion of a class of speech is legitimate, there is a distinction between content discrimination and viewpoint discrimination. Content discrimination may be permissible if it preserves the purposes of the limited forum. Viewpoint discrimination is presumed impermissible when directed against speech that is otherwise within the forum's limitations. Also, the guarantee of neutrality toward religion is respected when the government, following neutral criteria and even-handed policies, extends benefits to recipients whose ideologies and viewpoints, including some that are religious, are broad and diverse.

Author: Harry Blackmun

Speech cannot be financially burdened, any more than it can be punished or banned, simply because it might offend a hostile mob.

Areas of speech that can be regulated because of their constitutionally proscribable content still cannot be made vehicles for content discrimination unrelated to their distinctively proscribable content. However, when the basis for the content discrimination consists entirely of the very reason why the entire class of speech at issue is proscribable, no significant danger of idea or viewpoint discrimination exists. Another valid basis for according differential treatment to even a content-defined subclass of proscribable speech is that the subclass happens to be associated with particular secondary effects of the speech, so the regulation is justified without reference to the content of the speech.

Nude dancing at adult entertainment establishments is expressive conduct within the outer perimeters of the First Amendment, but only marginally so. The enforcement of a public indecency law to prevent totally nude dancing does not violate the First Amendment guarantee of freedom of expression.

The government may make a value judgment favoring childbirth over abortion and implement that judgment by the allocation of public funds. In so doing, the government has not discriminated on the basis of viewpoint; it has merely chosen to fund one activity to the exclusion of another.

Author: William Brennan

The government generally has a freer hand in restricting expressive conduct than in restricting the written or spoken word. However, it may not proscribe particular conduct because it has expressive elements.

Even in a public forum, the government may impose reasonable restrictions on the time, place, or manner of protected speech, provided that the restrictions are justified without reference to the content of the regulated speech, they are narrowly tailored to serve a significant governmental interest, and they leave open ample alternative channels for communication of the information.

Author: Byron White

Educators do not offend the First Amendment by exercising editorial control over the style and content of student speech in school-sponsored expressive activities, so long as their actions are reasonably related to legitimate pedagogical concerns.

Author: Warren Burger

The use of an offensive form of expression may not be prohibited to adults making what the speaker considers a political point, but it does not follow that the same latitude must be permitted to children in a public school.

A broad ban on all editorializing by every broadcast station that receives public funds exceeds what is necessary to protect against the risk of government interference or to prevent the public from assuming that editorials by public broadcasting stations represent the official view of the government.

When a public employee speaks as an employee on matters only of personal interest, a federal court is generally not the appropriate forum to review the wisdom of a personnel decision taken by a public agency allegedly in reaction to the employee's behavior.

States are entitled to greater leeway in the regulation of pornographic depictions of children. The standard of Miller v. California for determining what is legally obscene is not a satisfactory solution to the child pornography problem.

A public employee does not forfeit their First Amendment protection when they arrange to communicate privately with their employer, rather than expressing their views publicly.

Author: Per Curiam

Restrictions on individual contributions to political campaigns and candidates did not violate the First Amendment. However, restrictions on independent expenditures in campaigns, limits on expenditures by candidates from their personal or family resources, and limits on total campaign expenditures violated the First Amendment. Also, any appointee exercising significant authority pursuant to the laws of the United States is an “Officer of the United States” and must be appointed in the manner prescribed by the Appointments Clause.

Author: Lewis Powell

So long as they do not impose liability without fault, states may define for themselves the appropriate standard of liability for a publisher or broadcaster of defamatory falsehood that injures a private individual and whose substance makes substantial danger to reputation apparent.

States have a legitimate interest in regulating commerce in obscene material and its exhibition in places of public accommodation, including adult theaters.

The guidelines for the trier of fact in an obscenity case are whether the average person applying contemporary community standards would find that the work, taken as a whole, appeals to the prurient interest; whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.

The First Amendment does not relieve a newspaper reporter of the obligation to respond to a grand jury subpoena and answer questions relevant to a criminal investigation. Therefore, the First Amendment does not afford a reporter a constitutional testimonial privilege for an agreement that they make to conceal facts relevant to a grand jury's investigation of a crime or to conceal the criminal conduct of their source or evidence of it.

Author: John Marshall Harlan II

A state could not make the simple public display of a single four-letter expletive a criminal offense.

It is the purpose of the First Amendment to preserve an uninhibited marketplace of ideas in which truth will ultimately prevail. The right of the viewers and listeners, rather than the right of the broadcasters, is paramount.

Freedoms of speech and press do not permit a state to forbid advocacy of the use of force or of law violation except when such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.

Author: Abe Fortas

A student may express their opinions, even on controversial subjects, if they do so without materially and substantially interfering with the requirements of appropriate discipline in the operation of the school, and without colliding with the rights of others. However, conduct by a student that materially disrupts classwork or involves substantial disorder or invasion of the rights of others is not immunized by the constitutional guarantee of freedom of speech.

Author: Thurgood Marshall

When a public employee's false statements concerned issues that were currently the subject of public attention and did not interfere with the performance of their duties or the general operation of their employer, they were entitled to the same protection as if the statements had been made by a member of the general public.

Author: Earl Warren

When speech and non-speech elements are combined in the same course of conduct, a sufficiently important governmental interest in regulating the non-speech element can justify incidental limitations on First Amendment freedoms. A government regulation is sufficiently justified if it is within the constitutional power of the government, it furthers an important or substantial governmental interest, the governmental interest is unrelated to the suppression of free expression, and the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest.

A public figure who is not a public official may recover damages for defamatory falsehood substantially endangering their reputation on a showing of highly unreasonable conduct constituting an extreme departure from the standards of investigation and reporting ordinarily maintained by responsible publishers. (In a concurrence, Justice Warren advised applying the New York Times standard.)

Author: Arthur Goldberg

Allowing unfettered discretion to local officials in the regulation of the use of the streets for peaceful parades and meetings is an unwarranted abridgment of freedom of speech.

A state cannot award damages to a public official for defamatory falsehood related to their official conduct unless they prove actual malice, which means that the statement was made with knowledge of its falsity or with reckless disregard of whether it was true or false.

Obscenity is not within the area of constitutionally protected freedom of speech or press under the First Amendment.

Author: Tom C. Clark

Expression by means of motion pictures is included within the free speech and free press guaranty of the First Amendment. A state may not place a prior restraint on the showing of a motion picture on the basis of a censor's conclusion that it is sacrilegious.

Author: Felix Frankfurter

In the face of a history of tension and violence and its frequent obligato of extreme racial and religious propaganda, a state legislature was not without reason in seeking ways to curb false or malicious defamation of racial and religious groups, made in public places and by means calculated to have a powerful emotional impact on those to whom it was presented.

Author: Fred M. Vinson

Courts must ask whether the gravity of the evil, discounted by its improbability, justifies such invasion of free speech as is necessary to avoid the danger.

When a speaker passes the bounds of argument or persuasion and undertakes incitement to riot, the police are not powerless to prevent a breach of the peace.

Author: William O. Douglas

Freedom of speech, although not absolute, is protected against censorship or punishment unless it is shown likely to produce a clear and present danger of a serious substantive evil that rises far above public inconvenience, annoyance, or unrest.

Author: Robert H. Jackson

The action of a state in making it compulsory for children in public schools to salute the flag and pledge allegiance violates the First Amendment. The issue does not turn on the possession of particular religious views or the sincerity with which they are held. (This decision overturned Gobitis .)

Author: Frank Murphy

There are certain well-defined and narrowly limited classes of speech, the prevention and punishment of which have never been thought to raise any constitutional problem. These include the lewd and obscene, the profane, the libelous, and the insulting or fighting words, which by their very utterance inflict injury or tend to incite an immediate breach of the peace.

Author: Owen Josephus Roberts

When a clear and present danger of riot, disorder, interference with traffic on the public streets, or other immediate threat to public safety, peace, or order appears, the power of the state to prevent or punish is obvious.

Author: Charles Evans Hughes

In determining the extent of the constitutional protection for the freedom of the press, it has been generally considered that it is the chief purpose of the guaranty to prevent previous restraints upon publication.

Author: Louis Brandeis

No danger flowing from speech can be deemed clear and present unless the incidence of the evil apprehended is so imminent that it may befall before there is opportunity for full discussion.

Author: Edward Terry Sanford

The government cannot reasonably be required to defer taking measures against revolutionary utterances advocating the overthrow of organized government until they lead to actual disturbances of the peace or imminent danger of the government's destruction. (This case is also significant for applying the First Amendment to the states via the Fourteenth Amendment.)

Author: Oliver Wendell Holmes, Jr.

It is only the present danger of immediate evil or an intent to bring it about that warrants Congress in setting a limit to the expression of opinion when private rights are not concerned.

The delivery of a speech in such word and such circumstances that the probable effect will be to prevent recruiting, and with that intent, is not protected because of the fact that the purpose to oppose the war and obstruct recruiting, and the expressions used in that regard, were but incidental parts of a general propaganda of socialism and expressions of a general and conscientious belief.

The First Amendment, while prohibiting legislation against free speech as such, was not intended to give immunity to every possible use of language.

Words that, ordinarily and in many places, would be within the freedom of speech protected by the First Amendment may become subject to prohibition when of such a nature and used in such circumstances as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent.

  • Immigration & National Security
  • Labor & Employment
  • Property Rights & Land Use
  • Voting & Elections
  • Communications and Internet Law
  • Constitutional Law
  • Criminal Law
  • Education Law
  • Employment Law
  • Bankruptcy Lawyers
  • Business Lawyers
  • Criminal Lawyers
  • Employment Lawyers
  • Estate Planning Lawyers
  • Family Lawyers
  • Personal Injury Lawyers
  • Estate Planning
  • Personal Injury
  • Business Formation
  • Business Operations
  • Intellectual Property
  • International Trade
  • Real Estate
  • Financial Aid
  • Course Outlines
  • Law Journals
  • US Constitution
  • Regulations
  • Supreme Court
  • Circuit Courts
  • District Courts
  • Dockets & Filings
  • State Constitutions
  • State Codes
  • State Case Law
  • Legal Blogs
  • Business Forms
  • Product Recalls
  • Justia Connect Membership
  • Justia Premium Placements
  • Justia Elevate (SEO, Websites)
  • Justia Amplify (PPC, GBP)
  • Testimonials

Find anything you save across the site in your account

The Evolving Free-Speech Battle Between Social Media and the Government

freedom of speech on social media cases

By Isaac Chotiner

An attendee uses Facebook Live to record U.S. President Joe Biden speaking.

Earlier this month, a federal judge in Louisiana issued a ruling that restricted various government agencies from communicating with social-media companies. The plaintiffs, which include the attorneys general of Missouri and Louisiana, argued that the federal government was coercing social-media companies into limiting speech on topics such as vaccine skepticism. The judge wrote, in a preliminary injunction, “If the allegations made by plaintiffs are true, the present case arguably involves the most massive attack against free speech in United States’ history. The plaintiffs are likely to succeed on the merits in establishing that the government has used its power to silence the opposition.” The injunction prevented agencies such as the Department of Health and Human Services and the F.B.I. from communicating with Facebook , Twitter, or other platforms about removing or censoring content. (The Biden Administration appealed the injunction and, on Friday, the Fifth Circuit paused it. A three-judge panel will soon decide whether it will be reinstated as the case proceeds.) Critics have expressed concern that such orders will limit the ability of the government to fight disinformation.

To better understand the issues at stake, I recently spoke by phone with Genevieve Lakier, a professor of law at the University of Chicago Law School who focusses on issues of social media and free speech. (We spoke before Friday’s pause.) During our conversation, which has been edited for length and clarity, we discussed why the ruling was such a radical departure from the way that courts generally handle these issues, how to apply concepts like free speech to government actors, and why some of the communication between the government and social-media companies was problematic.

In a very basic sense, what does this decision actually do?

Well, in practical terms, it prevents a huge swath of the executive branch of the federal government from essentially talking to social-media platforms about what they consider to be bad or harmful speech on the platforms.

There’s an injunction and then there’s an order, and both are important. The order is the justification for the injunction, but the injunction itself is what actually has effects on the world. And the injunction is incredibly broad. It says all of these defendants—and we’re talking about the President, the Surgeon General, the White House press secretary, the State Department, the F.B.I.—may not urge, encourage, pressure, or induce in any manner the companies to do something different than what they might otherwise do about harmful speech. This is incredibly broad language. It suggests, and I think is likely to be interpreted to mean, that, basically, if you’re a member of one of the agencies or if you’re named in this injunction, you just cannot speak to the platforms about harmful speech on the platform until, or unless, the injunction ends.

But one of the puzzling things about the injunction is that there are these very significant carve-outs. For example, my favorite is that the injunction says, basically, “On the other hand, you may communicate with the platforms about threats to public safety or security of the United States.” Now, of course, the defendants in the lawsuit would say, “That’s all we’ve been doing. When we talk to you, when we talk to the platforms about election misinformation or health misinformation, we are alerting them to threats to the safety and security of the United States.”

So, read one way, the injunction chills an enormous amount of speech. Read another way, it doesn’t really change anything at all. But, of course, when you get an injunction like this from a federal court, it’s better to be safe than sorry. I imagine that all of the agencies and government officials listed in the injunction are going to think, We’d better shut up.

And the reason that specific people, jobs, and agencies are listed in the injunction is because the plaintiffs say that these entities were communicating with social-media companies, correct?

Correct. And communicating in these coercive or harmful, unconstitutional ways. The presumption of the injunction is that if they’ve been doing it in the past, they’re probably going to keep doing it in the future. And let’s stop continuing violations of the First Amendment.

As someone who’s not an expert on this issue, I find the idea that you could tell the White House press secretary that he or she cannot get up at the White House podium and say that Twitter should take down COVID misinformation—

Does this injunction raise issues on two fronts: freedom of speech and separation of powers?

Technically, when the press secretary is operating as the press secretary, she’s not a First Amendment-rights holder. The First Amendment limits the government, constrains the government, but protects private people. And so when she’s a private citizen, she has all her ordinary-citizen rights. Government officials technically don’t have First Amendment rights.

That said, it’s absolutely true that, when thinking about the scope of the First Amendment, courts take very seriously the important democratic and expressive interests in government speech. And so government speakers don’t have First Amendment rights, but they have a lot of interests that courts consider. A First Amendment advocate would say that this injunction constrains and has negative effects on really important government speech interests.

More colloquially, I would just say the irony of this injunction is that in the name of freedom of speech it is chilling a hell of a lot of speech. That is how complicated these issues are. Government officials using their bully pulpit can have really powerful speech-oppressive effects. They can chill a lot of important speech. But one of the problems with the way the district court approaches the analysis is that it doesn’t seem to be taking into account the interest on the other side. Just as we think that the government can go too far, we also think it’s really important for the government to be able to speak.

And what about separation-of-powers issues? Or is that not relevant here?

I think the way that the First Amendment is interpreted in this area is an attempt to protect some separation of powers. Government actors may not have First Amendment rights, but they’re doing important business, and it’s important to give them a lot of freedom to do that business, including to do things like express opinions about what private citizens are doing or not doing. Courts generally recognize that government actors, legislators, and executive-branch officials are doing important business. The courts do not want to second-guess everything that they’re doing.

So what exactly does this order say was illegal?

The lawsuit was very ambitious. It claimed that government officials in a variety of positions violated the First Amendment by inducing or encouraging or incentivizing the platforms to take down protected speech. And by coercing or threatening them into taking down protected speech. And by collaborating with them to take down protected speech. These are the three prongs that you can use in a First Amendment case to show that the decision to take down speech that looks like it’s directly from a private actor is actually the responsibility of the government. The plaintiffs claimed all three. What’s interesting about that district-court order is that it agreed with all three. It says, Yeah, there was encouragement, there was coercion, and there was joint action or collaboration.

And what sort of examples are they providing? What would be an example of the meat of what the plaintiffs argued, and what the judge found to violate the First Amendment?

A huge range of activities—some that I find troubling and some that don’t seem to be troubling. Public statements by members of the White House or the executive branch expressing dissatisfaction with what the platforms are doing. For instance, President Biden’s famous statement that the platforms are killing people. Or the Surgeon General’s warning that there is a health crisis caused by misinformation, and his urging the platforms to do something about it. That’s one bucket.

There is another bucket in which the platforms were going to agencies like the C.D.C. to ask them for information about the COVID pandemic and the vaccine—what’s true and what’s false, or what’s good and what’s bad information—and then using that to inform their content-moderation rules.

Very different and much more troubling, I think, are these e-mails that they found in discovery between White House officials and the platforms in which the officials more or less demand that the platforms take down speech. There is one e-mail from someone in the White House who asked Twitter to remove a parody account that was linked to President Biden’s granddaughter, and said that he “cannot stress the degree to which this needs to be resolved immediately”—and within forty-five minutes, Twitter takes it down. That’s a very different thing than President Biden saying, “Hey, platforms, you’re doing a bad job with COVID misinformation.”

The second bucket seems full of the normal give-and-take you’d expect between the government and private actors in a democratic society, right?

Yeah. Threats and government coercion on private platforms seem the most troubling from a First Amendment perspective. And traditionally that is the kind of behavior that these cases have been most worried about.

This is not the first case to make claims of this kind. This is actually one of dozens of cases that have been filed in federal court over the last years alleging that the Biden Administration or members of the government had put pressure on or encouraged platforms to take down vaccine-skeptical speech and speech about election misinformation. What is unusual about this case is the way that the district court responded to these claims. Before this case, courts had, for the most part, thrown these cases out. I think this was largely because they thought that there was insufficient evidence of coercion, and coercion is what we’re mostly worried about. They have found that this kind of behavior only violates the First Amendment if there is some kind of explicit threat, such as “If you don’t do X, we will do Y,” or if the government actors have been directly involved in the decision to take down the speech.

In this case, the court rejects that and has a much broader test, where it says, basically, that government officials violate the First Amendment if they significantly encourage the platforms to act. And that may mean just putting pressure on them through rhetoric or through e-mails on multiple occasions—there’s a campaign of pressure, and that’s enough to violate the First Amendment. I cannot stress enough how significant a departure that is from the way courts have looked at the issue before.

So, in this case, you’re saying that the underlying behavior may constitute something bad that the Biden Administration did, that voters should know about it and judge them on it, but that it doesn’t rise to the level of being a First Amendment issue?

Yes. I think that this opinion goes too far. It’s insufficiently attentive to the interests on the other side. But I think the prior cases have been too stingy. They’ve been too unwilling to find a problem—they don’t want to get involved because of this concern with separation of powers.

The platforms are incredibly powerful speech regulators. We have largely handed over control of the digital public sphere to these private companies. I think there is this recognition that when the government criticizes the platforms or puts pressure on the platforms to change their policies, that’s some form of political or democratic oversight, a way to promote public welfare. And those kinds of democratic and public-welfare concerns are pretty significant. The courts have wanted to give the government a lot of room to move.

But you think that, in the past, the courts have been too willing to give the government space? How could they develop a better approach?

Yeah. So, for example, the e-mails that are identified in this complaint—I think that’s the kind of pressure that is inappropriate for government actors in a democracy to be employing against private-speech platforms. I’m not at all convinced that, if this had come up in a different court, those would have been found to be a violation of the First Amendment. But there need to be some rules of the road.

On the one hand, I was suggesting that there are important democratic interests in not having too broad a rule. But, on the other hand, I think part of what’s going on here—part of what the facts that we see in this complaint are revealing—is that, in the past, we’ve thought about this kind of government pressure on private platforms, which is sometimes called jawboning, as episodic. There’s a local sheriff or there’s an agency head who doesn’t like a particular policy, and they put pressure on the television station, or the local bookseller, to do something about it. Today, what we’re seeing is that there’s just this pervasive, increasingly bureaucratized communication between the government and the platforms. The digital public theatre has fewer gatekeepers; journalists are not playing the role of leading and determining the news that is fit to print or not fit to print. And so there’s a lot of stuff, for good or for ill, that is circulating in public. You can understand why government officials and expert agencies want to be playing a more significant role in informing, influencing, and persuading the platforms to operate one way or the other. But it does raise the possibility of abuse, and I’m worried about that.

That was a fascinating response, but you didn’t totally answer the question. How should a court step in here without going too far?

The traditional approach that courts have taken, until now, has been to say that there’s only going to be a First Amendment violation if the coercion, encouragement, or collaboration is so strong that, essentially, the platform had no choice but to act. It had no alternatives; there was no private discretion. Because then we can say, Oh, yes, it was the government actor, not the platform, that ultimately was responsible for the decision.

I think that that is too restrictive a standard. Platforms are vulnerable to pressure from the government that’s a lot less severe. They’re in the business of making money by disseminating a lot of speech. They don’t particularly care about any particular tweet or post or speech act. And their economic incentives will often mean that they want to curry favor with the government and with advertisers by being able to continue to circulate a lot of speech. If that means that they have to break some eggs, that they have to suppress particular kinds of posts or tweets, they will do that. It’s economically rational for them to do so.

The challenge for courts is to develop rules of the road for how government officials can interact with platforms. It has to be the case that some forms of communication are protected, constitutionally O.K., and even democratically good. I want expert agencies such as the C.D.C. to be able to communicate to the platforms. And I want that kind of expert information to be constitutionally unproblematic to deliver. On the other hand, I don’t think that White House officials should be writing to platforms and saying, “Hey, take this down immediately.”

I never thought about threatening companies as a free-speech issue that courts would get involved with. Let me give you an example. If you had told me four years ago that the White House press secretary had got up and said, “I have a message from President Trump. If CNN airs one more criticism of me, I am going to try and block its next merger,” I would’ve imagined that there would be a lot of outrage about that. What I could not have imagined was a judge releasing an injunction saying that people who worked for President Trump were not allowed to pass on the President’s message from the White House podium. It would be an issue for voters to decide. Or, I suppose, CNN, during the merger decision, could raise the issue and say, “See, we didn’t get fair treatment because of what President Trump said,” and courts could take that into account. But the idea of blocking the White House press secretary from saying anything seems inconceivable to me.

I’ll say two things in response. One is that there is a history of this kind of First Amendment litigation, but it’s usually about private speech. We might think that public speech has a different status because there is more political accountability. I don’t know. I find this question really tricky, because I think that the easiest cases from a First Amendment perspective, and the easiest reason for courts to get involved, is when the communication is secret, because there isn’t political accountability.

You mentioned the White House press secretary saying something in public. O.K., that’s one thing. But what about if she says it in private? We might think, Well, then the platforms are going to complain. But often regulated parties do not want to say that they have been coerced by the government into doing something against their interests, or that they were threatened. There’s often a conspiracy of silence.

In those cases, it doesn’t seem to me as if there’s democratic accountability. But, even when it is public, we’ve seen over the past year that government officials are writing letters to the platforms: public letters criticizing them, asking for information, badgering them, pestering them about their content-moderation policies. And we might think, Sure, people know that that’s happening. Maybe the government officials will face political accountability if it’s no good. But we might worry that, even then, if the behavior is sufficiently serious, if it’s repeated, it might give the officials too much power to shape the content-moderation policies of the platforms. From a First Amendment perspective, I don’t know why that’s off the table.

Now, from a practical perspective, you’re absolutely right. Courts have not wanted to get involved. But that’s really worrying. I think this desire to just let the political branches work it out has meant that, certainly with the social-media platforms, it’s been like the Wild West. There are no rules of the road. We have no idea what’s O.K. or not for someone in the White House to e-mail to a platform. One of the benefits of the order and the injunction is that it’s opening up this debate about what’s O.K. and what’s not. It might be the case that the way to establish rules of the road will not be through First Amendment-case litigation. Maybe we need Congress to step in and write the rules, or there needs to be some kind of agency self-regulation. But I think it’s all going to have to ultimately be viewed through a First Amendment lens. This order and injunction go way too far, but I think the case is at least useful in starting a debate. Because up until now we’ve been stuck in this arena where there are important free-speech values that are at stake and no one is really doing much to protect them. ♦

More New Yorker Conversations

Naomi Klein sees uncanny doubles in our politics .

Olivia Rodrigo considers the meanings of “Guts.”

Isabel Allende’s vision of history .

Julia Fox didn’t want to be famous, but she knew she would be .

John Waters is ready for his Hollywood closeup .

Patrick Stewart boldly goes there .

Sign up for our daily newsletter to receive the best stories from The New Yorker .

By signing up, you agree to our User Agreement and Privacy Policy & Cookie Statement . This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

What George Miller Has Learned in Forty-five Years of Making “Mad Max” Movies

By Burkhard Bilger

Do Children Have a “Right to Hug” Their Parents?

By Sarah Stillman

Why Liberals Struggle to Defend Liberalism

By Adam Gopnik

Michael Cohen’s Trust Issues

By Eric Lach

Supreme Court Sides With Cheerleader in First Amendment Social Media Case

The justices in a 8-1 decision sided with a former cheerleader who was suspended from her squad for posting a profanity-laced message after school to a social media site.

Supreme Court Acts on First Amendment Case

freedom of speech on social media cases

J. Scott Applewhite | AP-File

The Supreme Court is seen at sundown, Nov. 2, 2020, in Washington, D.C.

The Supreme Court on Wednesday sided with a student cheerleader in a major free-speech case, ruling that the school's disciplinary action against the student for her off-campus social media post violated her First Amendment rights.

In a 8-1 decision, the justices ruled in favor of Brandi Levy, a former cheerleader at Mahanoy Area High School in Pennsylvania. In 2017, Levy, who was a freshman at the time, didn't make the varsity cheerleading team and during the weekend, posted a profanity-laced message to the social media platform Snapchat from a convenience store. She was later reprimanded and suspended from the junior varsity team for the comments.

"We must decide whether the Court of Appeals for the Third Circuit correctly held that the school's decision violated the First Amendment. Although we do not agree with the reasoning of the Third Circuit panel's majority, we do agree with its conclusion that the school's disciplinary action violated the First Amendment," Justice Stephen Breyer wrote in the majority opinion. Justice Clarence Thomas was the lone dissenter in the case.

The Week in Cartoons: June 21-25

freedom of speech on social media cases

The 3rd U.S. Circuit Court of Appeals in Philadelphia had pointed to a 1969 Supreme Court case, Tinker v. Des Moines Independent Community School District, which argued that students do not "shed their constitutional rights to freedom of speech or expression at the schoolhouse gate," though it did grant protections for schools to discipline students for "substantial disruption."

But the high court's ruling was more specific to Levy's situation and left open to interpretation other student off-campus speech and how schools might regulate it, especially as students engage significantly more in online communication.

"We do not now set forth a broad, highly general First Amendment rule stating just what counts as 'off campus' speech and whether or how ordinary First Amendment standards must give way off campus to a school's special need to prevent, e.g., substantial disruption of learning-related activities or the protection of those who make up a school community," Breyer wrote.

"We leave for future cases to decide where, when, and how these features mean the speaker's off-campus location will make the critical difference," he added. "This case can, however, provide one example."

The court's majority argued that the leeway given to schools to regulate student speech doesn't always "disappear when a school regulates speech that take place off campus," noting that speech regulation could be needed in scenarios concerning serious bullying or harassment, threats to teachers or students, failure to comply with rules on curriculum or school activities and "breaches of school security devices."

Groups like the American Civil Liberties Union and teachers unions applauded Wednesday's decision, arguing that it offers students, particularly those of color, protections to their speech but also against bullying from their peers. But President Joe Biden's administration had sided with the school district.

"We all want our students – Black and white, Native and newcomer, Hispanic and Asian alike – to attend safe public schools that are free of harassment and bullying. We also believe that students, just like educators, have a right to freedom of speech under the First Amendment," National Education Association president Becky Pringle said in a statement. "The Supreme Court's decision in Mahanoy strikes the right balance."

Join the Conversation

Tags: Supreme Court , freedom of speech

America 2024

freedom of speech on social media cases

Health News Bulletin

Stay informed on the latest news on health and COVID-19 from the editors at U.S. News & World Report.

Sign in to manage your newsletters »

Sign up to receive the latest updates from U.S News & World Report and our trusted partners and sponsors. By clicking submit, you are agreeing to our Terms and Conditions & Privacy Policy .

You May Also Like

The 10 worst presidents.

U.S. News Staff Feb. 23, 2024

freedom of speech on social media cases

Cartoons on President Donald Trump

Feb. 1, 2017, at 1:24 p.m.

freedom of speech on social media cases

Photos: Obama Behind the Scenes

April 8, 2022

freedom of speech on social media cases

Photos: Who Supports Joe Biden?

March 11, 2020

freedom of speech on social media cases

What to Know: Biden’s Trip to France

Laura Mannweiler June 5, 2024

freedom of speech on social media cases

Dems Look to Press GOP on Contraception

Aneeta Mathur-Ashton June 5, 2024

freedom of speech on social media cases

Earth Hit a Grim Hot Streak Last Month

Cecelia Smith-Schoenwalder June 5, 2024

freedom of speech on social media cases

Labor Market Cools in May

Tim Smart June 5, 2024

freedom of speech on social media cases

Biden Signs Border Executive Order

Cecelia Smith-Schoenwalder June 4, 2024

freedom of speech on social media cases

Garland Punches Back at GOP Attacks

Lauren Camera June 4, 2024

freedom of speech on social media cases

Supreme Court debates student free speech rights on social media

Justices consider teenager's challenge to school punishment for vulgar Snapchat.

The U.S. Supreme Court wrestled openly on Wednesday with how to balance the free speech rights of American teenagers on social media with the needs of schools to maintain order and discipline in classrooms and on playing fields.

"I'm frightened to death of writing a standard," Justice Stephen Breyer said in a nod to the case's significant stakes for schools, parents and students.

The justices heard oral arguments in an appeal by a Pennsylvania school district seeking to reverse a sweeping lower court decision that said what students say off campus after hours is strictly off-limits for punishment by school officials.

They were wary of that stark standard even as most were also uneasy with the idea of schools policing student speech unfettered, after hours.

"If schools are going to have any authority... outside of school," Justice Samuel Alito said, "there has to be a clear rule. That’s what I’m looking for."

The case involves a former teenage cheerleader in the Mahanoy Area School District who was given a year suspension from the squad in 2017 after a coach learned of an expletive-laden Snapchat message posted to teammates online over a weekend.

Brandi Levy, then a 14-year-old freshman, told ABC News she was venting frustration to friends about not making the cut to cheer for the school’s varsity squad. "I was upset. I was angry," she said in an interview. "I said, 'F school, F cheer, F softball, F everything.'"

PHOTO: Former high school cheerleader Brandi Levy, 18, sued her school district after officials suspended her from the sport for a vulgar Snapchat post sent off-campus, after school hours.

Levy sued the school and won twice in lower federal courts. She was reinstated to the team and graduated last spring.

"If they would have just taken her aside and said, 'Watch; be careful.' But the action they took, I think reached above and beyond where they should be," said Larry Levy, Brandi's father, who brought the case with help from the ACLU.

The school contends Levy’s social media post violated an agreed upon team code of conduct and that punishment was reasonable and justified. Attorneys for the district argue that schools need to be able to respond to student speech that is disruptive, even if it takes place off of school property.

"Off-campus speech on social media can be disruptive," attorney Lisa Blatt told the justices. "The speaker’s location is irrelevant."

In the landmark 1969 decision Tinker v Des Moines, the Supreme Court said that students don’t give up their First Amendment rights at the schoolhouse gate but that conduct that "materially and substantially interferes" with education or the rights of other students can be regulated by schools.

MORE: Supreme Court allows war cross memorial to stand as symbol of 'sacrifice'

The court has not addressed the matter of school-related speech made off campus, after hours on social media. Over nearly two hours of arguments, the justices openly wrestled with how and where to draw the line.

"Can you punish the student for cursing at home at her parents? Can you punish her for cursing on the way to school?" Justice Sonia Sotomayor pondered aloud. "If you can’t punish for that, you can [punish] for her doing it on the internet?" she said skeptically.

Justice Clarence Thomas raised one of the challenges of the digital age: "Aren’t we at a point that if it’s on social media, where you posted it on social media doesn’t really matter?" he said.

Justice Stephen Breyer wondered openly whether court precedent authorized any school punishment for the type of posting made by Levy. "It says, ‘School, you cannot punish this unless there’s substantial disruption or infringement on others. You can’t, unless," he said. "Here pretty clearly it didn’t satisfy."

PHOTO: Members of the Supreme Court pose for a group photo at the Supreme Court in Washington, D.C., April 23, 2021.

Justice Brett Kavanaugh was openly sympathetic to Levy, saying outright that he was "bothered" as a "judge and coach and parent, too," by the punishment.

"She’s competitive. She cares. She blew off steam like other kids do," Kavanaugh said. "It didn’t seem the punishment was tailored to the offense…. A year suspension from the team just seems excessive to me."

Justices Elena Kagan and Amy Coney Barrett, however, raised the legitimate stakes for schools in potential scenarios of cheating on academic assignments and student threats of violence that might take place electronically, off campus.

Kagan suggested that school authority might not need to be limited geographically under court precedent but instead "as what’s necessary for a school’s learning environment."

Echoing that view, Justice Samuel Alito said he’s "concerned" about cyberbullying. "Is there anything a school could do about that?" he said.

MORE: Major test for Voting Rights Act at Supreme Court as GOP pushes new election rules

ACLU attorney David Cole, representing Brandi Levy, said communities across the country already have anti-bullying and anti-harassment legislation that is constitutional and enforceable.

Cole urged the Court to make clear that schools simply cannot police students’ social media on the basis of a potential "disruption," even if they can legally impose clear codes of student conduct and impose penalties accordingly.

"If you’re at a convenience store on the weekend, Tinker does not apply. End of story," Cole argued. "Here they set forth conditions [for team membership.] She agreed, and she did not violate any conditions."

The justices are expected to hand down a decision in the case by the end of June 2021.

Related Topics

  • Supreme Court
  • Election 2024
  • Entertainment
  • Newsletters
  • Photography
  • Personal Finance
  • AP Investigations
  • AP Buyline Personal Finance
  • AP Buyline Shopping
  • Press Releases
  • Israel-Hamas War
  • Russia-Ukraine War
  • Global elections
  • Asia Pacific
  • Latin America
  • Middle East
  • Election Results
  • Delegate Tracker
  • AP & Elections
  • Auto Racing
  • 2024 Paris Olympic Games
  • Movie reviews
  • Book reviews
  • Personal finance
  • Financial Markets
  • Business Highlights
  • Financial wellness
  • Artificial Intelligence
  • Social Media

Supreme Court seems favorable to Biden administration over efforts to combat social media posts

Listen to the oral arguments as U.S. Supreme Court takes up the Murthy v. Missouri case, a dispute between Republican-led states and the Biden administration over how far the federal government can go to combat controversial social media posts on topics including COVID-19 and election security.

App logos for Facebook, left, and X, formerly known as Twitter, are seen on a mobile phone in Los Angeles, Saturday, March 16, 2024. (AP Photo/Paula Ulichney)

App logos for Facebook, left, and X, formerly known as Twitter, are seen on a mobile phone in Los Angeles, Saturday, March 16, 2024. (AP Photo/Paula Ulichney)

  • Copy Link copied

FILE - The Supreme Court is seen on Capitol Hill in Washington, March 4, 2024. (AP Photo/J. Scott Applewhite, File)

WASHINGTON (AP) — The Supreme Court seemed likely Monday to side with the Biden administration in a dispute with Republican-led states over how far the federal government can go to combat controversial social media posts on topics including COVID-19 and election security in a case that could set standards for free speech in the digital age.

The justices seemed broadly skeptical during nearly two hours of arguments that a lawyer for Louisiana, Missouri and other parties presented accusing officials in the Democratic administration of leaning on the social media platforms to unconstitutionally squelch conservative points of view.

Lower courts have sided with the states, but the Supreme Court blocked those rulings while it considers the issue.

Several justices said they were concerned that common interactions between government officials and the platforms could be affected by a ruling for the states.

In one example, Justice Amy Coney Barrett expressed surprise when Louisiana Solicitor General J. Benjamin Aguiñaga questioned whether the FBI could call Facebook and X (formerly Twitter) to encourage them to take down posts that maliciously released someone’s personal information without permission, the practice known as doxxing.

FILE - An "X" sign rests atop the company headquarters in downtown San Francisco, on July 28, 2023. Australia’s online safety watchdog said on Wednesday, June 6, 2024, she had dropped her Federal Court case that attempted to force X Corp. to take down a video of a Sydney bishop being stabbed. (AP Photo/Noah Berger, File)

“Do you know how often the FBI makes those calls?” Barrett asked, suggesting they happen frequently.

Justice Brett Kavanaugh also signaled that a ruling for the states would mean that “traditional, everyday communications would suddenly be deemed problematic.”

The case Monday was among several the court is considering that affect social media companies in the context of free speech. Last week, the court laid out standards for when public officials can block their social media followers . Less than a month ago, the court heard arguments over Republican-passed laws in Florida and Texas that prohibit large social media companies from taking down posts because of the views they express.

The cases over state laws and the one that was argued Monday are variations on the same theme, complaints that the platforms are censoring conservative viewpoints.

The states argue that White House communications staffers, the surgeon general, the FBI and the U.S. cybersecurity agency are among those who coerced changes in online content on social media platforms.

Aguiñaga put the situation in stark terms, telling the justices that “the record reveals unrelenting pressure by the government to coerce social media platforms to suppress the speech of millions of Americans.”

He said that calls merely encouraging the platforms to act also could violate speech rights, responding to a hypothetical situation conjured by Justice Ketanji Brown Jackson, about an online challenge that “involved teens jumping out of windows at increasing elevations.”

Jackson, joined by Chief Justice John Roberts, pressed the Louisiana lawyer about whether platforms could be encouraged to remove such posts.

“I was with you right until that last comment, Your Honor,” Aguiñaga said. “I think they absolutely can call and say this is a problem, it’s going rampant on your platforms, but the moment that the government tries to use its ability as the government and its stature as the government to pressure them to take it down, that is when you’re interfering with the third party’s speech rights.”

Justice Samuel Alito appeared most open to the states’ arguments, at one point referring to the government’s “constant pestering of Facebook and some of the other platforms.” Alito, along with Justices Neil Gorsuch and Clarence Thomas, would have allowed the restrictions on government contacts with the platforms to go into effect.

Justice Department lawyer Brian Fletcher argued that none of the actions the states complain about come close to problematic coercion and that the federal government would lose its ability to communicate with the social media companies about antisemitic and anti-Muslim posts, as well as on issues of national security, public health and election integrity.

The platforms are large sophisticated actors with no reluctance to stand up to the government, “saying no repeatedly when they disagree with what the government is asking them to do,” Fletcher said.

Justice Elena Kagan and Kavanaugh, two justices who served in the White House earlier in their careers, seemed to agree, likening the exchanges between officials and the platforms to relationships between the government and more traditional media.

Kavanaugh described “experienced government press people throughout the federal government who regularly call up the media and -- and berate them.”

Later, Kagan said, “I mean, this happens literally thousands of times a day in the federal government.”

Alito, gesturing at the courtroom’s press section, mused that whenever reporters “write something we don’t like,” the court’s chief spokeswoman “can call them up and curse them out and say...why don’t we be partners? We’re on the same team. Why don’t you show us what you’re going to write beforehand? We’ll edit it for you, make sure it’s accurate.”

Free speech advocates said the court should use the case to draw an appropriate line between the government’s acceptable use of the bully pulpit and coercive threats to free speech.

“We’re encouraged that the Court was sensitive both to the First Amendment rights of platforms and their users, and to the public interest in having a government empowered to participate in public discourse. To that end, we hope that the Court resolves these cases by making clear that the First Amendment prohibits coercion but permits the government to attempt to shape public opinion through the use of persuasion,” Alex Abdo, litigation director of the Knight First Amendment Institute at Columbia University, said in a statement.

A panel of three judges on the New Orleans-based 5th U.S. Circuit Court of Appeals had ruled earlier that the Biden administration had probably brought unconstitutional pressure on the media platforms. The appellate panel said officials cannot attempt to “coerce or significantly encourage” changes in online content. The panel had previously narrowed a more sweeping order from a federal judge, who wanted to include even more government officials and prohibit mere encouragement of content changes.

A divided Supreme Court put the 5th Circuit ruling on hold in October, when it agreed to take up the case.

A decision in Murthy v. Missouri, 23-411, is expected by early summer.

freedom of speech on social media cases

Watch CBS News

Supreme Court wary of restricting government contact with social media platforms in free speech case

By Melissa Quinn

Updated on: March 18, 2024 / 8:43 PM EDT / CBS News

Washington — The Supreme Court on Monday appeared wary of limiting the Biden administration's contacts with social media platforms in a closely watched dispute that  tests how much the government can  pressure social media companies to remove content before crossing a constitutional line from persuasion into coercion.

The case, known as Murthy v. Missouri, arose out of efforts during the early months of the Biden administration to push social media platforms to take down posts that officials said spread falsehoods about the pandemic and the 2020 presidential election. 

A U.S. district court judge said White House officials, as well as some federal agencies and their employees, violated the First Amendment's right to free speech by "coercing" or "significantly encouraging" social media sites' content-moderation decisions. The judge issued an injunction restricting the Biden administration's contacts with platforms on a variety of issues, though that order has been on hold.

During oral arguments on Monday, the justices seemed skeptical of a ruling that would broadly restrict the government's communications with social media platforms, raising concerns about hamstringing officials' ability to communicate with platforms about certain matters.

"Some might say that the government actually has a duty to take steps to protect the citizens of this country, and you seem to be suggesting that that duty cannot manifest itself in the government encouraging or even pressuring platforms to take down harmful information," Justice Ketanji Brown Jackson told Benjamin Aguiñaga, the Louisiana solicitor general. "I'm really worried about that, because you've got the First Amendment operating in an environment of threatening circumstances from the government's perspective, and you're saying the government can't interact with the source of those problems."

The Supreme Court is seen on March 18, 2024.

Justice Amy Coney Barrett warned Aguiñaga that one of the proposed standards for determining when the government's actions cross the bound into unlawful speech suppression — namely when a federal agency merely encourages a platform to remove problematic posts — "would sweep in an awful lot." She questioned whether the FBI could reach out to a platform to encourage it to take down posts sharing his and other Louisiana officials' home addresses and calling on members of the public to rally.

Aguiñaga said the FBI could be encouraging a platform to suppress constitutionally protected speech.

The legal battle is one of five that the Supreme Court is considering this term that stand at the intersection of the First Amendment's free speech protections and social media. It was also the first of two that the justices heard Monday that involves alleged jawboning, or informal pressure by the government on an intermediary to take certain actions that will suppress speech.

The second case raises whether a New York financial regulator  violated the National Rifle Association's free speech rights  when she pressured banks and insurance companies in the state to sever ties with the gun rights group after the 2018 shooting in Parkland, Florida. Decisions from the Supreme Court in both cases are expected by the end of June.

The Biden administration's efforts to stop misinformation

The social media case stems from the Biden administration's efforts to pressure platforms, including Twitter, now known as X, YouTube and Facebook, to take down posts it believed spread falsehoods about the pandemic and the last presidential election.

Brought by five social media users and two states, Louisiana and Missouri, their challenge claimed their speech was stifled when platforms removed or downgraded their posts after strong-arming by officials in the White House, Centers for Disease Control, FBI and Department of Homeland Security.

The challengers alleged that at the heart of their case is a "massive, sprawling federal 'Censorship Enterprise,'" through which federal officials communicated with social media platforms with the goal of pressuring them to censor and suppress speech they disfavored.

U.S. District Judge Terry Doughty found that seven groups of Biden administration officials violated the First Amendment because they transformed the platforms' content-moderation decisions into state action by "coercing" or "significantly encouraging" their activities. He limited the types of communications agencies and their employees could have with the platforms, but included several carve-outs.

The U.S. Court of Appeals for the 5th Circuit then determined that certain White House officials and the FBI violated free speech rights when they coerced and significantly encouraged platforms to suppress content related to COVID-19 vaccines and the election. It narrowed the scope of Doughty's order but said federal employees could not "coerce or significantly encourage" a platform's content-moderation decisions.

The justices in October agreed to decide whether the Biden administration impermissibly worked to suppress speech on Facebook, YouTube and X. The high court temporarily paused the lower court's order limiting Biden administration officials' contact with social media companies.

In filings with the court, the Biden administration argued that the social media users and states lack legal standing to even bring the case, but said officials must be free "to inform, to persuade, and to criticize."

"This case should be about that fundamental distinction between persuasion and coercion," Brian Fletcher, principal deputy solicitor general, told the justices. 

Fletcher argued that the states and social media users were attempting to use the courts to "audit all of the executive branch communications with and about social media platforms," and said administration officials public statements are "classic bully pulpit exhortations."

But Aguiñaga told the justices that the platforms faced "unrelenting pressure" from federal officials to suppress protected speech.

"The government has no right to persuade platforms to violate Americans' constitutional rights," he said. "And pressuring platforms in in backrooms shielded from public view is not using the bully pulpit at all. That's just being a bully."

The oral arguments

Several of the justices questioned whether the social media users who brought the suit demonstrated that they suffered a clear injury traceable to the government or could show that an injunction against the government would correct future injuries caused by the platforms' content moderation, which much be shown to bring a challenge in federal courts.

"I have such a problem with your brief," Justice Sonia Sotomayor told Aguiñaga. "You omit information that changes the context of some of your claims. You attribute things to people that it didn't happen to. ... I don't know what to make of all this because I'm not sure how we get to prove direct injury in any way."

Aguiñaga apologized and said he takes "full responsibility" for any aspects of their filings that were not forthcoming.

Justice Elena Kagan asked Aguiñaga to point to the piece of evidence that most clearly showed that the government was responsible for his clients having material taken down.

"We know that there's a lot of government encouragement around here," she said. "We also know that the platforms are actively content moderating, and they're doing that irrespective of what the government wants, so how do you decide that it's government action as opposed to platform action?"

The justices frequently raised communications between the federal government and the press, which often involve heated discussions.

Justice Samuel Alito referenced emails between federal officials and platforms, some of which he said showed "constant pestering" by White House employees and requests for meetings with the social media sites.

"I cannot imagine federal officials taking that approach to the print media, our representatives over there," he said, referencing the press section in the courtroom. "If you did that to them, what do you think the reaction would be?"

Alito speculated that the reason why the federal officials felt free to pressure the platforms was because it has Section 230, a key legal shield for social media companies, and possible antitrust action "in its pocket," which he called "big clubs available to it." 

"It's treating Facebook and these other platforms like they're subordinates," Alito said. "Would you do that to the New York Times or the Wall Street Journal or the Associated Press or any other big newspaper or wire service?"

Fletcher conceded that officials' anger is "unusual," but said it's not odd for there to be a back-and-forth between White House employees and the media.

Kavanaugh, though, said that he "assumed, thought, experienced government press people throughout the federal government who regularly call up the media and berate them." He also noted that "platforms say no all the time to the government."

Chief Justice John Roberts — noting that he has "no experience coercing anybody" — said the government is "not monolithic, and that has to dilute the concept of coercion significantly." Roberts said one agency may be attempting to coerce a platform one way, while another may be pushing it to go the other direction.

The NRA's court fight

In the second case, the court considered whether the former superintendent of the New York State Department of Financial Services violated the NRA's free speech rights when she pushed regulated insurance companies and banks to stop doing business with the group.

Superintendent Maria Vullo, who left her post in 2019, had been investigating since 2017 two insurers involved in NRA-endorsed affinity programs, Chubb and Lockton, and determined they violated state insurance law. The investigation found that a third, Lloyd's of London, underwrote similar unlawful insurance products for the NRA.

Then, after the Parkland school shooting in February 2018, Vullo issued guidance letters that urged regulated entities "to continue evaluating and managing their risks, including reputational risks" that may arise from their dealings with the NRA or similar gun rights groups.

Later that year, the Department of Financial Services entered into consent decrees with the three insurance companies it was investigating. As part of the agreements, the insurers admitted they provided some unlawful NRA-supported programs and agreed to stop providing the policies to New York residents. 

The NRA then sued the department, alleging that Vullo privately threatened insurers with enforcement action if they continued working with the group and created a system of "informal censorship" that was designed to suppress its speech, in violation of the First Amendment.

A federal district court sided with the NRA, finding that the group sufficiently alleged that Vullo's actions "could be interpreted as a veiled threat to regulated industries to disassociate with the NRA or risk DFS enforcement action."

But a federal appeals court disagreed and determined that the guidance letters and a press release couldn't "reasonably be construed as being unconstitutionally threatening or coercive," because they "were written in an even-handed, nonthreatening tone" and used words intended to persuade, not intimidate.

The NRA appealed the decision to the Supreme Court, which agreed to consider whether Vullo violated the group's free speech rights when she urged financial entities to sever their ties with it.

"Allowing unpopular speech to form the basis for adverse regulatory action under the guise of 'reputational risk,' as Vullo attempted here, would gut a core pillar of the First Amendment," the group, which is represented in part by the American Civil Liberties Union, told the court in a filing .

The NRA argued that Vullo "openly targeted the NRA for its political speech and used her extensive regulatory authority over a trillion-dollar industry to pressure the institutions she oversaw into blacklisting the organization."

"In the main, she succeeded," the organization wrote. "But in doing so, she violated the First Amendment principle that government regulators cannot abuse their authority to target disfavored speakers for punishment."

Vullo, though, told the court that the insurance products the NRA was offering its members were unlawful, and noted that the NRA itself signed a consent order with the department after Vullo left office after it found the group was marketing insurance producers without the proper license from the state.

"Accepting the NRA's arguments would set an exceptionally dangerous precedent," lawyers for the state wrote in a Supreme Court brief. "The NRA's arguments would encourage damages suits like this one and deter public officials from enforcing the law — even against entities like the NRA that committed serious violations."

The NRA, they claimed, is asking the Supreme Court to give it "favored status because it espouses a controversial view," and the group has never claimed that it was unable to exercise its free speech rights.

  • Biden Administration
  • Supreme Court of the United States
  • Social Media
  • Free Speech

Melissa Quinn is a politics reporter for CBSNews.com. She has written for outlets including the Washington Examiner, Daily Signal and Alexandria Times. Melissa covers U.S. politics, with a focus on the Supreme Court and federal courts.

More from CBS News

Prosecutors say "the law makes no distinction for Hunter Biden" in gun trial

Georgia court sets tentative Oct. 4 date to hear Trump appeal of Willis ruling

Senate to vote on contraception bill as Democrats pressure GOP

Garland tells GOP House members, "I will not be intimidated"

freedom of speech on social media cases

NRA Ruling Eyed for Clues in Supreme Court Social Media Case

By Lydia Wheeler and Kimberly Strawbridge Robinson

Lydia Wheeler

Lawyers are reading between the lines of the US Supreme Court’s decision to revive the National Rifle Association’s free speech fight against a New York regulator to see what the implications may be for a similar dispute involving the Biden administration.

Thursday’s ruling has bolstered hopes for some that the court will soon stop the government from trying to limit conservative speech online, while others expect the justices to use the NRA ruling as a reference point in deciphering when government actions violate the First Amendment.

A unanimous court came up very clearly on the idea that a government can’t do through third parties what the Constitution prohibits, said Adam Candeub, a communications and antitrust law professor at Michigan State University.

“That was very encouraging,” he said.

In NRA v. Vullo, the court said the gun rights lobby had sufficiently alleged that New York state officials coerced private parties to cut ties with the group following the 2018 shooting at Marjory Stoneman Douglas High School in Parkland, Florida.

“A government official can share her views freely and criticize particular beliefs, and she can do so forcefully in the hopes of persuading others to follow her lead,” Justice Sonia Sotomayor wrote for the court.

“What she cannot do, however, is use the power of the State to punish or suppress disfavored expression,” she said.

That’s the part of the opinion attorneys are pointing to as an indication of how the court might rule in Murthy v. Missouri . The justices are expected to issue an opinion by the end of June on whether the Biden administration improperly coerced social media companies to take down what government officials saw as Covid-19 and election misinformation.

“I’m cautiously optimistic about what it forecasts for Murthy ,” said Jenin Younes, litigation counsel for the New Civil Liberties Alliance, which represented four of the five individuals who sued the federal government alongside Louisiana and Missouri.

Government officials are entitled to try and persuade companies to do the right thing but they can’t use the bully pulpit to censor speech, Younes said.

“That was our argument and the court seemed to sort of agree with that,” she added.

Jawboning Claims

Other free speech experts, though, say the Vullo decisions provides very little for any reading of tea leaves.

Both cases involve claims of “jawboning,” the use of informal pressure to influence speech decisions, said the Foundation for Individual Rights and Expression’s Chief Counsel Bob Corn-Revere.

In Vullo , a New York state insurance regulator is accused of threatening to take enforcement actions against banks and insurance companies if they don’t stop doing business with the NRA in an effort to stifle the group’s pro-gun message.

In Murthy , the White House, the Surgeon General’s Office, the Centers for Disease Control and Prevention, and the FBI are accused of pressuring some of the most popular social media platforms to take down posts they saw as misleading or deplatform the users who made them.

Corn-Revere said the justices made clear the multi-factored test that court’s should apply in analyzing these claims, including looking at the tone of the interactions and whether there were threats of an adverse action.

But in clarifying the test, the court emphasized the fact-specific nature of that inquiry, Corn-Revere said. So there’s clarity about what factors to consider but not how they should apply, he said.

“It’s hard to make predictions because the factual records are very different,” Corn-Revere said.

One factor the government pointed to in March during the arguments in Vullo was whether the officials had made any threats.

In Vullo , the regulator “threatened adverse action in the form of an enforcement action so that” the insurance companies “would comply with a specific instruction to cut ties with all gun groups,” Justice Department lawyer Ephraim McDowell said during arguments.

“In Murthy , there was no threat at all,” he said. “There were just talks about legislative reforms, but they were not connected to any specific instruction.”

Because of that, some attorneys expect the court will try to find some sort of middle ground.

Devin Watkins, an attorney at the libertarian Competitive Enterprise Institute, said the justices are likely to cite their ruling in Vullo in creating a clear line between threats and the kinds of encouragement that’s allowed, and then send Murthy back to the US Court of Appeals for the Fifth Circuit.

“I doubt either side is probably going to win in the end,” he said.

To contact the reporters on this story: Lydia Wheeler in Washington at [email protected] ; Kimberly Strawbridge Robinson in Washington at [email protected]

To contact the editors responsible for this story: Seth Stern at [email protected] ; John Crawley at [email protected]

Learn more about Bloomberg Law or Log In to keep reading:

Learn about bloomberg law.

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.

freedom of speech on social media cases

Hate speech and disinformation in South Africa’s elections: big tech make it tough to monitor social media

freedom of speech on social media cases

Professor Emeritus, Rhodes University, Rhodes University

Disclosure statement

Guy Berger has received funding from the thinktank Research ICT Africa, where he is a Distinguished Research Fellow.

Rhodes University provides funding as a partner of The Conversation AFRICA.

View all partners

There’s a growing global movement to ensure that researchers can get access to the huge quantity of data assembled and exploited by digital operators.

Momentum is mounting because it’s becoming increasingly evident that data is power. And access to it is the key – for a host of reasons, not least transparency, human rights and electoral integrity.

But there’s currently a massive international asymmetry in access to data.

In the European Union and the US, some progress has been made. For example, EU researchers studying risks have a legal right of access. In the US too, some companies have taken voluntary steps to improve access.

The situation is generally very different in the global south.

The value of data access can be seen vividly in the monitoring of social media during elections. South Africa is a case in point. A powerful “big data” analysis was recently published about online attacks on women journalists there, raising the alarm about escalation around – and after – the election on 29 May.

A number of groups working with data are attempting to monitor hate speech and disinformation on social media ahead of South Africa’s national and provincial polls. At a recent workshop involving 10 of these initiatives, participants described trying to detect co-ordinated “information operations” that could harm the election, including via foreign interference.

But these researchers can’t get all the data they need because the tech companies don’t give them access.

This has been a concern of mine since I first commissioned a handbook about harmful online content – Journalism, Fake News & Disinformation: Handbook for Journalism Education and Training – six years ago. My experience since then includes overseeing a major UN study called Balancing Act: Countering Digital Disinformation While Respecting Freedom of Expression .

Over the years, I’ve learnt that to dig into online disinformation, you need to get right inside the social media engines. Without comprehensive access to the data they hold, you’re left in relative darkness about the workings of manipulators, the role of misled punters and the fuel provided by mysterious corporate algorithms.

Looking at social media in the South African elections, the researchers at the recent workshop shared how they were doing their best with what limited data they had. They were all monitoring text on social platforms. Some were monitoring audio, while a few were looking at “synthetic content” such as material produced with generative AI.

About half of 10 initiatives were tracking followers, impressions and engagement. Nearly all were checking content on Twitter; at least four were monitoring Facebook; three covered YouTube; and two included TikTok.

WhatsApp was getting scant attention. Though most messaging on the service is encrypted, the company knows (but doesn’t disclose) which registered user is bulk sending content to which others, who forwards this on, whether group admins are active or not, and a host of other “metadata” details that could help monitors to track dangerous trajectories.

But the researchers can’t do the necessary deep data dives. They’ve set out the difficult data conditions they work under in a public statement explaining how they are severely constrained in their access to data.

One data source they use is expensive (and limited) packages from marketing brokers (who in turn have purchased data assets wholesale from the platforms).

A second source is from analysing published posts online (which excludes in-group and WhatsApp communications). Using scraped data is limited and labour-intensive. Findings are superficial. And it’s risky: scraping is forbidden in most platforms’ terms of use.

None of the researchers covering South Africa’s elections have direct access to the platforms’ own Application Programme Interfaces (APIs). These gateways provide a direct pipeline into the computer servers hosting data. This major resource is what companies use to profile users, amplify content, target ads and automate content moderation. It’s an essential input for monitoring online electoral harms.

In the EU, the Digital Services Act enables vetted researchers to legally demand and receive free, and potentially wide-ranging, API access to search for “systemic risks” on the platforms.

It’s also more open in the US. There, Meta, the multinational technology giant that owns and operates Facebook, Instagram and WhatsApp, cherrypicked 16 researchers in the 2020 elections (of which only five projects have published their findings). The company has subsequently outsourced the judging of Facebook and Instagram access requests (from anywhere worldwide) to the University of Michigan .

One of the South African researchers tried that channel, without success.

Other platforms such as TikTok are still making unilateral decisions, even in the US, as to who has data access.

Outside the EU and the US, it’s hard even to get a dialogue going with the platforms.

The fightback

Last November, I invited the bigger tech players to join a workshop in Cape Town on data access and elections in Africa. There was effectively no response .

The same pattern is evident in an initiative earlier this year by the South African National Editors’ Forum. The forum suggested a dialogue around a human rights impact assessment of online risks to the South African elections. They were ignored .

Against this background, two South African NGOs – the Legal Resources Centre and the Campaign for Free Expression – are using South Africa’s expansive Promotion of Access to Information Act to compel platforms to disclose their election plans.

But the companies have refused to respond, claiming that they do not fall under South African jurisdiction. This has led to appeals being launched to the country’s Information Regulator to compel disclosures.

Further momentum for change may also come from Unesco, which is promoting international Guidelines for the Governance of Digital Platforms. These highlight transparency and the issue of research access. Unesco has also published a report that I researched titled Data Sharing to Foster Information as a Public Good.

In the works is an incipient African Alliance for Access to Data , now involving five pan-African formations. This coalition (I’m interim convenor) is engaging the African Union on the issues.

But there’s no guarantee yet that all this will lead the platforms to open up data to Africans and researchers in the global south.

  • Social media
  • South Africa
  • Hate speech
  • Peace and Security
  • Disinformation
  • Online attacks
  • South Africa election 2024

freedom of speech on social media cases

Head of School, School of Arts & Social Sciences, Monash University Malaysia

freedom of speech on social media cases

Chief Operating Officer (COO)

freedom of speech on social media cases

Clinical Teaching Fellow

freedom of speech on social media cases

Data Manager

freedom of speech on social media cases

Director, Social Policy

  • My View My View
  • Following Following
  • Saved Saved

US Supreme Court boosts NRA in free speech fight with New York official

  • Medium Text

NRA annual meetings held in Texas

'DISFAVORED POLITICAL GROUPS'

Sign up here.

Reporting by John Kruzel in Washington; Editing by Will Dunham

Our Standards: The Thomson Reuters Trust Principles. New Tab , opens new tab

Former wife of Hunter Biden Kathleen Buhle departs the trial in Wilmington, Delaware

World Chevron

Hunter biden's ex-girlfriend describes his drug use at his criminal trial.

Hunter Biden's former girlfriend testified about his near-constant crack cocaine use at lavish hotels at the criminal trial where prosecutors are trying to prove that U.S. President Joe Biden's son lied about his addiction to illegally buy a gun.

A nurse prepares a vaccine shot as the German embassy begins its roll out of BioNTech COVID-19 vaccines for German expatriates at a Beijing United Family hospital in Beijing

Australia drops legal fight against X over church stabbing videos

Elon Musk’s social media platform welcomes decision as a victory for freedom of speech.

x musk

Australia’s internet watchdog has ended a legal battle to force Elon Musk’s X to remove a graphic video of a church stabbing in Sydney.

The eSafety commissioner, Julie Inman, said on Wednesday that she had decided to drop the case to “achieve the most positive outcome for the online safety of all Australians, especially children”.

Keep reading

At least 15 killed in israeli attack on central gaza refugee camps at least 15 killed in israeli attack on ..., russia-ukraine war: list of key events, day 831 russia-ukraine war: list of key events, ..., sunak and starmer clash in heated first debate of uk general election sunak and starmer clash in heated first ..., the take: as famine looms in sudan, the people fill gap left by the world the take: as famine looms in sudan, the ....

“Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community. I stand by my investigators and the decisions eSafety made,” Inman said in a statement.

“Most Australians accept this kind of graphic material should not be on broadcast television, which begs an obvious question of why it should be allowed to be distributed freely and accessible online 24/7 to anyone, including children.”

Inman Grant said she welcomed the opportunity for a merits-based review of her takedown notice by the country’s Administrative Appeals Tribunal.

X, formerly known as Twitter, welcomed the announcement.

“This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed,” the social media platform said.

X refused an eSafety notice to take down footage of the non-fatal stabbing of Assyrian Orthodox Bishop Mar Mari Emmanuel during a livestreamed sermon, arguing that blocking the content for users in Australia should be sufficient.

Prime Minister Anthony Albanese criticised Musk’s refusal to follow the notice, labelling him an “arrogant billionaire who thinks he’s above the law, but also above common decency”.

Australia’s Federal Court in April temporarily ordered X to hide the content worldwide – which the platform refused to do – but a judge last month denied an application to extend the order.

Police have charged a 16-year-old boy with “committing a terrorist act” in relation to the April 15 attack on Emmanuel, which authorities say was religiously motivated.

X claims partial victory after Australian eSafety commissioner drops lawsuit over video of violent attack on Christian bishop

X claims partial victory after Australian eSafety commissioner drops lawsuit over video of violent attack on Christian bishop

'freedom of speech is worth fighting for.'.

The social media platform owned by tech titan Elon Musk and now known as X is celebrating a decision from the Australian eSafety commissioner to drop a lawsuit in connection with a violent attack on a Christian bishop earlier this year.

'My prophet': A religiously motivated attack on a Christian bishop

Back in April, Assyrian Orthodox Bishop Mar Mari Emmanuel, 53, was livestreaming a service held at Christ the Good Shepherd Church in Wakeley, just outside Sydney, when a young man suddenly ran to the altar and stabbed Bishop Emmanuel and Fr. Isaac Royel, as Blaze News previously reported .

The suspected assailant, a 16-year-old boy, appeared to be motivated by Islamic extremism , as Bishop Emmanuel has previously criticized the Islamic religion .

"If [Bishop Emmanuel] didn't get himself involved in my religion , if he hadn't spoken about my prophet, I wouldn't have come here. … If he just spoke about his own religion , I wouldn’t have come," the suspect said during the attack, according to the livestream video.

Though the violent stabbing cost the bishop an eye, he has since forgiven his attacker and called him to Christian conversion: "This young man who did this act almost two weeks ago, I say to you, my dear, you are my son and you will always be my son. I will always pray for you. I will always wish you nothing but the best. I pray that my Lord and Savior, Jesus Christ of Nazareth, to enlighten your heart, enlighten your soul, your entire being — to realize there is only one God who art in heaven. ... That God is Jesus Christ of Nazareth."

'Threaten free speech everywhere': X refuses to censor video

Thanks to the wonders of social media, a video of the brutal attack quickly went viral around the globe — much to the chagrin of Australian eSafety Commissioner Julie Inman Grant, a woke American previously affiliated with Big Tech . Grant demanded that X remove the video "to protect Australians from" exposure to "this most extreme and gratuitous violent material."

The platform partially complied at first, censoring it in Australia. But when Grant and other Australian officials called for it to be suppressed across the globe, X stood firm.

"While X respects the right of a country to enforce its laws within its jurisdiction, the eSafety Commissioner does not have the authority to dictate what content X's users can see globally. ... Global takedown orders go against the very principles of a free and open internet and threaten free speech everywhere," said a statement from X's Global Government Affairs team.

Grant pressed on, filing a lawsuit in Australian federal court with a hearing scheduled for late June. However, the suit seemed doomed to fail after a judge denied Grant's request for an injunction against the video while the legal process continued.

'Welcome ... news': Commissioner retreats, drops lawsuit

On Wednesday, Grant announced that she was dropping the federal lawsuit altogether, preferring instead to focus on a separate case involving X. "Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community. I stand by my investigators and the decisions eSafety made," she said in a lengthy statement .

'He issued a dog whistle to 181 million users around the globe, which resulted in ... doxxing of my family members, including my three children.'

She also insisted that censoring the video was the right call and praised the platforms that did so, including Meta, Reddit, and TikTok. She then expressed disappointment that X didn't follow suit. "So it was a reasonable expectation when we made our request to remove extremely graphic video of an attack, that X Corp would take action in line with these publicly stated policies and practices," the statement continued.

She even hinted that the decision may have partially resulted from safety concerns, claiming in an interview that Elon Musk's recent comments about the issue have endangered her and her family.

"He issued a dog whistle to 181 million users around the globe, which resulted in death threats directed at me, which resulted in doxxing of my family members, including my three children," she said .

Apparently unfazed by the personal allegations, Musk has since reaffirmed his commitment to keep his platform free. "Freedom of speech is worth fighting for," he tweeted in connection with the story.

X's global leadership team likewise cheered Grant's decision to drop the lawsuit. "We welcome the news that the eSafety Commissioner is no longer pursuing legal action against X seeking the global removal of content that does not violate X’s rules," said X's Global Government Affairs division.

"This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed."

X filed a lawsuit in Australia's Administrative Appeals Tribunal seeking clarity on whether Grant was in the right to classify the stabbing video as a "class 1" example of "extreme violence material." Grant claims a ruling on the matter from the AAT will provide her with "operational certainty."

The AAT is expected to hear that case sometime next month.

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here !

Want to leave a tip?

Cortney Weil

Cortney Weil

Sr. Editor, News

more stories

Sign up for the blaze newsletter, get the stories that matter most delivered directly to your inbox..

freedom of speech on social media cases

IMAGES

  1. Freedom of speech and expression on Social Media, Tripura High Court judgement explained #UPSC2020

    freedom of speech on social media cases

  2. Explain the Freedom of Speech & Expression in Social Media

    freedom of speech on social media cases

  3. Social Media, Freedom of Speech, and the Future of our Democracy

    freedom of speech on social media cases

  4. Freedom of Speech On Social Media & Government Intervention

    freedom of speech on social media cases

  5. Freedom Of Speech And Expression And Social Media

    freedom of speech on social media cases

  6. 360° Perspective: Free speech on social media

    freedom of speech on social media cases

COMMENTS

  1. What to Know About the Supreme Court Case on Free Speech on Social

    Published Feb. 25, 2024 Updated Feb. 26, 2024. Social media companies are bracing for Supreme Court arguments on Monday that could fundamentally alter the way they police their sites. After ...

  2. Elonis v. U.S.

    The First Amendment provides that "Congress shall make no law . . . abridging the freedom of speech [.]" Elonis v. U.S. (2015) is the first time that the Supreme Court of the United States has agreed to hear a case involving the constitutionality of prosecuting potential threats in a social media context.This is a relatively new and rapidly developing area of law.

  3. Supreme Court hears cases involving free speech rights on social media

    The Supreme Court heard arguments in highly consequential cases navigating First Amendment protections on social media. Tech companies are taking on state laws, decrying conservative censorship ...

  4. Supreme Court tackles social media and free speech : NPR

    Supreme Court tackles social media and free speech In a major First Amendment case, the Supreme Court heard arguments on the federal government's ability to combat what it sees as false ...

  5. Supreme Court wades into social media wars over free speech

    Supreme Court wades into social media wars over free speech. Three major issues on the role of social media in society are before the justices, with oral arguments in the first two cases taking ...

  6. The Supreme Court Cases That Could Redefine the Internet

    The Supreme Court in 1997 issued a very famous case called Reno v. ACLU. And this was a constitutional case about what was called the Communications Decency Act. This was a law that purported to ...

  7. Social Media

    The first free speech case to reach the Supreme Court that involved social media was in 2017 when the court struck down a state law prohibiting a convicted sex offender from using the platforms. Noting the power of social media, the court called it "the modern public square" and said that these "websites can provide perhaps the most powerful ...

  8. The Supreme Court Will Set an Important Precedent for Free Speech

    As the Supreme Court explained in striking down the law, a government-mandated "right of access inescapably dampens the vigor and limits the variety of public debate.". The Supreme Court's established precedent for protecting editorial discretion applies to online platforms as well. Private speech on the internet should receive at least ...

  9. O'Connor-Ratcliff v. Garnier and Lindke v. Freed: Is a Public Official

    The First Amendment's Free Speech Clause prevents the government from unduly abridging the freedom of speech. 1 Footnote ... The parties in both cases appealed to the Supreme Court, citing the divergent approaches taken by the Sixth and Ninth Circuits and the resulting circuit split. ... Whether a public official's social media activity can ...

  10. Supreme Court Rules For Cheerleader In Free Speech Case : NPR

    In a victory for student speech rights, the Supreme Court on Wednesday ruled that a former cheerleader's online F-bombs about her school is protected speech under the First Amendment. But in an 8 ...

  11. In the Age of Social Media, Expand the Reach of the First Amendment

    The Court should interpret the First Amendment to limit the "unreasonably restrictive and oppressive conduct" by certain powerful, private entities—such as social media entities—that flagrantly censor freedom of expression. David L. Hudson Jr. is a Justice Robert H. Jackson Legal Fellow for the Foundation for Individual Rights in ...

  12. Free Speech Supreme Court Cases

    Free Speech Supreme Court Cases. The First Amendment to the U.S. Constitution provides that the government must not "abridge the freedom of speech, or of the press.". Free speech has long been considered one of the pillars of a democracy. Explaining its importance, Justice Oliver Wendell Holmes, Jr. declared that "the best test of truth ...

  13. Supreme Court to hear free speech case over government pressure on

    The case poses a significant test of the First Amendment's free speech protections in the digital age and stems from the Biden administration's efforts to pressure social media platforms to remove ...

  14. The Evolving Free-Speech Battle Between Social Media and the Government

    The judge wrote, in a preliminary injunction, "If the allegations made by plaintiffs are true, the present case arguably involves the most massive attack against free speech in United States ...

  15. Supreme Court Sides With Cheerleader in First Amendment Social Media Case

    The Supreme Court on Wednesday sided with a student cheerleader in a major free-speech case, ruling that the school's disciplinary action against the student for her off-campus social media post ...

  16. The Limits to Free Speech on Social Media: On Two Recent Decisions of

    1 See an overview of the regulatory framework and the main case law on the topic in Jon Wessel-Aas, Audun Fladmoe, and Marjan Nadim, Hate Speech Report 3. The Boundary between Freedom of Speech and Criminal Law Protection against Hate Speech, Norwegian Institute for Social Research, Report 2016:22.

  17. Supreme Court debates student free speech rights on social media

    ABC News. The U.S. Supreme Court wrestled openly on Wednesday with how to balance the free speech rights of American teenagers on social media with the needs of schools to maintain order and ...

  18. A Supreme Court social media ruling could set new free speech standards

    WASHINGTON (AP) — The Supreme Court seemed likely Monday to side with the Biden administration in a dispute with Republican-led states over how far the federal government can go to combat controversial social media posts on topics including COVID-19 and election security in a case that could set standards for free speech in the digital age.. The justices seemed broadly skeptical during ...

  19. Two Supreme Court Cases Could "Break the Internet": What Role Should

    In addition, platforms would be encouraged to increasingly rely on automated content moderation tools in a manner that is likely to over-restrict speech. The Cases Have Broad Implications for How Social Media Platforms Moderate Content . The facts and legal questions underlying both cases, which are closely related, have been discussed extensively.

  20. Supreme Court wary of restricting government contact with social media

    The Supreme Court heard a free speech case involving the Biden administration's efforts to pressure social media companies to remove what it said was false information.

  21. NRA Ruling Eyed for Clues in Supreme Court Social Media Case

    NRA Ruling Eyed for Clues in Supreme Court Social Media Case. Lawyers are reading between the lines of the US Supreme Court's decision to revive the National Rifle Association's free speech fight against a New York regulator to see what the implications may be for a similar dispute involving the Biden administration.

  22. Supreme Court sides with NRA in free speech ruling that curbs

    The court's opinion made no mention of another pending and related case dealing with whether the Biden administration went too far in pressuring social media platforms like X and Facebook to ...

  23. Elon Musk says he won a battle for free speech in court, but it won't

    Even though the Federal Court case has been discontinued, the regulation of social media in Australia continues to be a central focus of political and legal debate.

  24. Hate speech and disinformation in South Africa's elections: big tech

    South Africa is a case in point. ... A number of groups working with data are attempting to monitor hate speech and disinformation on social media ahead of South Africa's national and provincial ...

  25. US Supreme Court boosts NRA in free speech fight with New York official

    A federal judge in 2021 dismissed all of the NRA's claims apart from two free speech counts against Vullo. The Manhattan-based 2nd U.S. Circuit Court of Appeals in 2022 said those also should have ...

  26. Australia drops legal fight against X over church stabbing videos

    "This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed," the social ...

  27. X claims partial victory after Australian eSafety ...

    'Freedom of speech is worth fighting for.' 'Threaten free speech everywhere': X refuses to censor video. Thanks to the wonders of social media, a video of the brutal attack quickly went viral around the globe — much to the chagrin of Australian eSafety Commissioner Julie Inman Grant, a woke American previously affiliated with Big Tech.Grant demanded that X remove the video "to protect ...