Freedom of Speech Examples: What’s Protected, What Isn’t, and Why

Freedom of Speech Examples: What’s Protected, What Isn’t, and Why
This article explains freedom of speech examples and the legal principles courts use to decide what the First Amendment protects. It is written for voters, journalists, students, and civic readers seeking clear, sourced explanations.

The guide summarizes long-standing Supreme Court tests and trusted primers, and it is informational only. For specific legal questions, consult the cited primary opinions and qualified counsel.

The First Amendment protects most political expression, but courts recognize categorical exceptions.
Incitement, true threats, obscenity, and defamation are assessed under established tests and factual context.
Online platforms may remove content even when speech remains constitutionally protected.

freedom of speech examples: what this guide covers and how to use it

This guide collects freedom of speech examples and explains how courts apply established legal tests to decide what counts as protected expression. It summarizes core categories that commonly fall outside First Amendment protection while stressing that this is general information, not legal advice, and advising readers to consult counsel for specific situations.

The examples below are drawn from long-standing Supreme Court tests and practitioner summaries; they are meant to illustrate how judges analyze facts, not to predict outcomes in a given case. For an accessible primer that lays out the basic categories and case summaries, see the ACLU guide and primers on the First Amendment in the linked resources ACLU free speech guide.

Quick reference to primary cases and primers to consult

Use these items to find the full opinions and summaries

Readers should use the examples here to test statements against the operative tests: whether speech is intended and likely to produce imminent lawless action, whether a reasonable recipient would view a statement as a true threat, whether material meets the Miller obscenity factors, and whether a public-figure defamation claim includes actual malice.

Where the guide summarizes a court test it cites the primary opinion text or a trusted practitioner summary so readers can check the authoritative language. The examples are framed for U.S. law and rely on established tests rather than experimental rulemaking.


Michael Carbonara Logo

First Amendment basics and the landmark cases that shape exceptions

The First Amendment broadly protects speech, but courts recognize categorical exceptions that are not protected, including incitement, true threats, obscenity, and defamation, as described in practitioner summaries SCOTUSblog First Amendment primer and our educational freedom page.

Brandenburg v. Ohio established that speech intended and likely to produce imminent lawless action is unprotected under the incitement test; consult the opinion for the Court’s formulation Brandenburg opinion and the Oyez case page Brandenburg v. Ohio on Oyez.

The Miller test sets a three-part standard for obscenity: whether the material appeals to prurient interest, is patently offensive under community standards, and lacks serious literary, artistic, political, or scientific value; the opinion states the three prongs in detail Miller opinion.

Defamation law for public officials and figures requires proof of actual malice, meaning the plaintiff must show the defendant knew a statement was false or acted with reckless disregard for the truth, as the New York Times Co. v. Sullivan decision explains New York Times v. Sullivan opinion.

On threatening statements, the Court in Elonis noted courts may consider the speaker’s intent and how a reasonable recipient would perceive the message when deciding whether a statement amounts to a true threat; the opinion discusses mens rea and context Elonis opinion.

Practitioner guides emphasize that these precedents set tests that courts apply case by case rather than bright-line rules, so factual context often determines the outcome ACLU free speech guide.

The categorical tests courts use to decide what is not protected

Courts rely on a few operative tests when they find speech is unprotected. One is the Brandenburg test, which focuses on intent and the likelihood of imminent lawless action; compare statements to the opinion text when evaluating alleged incitement Brandenburg opinion and the LII Wex summary Brandenburg test.

True threats are assessed by asking whether a reasonable recipient would view the statement as a serious expression of intent to harm; courts may also examine the speaker’s intent or state of mind, a point discussed in the Elonis opinion and subsequent commentary Elonis opinion.

The Miller obscenity test instructs courts to evaluate three elements together: prurient appeal, patently offensive content under community standards, and lack of serious value, so whether material is obscene depends on a fact-intensive inquiry laid out in the Miller opinion Miller opinion.

Defamation involving public figures requires a showing of actual malice-the plaintiff must prove the speaker knew a statement was false or acted with reckless disregard for the truth-which raises the bar for successful claims against criticism of public officials New York Times v. Sullivan opinion.

These tests are tools for courts; practitioners and judges stress that no single factor decides a case, and legal outcomes often rest on a chain of contextual facts rather than a single categorical label SCOTUSblog First Amendment primer.

How courts apply the tests in practice: context, audience, and timing

Minimal 2D vector infographic showing a courthouse facade and law library shelf in Michael Carbonara navy white and red palette illustrating freedom of speech examples

Judges weigh recurring factual considerations when applying the tests, including who the speaker is, who the audience is, the medium used, and when the statement was made. These factors affect findings on imminence and likelihood of harm SCOTUSblog First Amendment primer.

Speaker status matters. Public figures face a higher bar for defamation because of the actual malice requirement, while private individuals may have lower thresholds for showing harm in some civil suits New York Times v. Sullivan opinion.

Medium and timing can change how imminence is judged. A call to action made in the middle of a tense, ongoing protest may present a different legal picture than the same words posted hours later online; courts look at timing and the realistic chance of immediate unlawful behavior when applying Brandenburg Brandenburg opinion.

Court analysis also considers how a reasonable recipient would understand the statement. Satire, parody, and clearly rhetorical hyperbole are often treated differently than expressions that include detailed planning or specific threats, and guides note context signals such as disclaimers or obvious irony ACLU free speech guide.

Concrete freedom of speech examples: protected speech and unprotected speech examples

Everyday political criticism is generally protected, even when it is harsh or offensive. The First Amendment offers robust protection for debate about public issues, and practitioner summaries emphasize that most political expression remains protected speech ACLU free speech guide.

Satire and parody typically count as protected speech when a reasonable audience would not take the statements as literal facts; courts and commentators treat clear rhetorical or artistic expression with caution before labeling it unprotected.

A provocative post becomes legally risky when, under established tests, it is intended to and likely to produce imminent lawless action or would be viewed by a reasonable recipient as a serious expression of intent to harm; context and timing are critical and counsel can help assess the specifics.

An unprotected incitement example under Brandenburg would be a speaker who explicitly urges a crowd to commit violence immediately and the words are likely to produce that action, a scenario the opinion uses to define imminent lawless action Brandenburg opinion.

True threats can include direct, specific statements of intent to harm that a reasonable recipient would take seriously; courts examine both how the message reads to a recipient and the speaker’s intent as discussed in the Elonis opinion Elonis opinion.

Obscenity examples are context dependent but generally involve material that appeals to prurient interests, is patently offensive by community standards, and lacks serious value, so a work with legitimate literary or political content is less likely to be judged obscene under Miller Miller opinion.

Defamation examples: knowingly false factual assertions presented as fact about a public figure can lead to liability only if the plaintiff proves actual malice, which makes the truthfulness and the defendant’s state of mind crucial to a claim New York Times v. Sullivan opinion.

Borderline situations are common: rhetorical threats, ambiguous statements made in frustration, or artistic works with shocking content each require fact-specific analysis rather than categorical judgment, and courts look to context and audience understanding when deciding protection.

Digital age challenges: social media, AI content and platform moderation

Online context complicates tests like imminence and intent because posts can spread quickly and reach large audiences; practitioner primers note that algorithmic amplification and speed affect how courts and commentators think about audience reach and potential harm SCOTUSblog First Amendment primer.

Platform moderation is a private action by companies and does not equal a government restriction under the First Amendment, so removal from a social media site is separate from whether speech is constitutionally protected, a point emphasized in free-speech guidance ACLU free speech guide.

AI-generated content raises questions about authorship and intent. Courts have not resolved many of these issues, and commentators flag open questions about how traditional tests apply when content is produced or altered algorithmically rather than by a human speaker SCOTUSblog First Amendment primer.

The speed of sharing online can make imminence harder to assess; a provocative post that spreads rapidly might raise different concerns about likely harm than one that appears only briefly in a small forum, and judges may weigh evidence of reach and context when applying Brandenburg and related tests ACLU free speech guide.

Common mistakes, practical risks, and when to seek legal advice

A common mistake is assuming all hyperbole or angry words are categorically protected; while hyperbole and rhetorical statements often are protected, context can turn speech into a risk if intent or imminence is present ACLU free speech guide.

Another frequent error is treating platform removal as a legal judgment; private moderation choices reflect platform rules and not constitutional protection, so appeals or policy disputes with platforms are different from litigation over government action ACLU free speech guide.

Practical steps include preserving original posts and metadata, documenting context and timestamps, and avoiding public deletion or editing that can complicate evidence preservation when a serious claim is possible ACLU free speech guide.

Michael Carbonara - Image 2

Get primary sources and legal guidance

For case-specific questions, consult the primary opinions cited above and seek qualified legal counsel to assess the facts and possible remedies.

Join the Campaign

Signs that speech may be high risk include explicit, context-specific threats, language that seeks to produce immediate unlawful action, or factual assertions about a public figure that the speaker knows are false; these are triggers to preserve evidence and contact an attorney Brandenburg opinion.


Michael Carbonara Logo

Practical resources, references, and next steps

For primary texts, look up the named Supreme Court opinions and consult practitioner primers or our constitutional rights page. The key cases to search by name are Brandenburg v. Ohio, Miller v. California, New York Times Co. v. Sullivan, and Elonis v. United States, and summaries are available on SCOTUSblog and the ACLU site Brandenburg opinion on Justia.

Before pursuing a defamation claim, confirm whether the subject is a public figure, since that status changes the legal standard and the need to prove actual malice; for research, use the case names and the primer sites to access authoritative language New York Times v. Sullivan opinion.

If you face criminal threat charges or a high-risk civil claim, consult counsel promptly and preserve evidence. Legal advisors can assess how the specific facts map to the established tests and recommend next steps suited to the jurisdiction and facts ACLU free speech guide.

Conclusion: assessing freedom of speech examples responsibly

Protected versus unprotected speech turns on established tests and careful fact-specific analysis; practitioner summaries stress that context, timing, and audience are often decisive when courts apply these tests SCOTUSblog First Amendment primer.

Use primary opinions and trusted primers when evaluating statements, preserve evidence if a statement raises legal risk, and seek counsel for questions about criminal threats, defamation, or other serious claims. This article is informational and not a substitute for legal advice.

Courts commonly recognize incitement, true threats, obscenity, and defamation as categories that may fall outside First Amendment protection under established tests.

No. Platform moderation is a private action and distinct from constitutional free-speech protections; removal does not mean the speech is illegal under the First Amendment.

Seek legal counsel if a statement includes specific threats, appears intended to produce imminent unlawful action, or involves alleged defamatory factual claims about a public figure, and preserve evidence promptly.

Careful fact-checking and documentation are essential when evaluating potentially risky speech. Use the cited cases and practitioner primers for authoritative language and seek legal advice for case-specific issues.

References