Why the question of limiting freedom of expression matters now
The phrase limiting freedom of expression captures a debate that often appears in news coverage and public discussion when content is removed online. To decide whether a removal is government censorship or private moderation, the law asks whether private action can be attributed to the government under the state action doctrine, a legal baseline that determines when the First Amendment applies to a nonstate actor State Action Doctrine (Cornell Law)
Readers who care about civic rights and accurate reporting need clear standards. This article explains those standards, how courts frame the tests, and what evidence reporters or voters should seek when a removal is contested. It also provides a practical checklist you can use to evaluate a specific incident without leaping to conclusions.
The rest of the article is a roadmap: definitions and plain-language examples, the core judicial tests, how Section 230 shapes platform incentives, recent appellate cases, a short checklist for verification, common mistakes to avoid, three applied scenarios, and steps for reporting responsibly. After reading, you should be able to identify what documentary evidence would be needed to show that a removal amounts to government censorship.
Get campaign updates and primary materials
If you plan to check a particular removal, consult the statutes, contracts, platform transparency reports, and court opinions cited later in this guide.
Scope of the issue
Content removals can affect civic debates, campaign communications, and individual expression. In many cases the practical question is not whether removal is controversial, but whether a government actor directed or controlled the action in a way courts treat as state action. That factual distinction matters for constitutional protections and for how journalists describe the removal.
Who this guide is for and how to use it
This guide is written for voters, local residents, journalists, and civic-minded readers who want to evaluate whether a content removal could be government censorship. Use the checklist section to gather primary evidence and the scenario vignettes to see how the tests are applied in practice. When the guide cites legal doctrine, it links to a primary legal overview or court opinion so readers can verify the source.
Definition and context: What counts as government censorship versus private moderation
Government censorship, in plain terms, is when a public actor with legal authority directs or compels the suppression or removal of speech, and that direction is legally attributable to the state so that constitutional limits apply. Private moderation is when a nonstate actor, such as an online platform, removes or limits content based on its own policies or business choices and those decisions do not meet the legal threshold for state action State Action Doctrine (Cornell Law)
The First Amendment primarily restricts government actors. That means a private entity’s choice to moderate content is ordinarily not a First Amendment problem unless courts find sufficient government involvement or coercion to treat the private action as state action. Foundational cases and legal summaries explain the tests courts use to make that determination.
A short evaluation checklist to compare facts to the state action tests
Use this checklist when verifying removals
Everyday experience helps show the difference. An obvious example of private moderation is a platform removing posts that violate its published rules. An obvious example of government censorship is a law that requires an official agency to block speech and is enforced by the state. Between those clear cases sits a fact-specific zone where close collaboration, conditional funding, or contracts can raise legal questions.
When assessing any specific removal, the legal baseline requires documentary evidence. Official laws, written contracts, formal funding agreements, or court orders are the kinds of records that can change an ordinary moderation question into a state action issue.
Core legal framework: the state action tests courts use when assessing attribution
Court decisions apply a set of tests to decide when private conduct is treated as state action. Three well-known tests are the public function test, nexus or entwinement, and coercion. The public function test asks whether a private actor performed a function that is traditionally and exclusively done by the government. The entwinement and coercion tests look at the degree of government involvement or pressure that produced the specific action Jackson v. Metropolitan Edison Co. (Supreme Court opinion)
Private moderation is treated as government censorship when courts attribute the private action to the state under tests like public function, entwinement, or coercion, which require documentary evidence showing government direction or significant control.
The nexus or entwinement approach examines whether the government and the private actor are so closely related in operation that the private action is essentially governmental. The coercion test asks whether the government used threats, orders, or other compulsion to secure the particular outcome. These inquiries are fact-specific and often require detailed documentary evidence and legal analysis to resolve.
Short examples help. If a private actor performs a traditional public function, like running elections under state law, the public function test may apply. If a government contract gives an agency direct control over how a platform must operate, or the government conditions money or privileges in a way that leaves the platform no realistic choice, courts may find entwinement or coercion. Legal overviews explain how courts balance these factors case by case.
How Section 230 shapes platform incentives and the boundary with censorship concerns
Section 230 of the Communications Decency Act provides broad statutory protection for many content moderation choices by shielding interactive computer services from most publisher liability for third-party content and for decisions to restrict access to content. That statutory rule affects how platforms manage risk and therefore influences their moderation policies 47 U.S.C. § 230 (GovInfo)
Section 230 is statutory, and constitutional questions about state action are separate. In other words, Section 230 can protect a private platform against certain legal claims, but it does not convert private action into government action. Constitutional attribution under the state action doctrine remains a distinct judicial inquiry.
Because Section 230 reduces some publisher risk, platforms often create and enforce their own rules to manage content and community safety. Debates about limiting freedom of expression sometimes focus on whether platforms exercise too much editorial judgment, and those debates occur alongside legal debates about when government involvement could change the constitutional analysis.
Recent case law and enforcement: how courts treated state coercion and platform rules in the 2020s
Federal and appellate courts in the 2020s addressed laws that tried to require platforms to host or remove certain content and in several instances blocked or enjoined such laws on constitutional grounds. Those rulings show that when state statutes or regulations cross into coercive direction of platform choices, courts will apply First Amendment review NetChoice v. Moody (Eleventh Circuit opinion) (see also Lindke v. Freed)
NetChoice v. Moody is an example where an appellate court found problems with a state law that sought to control platforms’ content decisions. The ruling illustrates how courts analyze whether a law effectively compels private moderation choices and whether that compulsion triggers constitutional protections against government interference in speech.
These cases do not imply that every government request or statement about content is unlawful. Instead, they show that courts scrutinize the specific statutory terms, the means of enforcement, and the factual context to determine if the government crossed the line into coercion or joint action that would make private moderation subject to constitutional constraints.
A practical checklist: five questions reporters and voters can use to evaluate a removal
When a removal raises concern, ask five factual questions that align with state action tests. First, is there an explicit legal directive or statute that required or authorized the removal? Second, did the government provide material funding, technical control, or privileged access that effectively directed the action? Third, is there a formal delegation of a public function to the platform? Fourth, was there coercion or overt direction by a government official in this specific case? Fifth, is there close entwinement or nexus that makes the private action traceable to the state State Action Doctrine (Cornell Law)
For each question, primary sources matter. To confirm a directive look for text in statutes, regulations, or official orders. To confirm material funding or control look for contracts, grant agreements, or communications that specify required actions. To confirm delegation of function look for statutory language assigning public duties. To confirm coercion look for contemporaneous communications that show threats or orders. Platform transparency reports and court filings can be informative but should be matched to primary government records.
Public opinion polls show widespread concern about content removal and perceived bias, but polling is not proof of state action. Public concern is relevant for public debate and policy choice, yet legal attribution depends on documentary evidence and judicial analysis rather than on survey results Pew Research Center report
Common mistakes and pitfalls when people label moderation as censorship
A frequent error is confusing criticism or a government request with coercion. Government officials may call for removal or express displeasure, but absent material coercion, entwinement, or a legal mandate, courts typically treat such statements differently than a formal government order. Careful analysis looks for documentary evidence of control rather than rhetorical pressure When Is Private Conduct State Action? (Knight First Amendment Institute) (see state action explained)
Another common pitfall is over-relying on anecdotal removal claims or on public polling to establish state action. Anecdotes and polls can show public concern but cannot substitute for statutes, contracts, or records that demonstrate government direction. Reporters should seek contemporaneous documents and official records before asserting that a removal is government censorship.
Typical evidentiary gaps to watch for include missing written directives, lack of contemporaneous communication showing coercion, absence of contract terms that give control, or no statutory language delegating a public function. Where such gaps exist, describe the facts accurately and attribute claims to sources rather than asserting a legal conclusion.
Practical scenarios: three short case studies to apply the checklist
Scenario A, government request for removal. Imagine an official calls a platform and asks that a post be removed. If there is no statute, no binding legal order, and no threat of penalty, a court would likely treat the exchange as a request, not coercion. To evaluate it, seek call records, emails, and any follow-up that shows threat or direction, and check whether a law or regulation requires compliance.
Scenario B, contract-based platform control. In this vignette a local government signs a contract with a platform that requires content filtering for particular categories as a condition of a municipal service. Here the checklist asks whether the contract terms effectively required the platform to act like a state actor and whether the platform retained meaningful discretion. Look for the contract language and any oversight mechanisms that give the government operational control.
Scenario C, algorithmic amplification and close collaboration. Suppose a government agency and a platform jointly design algorithms to prioritize certain messages, accompanied by data sharing and coordinated testing. Such entwinement could resemble joint action. To assess it, request contracts, technical specifications, and communications that show direct collaboration and control rather than mere consultation.
How to report, verify, and cite a suspected instance of government censorship
Start by gathering primary records. Check statutes and regulations for any legal mandate. Request contracts and grant documents for terms that condition funding or require particular moderation steps. Seek official correspondence, orders, or memoranda that pertain to the disputed removal. Platform transparency reports and court filings can supply useful context, but primary government records are central to establishing state action State Action Doctrine (Cornell Law)
When writing, use careful attribution. Useful templates include phrases like according to the statute, public filings show, the agency’s correspondence states, or the platform’s transparency report says. Avoid asserting that an action is government censorship unless the documentary evidence supports attribution or a court has so held.
For voter information about candidates and positions, campaign website content can be a primary source for a candidate’s stated views. According to his campaign site, a candidate may explain his approach to online speech and related policy priorities. When citing a candidate’s statements, link directly to the campaign page or public filing that contains the claim.
Conclusion: key takeaways on limiting freedom of expression and where questions remain
The central legal point is straightforward: whether an action limits freedom of expression in a way that triggers the First Amendment depends on state-action attribution and the core tests courts use. Courts apply public function, entwinement, and coercion analyses to decide whether private moderation is legally governmental, and each case turns on specific facts and documents State Action Doctrine (Cornell Law)
Open questions remain, including how courts will treat algorithmic amplification and whether conditional funding or close technical collaboration can meet the tests for entwinement or coercion. Those are areas to follow in future court opinions and legislative proposals. For any particular removal, consult statutes, contracts, platform notices, and court opinions to verify whether the legal line has been crossed.
The First Amendment restricts government action; private platforms are generally not bound by it unless courts find that a private action is legally attributable to the state through tests like public function, entwinement, or coercion.
Section 230 provides statutory protection from certain publisher liability for moderation, but it does not make every moderation decision immune from other legal scrutiny and it is distinct from constitutional questions about government attribution.
Look for statutes or official orders, written contracts or funding agreements that condition actions, contemporaneous communications showing directives or threats, and platform transparency reports that document coordination.
References
- https://www.law.cornell.edu/wex/state_action
- https://lawreview.uchicago.edu/online-archive/blocking-suit-lower-court-applications-lindke-state-action-test
- https://supreme.justia.com/cases/federal/us/419/345/
- https://www.govinfo.gov/content/pkg/USCODE-2018-title47/html/USCODE-2018-title47-chap5-subchapII-sec230.htm
- https://michaelcarbonara.com/contact/
- https://www.ca11.uscourts.gov/opinions/pub/files/202214072.pdf
- https://supreme.justia.com/cases/federal/us/601/22-611/
- https://www.pewresearch.org/internet/2023/06/29/americans-views-about-social-media-content-moderation/
- https://knightcolumbia.org/content/when-is-private-conduct-state-action-tests-and-applications-for-digital-platforms
- https://www.freedomforum.org/state-action/
- https://michaelcarbonara.com/news/
- https://michaelcarbonara.com/issue/constitutional-rights/
- https://michaelcarbonara.com/republican-candidate-for-congress-michael-car/

