Deplatforming
abstract. Deplatforming in the technology sector is hotly debated, and at times may even seem unprecedented. In recent years, scholars, commentators, jurists, and lawmakers have focused on the possibility of treating social-media platforms as common carriers or public utilities, implying that the imposition of a duty to serve the public would restrict them from deplatforming individuals and content.
But, in American law, the duty to serve all comers was never absolute. In fact, the question of whether and how to deplatform—to exclude content, individuals, or businesses from critical services—has been commonly and regularly debated throughout American history. In the common law and the major infrastructural and utility sectors—transportation, communications, energy, and banking—American law has long provided rules and procedures for when and how to deplatform.
This Article offers a history and theory of the law of deplatforming across networks, platforms, and utilities. Historically, the American tradition has not been one of either an absolute duty to serve or an absolute right to exclude. Rather, it has been one of reasonable deplatforming—of balancing the duties to serve and the need to, in limited and justifiable cases, exclude. Theoretically, deplatforming raises common questions across sectors: Who deplatforms? What is deplatformed? When does deplatforming occur? What are permissible reasons for deplatforming? How should deplatforming take place? The Article uses the history of deplatforming to identify these and other questions, and to show how American law has answered them.
The history and theory of deplatforming shows that the tension between service and exclusion is an endemic issue for common carriers, utilities, and other infrastructural services—including contemporary technology platforms. This Article considers ways in which past deplatforming practices can inform current debates over the public and private governance of technology platforms.
author. New York Alumni Chancellor’s Chair in Law, Vanderbilt University Law School. Thanks to Rebecca Allensworth, Margaret Blair, Ed Cheng, Evelyn Douek, Joshua Macey, Lev Menand, Morgan Ricks, Jim Rossi, J.B. Ruhl, Nelson Tebbe, and Shelley Welton. Thanks also to participants in the Networks, Platforms, and Utilities (NPU) Zoom Scholars workshop, Knight First Amendment Institute seminar, and faculty workshops at Boston College and Marquette University Law Schools for helpful comments and suggestions.
introduction
In January 2021, President Donald Trump and many of his supporters were banned from Twitter,1 Facebook,2 and other social-media services.3 The immediate reaction to the “great deplatforming,”4 as some have called it, varied from support,5 to objections,6 to claims of “incoherence.”7 Since that time, much of the discussion has focused on the possibility of treating social-media platforms as common carriers or public utilities.8 Proponents of this approach, including conservatives like Richard Epstein and Justice Clarence Thomas, emphasize that common carriers and public utilities have an obligation to serve all customers.9 The subtle implication, of course, was that President Trump and others who were deplatformed should be reinstated.
After Elon Musk took over Twitter, controversies over deplatforming have only continued. Musk had promised the social-media platform would be a home for free speech. But his Twitter soon suspended Kanye West’s account for posting a swastika,10 accounts that showed the location of Musk’s private jet (after promising not to), and even accounts that had shared those tweets.11 Musk’s Twitter even announced a policy banning posts promoting competitor social-media platforms, although it soon reversed course.12
Deplatforming has also not been limited to individuals and content on social media. Cloud-infrastructure giant Amazon Web Services (AWS) deplatformed the conservative social-media network Parler,13 as did the Apple App Store and Google Play Store.14 Google has removed apps that secretly collect user data,15 and Apple has excluded apps that have not been updated.16 Amazon has shut down accounts of thousands of merchants seeking to sell goods on Amazon Marketplace.17 With examples like these in the news, as well as many others, scholars have raised a variety of issues about deplatforming individuals and content—including First Amendment analyses18 and the workability and desirability of existing procedures19—and have even proposed creating federal procedural rules for platforms.20
Deplatforming in the tech sector is thus hotly debated, and at times, it might even seem “unprecedented.”21 But in American law, the duty to serve all comers was never absolute.22 In fact, the question of whether and how to deplatform—to exclude content, individuals, or businesses from critical services—has been commonly and regularly debated throughout American history. In the common law and the major infrastructural and utility sectors—transportation, communications, energy, and banking—law has long provided rules and procedures for when and how to. And yet, despite the increasingly familiar argument that tech platforms are akin to common carriers, infrastructure, or public utilities,23 the practice of deplatforming across the common law and the traditional networks, platforms, and utilities sectors has gone unexamined.
This Article offers a history and theory of the law of deplatforming across networks, platforms, and utilities. Part I shows that there has been a long history of deplatforming in the common law and in the transportation, communications, energy, and banking sectors. These areas of law are generally considered the traditional “regulated industries,”24 or as a new casebook calls them, “Networks, Platforms, and Utilities” (NPUs).25 In each of these sectors, firms excluded individuals or activities—even if the firm was an essential service, a government monopoly, or had a legal duty to serve the public. The American tradition has not been one of either an absolute duty to serve or an absolute right to exclude. Rather, it has been one of reasonable deplatforming: balancing the duties to serve and the need to, in limited and justifiable cases, exclude.
In the nineteenth century, for example, common-law courts required innkeepers and other common carriers to “accept all comers,” but they also created exceptions for persons who could be excluded from service, including thieves and belligerents.26 In the early twentieth century, the law grappled with excluding individuals and content from the postal system, telephone service, and broadcast communications.27 Over time, these rules migrated into state- and federal-NPU regulatory systems. Legal rules enable the government to ban certain individuals from access to the banking system, energy sector, and air travel.28 Although the substantive and procedural rules vary somewhat from sector to sector, there are common approaches and dilemmas. This history shows that deplatforming is not unprecedented or unique to tech platforms; it is an inevitable, endemic issue that emerges in governing infrastructure industries. Far from ensuring that deplatformed individuals must be reinstated, the common-carrier and public-utility framework sanctions deplatforming under a limited, but significant, set of scenarios. The critical question, therefore, is not whether to permit deplatforming, but rather who decides the rules of deplatforming and what those rules should be.
Having presented the history and practice of deplatforming across NPUs, the Article then zooms out from particular sectors and explores theoretical questions about deplatforming in Part II. Understanding deplatforming requires considering (1) foundational issues, (2) the reasons for deplatforming, and (3) the process of deplatforming. Foundational issues answer the basic questions: Who deplatforms (public or private)? What is deplatformed (conduct or an entity)? Why is the service important (essential, civic, or commercial reasons)? Are platforms liable for injuries? When does deplatforming take place (reactively, preemptively, or preventively)? And what justifies replatforming? The reasons for deplatforming can vary from sector to sector, but they have been remarkably common and stable throughout history: ensuring service provision (the failure to pay, capacity and congestion concerns, and/or service-quality degradation); preventing harms (injury to other users, society, and/or national security); and adhering to social regulations (public morality and, earlier in history, racial discrimination). There are also consistently impermissiblereasons for deplatforming. The process of deplatforming has involved both ex ante measures (generally applicable rules and notice requirements) and ex post measures (case-by-case determinations and opportunities to challenge exclusion). Describing these factors helps to clarify the shape of reasonable deplatforming in American history and illuminates persistent dilemmas in deplatforming as well.
Part III then turns to the contemporary issue of tech platforms’ decision to deplatform individuals and content. To start, it shows that neither private deplatforming (by firms) nor public deplatforming (by law) are novel. Contemporary private practices among big-tech platforms track the historical approach to reasonable deplatforming remarkably well. Recently proposed and subsequently passed regulation at the state level, however, differs from the traditional American approach in that it significantly reduces the scope for reasonable deplatforming. Importantly, some courts and commentators have said that a platform’s terms of service mean that it is not open to all comers and, therefore, is not a common carrier or public utility. The history of deplatforming reveals that this is incorrect: as we will see, terms of service existed even in the nineteenth and early-twentieth centuries, and they foreclosed neither the duty to serve nor the power to exclude. The history and theory of deplatforming also contributes to specific debates, including over bots and anonymity, bans on crypto-mining, platform liability, and the boundary between content moderation and commercial nondiscrimination in cases where platforms deplatform other platforms. A brief conclusion follows.
In reviewing the history and theory of deplatforming, this Article makes several contributions. First, it provides an account of deplatforming—exclusions from service provision—across the common law and statutory laws of networks, platforms, and utilities. It is, of course, not a treatise, and as such, it does not offer a comprehensive accounting of every case or statute from every subsector. But, so far as I can tell, it is the first transsectoral account of this dynamic across NPU law. This history shows that deplatforming is and has been an endemic issue for infrastructural enterprises. Second, this Article identifies the justifications, mechanisms, and dilemmas that have characterized the practice of reasonable deplatforming in American history. Together, these analyses on their own terms contribute to the recent revival of the study of the law governing networks, platforms, and utilities.29 In particular, they show that despite a general view that common carriers and public utilities must accept all comers, there have always been exceptions—and the exceptions have followed consistent patterns. The theoretical analysis, combined with the historical examples, also provides a framework for thinking about designing deplatforming regimes. This is critical because the history of deplatforming also offers some cautionary tales. Careful and thoughtful design, not simplistic arguments about unconstrained rights to exclude or absolute duties to serve, should be the way forward for courts and policymakers. Third, this Article offers important lessons for tech platforms. History shows that deplatforming in the tech sector is nothing new and that the quest for a platform that doesn’t deplatform is misguided. Moreover, the fact that tech platforms require following their terms of service for access does not mean they are not common carriers or public utilities. More broadly, this Article contributes to the literature that seeks to show that the legal framework that applies to networks, platforms, and utilities can provide helpful insights and guidance for regulating contemporary tech platforms. The American tradition of reasonable deplatforming may provide a guide for tech deplatforming—whether privately directed, imposed under the common law, or regulated by statute.
A few caveats are also in order. The first is about scope. One of the standard tools of NPU law are exit restrictions, in which a regulated firm is not permitted to shut down service to some set of customers.30 For example, a railroad would have, during the heyday of the Interstate Commerce Commission, been prohibited from shutting down a rail line to a city without permission from the regulator. This dynamic is similar to deplatforming, but it is distinct. Exit restrictions are generally part of a system of route allocation or an exclusive franchise or monopoly provision. They are thus linked both to entry restriction and universal-service mandates within the service area. They are part of the structural regulatory rules that shape the industrial organization of the sector, rather than rules of behavior on the NPUs themselves. The history and purposes of exit restrictions thus differ from the tradition that I trace here. As a result, I do not treat exit restrictions in this Article.
A harder case, but also one that I cabin, is about platform design and accessibility. The question is whether the duty to serve all comers includes an obligation not merely to accept anyone seeking to use the service (subject to reasonable deplatforming rules) but also for the platform to design its service in a way that everyone can actually useit. Or, to put it differently, if you cannot access the platform because you require a special accommodation or design feature, must the platform adapt to offer that accommodation? It is, of course, the case that the scope of the duty to serve will necessarily shape the extent of the right to exclude.31 But with one notable exception,32 the cases and history offered here have comparatively little to say about this topic. This may be, in part, because some NPUs require no or minimal accommodations for use due to standardized parts or connections (e.g., electric grid, pipelines, water pipes, telephones, telegraphs), because technologies did not permit accommodation at the time (e.g., radio technology could not accommodate the deaf in the early twentieth century), and because NPU enterprises have an incentive to maximize their customer base. It is perhaps also partly because policies requiring accommodations for individuals have been a function of antidiscrimination and civil-rights laws,33 which apply far more broadly than firms in NPU sectors as a condition on their duty to serve. This latter tradition is intertwined with but distinct from the tradition I trace here—the duty to serve within economic regulation, and its exceptions. An account of the intersections of these traditions would be welcome but is beyond the scope of this Article.
The second caveat is about terminology. Throughout the Article, I use “platform” interchangeably with infrastructure industries, regulated industries, and public utilities. For those who think only about contemporary tech platforms, this usage might seem odd. But in doing so, I align with others who also note the functional similarities between tech platforms and the traditional regulated industries (primarily, but not exclusively, the transportation, communications, energy, and banking sectors)—and likewise think that “platform” is a better (or at least, not worse) term than others one might devise.34 As a result, “deplatforming” is the exclusion or ejection of not only individuals or entities, but also content or particular behavior from a platform.35 The distinction between entity deplatforming and content deplatforming is explored throughout the Article, and expressly in Part II. In defining platforms and deplatforming this way, I intend for them to be broader than the common, casual usage, which focuses on social-media companies. Rather, I mean to expand the meaning of these terms to apply to NPUs generally. Doing so, as we shall see, illuminates a range of practices, possibilities, and perils.
Permanent Suspension of @realDonaldTrump, Twitter (Jan. 8, 2021), https://blog.twitter.com/en_us/topics/company/2020/suspension [https://perma.cc/6MY3-JLW3].
Nick Clegg, In Response to Oversight Board, Trump Suspended for Two Years; Will Only Be Reinstated if Conditions Permit, Facebook (June 4, 2021), https://about.fb.com/news/2021/06/facebook-response-to-oversight-board-recommendations-trump [https://perma.cc/WHN8-CC3X].
Sara Fischer & Ashley Gold, All the Platforms That Have Banned or Restricted Trump so Far, Axios (Jan. 11, 2021), https://www.axios.com/2021/01/09/platforms-social-media-ban-restrict-trump [https://perma.cc/T67B-9K6T].
Jen Patja Howell, Evelyn Douek, Quinta Jurecic & Jonathan Zittrain, The Lawfare Podcast: Jonathan Zittrain on the Great Deplatforming, Lawfare (Jan. 14, 2021, 12:00 PM), https://www.lawfareblog.com/lawfare-podcast-jonathan-zittrain-great-deplatforming [https://perma.cc/4QKH-FTUE]; Genevieve Lakier & Nelson Tebbe, After the “Great Deplatforming”: Reconsidering the Shape of the First Amendment, LPE Project (Mar. 1, 2021), https://lpeproject.org/blog/after-the-great-deplatforming-reconsidering-the-shape-of-the-first-amendment [https://perma.cc/8FPB-9SM6].
Zephyr Teachout, We’re All Better Off Without Trump on Twitter. And Worse Off with Twitter in Charge., Wash. Post (Jan. 14, 2021, 12:24 PM EST), https://www.washingtonpost.com/outlook/2021/01/14/trump-twitter-ban-big-tech-monopoly-private [https://perma.cc/CZU4-ZQ56]; Paul Waldman, Twitter’s Trump Ban Is Even More Important than You Thought, Wash. Post (Jan. 18, 2021, 12:58 PM EST), https://www.washingtonpost.com/opinions/2021/01/18/twitters-trump-ban-is-even-more-important-than-you-thought [https://perma.cc/9DPA-Y2JQ].
For objections, see, for example, Jessica Guynn, ‘They Want to Take Your Speech Away,’ Censorship Cry Unites Trump Supporters and Extremists After Capitol Attack, USA Today (Jan. 16, 2021, 3:03 PM ET), https://www.usatoday.com/story/tech/2021/01/15/censorship-trump-extremists-facebook-twitter-social-media-capitol-riot/4178737001 [https://perma.cc/FN9Z-PJEV]; Grace Curley, Every American Should Be Against Twitter’s Ban of Trump, Bos. Herald (Jan. 9, 2021, 4:02 PM), https://www.bostonherald.com/2021/01/09/curley-every-american-should-be-against-twitters-ban-of-trump [https://perma.cc/G9LY-3UQF]; and Suzanne Nossel, Banning Trump from Facebook May Feel Good. Here’s Why It Might Be Wrong, L.A. Times (Jan. 27, 2021, 3:15 AM PT), https://www.latimes.com/opinion/story/2021-01-27/facebook-donald-trump-oversight-board [https://perma.cc/YKF2-72KS].
Andrew Marantz, The Importance, and Incoherence, of Twitter’s Trump Ban, New Yorker (Jan. 15, 2021), https://www.newyorker.com/news/daily-comment/the-importance-and-incoherence-of-twitters-trump-ban [https://perma.cc/68BG-KCAJ].
See, e.g., Ganesh Sitaraman & Morgan Ricks, Tech Platforms and the Common Law of Common Carriers, 73 Duke L.J. (forthcoming); Adam Candeub, Bargaining for Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 Yale J.L. & Tech. 391, 429-33 (2020); Eugene Volokh, Treating Social Media Platforms Like Common Carriers?, 1 J. Free Speech L. 377, 381-83 (2021); Christopher S. Yoo, The First Amendment, Common Carriers, and Public Accommodations: Net Neutrality, Digital Platforms, and Privacy, 1 J. Free Speech L. 463, 465-73 (2021). For an earlier treatment, see Adam Thierer, The Perils of Classifying Social Media Platforms as Public Utilities, 21 CommLaw Conspectus 249, 250 (2013).
See Tunku Varadarajan, The ‘Common Carrier’ Solution to Social-Media Censorship, Wall St. J. (Jan. 15, 2021, 12:39 PM ET), https://www.wsj.com/articles/the-common-carrier-solution-to-social-media-censorship-11610732343 [https://perma.cc/GU9T-8B2K] (interviewing Richard Epstein); Biden v. Knight First Amend. Inst., 141 S. Ct. 1220, 1224 (2021) (Thomas, J., concurring) (observing that tech platforms could potentially be subject to common-carrier obligations).
Charisma Madarang, Kanye Tweets Swastika, Elon Musk Suspends His Twitter Account, Rolling Stone (Dec. 2, 2022), https://www.rollingstone.com/music/music-news/kanye-west-swastika-elon-musk-twitter-1234640112 [https://perma.cc/8PVY-TU7N].
Mike Isaac & Kate Conger, Twitter Suspends Accounts of Half a Dozen Journalists, N.Y. Times (Dec. 15, 2022), https://www.nytimes.com/2022/12/15/technology/twitter-suspends-journalist-accounts-elon-musk.html [https://perma.cc/J4CL-NER3].
Chas Danner, Elon Musk Tried to Ban Leaving Twitter, N.Y. Mag. (Dec. 18, 2022), https://nymag.com/intelligencer/2022/12/elon-musks-twitter-bans-sharing-links-to-many-competitors.html [https://perma.cc/5FN4-6FUH].
John Paczkowski & Ryan Mac, Amazon Will Suspend Hosting for Pro-Trump Social Network Parler, BuzzFeed News (Jan. 9, 2021, 10:08 PM), https://www.buzzfeednews.com/article/johnpaczkowski/amazon-parler-aws [https://perma.cc/D2L5-BMMU].
Brian Fung, Parler Has Now Been Booted by Amazon, Apple, and Google, CNN (Jan. 11, 2021, 6:54 AM ET), https://www.cnn.com/2021/01/09/tech/parler-suspended-apple-app-store/index.html [https://perma.cc/DFD6-A7A8].
Byron Tau & Robert McMillan, Google Bans Apps with Hidden Data-Harvesting Software, Wall St. J. (Apr. 6, 2022, 2:27 PM ET), https://www.wsj.com/articles/apps-with-hidden-data-harvesting-software-are-banned-by-google-11649261181 [https://perma.cc/GQE8-TDDP].
Emma Roth, Apple App Store Appears to Be Widely Removing Outdated Apps: Wiping Apps that Haven’t Been Updated in a ‘Significant Amount of Time,’ Verge (Apr. 23, 2022, 6:44 PM EDT), https://www.theverge.com/2022/4/23/23038870/apple-app-store-widely-remove-outdated-apps-developers [https://perma.cc/V6AB-XDQV].
Jason Del Ray, Amazon Ousted Thousands of Merchants with No Notice—Showing the Danger of Relying on the Shopping Platform, Recode (Mar. 8, 2019, 9:23 AM EST), https://www.vox.com/2019/3/8/18252606/amazon-vendors-no-orders-marketplace-counterfeits [https://perma.cc/7MCN-326N].
See Evelyn Douek, Content Moderation as Systems Thinking, 136 Harv. L. Rev. 526, 531-32 (2022); Kate Klonick, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 Yale L.J. 2418, 2453-54 (2020); Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598, 1631-35 (2018).
Danny Crichton, The Deplatforming of President Trump: A Review of an Unprecedented and Historical Week for the Tech Industry, TechCrunch (Jan. 9, 2021, 11:46 AM EST), https://techcrunch.com/2021/01/09/the-deplatforming-of-a-president [https://perma.cc/32EP -UWRA] (“From Twitter to PayPal, more than a dozen companies have placed unprecedented restrictions or outright banned the current occupant of the White House from using their services, and in some cases, some of his associates and supporters as well.”).
To be fair to Epstein, he does note that service had to be on “fair, reasonable and nondiscriminatory” terms, and suggests there be a “narrow” exception for “violence and threats of force.” But he does not discuss why President Trump’s tweets would fall on the permissible side of that line. See Varadarajan, supra note 9.
See supra notes 8-9 and accompanying text; see also Peter Swire, Should the Leading Online Tech Companies Be Regulated as Public Utilities?, Lawfare (Aug. 2, 2017, 9:00 AM), https://www.lawfareblog.com/should-leading-online-tech-companies-be-regulated-public-utilities [https://perma.cc/SN8U-5TQK] (analogizing tech companies to public utilities and criticizing that regulatory approach); K. Sabeel Rahman, The New Utilities: Private Power, Social Infrastructure, and the Revival of the Public Utility Concept, 39 Cardozo L. Rev. 1621, 1634-39 (2018) (evaluating public-utility regulation as a template to redress the private control of tech platforms).
See generally Richard J. Pierce, Jr. & Ernest Gellhorn, Regulated Industries in a Nutshell (4th ed. 1999) (describing traditionally regulated industries). I include banking, even though it was not usually taught or included in regulated-industries textbooks because it shares many similar features of infrastructure and utilities. See, e.g., Morgan Ricks, Money as Infrastructure, 2018 Colum. Bus. L. Rev. 757 (2018) (arguing that bank regulation falls within the broader category of infrastructure regulation); Alan M. White, Banks as Utilities, 90 Tul. L. Rev. 1241 (2016) (applying public-utility law to banks).
For examples of recent scholarship on the topic of networks, platforms, and utilities law, see William J. Novak, New Democracy: The Creation of the Modern American State 108-45, 180-217 (2022) (centering public utilities and regulated industries in the heart of the creation of the American state); Ricks, Sitaraman, Welton & Menand, supra note 25; Dan Awrey & Joshua Macey, The Promise and Perils of Open Finance, 40 Yale J. on Reg. 1 (2023); Lina M. Khan, The Separation of Platforms and Commerce, 119 Colum. L. Rev. 973 (2019); Rahman, supra note 23; Ganesh Sitaraman, The Regulation of Foreign Platforms, 74 Stan. L. Rev. 1073 (2022).
The field of NPU law includes the common law of carriers and statutory regimes in the transportation, communications, energy, and banking sectors, in addition to some others. Inclusion in the category itself is perhaps best defined analogically, rather than formalistically, based on consideration of a variety of factors that include status as a service, network effects and economies of scale, and other features. For discussions on terminology and the scope of what counts as a network, platform, or utility, see Ricks, Sitaraman, Welton & Menand, supra note 25. Note that terminology has always been a problem in the field, which has had names ranging from the law of “public service corporations,” to public utilities, to regulated industries. See, e.g., Charles K. Burdick, The Origin of the Peculiar Duties of Public Service Companies. Part I, 11 Colum. L. Rev. 514, 515 n.8 (1911) (“This term ‘Public Service Company’ is not entirely satisfactory, but it is difficult to find a substitute which is not unwieldy.”). For others who also see tech platforms as related to utilities, see, for example, sources cited supra notes 8-9, 23, which describe similarities between platforms and utilities.
This is similar to, but not the same as, how experts define content moderation. Grimmelmann, for example, defines that term as “the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse.” James Grimmelmann, The Virtues of Moderation, 17 Yale J.L. & Tech. 42, 47 (2015).