Government Hacking
abstract. The United States government hacks computer systems for law enforcement purposes. As encryption and anonymization tools become more prevalent, the government will foreseeably increase its resort to malware.
Law enforcement hacking poses novel puzzles for criminal procedure. Courts are just beginning to piece through the doctrine, and scholarship is scant. This Article provides the first comprehensive examination of how federal law regulates government malware.
Part I of the Article considers whether the Fourth Amendment regulates law enforcement hacking. This issue has sharply divided district courts because, unlike a conventional computer search, hacking usually does not involve physical contact with a suspect’s property. The Article provides a technical framework for analyzing government malware, then argues that a faithful application of Fourth Amendment principles compels the conclusion that government hacking is inherently a search.
Part II analyzes the positive law that governs law enforcement hacking, answering fundamental criminal procedure questions about initiating a search, establishing probable cause and particularity, venue, search duration, and notice. A review of unsealed court filings demonstrates that the government has a spotty compliance record with these procedural requirements. The Article also argues for reinvigorating super-warrant procedures and applying them to law enforcement hacking.
Finally, Part III uses government malware to illuminate longstanding scholarly debates about Fourth Amendment law and the structure of surveillance regulation. Law enforcement hacking sheds new light on the interbranch dynamics of surveillance, equilibrium adjustment theories for calibrating Fourth Amendment law, and the interplay between statutory and constitutional privacy protections.
author. Cyber Initiative Fellow, Stanford University; Assistant Professor of Computer Science and Public Affairs, Princeton University (effective March 2018); J.D., Stanford Law School; Ph.D. candidate, Stanford University Department of Computer Science. The author currently serves as a Legislative Fellow in the Office of United States Senator Kamala D. Harris. All views are solely the author’s own and do not reflect the position of the United States government. This work draws upon conversations at the Federal Judicial Center Fourth Circuit Workshop, Federal Judicial Center Sixth Circuit Workshop, Federal Judicial Center Ninth Circuit Mid-Winter Workshop, Federal Judicial Center Workshop for United States Magistrate Judges, the Privacy Law Scholars Conference, and the Rethinking Privacy and Surveillance in the Digital Age event at Harvard Law School. The project benefits from the wisdom and feedback of countless colleagues, including Julia Angwin, Kevin Bankston, Dan Boneh, Ryan Calo, Cindy Cohn, Laura Donohue, Hanni Fakhoury, Nick Feamster, Ed Felten, Laura Fong, Jennifer Granick, James Grimmelmann, Marcia Hofmann, Orin Kerr, Mark Lemley, Whitney Merrill, John Mitchell, Ellen Nakashima, Paul Ohm, Kurt Opsahl, David Pozen, Chris Riley, Barbara van Schewick, Michael Shih, David Sklansky, Peter Swire, Elisabeth Theodore, Lee Tien, George Triantis, and Tyce Walters. The editors of the Yale Law Journal, led by Jeremy Aron-Dine, provided invaluable recommendations on the Article’s substance and organization. The author is especially grateful to the federal judges, attorneys, and law enforcement officers who informed this Article’s discussion of the law, policy, and technology issues associated with government hacking.
Jenna McLaughlin, FBI’s Secret Surveillance Tech Budget Is ‘Hundreds of Millions,’ Intercept (June 25, 2016, 10:49 AM), http://theintercept.com/2016/06/25/fbis-secret-surveillance -tech-budget-is-hundreds-of-millions [http://perma.cc/76J4-GFHD] (quoting Baker).
See Raphael Satter, How a School Bomb-Scare Case Sparked a Media vs. FBI Fight, Associated Press (Mar. 18, 2017), http://www.ap.org/ap-in-the-news/2017/how-a-school-bomb-scare-case-sparked-a-media-vs.-fbi-fight [http://perma.cc/5LZP-KBZ4]. See generally Cyber Div., Fed. Bureau of Investigation, Situation Action Background: Timberline School District (Oct. 29, 2014), in 2 Reporters Comm. for Freedom of the Press, FBI Document Production at RCFP-44, https://www.rcfp.org/sites/default/files/litigation/rcfpapfoia_2016-02-26_fbi_document_production_part_2_of_5.pdf [http://perma.cc/SF87-W6VM] (providing background on the bomb threat and the FBI’s decision to impersonate a member of the media).
Lacey 10th-Grader Arrested in Threats To Bomb School, Seattle Times (June 14, 2007, 4:01 PM), http://www.seattletimes.com/seattle-news/lacey-10th-grader-arrested-in-threats-to -bomb-school [http://perma.cc/4H35-N9KH].
See Office of the Inspector Gen., A Review of the FBI’s Impersonation of a Journalist in a Criminal Investigation, U.S. Dep’t Just. 9 (Sept. 2016), https://oig.justice.gov/reports/2016/o1607.pdf [http://perma.cc/Z3RN-WCVL].
Referring to these practices as “malware” is a source of some controversy. Compare United States v. Arterbury, No. 15-CR-182-JHP, 2016 BL 133752, at *6 (N.D. Okla. Apr. 25, 2016) (referring to FBI software as “malware”), with United States v. Matish, 193 F. Supp. 3d 585, 601-02 (E.D. Va. 2016) (objecting to characterization of a government program as “malware”). I use the term “malware” throughout this Article because, in the field of computer security, it is the common term for software that subverts a user’s device. The term is not intended as a criticism of government hacking. On the contrary, my view is that hacking can be a legitimate and effective law enforcement technique. I also use the term to promote consistency and avoid ambiguity. Government documents have referred to hacking with a wide variety of terms, including Network Investigative Technique (NIT), Computer and Internet Protocol Address Verifier (CIPAV), Internet Protocol Address Verifier (IPAV), Remote Access Search and Surveillance (RASS), Remote Computer Search, Remote Search, Web Bug, Sniffer, Computer Tracer, Internet Tracer, and Remote Computer Trace.
See United States v. Scarfo, 180 F. Supp. 2d 572, 574 (D.N.J. 2001). The Scarfo opinion provides only a summary of the FBI’s “Key Logger System,” recognizing it as protected from disclosure under the Classified Information Procedures Act. What details are included suggest a design with both hardware and software components. Later in 2001, news reports confirmed that the FBI was developing sophisticated malware, euphemistically entitled “Magic Lantern” and the “Enhanced Carnivore Project.” See Bob Sullivan, FBI Software Cracks Encryption Wall, MSNBC (Nov. 20, 2001), http://www.nbcnews.com/id/3341694/ns/technology_and_science -security/t/fbi-software-cracks-encryption-wall [http://perma.cc/Y9D5-WVUA].
See Letter from Mythili Raman, Acting Assistant Att’y Gen., U.S. Dep’t of Justice, Criminal Div., to Judge Reena Raggi, Chair, Advisory Comm. on the Criminal Rules 1 (Sept. 18, 2013) (describing government hacking practices as “increasingly common situations”); see also Email from [Redacted] to CTCs, Re [Redacted] (Mar. 7, 2002), in 5 Elec. Frontier Found., CIPAV FOIA Release 1, https://www.eff.org/files/filenode/cipav/fbi_cipav-05.pdf [http://perma.cc/3BVV-G6YM] (“[W]e are seeing indications that [the Internet Protocol Address Verifier (IPAV)] technique is being used needlessly by some agencies . . . .”).
See Users [Start Date: 2013-01-01, End Date: 2016-12-31, Source: All users], Tor Project, https://metrics.torproject.org/userstats-relay-country.html?start=2013-01-01&end=2016 -12-31&country=all&events=off [http://perma.cc/JNW4-LKNU]. See generally Roger Dingledine et al., Tor: The Second-Generation Onion Router (2004), http://svn.torproject.org/svn /projects/design-paper/tor-design.pdf [http://perma.cc/3QBZ-YGWW] (describing Tor).
Ivan Krstic, Behind the Scenes of iOS Security, YouTube (Aug. 16, 2016), https://www.youtube.com/watch?v=BLGFriOKz6U [http://perma.cc/N8HR-GU2N].
Android 7.1 Compatibility Definition Document, Google, http://source.android.com /compatibility/android-cdd.html [http://perma.cc/Q969-WM3C] (“[T]he data storage encryption MUST be enabled by default at the time the user has completed the out-of-box setup experience.”).
Messenger Secret Conversations: Technical Whitepaper, Facebook 3 (July 8, 2016), http://fbnewsroomus.files.wordpress.com/2016/07/secret_conversations_whitepaper-1.pdf [http://perma.cc/3QM6-EZU4]; WhatsApp Encryption Overview: Technical Whitepaper, WhatsApp 3 (July 6, 2017), http://www.whatsapp.com/security/WhatsApp-Security -Whitepaper.pdf [http://perma.cc/56Y9-UFW6].
See Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy: Hearing Before the S. Comm. on the Judiciary, 114th Cong. 1 (2015) (joint statement of Sally Quillian Yates, Deputy Att’y Gen. of the United States, and James Comey, Director of the FBI); Majority Staff, Going Dark, Going Forward: A Primer on the Encryption Debate, House Committee on Homeland Security (June 2016), http://homeland.house.gov/wp-content /uploads/2016/07/Staff-Report-Going-Dark-Going-Forward.pdf[http://perma.cc/86AK -42QD].
See Comput. Sci. & Telecomm. Bd., Nat’l Acad. of Scis., Eng’g & Med., Exploring Encryption and Potential Mechanisms for Authorized Government Access to Plaintext (2016); Harold Abelson et al., Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications, Mass. Inst. Tech. (July 7, 2015), http://dspace.mit.edu/handle/1721.1/97690 [http://perma.cc/E5NV-V74G]; Urs Gasser et al., Don’t Panic: Making Progress on the “Going Dark” Debate, Berkman Klein Ctr. for Internet & Soc’y at Harv. U. (Feb. 1, 2016), http://cyber.law.harvard.edu/pubrelease/dont -panic/Dont_Panic_Making_Progress_on_Going_Dark_Debate.pdf [http://perma.cc /6LFP-TCHW].
An archive of FBI documents released under the Freedom of Information Act includes a diverse range of requests for hacking assistance. See 10 Elec. Frontier Found., CIPAV FOIA Release 1-19, https://www.eff.org/files/filenode/cipav/fbi_cipav-10.pdf [http://perma.cc /T4P6-D9VJ]; 13 Elec. Frontier Found., CIPAV FOIA Release 1-20, https://www.eff.org/files/filenode/cipav/fbi_cipav-13.pdf [http://perma.cc/DD9Z-S7CN].
United States v. Pierce, No. 8:13CR106, 2014 WL 5173035 (D. Neb. Oct. 14, 2014); United States v. Pierce, No. 8:13CR106, 2014 U.S. Dist. LEXIS 108171 (D. Neb. July 28, 2014) (magistrate recommendation in same prosecution); In re Warrant To Search a Target Comput. at Premises Unknown, 958 F. Supp. 2d 753 (S.D. Tex. 2013); United States v. Scarfo, 180 F. Supp. 2d 572 (D.N.J. 2001).
Recent scholarship has emphasized jurisdictional and venue issues with law enforcement hacking. See, e.g., Devin M. Adams, The 2016 Amendments to Criminal Rule 41: National Search Warrants To Seize Cyberspace, “Particularly” Speaking, 51 U. Rich. L. Rev. 727 (2017); Susan W. Brenner, Law, Dissonance, and Remote Computer Searches, 14 N.C. J.L. & Tech. 43 (2012) (suggesting how to reconcile differing jurisdictional privacy standards in a remote computer search); Ahmed Ghappour, Searching Places Unknown: Law Enforcement Jurisdiction on the Dark Web, 69 Stan. L. Rev. 1075 (2017) (criticizing government hacking for intruding upon the sovereignty interests of other nations); Orin S. Kerr & Sean D. Murphy, Government Hacking To Light the Dark Web: What Risks to International Relations and International Law?, 70 Stan. L. Rev. Online 58 (2017) (responding to Ghappour); Zach Lerner, A Warrant To Hack: An Analysis of the Proposed Amendments to Rule 41 of the Federal Rules of Criminal Procedure, 18 Yale J.L. & Tech. 26 (2016) (reviewing proposed revisions to the federal warrant venue rules); Brian L. Owsley, Beware of Government Agents Bearing Trojan Horses, 48 Akron L. Rev. 315 (2015) (reviewing five district court opinions on government hacking, primarily related to venue issues).
Several articles have raised policy concerns about technical properties of government malware. See, e.g., Steven M. Bellovin et al., Lawful Hacking: Using Existing Vulnerabilities for Wiretapping on the Internet, 12 Nw. J. Tech. & Intell. Prop. 1 (2014) (describing policy considerations associated with a shift from conventional wiretapping to law enforcement hacking); Benjamin Lawson, Note, What Not to “Ware,” 35 Rutgers Comp. & Tech. L.J. 77 (2008) (categorizing types of government hacking); Steven M. Bellovin, Matt Blaze & Susan Landau, Insecure Surveillance: Technical Issues with Remote Computer Searches, Computer, Mar. 2016, at 14 (describing policy considerations associated with a shift from conventional wiretapping to law enforcement hacking).
Authors have also noted how hacking is a response to growing adoption of encryption and anonymization tools. See, e.g., Orin S. Kerr & Bruce Schneier, Encryption Workarounds, Geo. L. Rev. (forthcoming 2018) (reviewing law enforcement mechanisms for defeating encryption, including hacking); Stephanie K. Pell, You Can’t Always Get What You Want: How Will Law Enforcement Get What It Needs in a Post-CALEA, Cybersecurity-Centric Encryption Era?, 17 N.C. J.L. & Tech. 599 (2016) (arguing that broader deployment of security technology will require the government to either undermine security or resort to hacking); Susan Hennessey, The Elephant in the Room: Addressing Child Exploitation and Going Dark, Hoover Inst. (2017), http://www.hoover.org/sites/default/files/research/docs/hennessey _webreadypdf.pdf [http://perma.cc/6J6Q-JKZK] (arguing that law enforcement hacking is a legitimate response to child exploitation and encryption and reviewing litigation associated with the Playpen investigation).
A couple articles have touched on Fourth Amendment considerations for government malware. See, e.g., Susan W. Brenner, Fourth Amendment Future, 81 Miss. L.J. 1229 (2012) (arguing that if the government remotely retrieves files from a suspect’s hard drive, it must obtain a warrant); Gus Hosein & Caroline Wilson Palow, Modern Safeguards for Modern Surveillance, 74 Ohio St. L.J. 1071, 1093-97 (2013) (briefly arguing that most government malware requires a warrant); see also ACLU et al., Challenging Government Hacking in Criminal Cases, ACLU (Mar. 2007), https://www.aclu.org/sites/default/files/field_document/malware_guide_3 -30-17-v2.pdf [http://perma.cc/P5UE-ASLW] (reviewing possible challenges to government hacking under the Fourth Amendment and Federal Rules of Criminal Procedure).
The opinion in United States v. Scarfo generated a small body of commentary. See, e.g., Angela Murphy, Cracking the Code to Privacy: How Far Can the FBI Go?, 2002 Duke L. & Tech. Rev. 0002 (explaining the Scarfo case); Nathan E. Carrell, Note, Spying on the Mob: United States v. Scarfo—A Constitutional Analysis, 2002 U. Ill. J.L. Tech. & Pol’y 193 (2002) (reviewing Scarfo and arguing that keystroke monitoring should require a super-warrant); Neal Hartzog, Comment, The “Magic Lantern” Revealed: A Report of the FBI’s New ‘Key Logging’ Trojan and Analysis of Its Possible Treatment in a Dynamic Legal Landscape, 20 J. Marshall J. Computer & Info. L. 287 (2002) (arguing that Scarfo was rightly decided); Rachel S. Martin, Note, Watch What You Type: As the FBI Records Your Keystrokes, the Fourth Amendment Develops Carpal Tunnel Syndrome, 40 Am. Crim. L. Rev. 1271 (2003) (arguing that keystroke monitoring should require a super-warrant).
The debate about law enforcement hacking is not confined to the United States. The European Union, for example, is grappling with the same issue. See, e.g., Mirja Gutheil et al., European Parliament Directorate-Gen. for Internal Policies, Legal Frameworks for Hacking by Law Enforcement: Identification, Evaluation and Comparison of Practices (2017) (summarizing how six EU member states and three non-EU countries regulate law enforcement hacking and providing policy proposals).
This Article is focused exclusively on government hacking for law enforcement purposes. Hacking for national security purposes introduces further legal complications (under the Fourth Amendment and the Foreign Intelligence Surveillance Act), as well as numerous additional policy dimensions. The Article is also exclusively focused on hacking of domestic computer systems. The extraterritorial scope of the Fourth Amendment remains a subject of professional and scholarly debate. See generally United States v. Ulbricht, No. 14-cr-68 (KBF), 2014 U.S. Dist. LEXIS 145553, at *13-14 (S.D.N.Y. Oct. 10, 2014) (considering how the Fourth Amendment might apply to the search of a foreign server); Jennifer Daskal, The Un-Territoriality of Data, 125 Yale L.J. 326 (2015) (arguing that traditional Fourth Amendment concepts of territoriality are a poor fit for electronic data); David G. Delaney, Widening the Aperture on Fourth Amendment Interests: A Comment on Orin Kerr’s The Fourth Amendment and the Global Internet, 68 Stan. L. Rev. Online 9 (2015) (similar); Orin S. Kerr, The Fourth Amendment and the Global Internet, 67 Stan. L. Rev. 285 (2015) (summarizing territorial Fourth Amendment doctrine and applying it to international data searches).
This Article is also focused exclusively on federal hacking because the factual and legal issues are better developed for federal law enforcement. It is foreseeable that state, county, and municipal law enforcement agencies will also adopt hacking as an investigative technique. In some instances, they already have—commercial tools for breaking into smartphones, for example, are already widespread. Future work might consider the status of hacking under state law and whether certain hacking tools should be reserved for federal law enforcement.
See William Baude & James Y. Stern, The Positive Law Model of the Fourth Amendment, 129 Harv. L. Rev. 1823, 1829 (2016) (describing the threshold inquiry in modern Fourth Amendment law); Orin Kerr, Four Models of Fourth Amendment Protection, 60 Stan. L. Rev. 503, 528-29 (2007) (same); Daniel J. Solove, Fourth Amendment Pragmatism, 51 B.C. L. Rev. 1511, 1514 (2010) (same). This Article focuses exclusively on search doctrine rather than seizure doctrine, because seizures—which are grounded in personal freedom of movement and possessory interests in property—are a poor fit for electronic surveillance. See Brendlin v. California, 551 U.S. 249, 255 (2007) (emphasizing that seizures of persons involve “physical force or show of authority” (quoting Florida v. Bostick, 501 U.S. 429, 434 (1991))); United States v. Jacobsen, 466 U.S. 109, 113 (1984) (“A ‘seizure’ of property occurs when there is some meaningful interference with an individual’s possessory interests in that property.”). While some courts and scholars occasionally refer to forms of electronic surveillance (and especially copying data) as a species of seizure, categorization as a search is more consistent with current Fourth Amendment doctrine.
See, e.g., Florida v. Riley, 488 U.S. 445 (1989) (aerial surveillance of a greenhouse); United States v. Knotts, 460 U.S. 276 (1983) (location-tracking device on public roads); Smith v. Maryland, 442 U.S. 735 (1979) (numbers dialed on a telephone). While beyond the scope of this Article, it bears mentioning that the Fourth Amendment provides a limited privacy safeguard for otherwise unprotected electronic information that is held by a service provider; the service provider can assert its own constitutional privacy interest and contest the subpoena’s “reasonableness.” See Hale v. Henkel, 201 U.S. 43, 75-77 (1906) (holding that subpoenas must be reviewed for “reasonableness” under the Fourth Amendment); In re Horowitz, 482 F.2d 72, 75-79 (2d Cir. 1973) (reviewing constitutional limits on subpoenas); In re Application of the United States for an Order Pursuant to 18 U.S.C. § 2703(d), 830 F. Supp. 2d 114, 130-39 (E.D. Va. 2011) (suggesting that a surveillance order served on Twitter might be reviewed for reasonableness).
These four steps are borrowed from an influential Lockheed Martin paper on cybersecurity, which provides a valuable taxonomy of steps in the “intrusion kill chain.” See Eric M. Hutchins et al., Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains, Lockheed Martin Corp. (2011), http://www .lockheedmartin.com/content/dam/lockheed/data/corporate/documents/LM-White-Paper -Intel-Driven-Defense.pdf [http://perma.cc/MB7A-XUVB].
See, e.g., Criminal Complaint at 29-33, United States v. Hernandez, No. 1:17-mj-00661 (S.D. Ind. Aug. 1, 2017) (describing malware delivery by inducing the suspect to open a video shared on Dropbox); Amended Application for a Search Warrant at 26-28, In re Use of a Network Investigative Technique for a Comput. Accessing Email Accounts: weknow@hotdak.net, iama.skank@yandex.com, weknow@mail2actor.com, and skankcatcher@mail2actor.com, No. 6:17-mj-00519-JWF (W.D.N.Y. Feb. 13, 2017) (describing malware delivery by causing the suspect to execute a Microsoft Word macro); Application for a Search Warrant at 9, In re Network Investigative Technique (NIT) for E-mail Address 512SocialMedia@gmail.com, No. A-12-M-748 (W.D. Tex. Dec. 18, 2012) (proposing malware delivery via an email); Third Amended Application for a Search Warrant at 20, In re Network Investigative Technique (“NIT”) for Email Address texan.slayer@yahoo.com, No. 12-sw-05685-KMT (D. Colo. Dec. 11, 2012) (same); Application and Affidavit for Search Warrant at 13-14, In re Search of Any Comput. Accessing Elec. Message(s) Directed to Adm’r(s) of MySpace Account “Timberlinebombinfo” and Opening Message(s) Delivered to that Account by the Gov’t, No. MJ07-5114 (W.D. Wash. June 12, 2007) (proposing malware delivery via a social network message).
See Tor: Hidden Service Protocol, Tor, http://www.torproject.org/docs/hidden-services.html [http://perma.cc/9NEE-UL6E]. These services are often referred to as the “dark web.”
The government’s continued operation of criminal websites—especially child-pornography forums—has been a source of substantial controversy. See, e.g., Mike Carter, FBI’s Massive Porn Sting Puts Internet Privacy in Crossfire, Seattle Times (Aug. 27, 2016), http://www.seattletimes.com/seattle-news/crime/fbis-massive-porn-sting-puts-internet-privacy-in -crossfire [http://perma.cc/73L4-8E3Z].
See Application for a Search Warrant, In re Search of Computs. that Access the Website “Hidden Service A” Which Is Located at oqm66m6lyt6vxk7k.onion, No. 8:12MJ360 (D. Neb. Nov. 19, 2012) (malware delivery to visitors of the seized “Hidden Service A” service); Application for a Search Warrant, In re Search of Computs. that Access the Website “Hidden Service B” Which Is Located at s7cgvirt5wvojli5.onion, No. 8:12MJ359 (D. Neb. Nov. 19, 2012) (malware delivery to visitors of the seized “TB3” service); Application for a Search Warrant, In re Search of Computs. that Access the Website “Bulletin Board A” Located at http://jkpos24pl2r3urlw.onion, No. 8:12MJ356 (D. Neb. Nov. 16, 2012) [hereinafter Application for “Bulletin Board A” Search Warrant] (malware delivery to visitors of the seized “PedoBoard” service). See generally Kevin Poulsen, Visit the Wrong Website and the FBI Could End Up in Your Computer, Wired (Aug. 5, 2014, 6:30 AM) http://www.wired.com/2014/08 /operation_torpedo [http://perma.cc/ZTG3-JZDM] (describing FBI techniques in “Operation Torpedo”). The government additionally sought and received at least one wiretap order authorizing interception of private user communications on a seized service. See United States v. Laurita, No. 8:13CR107, 2016 WL 4179365, at *3 (D. Neb. Aug. 5, 2016) (describing a wiretap order for private messages); United States v. Michaud, No. 3:15-cr-05351-RJB, 2016 WL 337263, at *1 (W.D. Wash. Jan. 28, 2016) (describing a wiretap order for a message board).
See Affidavit in Support of Application for a Search Warrant at 13-14, In re Search of Computs. that Access “Websites 1-23”, No. 8:13-mj-01744-WGC (D. Md. July 22, 2013) (malware delivery to visitors of twenty-three websites related to child pornography hosted on the seized Freedom Hosting platform); see also Kevin Poulsen, FBI Admits It Controlled Tor Servers Behind Mass Malware Attack, Wired (Sept. 13, 2013, 4:17 PM), http://www.wired.com /2013/09/freedom-hosting-fbi [http://perma.cc/9LRE-JPH3] (describing the FBI’s investigative technique).
See Affidavit in Support of Application for Search Warrant at 23-27, In re Search of Computs. that Access upf45jv3bziuctml.onion, No. 1:15-SW-89 (E.D. Va. Feb. 20, 2015) (proposing malware delivery to visitors of the seized “Playpen” service). Like in the Operation Torpedo investigation, the government obtained a wiretap order to intercept private messages and chats between users. Application for an Order Authorizing Interception of Electronic Communications at 4-5, No. 1:15-ES-4 (E.D. Va. Feb. 20, 2015).
See Michaud, 2016 WL 337263, at *1, *5 (noting that “the FBI may have anticipated tens of thousands of potential suspects” in the Operation Pacifier investigation, because the Playpen website had over 200,000 registered users and 1,500 daily visitors); Transcript of Oral Argument at 18, 39, United States v. Tippens, No. CR16-5110RJB (W.D. Wash. Nov. 1, 2016) (noting that the investigation involved compromising 1,432 devices inside the United States and 7,281 devices in 120 other countries and territories).
See Order at 3-6, In re Application of the U.S. for an Order Authorizing the Surreptitious Entry into the Premises of Merchant Servs., No. 99-4061 (D.N.J. June 9, 1999) (search warrant in United States v. Scarfo, 180 F. Supp. 2d 572 (D.N.J. 2001), allowing law enforcement entry to install malware on a suspect’s computer).
See Ellen Nakashima, FBI Paid Professional Hackers One-Time Fee to Crack San Bernardino iPhone, Wash. Post (Apr. 12, 2016), http://www.washingtonpost.com/world /nationalsecurity/fbi-paid-professional-hackers-one-time-fee-to-crack-san-bernardino -iphone/2016/04/12/5397814a-00de-11e6-9d36-33d198ea26c5_story.html [http://perma.cc /G6VJ-ACFU] (describing the Apple-FBI dispute); see also In re Order Requiring Apple, Inc. to Assist in the Execution of a Search Warrant Issued by this Court, 149 F. Supp. 3d 341, 375-76 (E.D.N.Y. 2016) (holding that the FBI could not compel Apple to assist with bypassing the lock screen on a suspect’s iPhone).
See Affidavit in Support of Application for Search Warrant, In re Search of Computs. that Access Target E-Mail Accounts, No. 8:13-mj-01745-WGC (D. Md. Oct. 31, 2016) (targeting malware at users who access specific Tor Mail accounts); Affidavit in Support of Application for Search Warrant, In re Search of Computs. that Access the E-Mail Accounts Described in Attachment A, No. 8:13-mj-01746-WGC (D. Md. Oct. 31, 2016) (similar).
See Flash Ignores FF Proxy Settings, Mozilla (Apr. 30, 2010), http://bugzilla.mozilla.org /show_bug.cgi?id=562880 [http://perma.cc/J93J-HQ3H].
See Ken Buckler, Caffsec-malware-analysis, Google Code (Mar. 19, 2016), http://code.google.com/p/caffsec-malware-analysis/source/default/source [http://perma.cc/6X7P -6WFN] (providing the source code and a forensic analysis of the FBI software); Vlad Tsyrklevich, Annotation and Analysis of the Tor Browser Bundle Exploit (Apr. 6, 2014), https://tsyrklevich.net/tbb_payload.txt [http://perma.cc/8WBX-7TB6] (providing additional forensic analysis of the FBI software); see also Crash with Onreadystatechange and Reload, Mozilla (Apr. 3, 2013), https://bugzilla.mozilla.org/show_bug.cgi?id=857883 [http://perma.cc/NMD5-8MW3] (explaining the Firefox vulnerability).
See Kevin Poulsen, Feds Are Suspects in New Malware That Attacks Tor Anonymity, Wired (Aug. 5, 2013), http://www.wired.com/2013/08/freedom-hosting [http://perma.cc/AAQ2-6J54].
See NIT Forensic and Reverse Engineering Report, supra note 51, at 3 (calling the Adobe Flash interfaces Capabilities.os to determine the operating system, Capabilities.cpuArchitecture to determine the processor architecture, and Lib.current.loaderInfo.parameters.id to retrieve a unique session identifier).
See Analysis of the Tor Browser Bundle Exploit Payload, Vlad Tsyrklevich (Apr. 6, 2014), http://tsyrklevich.net/tbb_payload.txt [http://perma.cc/VV8L-DJL7] (calling the Windows Sockets interface gethostname() to determine the computer’s configured name and the interface gethostbyname() to determine the computer’s local IP address, then calling the IP Helper interface SendARP to determine the computer’s MAC address).
See, e.g., Kyllo v. United States, 533 U.S. 27, 34 (2001) (“[I]n the case of the search of the interior of homes—the prototypical and hence most commonly litigated area of protected privacy—there is a ready criterion, with roots deep in the common law, of the minimal expectation of privacy that exists, and that is acknowledged to be reasonable.”); United States v. U.S. Dist. Court (Keith), 407 U.S. 297, 313 (1972) (“[P]hysical entry of the home is the chief evil against which the wording of the Fourth Amendment is directed . . . .”).
See Florida v. Jardines, 569 U.S. 1, 5-12 (2013) (following Jones and applying it to a drug-sniffing dog on residential curtilage); United States v. Jones, 565 U.S. 400, 404-11 (2012) (noting a property-based conception of the Fourth Amendment, and applying it to the attachment of a GPS-tracking device); see also Entick v. Carrington, 19 Howell’s St Trials 1029 (CP 1765) (establishing government liability for trespasses to real and personal property). Whether this trespass test applies to purely electronic searches remains ambiguous. Jones, 565 U.S. at 426 (Alito, J., concurring) (“[T]he Court’s reliance on the law of trespass will present particularly vexing problems in cases involving surveillance that is carried out by making electronic, as opposed to physical, contact . . . .”).
See, e.g., United States v. Thomas, 726 F.3d 1086, 1092-93 (9th Cir. 2013) (suggesting that police dog contact with the outside of a toolbox, combined with a sniff test for the presence of drugs, could constitute a Fourth Amendment search). Similarly, manipulating or retaining a container could constitute sufficient interference with possessory interests to trigger Fourth Amendment seizure protections. E.g., State v. Kelly, 708 P.2d 820, 823-24 (Haw. 1985).
See, e.g., United States v. Andrus, 483 F.3d 711, 718-19 (10th Cir. 2007) (assessing appropriate Fourth Amendment analogies for computer systems, and concluding that “it seems natural that computers should fall into the same category as suitcases, footlockers, or other personal items that command[] a high degree of privacy” (alteration in original) (citation omitted)); see also United States v. Lifshitz, 369 F.3d 173, 190 (2d Cir. 2004) (“Individuals generally possess a reasonable expectation of privacy in their home computers.”); Trulock v. Freeh, 275 F.3d 391, 402-04 (4th Cir. 2001) (analogizing password-protected files on a shared computer to a locked footlocker); Guest v. Leis, 255 F.3d 325, 333 (6th Cir. 2001) (“Home owners would of course have a reasonable expectation of privacy in their homes and in their belongings—including computers—inside the home.”).
See United States v. Burgess, 576 F.3d 1078, 1087-90 (10th Cir. 2009) (suggesting that the automobile search exception to the warrant requirement may not apply to computers); Wertz v. State, 41 N.E.3d 276, 280-82 (Ind. Ct. App. 2015) (holding that the automobile search exception does not apply to electronic devices).
See United States v. Warshak, 631 F.3d 266, 283-88 (6th Cir. 2010) (holding that government access to stored email content is a Fourth Amendment search because, among other reasons, any other rule would be inconsistent with Fourth Amendment protection for real-time interception of email content). Although Warshak is only binding within the Sixth Circuit, a number of courts have cited the opinion with approval. See, e.g., United States v. Ackerman, 831 F.3d 1292, 1306 (10th Cir. 2016); United States v. Graham, 824 F.3d 421, 433 (4th Cir. 2016); United States v. Carpenter, 819 F.3d 880, 887 (6th Cir. 2016); Vista Marketing, LLC v. Burkett, 812 F.3d 954, 969 (11th Cir. 2016). No court has rejected the holding.
In the years following Warshak, the executive branch adopted the position that warrant protections are appropriate for stored content. See ECPA (Part 1): Lawful Access to Stored Content: Hearing Before the Subcomm. on Crime, Terrorism, Homeland Sec., and Investigations of the H. Comm. on the Judiciary, 113th Cong. 20 (2013) (statement of Elana Tyrangiel, Acting Assistant Att’y Gen. of the United States); A Response to Your Petition on ECPA, White House, http://petitions.obamawhitehouse.gov/petition/reform-ecpa-tell-government-get -warrant [http://perma.cc/X322-NF6R]. In 2013, the Department of Justice established a formal policy of obtaining a search warrant before accessing stored communications in criminal investigations. H.R. Rep. No. 114-528, at 9 (2016). At present, nearly every major online service requires a search warrant before disclosing customer content to a law enforcement agency. Nate Cardozo et al., Who Has Your Back? Protecting Your Data from Government Requests, Elec. Frontier Found. 8 (2015), http://www.eff.org/files/2015/06/18/who_has_your_back_2015_protecting_your_data_from_government_requests_20150618.pdf [http://perma.cc/M6GS-6ERL].
See, e.g., Daniel J. Solove, Digital Dossiers and the Dissipation of Fourth Amendment Privacy, 75 S. Cal. L. Rev. 1083, 1086-87 (“The Court’s current conception of privacy is as a form of total secrecy . . . . Since information maintained by third parties is exposed to others, it is not private, and therefore not protected by the Fourth Amendment.”). Courts and commentators have developed a range of terms for describing these doctrines, including the “third-party doctrine,” “metadata doctrine,” and “public movements doctrine.” Whatever the terminology, the underlying rationales are essentially shared.
See, e.g., United States v. Thompson, No. 15-3313, 2017 WL 3389368, at *4-9 (10th Cir. Aug. 8, 2017) (holding that the third-party doctrine precludes Fourth Amendment protection for retrospective cell-site location information); Graham, 824 F.3d at 424-38 (4th Cir. 2016) (same); Carpenter, 819 F.3d at 886-90 (6th Cir. 2016) (same); United States v. Davis, 785 F.3d 498, 505-18 (11th Cir. 2015) (same); In re Application of the U.S. for Historical Cell Site Data, 724 F.3d 600, 608-15 (5th Cir. 2013) (same); see also United States v. Wallace, 857 F.3d 685 (5th Cir.) (reaching the same conclusion for prospective cell-site location information, though the opinion was withdrawn and replaced because the case actually involved GPS information), withdrawn, 866 F.3d 605 (5th Cir. 2017). But see, e.g., Tracey v. State, 152 So. 3d 504, 511-26 (Fla. 2014) (reaching the opposite conclusion for prospective cell-site location information and reserving judgment on retrospective information). The Supreme Court is reviewing Carpenter in the current term, and, as discussed infra note 122 and accompanying text, is widely expected to find at least some measure of Fourth Amendment protection for geolocation records.
See 18 U.S.C. § 2703(c)(2) (2012) (authorizing law enforcement access to subscriber records and telephone metadata with a grand jury or administrative subpoena); id. § 2703(d) (establishing an intermediate court order for non-content records, including internet communications metadata and device geolocation); id. § 2709 (2012) (granting limited authority for administrative subpoenas, commonly referred to as national security letters, for subscriber records and telephone metadata).
In re Search of Premises Known As: Three Hotmail Email Accounts, No. 16-MJ-8036-DJW, 2016 WL 1239916, at *8 (D. Kan. Mar. 28, 2016) (“[E]very court . . . that has participated in this discussion [of Fourth Amendment protections for electronically stored content] agrees . . . individuals have a right to privacy with respect to email . . . .”); see, e.g., Quon v. Arch Wireless Operating Co., 529 F.3d 892, 906-08 (9th Cir. 2008) (concluding that the Fourth Amendment protects archived text messages); In re Search of Info. Associated with the Facebook Account Identified by the Username Aaron.Alexis that Is Stored at Premises Controlled by Facebook, Inc., 21 F. Supp. 3d 1, 6 (D.D.C. 2013) (assuming that the Fourth Amendment protects private content on a social network).
They do so by issuing a reverse Domain Name System query (which is free) to find the corresponding ISP and then subpoenaing the ISP. See, e.g., United States v. Acevedo-Lemus, No. SACR 15-00137-CJC, 2016 WL 4208436, at *3 (C.D. Cal. Aug. 8, 2016) (denying defendant’s motion to suppress evidence in a case where defendant’s name and address were discovered via a subpoena to Time Warner Cable); United States v. Darby, 190 F. Supp. 3d 520, 529 (E.D. Va. 2016) (subpoena to Verizon).
E.g., United States’ Response in Opposition to Defendant’s Motion To Suppress Evidence at 10-13, United States v. Schuster, No. 1:16-CR-051, 2017 WL 1154088 (S.D. Ohio Sept. 1, 2016) (arguing that there is no Fourth Amendment protection for government malware that reports an IP address); Government’s Opposition to Defendant’s Motion To Suppress Evidence at 8-12, United States v. Acevedo-Lemus, No. SACR 15-00137-CJC, 2016 WL 4208436 (C.D. Cal. July 11, 2016) (similar); see also United States v. Laurita, No. 8:13CR107, 2016 WL 4179365, at *6 (D. Neb. Aug. 5, 2016) (explaining the government’s withdrawal of a stipulation that using malware to obtain an IP address constitutes a Fourth Amendment search). The United States also adopted this position in an Eighth Circuit challenge to evidence arising from the Operation Torpedo investigation. Brief of Appellee at 27-28, United States v. Welch, No. 15-1993 (8th Cir. Aug. 12, 2015). The government’s invocation of this argument has not, however, been uniform. See, e.g., Government’s Reply Brief at 22 n.12, United States v. Horton, No. 16-3976, 2017 U.S. Dist. LEXIS 44757 (8th Cir. Jan. 9, 2017) (“[Defendant] argues . . . that he had a reasonable expectation of privacy in his IP address and the information stored on his computer, but we have not suggested otherwise.”); United States’ Response in Opposition to Defendant’s Supplemental Motion To Suppress at 3 n.2, United States v. Gaver, No. 3:16-CR-88 (S.D. Ohio Dec. 16, 2016) (withdrawing its earlier argument that obtaining an IP address was not a search); see also Orin Kerr, What’s Missing in the Government’s Briefs in the Playpen Warrant Cases, Wash. Post: Volokh Conspiracy (Feb. 20, 2017), http://www.washingtonpost.com/news/volokh-conspiracy/wp/2017/02/20/whats-missing-in-the-governments-briefs-in-the-playpen-warrant-cases [http://perma.cc/W9U5-QGUN] (noting that the United States did not advance this argument in three appeals arising from the Operation Pacifier investigation). The United States also suggested—but then dropped—a similar argument before the Supreme Court in the context of searching mobile phones incident to arrest. Compare Brief for Petitioner at 42, Riley v. California, 134 S. Ct. 2473 (2014) (No. 13-212) (suggesting that a suspect has no reasonable expectation of privacy in his mobile phone’s stored call log), aff’g United States v. Wurie, 728 F.3d 1 (1st Cir. 2013), with Reply Brief for Petitioner at 7, 15-16, Riley, 134 S. Ct. 2473 (No. 13-212) (clarifying that a suspect has a reasonable expectation of privacy in his mobile phone’s stored call log, but that a search of the call log incident to arrest would be “reasonable”).
See Laurita, 2016 WL 4179365, at *3 (describing a super-warrant application for an “Order Authorizing the Surreptitious Installation of Electronic Keyboard Keystroke and Computer Screen Capture Recording Devices To Collect Computer Keyboard Keystrokes and Computer Screen Captures . . .”); see also Memorandum from David Bitkower, Deputy Assistant Att’y Gen., U.S. Dep’t of Justice, Criminal Div., to Judge Reena Raggi, Chair, Advisory Comm. on Criminal Rules, at 9 (Dec. 22, 2014) (on file with author) (noting that the Wiretap Act, which is the statutory implementation of Berger v. New York’s super-warrant doctrine, applies to government malware that intercepts electronic communications).
See Email from [Redacted] to [Redacted] Re: IPAV (May 11, 2006), in 3 Elec. Frontier Found., CIPAV FOIA Release 1, https://www.eff.org/files/filenode/cipav/fbi_cipav-03.pdf [http://perma.cc/9GZR-RD2G] (“I think that you most likely were told that a simple IPAV would be used initially, in which case I would agree with your initial analysis. [Redacted, apparent description of additional hacking steps to provide contrast.] This clearly requires a search and therefore a warrant and/or consent.”); Email from [Redacted] to [Redacted] Re: [Redacted] (Aug. 24, 2005), in 14 Elec. Frontier Found., CIPAV FOIA Release 36, https://www.eff.org/files/filenode/cipav/fbi_cipav-14.pdf [http://perma.cc/7EA7-XRJM] (“I still think that use of [redacted] is consensual monitoring without need for process . . . . That said, I will try to contort my mind into a different position if you still think otherwise.”); Email from [Redacted] to [Redacted] (Aug. 23, 2005), in 3 Elec. Frontier Found., supra, at 44 (acknowledging that whether a search warrant is required “is a hotly debated issue, and as of yet there is no policy guidance issued”); Email from [Redacted] to [Redacted] Re: UCO Proposal (Dec. 8, 2004), in 1 Elec. Frontier Found., CIPAV FOIA Release 4, https://www.eff.org/files/filenode/cipav/fbi_cipav-01.pdf [http://perma.cc/WNX8-EKA8] (“We all know that there are IPAVs and then there are IPAVs. Of course the technique can be used in a manner that would require a court order. We need to know how/when to draw the line for obvious reasons.”); id. at 5 (“I don’t necessarily think a search warrant is needed in all [hacking] cases . . . .”); Email from [Redacted] to [Redacted] Re: UCO Proposal (Dec. 1, 2004), in 1 Elec. Frontier Found., supra, at 9 (“[T]he safest course is to secure a warrant, though one might arguably not be required . . . .”); Email from [Redacted] to [Redacted] Re: IPAVs (Aug. 4, 2004), in 1 Elec. Frontier Found., supra, at 41 (“There is an argument that at least the simplest IPAV is essentially akin to a [redacted] command and that under this principle may be used without a court order.”).
Application and Affidavit for Search Warrant, supra note 4, at 2 n.2 (“In submitting this request, the Government respectfully does not concede that . . . a reasonable expectation of privacy is abridged by the use of this communication technique, or that the use of this technique to collect a computer’s IP address, MAC address or other variables that are broadcast by the computer whenever it is connected to the Internet, constitutes a search or seizure.”).
See Email from [Redacted] to [Redacted] Re: UCO Proposal (Dec. 1, 2004), in 1 Elec. Frontier Found., supra note 93, at 9 (“According to guidance issued by DOJ CCIPS, DOJ has ‘consistently advised AUSAs and agnets [sic] proposing to use IPAVs to obtain a warrant to avoid the exclusion of evidence.’ This opinion is dated March 7, 2002, written by [redacted].”).
U.S. Dep’t of Justice, Online Investigations Working Grp., Online Investigative Principles for Federal Law Enforcement Agents 20 (1999) (noting that “agents must be careful to use [identifying] information-gathering tools only as conventionally permitted and not in a manner unauthorized by the system (as by exploiting design flaws . . . )”).
See Email from [Redacted] to [Redacted] Re: UCO Proposal (Dec. 8, 2004), in 1 Elec. Frontier Found., supra note 93, at 4 (“[I]t is my understanding that there is a disagreement on the status of the IPAV between what FBI/OGC says and what DOJ/CCIPS [sic]. If OGC will set out a policy on this, we will be glad to rely on it.”).
See Email from [Redacted] to [Redacted] Re: IPAV/CIPAV (Nov. 22, 2004), in 1 Elec. Frontier Found., supra note 93, at 24 (“He wants all [special agents] to know that [the Office of the General Counsel] expects a [search warrant] for all IPAV/CIPAV applications (no getting around [the Operational Technology Division] by going to another Division that currently doesn’t follow CCIPS guidance on this point).”).
Courts have not been consistent in their terminology for these practices. See, e.g., United States v. Skinner, 690 F.3d 772, 787 (6th Cir. 2012) (suggesting that “ping” data is cell-site location information and distinct from GPS data); United States v. Caraballo, 963 F. Supp. 2d 341, 346 (D. Vt. 2013), aff’d, 831 F.3d 95 (6th Cir. 2016) (using the same term to reference both cell-site location information and GPS data). Courts have also been spotty on the technical details of these practices. A panel of the Fifth Circuit, for example, recently misunderstood an instance of GPS tracking as an instance of prospective cell-site location information tracking. See United States v. Wallace, 857 F.3d 685 (5th Cir.) (reaching the same conclusion for prospective cell-site location information, but later withdrawing and replacing the decision because the case actually involved GPS information), withdrawn, 866 F.3d 605 (5th Cir. 2017).
The third-party doctrine rationale is even further strained when the government collects location data directly, such as with a “cell-site simulator” device (commonly called an “IMSI catcher” or “Stingray”). See Brian L. Owsley, TriggerFish, StingRays, and Fourth Amendment Fishing Expeditions, 66 Hastings L.J. 183 (2014) (explaining cell-site simulator technology and surveying district court opinions). Given the relative paucity of case law on cell-site simulators—to date, not one federal appellate court has rigorously reviewed the technology—the discussion above emphasizes other mobile-phone tracking techniques. See United States v. Ellis, No. 13-CR-00818 PJH, 2017 WL 3641867, at *1-7 (N.D. Cal. Aug. 24, 2017) (determining that police use of a cell-site simulator is a Fourth Amendment search); United States v. Lambis, 197 F. Supp. 3d 606, 609-11, 614-16 (S.D.N.Y. 2016) (same); Jones v. United States, 168 A.3d 703, 711-13 (D.C. 2017) (same); State v. Andrews, 134 A.3d 324, 339-52 (Md. Ct. Spec. App. 2016) (same); see also United States v. Patrick, 842 F.3d 540, 543-44 (7th Cir. 2016) (noting a split in authority on the issue).
Id. at 778. While I have many reservations about the Skinner opinion, I find this part particularly objectionable, since it has the law backward. Making a privacy-protecting choice increases a person’s Fourth Amendment protection (i.e., his or her reasonable expectation of privacy). Electing to have a conversation indoors, for instance, results in higher privacy safeguards than holding the chat in public.
When seeking email metadata prospectively, law enforcement officers much more commonly serve a pen/trap order on the suspect’s email service (e.g., Google) and receive a real-time feed in response. Since this form of email surveillance does not implicate competing Fourth Amendment perspectives, I focus solely on ISP-based email surveillance. The ISP-based approach to email surveillance has also become less common owing to the adoption of secure email transfer and mobile devices.
See Steven M. Bellovin et al., It’s Too Complicated: How the Internet Upends Katz, Smith, and Electronic Surveillance Law, 30 Harv. J.L. & Tech. 1 (2016) (describing multiple possible approaches to the content/metadata distinction); Orin Kerr, Relative vs. Absolute Approaches to the Content/Metadata Line, Lawfare (Aug. 25, 2016, 4:18 PM), http://www.lawfareblog.com /relative-vs-absolute-approaches-contentmetadata-line [http://perma.cc/K2Q7-S2BW].
See, e.g., Protecting the Privacy of Customers of Broadband and Other Telecommunications Services, 31 FCC Rcd. 13911 ¶¶ 76, 181-83, 192-93 (2016), nullified by S.J. Res. 34, 115th Cong. (2017); see also Peter Whoriskey, Internet Provider Halts Plan To Track, Sell Users’ Surfing Data, Wash. Post (June 25, 2008), http://www.washingtonpost.com/wp-dyn/content/article/2008/06/24/AR2008062401033.html [http://perma.cc/2TNT-X79G] (describing one cable operator’s foray into deep packet inspection and the criticism that followed).
See United States v. Forrester, 512 F.3d 500, 509-11 (9th Cir. 2008) (email and IP metadata); In re Application of the United States for an Order, 396 F. Supp. 2d 45, 48 (D. Mass. 2005) (web and IP metadata); United States v. Allen, 53 M.J. 402, 409 (C.A.A.F. 2000) (web metadata); [Redacted], No. PR/TT [Redacted], at 58-62 (FISA Ct. [date redacted]) (email metadata); see also In re Certified Question of Law, No. FISCR 16-01, at 32 (FISA Ct. Rev. Apr. 14, 2016) (analyzing government collection of post-cut-through digits obtained from a telephone carrier). The analysis in the main text is about whether constitutional protections for metadata depend on the government’s vantage point when conducting surveillance. That analysis presumes that the metadata is functioning solely as metadata (e.g., for routing communications). A related—but distinct—question is whether the metadata and content categories are mutually exclusive, or whether metadata can function both as routing information and as content. On this question, courts have generally held that the metadata and content categories are not mutually exclusive. See In re Google Inc. Cookie Placement Consumer Privacy Litig., 806 F.3d 125, 135-39 (3d Cir. 2015) (noting that the distinction between content and metadata is contextual and reviewing opinions).
See, e.g., Glenn v. State, No. S17A0858, 2017 WL 4582629, at *5 (Ga. Oct. 16, 2017); State v. Green, 164 So. 3d 331, 344 (La. Ct. App. 2015); infra notes 128-131 (additional cases). But see State v. Moore, No. 2014-001669, 2017 WL 3723327, at *6-7 (S.C. Ct. App. Aug. 30, 2017) (Lockemy, C.J., concurring in part and dissenting in part) (recognizing in a divided panel decision that police removal and analysis of a mobile phone SIM card constitutes a Fourth Amendment search).
See United States v. Turner, 839 F.3d 429 (5th Cir. 2016); United States v. DE L’Isle, 825 F.3d 426 (8th Cir. 2016); United States v. Bah, 794 F.3d 617 (6th Cir. 2015); United States v. Alabi, 943 F. Supp. 2d 1201 (D.N.M. 2013), aff’d, 597 F. App’x 991 (10th Cir. 2015); United States v. Medina, No. 09-20717-CR, 2009 WL 3669636 (S.D. Fla. Oct. 24, 2009) (Torres, Mag. J.), adopted in part and rejected in part sub nom. United States v. Duarte, 2009 WL 3669537 (S.D. Fla. Nov. 4, 2009) (excluding the magstripe evidence on other grounds).
See State v. Hill, 789 S.E.2d 317 (Ga. Ct. App. 2016); see also Cyrus Farivar, Crook Who Left His Phone at the Scene Has “No Reasonable Expectation of Privacy,” Ars Technica (June 23, 2016, 3:24 PM), http://arstechnica.com/tech-policy/2016/06/crook-who-left-his-phone-at-the-scene-has-no-reasonable-expectation-of-privacy [http://perma.cc/8FAS-BQQT] (providing a partial transcript of the bench ruling in United States v. Muller, No. 2:15-cr-00205-TLN (E.D. Cal. June 23, 2016)).
See United States v. Jones, 565 U.S. 400, 416 (2012) (Sotomayor, J., concurring) (“I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on. I do not regard as dispositive the fact that the Government might obtain the fruits of GPS monitoring through lawful conventional surveillance techniques.”); id. at 964 (Alito, J., concurring in the judgment) (“I conclude that the lengthy monitoring that occurred in this case constituted a search under the Fourth Amendment.”).
This version of the third-party doctrine argument is slightly different from the version that plausibly permits warrantless mobile phone location tracking. This version centers on whether the government obtains data via a business that consensually carries the data as a third-party service provider. The version that applies to mobile phone tracking, by comparison, turns on whether the government obtains data via a technical capability that is consensually made available to a third-party service provider. The former argument is fairly similar to how the third-party doctrine usually applies to communications providers, while the latter argument is most similar to cases involving file sharing networks (where the suspect has intentionally enabled certain third-party remote access to their device) or law enforcement deception to gain entry onto private property (here the deception is the police posing as the wireless carrier and the private property is the mobile phone functionality).
To be technically precise, the government may learn a suspect Tor user’s IP address from communications metadata transmitted by his computer and through his ISP to the government, rather than from directly querying a software interface on the computer and reporting the results. But, in either technical design, government malware is the only reason why the communications metadata emanates from the suspect’s computer.
See, e.g., Illinois v. Caballes, 543 U.S. 405, 408-10 (2005) (sniff test by a trained narcotics dog during a vehicle stop is not a Fourth Amendment search); United States v. Jacobsen, 466 U.S. 109, 122-26 (1984) (narcotics field test on powder is not a Fourth Amendment search); United States v. Place, 462 U.S. 696, 706-07 (1983) (sniff test by a trained narcotics dog at an airport is not a Fourth Amendment search).
E.g., California v. Hodari D., 499 U.S. 621, 628-29 (1991) (rock of cocaine dropped while fleeing police pursuit); California v. Greenwood, 486 U.S. 35, 39-44 (1988) (garbage placed outside the home); Hester v. United States, 265 U.S. 57, 58-59 (1924) (illegal whiskey bottle abandoned during police pursuit); see also Farivar, supra note 115.
Riley v. California, 134 S. Ct. 2473, 2489-91 (2014) (comparing electronic device searches to physical searches and concluding that the former implicate substantially greater privacy interests); see also Jennifer Granick, SCOTUS & Cell Phone Searches: Digital Is Different, Just Security (June 25, 2014), https://www.justsecurity.org/12219/scotus-cell-phone-searches -digital [http://perma.cc/94RH-42EV] (arguing that Riley stands for a Fourth Amendment principle of greater protection for electronic information); Orin Kerr, The Significance of Riley, Wash. Post: Volokh Conspiracy (June 25, 2014), https://www.washingtonpost.com /news/volokh-conspiracy/wp/2014/06/25/the-significance-of-riley [http://perma.cc/Y3G3 -W4YB] (providing a similar analysis).
See Florida v. Jimeno, 500 U.S. 248, 250-52 (1991) (describing the scope of Fourth Amendment consent). Several appellate courts have recognized a “consent once removed” exception to the Fourth Amendment warrant requirement, where initial consent to a law enforcement entry transfers to immediately subsequent law enforcement entry. See Callahan v. Millard County, 494 F.3d 891, 895-98 (10th Cir. 2007), rev’d on other grounds sub nom. Pearson v. Callahan, 555 U.S. 223 (2009); United States v. Bramble, 103 F.3d 1475, 1478 (9th Cir. 1996); United States v. Akinsanya, 53 F.3d 852, 855-56 (7th Cir. 1995). Whatever the vitality of the consent-once-removed doctrine, it does not provide a basis for warrantless law enforcement hacking. First, a suspect’s consent once removed has the same scope as the suspect’s initial consent. See, e.g., Bramble, 103 F.3d at 1478-79 (“When entering pursuant to the suspect’s ‘consent once removed,’ the additional backup officers are restricted to the scope of the consent originally given. Our holding does not authorize police to go beyond those areas consented to or to conduct general searches without first satisfying the ordinary requirements of consent, a warrant, or exigent circumstances which excuse the failure to obtain a warrant.” (citations omitted)). Second, the consent-once-removed exception only allows for a subsequent entry to effectuate a warrantless arrest; any additional warrantless search or seizure must be justified by a separate exception to the Fourth Amendment’s warrant requirement. See, e.g., State v. Henry, 627 A.2d 125, 132 (N.J. 1993) (applying the search incident to arrest and protective sweep exceptions).
See Email from [Redacted] to [Redacted] (Dec. 8, 2004), in 1 Elec. Frontier Found., supra note 93, at 5 (“Until a policy or directive is put in place, [the Data Intercept Technology Unit] has and will support any case that obtains a search warrant. Over the last six months it has not proven to be an obstacle to investigations.”).
See Paul Ohm, Probably Probable Cause: The Diminishing Importance of Justification Standards, 94 Minn. L. Rev. 1514, 1535-42 (2010) (arguing that probable cause develops early in online investigations); see also infra Section II.B (discussing the application of Fourth Amendment requirements to searches where a particular computer cannot be identified in advance).
An easy technical implementation of this investigative technique would be to generate a set of signatures for known contraband files, then check each file on a hacked device for whether the signature matches. A family of mathematical algorithms dubbed “cryptographic hashing” enables quickly computing these file signatures. See Richard P. Salgado, Fourth Amendment Search and the Power of the Hash, 119 Harv. L. Rev. F. 38, 43-46 (2006) (arguing that an examination of a seized device for file hashes that match known contraband would not constitute a Fourth Amendment search); see also Note, Data Mining, Dog Sniffs, and the Fourth Amendment, 128 Harv. L. Rev. 691, 705-12 (2014) (discussing a hypothetical “crime-sniffing algorithm”).
See Payton v. Riddick, 445 U.S. 573, 586 (1980) (noting that the “physical entry of the home is the chief evil against which the wording of the Fourth Amendment is directed”) (quoting United States v. U.S. Dist. Court, 407 U.S. 297, 313 (1972)). The government has previously applied for hacking warrants where the delivery stage involves entry into the suspect’s home or private office. E.g., United States v. Laurita, No. 8:13CR107, 2016 WL 4179365, at *3 (D. Neb. Aug. 5, 2016) (describing a search warrant that authorized entering a suspect’s home and installing malware); Order at 3-6, In re Application of the U.S. for an Order Authorizing the Surreptitious Entry into the Premises of Merchant Servs., No. 99-4061 (D.N.J. June 9, 1999) (search warrant allowing surreptitious entry to a suspect's private office to install malware).
See, e.g., United States v. Contreras-Ceballos, 999 F.2d 432, 434-35 (9th Cir. 1993) (reviewing doctrine permitting law enforcement deception to gain entry to a suspect’s home). But see United States v. Phua, 100 F. Supp. 3d 1040, 1047-52 (D. Nev. 2015) (describing limits on the government’s ability to use deception, including where the government manufactures an exigency).
Continuing with the mail analogy, while phishing and watering hole delivery techniques are akin to a suspect consensually accepting a deceptive parcel sent by the government, the exploitation and execution steps are more analogous to a robot nonconsensually slipping out of the parcel and roving about the suspect’s home.
See Jonathan Mayer, The “Narrow” Interpretation of the Computer Fraud and Abuse Act: A User Guide for Applying United States v. Nosal, 84 Geo. Wash. L. Rev. 1644, 1645-46, 1656-57 (2016) (synthesizing substantive standards for authorization to access a computer system or information). The Computer Fraud and Abuse Act explicitly excepts law enforcement investigations from its regulatory scheme. 18 U.S.C. § 1030(f) (2012) (“This section does not prohibit any lawfully authorized investigative, protective, or intelligence activity of a law enforcement agency of the United States, a State, or a political subdivision of a State, or of an intelligence agency of the United States.”).
When the government remotely probes a business server, much more difficult line-drawing questions can arise. While investigating an online black market, for instance, federal investigators may have engaged in borderline hacking conduct. See Nik Cubrilovic, Analyzing the FBI’s Explanation of How They Located Silk Road, New Web Ord. (Sept. 7, 2014), https://www.nikcub.com/posts/analyzing-fbi-explanation-silk-road [http://perma.cc/E3K3 -4XPX] (collecting and technically analyzing government filings associated with locating the black market server). Some remote investigative practices involving business servers do remain easily identifiable as searches, such as entering a user’s cloud service account without their permission. See Memorandum from David Bitkower, supra note 92, at 5 (offering a government hacking scenario where investigators cannot serve a Stored Communications Act warrant on a service provider, so they log into the suspect’s account themselves).
Courts have, for instance, consistently concluded that government investigators may explore public file-sharing services without triggering Fourth Amendment safeguards. See, e.g., United States v. Hill, 750 F.3d 982, 986 (8th Cir. 2014); United States v. Ganoe, 538 F.3d 1117, 1127 (9th Cir. 2008); United States v. Perrine, 518 F.3d 1196, 1205 (10th Cir. 2008); see also United States v. King, 509 F.3d 1338, 1341-42 (11th Cir. 2007) (finding no search when the government monitored files shared on a shared residential network on a military base). An early interagency report on computer searches also recognized a distinction between publicly and privately advertised data. U.S. Dep’t of Justice, Online Investigations Working Grp., supra note 96, at 20 (suggesting that using the public Unix “finger” command to collect identifying information would not constitute a search, but “exploiting design flaws” to collect the same information would be a search).
See United States v. Ahrndt, No. 3:08-CR-00468-KI, 2013 WL 179326, at *6-8 (D. Or. Jan. 17, 2013) (invalidating the warrantless search of an unprotected wireless network). In some scenarios, the police may be able to obtain consent from a person with authorized access to the private network. See United States v. Sawyer, 786 F. Supp. 2d 1352, 1355-57 (N.D. Ohio 2011).
See Advisory Comm. on Rules of Criminal Procedure, Criminal Rules Meeting, Jud. Conf. U.S. 10 (Apr. 7-8, 2014), http://www.uscourts.gov/rules-policies/archives/meeting-minutes/advisory-committee-rules-criminal-procedure-april-2014 [http://perma.cc/7GNJ-MFRP]; Email from [Redacted] to [Redacted] (July 2, 2007, 10:52 AM), in 8 Elec. Frontier Found., CIPAV FOIA Release 29, https://www.eff.org/files/filenode/cipav/fbi_cipav-08.pdf [http://perma.cc/8TDK-C2W5] (“It is just not well settled in the law that we can rely on the trespasser exception to the search requirement.”).
See 18 U.S.C. §§ 2510(21) & 2511(2)(i) (2012); Comput. Crime & Intellectual Prop. Section, Criminal Div., Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, U.S. Dep’t Just. 177-79 (2009), https://www.justice.gov/sites/default/files /criminal-ccips/legacy/2015/01/14/ssmanual2009.pdf [http://perma.cc/3ZLW-QYQJ] (explaining the computer trespasser exception to the Wiretap Act).
Cf. Sam Zeitlin, Note, Botnet Takedowns and the Fourth Amendment, 90 N.Y.U. L. Rev. 746, 770-77 (2015) (concluding that if the government obtains information from a compromised “zombie” computer that is part of a “botnet,” it is conducting a Fourth Amendment search). If anything, it seems reasonably likely that a hacker would have installed additional security precautions, developing a stronger argument for a reasonable expectation of privacy.
A narrower trespasser exception, allowing the government solely to take steps to prevent an attack or identify the perpetrator, would mitigate this concern. But a narrower exception would have no doctrinal basis, and would require courts to rigorously parse and review each and every category of information that the government obtained.
See United States v. Stanley, 753 F.3d 114, 119-24 (3d Cir. 2014) (holding that a network trespasser does not have a Fourth Amendment interest in his or her network configuration as exposed to the victim’s network, but rejecting the argument that all information associated with the trespasser is exempt from protection).
As with all Fourth Amendment searches, exigent circumstances may excuse the warrant requirement. The Supreme Court has noted that these are highly fact-specific determinations, and require extraordinary justification. See Riley, 134 S. Ct. at 2494 (listing bomb detonation and child abduction as hypothetical exigencies where a warrantless mobile phone search might be permissible). The rationale most likely to be applicable to computer trespassers is destruction of evidence, since most electronic attacks do not implicate human life or safety. See Kentucky v. King, 563 U.S. 452, 460 (2011) (noting various exigencies that excuse a search warrant). In order for that justification to apply, though, investigators must have reasonable grounds to believe that the hacker is destroying evidence in his or her own computer system. See Ker v. California, 374 U.S. 23, 57 (1963) (Brennan, J., concurring) (“Our decisions in related contexts have held that ambiguous conduct cannot form the basis for a belief of the officers that an escape or the destruction of evidence is being attempted.”). While it is conceivable that some hackers will satisfy this standard—for instance, by issuing specific taunts—investigators will rarely have sufficient indicia that a hacker plans to purge data from his or her own computer. See United States v. Gorshkov, No. CR00-550C, 2001 WL 1024026, at *4 (W.D. Wash. May 23, 2001) (concluding in dicta that an exigency justified the warrantless remote copying of a hacking group’s data, because the FBI had just arrested two of the hackers and co-conspirators could delete the data). What’s more, even if an exigency justifies warrantless remote copying of data, investigators will usually have time to obtain a warrant authorizing examination of the data. See id. (noting that investigators obtained a warrant before examining the remotely-seized hacker data).
See, e.g., United States v. Winn, 79 F. Supp. 3d 904, 918-22 (S.D. Ill. 2015) (invalidating a smartphone search warrant that covered “any or all files contained on said phone” as insufficiently particularized); In re [Redacted]@gmail.com, 62. F. Supp. 3d 1100, 1104 (N.D. Cal. 2014) (suggesting that, at minimum, the government must identify date restrictions and commit to returning or destroying relevant evidence in a cloud service search); In re Search of Info. Associated with [Redacted]@mac.com, 25 F. Supp. 3d 1, 7-9 (D.D.C. 2014) (calling for online services to prescreen information made available to the government, according to specific times, keywords, parties, or other filtering criteria).
See, e.g., In re a Warrant for xxxxxxx@gmail.com, 33 F. Supp. 3d 386, 396-401 (S.D.N.Y. 2014) (holding that, in general, ex ante protocols for data searches are not required to satisfy the Fourth Amendment’s particularity standard).
Anticipatory warrants offer a comprehensive and coherent constitutional basis for identification malware. There are, to be sure, related lines of doctrine that could also be used to justify identification-malware warrants. Courts have long permitted location-tracking warrants, even though at the time of issuance officers do not know where the suspect will travel. See United States v. Karo, 468 U.S. 705, 718 (1984). More recently, courts have allowed DNA-based “John Doe” arrest warrants, where officers do not know the suspect’s identity at the time of issuance. See People v. Robinson, 224 P.3d 55, 71-76 (Cal. 2010).
See United States v. Grubbs, 547 U.S. 90, 96 (2006) (“Anticipatory warrants are, therefore, no different in principle from ordinary warrants. They require the magistrate to determine (1) that it is now probable that (2) contraband, evidence of a crime, or a fugitive will be on the described premises (3) when the warrant is executed.”); United States v. Garcia, 882 F.2d 699, 702-04 (2d Cir. 1989) (reviewing the doctrine, policy, and precedent that support anticipatory warrants).
See Grubbs, 547 U.S. at 96-97 (explaining that an anticipatory warrant requires probable cause both with respect to the triggering condition occurring and to finding evidence once the triggering condition is satisfied). In a controlled-delivery scenario, probable cause with respect to the triggering condition is easily satisfied—packages are usually delivered to their intended destination and recipient.
In the usual controlled-delivery fact pattern, investigators at least know the intended recipient and destination for a package. With a malware search, by contrast, officers cannot provide a name or address in advance. While that sort of ex ante ambiguity is rare in a controlled delivery, it has come up, and courts have sustained anticipatory warrants for unspecified addresses and individuals. See People v. Bui, 885 N.E.2d 506, 517-22 (Ill. App. Ct. 2008) (sustaining a controlled-delivery search warrant for “any other location” where a package was taken); State v. Morris, 668 P.2d 857, 861-63 (Alaska Ct. App. 1983) (sustaining a controlled-delivery search warrant for “whoever picks up said package” and “wherever the described package is taken”).
See, e.g., United States v. Bianco, 998 F.2d 1112, 1122-25 (2d Cir. 1993) (roving bug); United States v. Petti, 973 F.2d 1441, 1443-45 (9th Cir. 1992) (roving wiretap); see also Grubbs, 547 U.S. at 97 (“The Fourth Amendment, however, does not set forth some general ‘particularity requirement.’ It specifies only two matters that must be ‘particularly describ[ed]’ in the warrant: ‘the place to be searched’ and ‘the persons or things to be seized.’”).
See, e.g., United States v. Rigmaiden, No. CR 08-814-PHX-DGC, 2013 WL 1932800, at *18-19 (D. Ariz. May 8, 2013) (explaining that a warrant authorizing electronic surveillance need not be a “model of clarity,” and need only satisfy the Fourth Amendment’s basic requirements of a neutral and disinterested magistrate, probable cause, and particularity). As a matter of policy, greater clarity in warrant documentation is certainly preferable, but not a constitutional requirement.
18 U.S.C. §§ 2251, 2252, 2252A (2012); United States v. Williams, 533 U.S. 285, 292-304 (2008) (holding that an offer to provide or request to receive child pornography is categorically unprotected by the First Amendment); New York v. Ferber, 458 U.S. 747, 753-74 (1982) (holding that the possession of child pornography is categorically unprotected by the First Amendment).
Affidavit of FBI Special Agent John Robertson in Support of Application for a Search Warrant at 12, No. 1:15-mj-00534-VVP (E.D.N.Y. June 10, 2015) (describing a warrant that authorized malware delivery from a seized child pornography website “each time any user or administrator logged [in]”); Application for “Bulletin Board A” Search Warrant, supra note 41, at 30 (“I request authority to use the NIT to investigate: . . . (2) any user who sends or views a private message on ‘Bulletin Board A’ during the period of this authorization.”).
The previous result under Rule 41 was that when the government knew which computer it was hacking, investigators would apply to a magistrate judge in the district where the computer was located. When the government was hacking a computer in a terrorism-related case, a magistrate in any district would suffice. But when the government wanted to hack a computer and did not know where the computer was located (e.g., when investigating a Tor user), a substantial majority of lower courts rightly concluded that there was no exceptional Rule 41 venue provision—textually or in principle. See, e.g., United States v. Adams, No. 6:16-CR-11-ORL-40GJK, 2016 WL 4212079, at *10 (M.D. Fla. Aug. 10, 2016); United States v. Werdene, No. 15-434, 2016 WL 3002376, at *11 (E.D. Pa. May 18, 2016). But see United States v. Laurita, No. 8:13CR107, 2016 WL 4179365, at *7 (D. Neb. Aug. 5, 2016); United States v. Eure, No. 2:16CR43, 2016 WL 4059663, at *4 (E.D. Va. July 28, 2016). There was a plausible argument that District Court judges retained authority to issue these types of warrants under 18 U.S.C. § 3103, regardless of Rule 41.
Fed. R. Crim. P. 41(b)(6). See Daskal, supra note 29, at 355-59 (reviewing the proposed amendments); Lerner, supra note 28 (similar); see also Ghappour, supra note 28, at 1080-81 (criticizing the proposed amendments). The discussion above centers on Fed. R. Crim. P. 41(b)(6)(A), because it resolved an outstanding and difficult venue issue in malware-based investigations. The new amendment also added Fed. R. Crim. P. 41(b)(6)(B), which streamlines the warrant process for remotely accessing compromised devices in multi-district investigations.
The extra-territoriality provision for terrorism investigations still applies to law enforcement hacking. If the government is investigating “domestic terrorism or international terrorism,” it can apply for a hacking warrant in “any district in which activities related to the terrorism may have occurred.” Fed. R. Crim. P. 41(b)(3).
See United States v. Krueger, 809 F.3d 1109, 1117-26 (10th Cir. 2015) (Gorsuch, J., concurring) (detailing how the Federal Magistrates Act imposes warrant venue provisions); see also United States v. Arterbury, No. 15-CR-182-JHP, 2016 U.S. Dist. LEXIS 67091, at *7-8 (N.D. Okla. Apr. 25, 2016) (concluding that a magistrate’s issuance of a warrant to hack Tor users violated the Federal Magistrates Act).
Relatedly, in the instance where magistrate judges received new authority to preside outside their district, that was also authorized by legislation. The statutory authority for magistrates to operate “at other places where that court may function” was added in 2005, as a response to the displacement of federal courts following Hurricane Katrina. See Krueger, 809 F.3d at 1121 (Gorsuch, J., concurring).
Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001 (USA PATRIOT Act), Pub. L. No. 107-56, § 219, 115 Stat. 272, 291. While the provision addressing extra-district search warrants is framed as an amendment to Rule 41, it is nevertheless a congressional enactment.
District court judges are authorized to issue search warrants under 18 U.S.C. § 3102 (2012), and the Federal Rules of Criminal Procedure expressly provide district court judges with all of the powers of magistrate judges. Fed. R. Crim. P. 1(c). Courts have also understood that Rule 41 regulates warrants issued by district court judges, even though the text of the rule references magistrates. See, e.g., United States v. Golson, 743 F.3d 44, 51-53 (3d Cir. 2014); United States v. Glover, 736 F.3d 509, 515 (D.C. Cir. 2013).
See, e.g., United States v. Katzin, 732 F.3d 187, 198 (3d Cir. 2013) (“We thus have no hesitation in holding that the police must obtain a warrant prior to attaching a GPS device on a vehicle, thereby undertaking a search that the Supreme Court has compared to ‘a constable’s concealing himself in the target’s coach in order to track its movements.’” (quoting United States v. Jones, 565 U.S. 400, 406 n.3 (2012))).
See, e.g., Silverman v. United States, 365 U.S. 505, 511-12 (1961) (“This Court has never held that a federal officer may without warrant and without consent physically entrench into a man’s office or home, there secretly observe or listen, and relate at the man’s subsequent criminal trial what was seen or heard.”).
Compare 18 U.S.C. § 3122 (2012) (requiring only a self-certification of relevance to substantiate a pen/trap order), with U.S. Const. amend. IV (“[N]o Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”).
Courts have, in the past, authorized deviations from Rule 41’s time limit for subsequent forensic examination of seized computer data. See United States v. Kernell, No. 3:08-CR-142, 2010 U.S. Dist. LEXIS 32845, at *38-43 (E.D. Tenn. Mar. 31, 2010) (explaining the issue and collecting cases). That fact pattern is very different from government hacking, of course: police have long been authorized to inspect evidence after seizure. See, e.g., United States v. Tillotson, No. 2:08-CR-33, 2008 U.S. Dist. LEXIS 120701, at *14 (E.D. Tenn. Nov. 13, 2008) (“The subsequent analysis of the computer’s contents is not a search in the sense contemplated by Rule 41 . . . .”). And, at any rate, Rule 41 was explicitly amended to address the timing of post-seizure forensic examinations. Fed. R. Crim. P. 41(e)(2)(B) (clarifying that the Rule 41 time limits apply to “the seizure or on-site copying of the media or information, and not to any later off-site copying or review”).
Affidavit in Support of Application for a Search Warrant, supra note 42; Affidavit in Support of Application for Search Warrant, In re Search of Computs. that Access Target E-Mail Accounts, supra note 47; Affidavit in Support of Application for Search Warrant, In re Search of Computs. that Access the E-Mail Accounts Described in Attachment A, supra note 47.
See Email from [Redacted] to [Redacted] Re: [Redacted] (Nov. 20, 2006), in 8 Elec. Frontier Found., CIPAV FOIA Release, supra note 167, at 154, 154 (discussing how to maximize the duration of malware operation under one court order). The tracking device statute, 18 U.S.C. § 3117 (2012), empowers courts to issue warrants for “tracking devices”; its implementation in Rule 41(e)(2)(C) specifies a maximum of ten days for installation and forty-five days for operation. The DOJ has consistently argued that these tracking device provisions do not cover purely electronic location-tracking techniques, in a bid to avoid a warrant requirement for mobile phone location tracking. See, e.g., In re Application of the U.S. for an Order, 411 F. Supp. 2d 678, 681 (W.D. La. 2006). It would be incongruous for the DOJ to reverse that critical argument after a decade—and solely to extend a renewal clock in hacking cases. Moreover, identification malware does not itself locate a device in any conventional sense. Rather, it gives the government sufficient network and device configuration information to determine the owner’s identity through follow-up investigation. And even if some of the functionality of government malware could be characterized as a tracking device, much of the functionality could not. Only a subset of the malware’s operation would be covered by the longer time limit.
See Email from [Redacted] to [Redacted] Re: CIPAV Court Orders (Nov. 21, 2006), in 8 Elec. Frontier Found., CIPAV FOIA Release, supra note 167, at 149 (“One comment that has come in from my unit re the draft orders that should be forwarded to AUSA [redacted] is that he should also cite to the All Writs Act . . . .”). Courts invoke the All Writs Act, 28 U.S.C. § 1651, to compel third-party assistance with warrant execution. That includes assistance with ongoing electronic surveillance. See United States v. N.Y. Tel. Co., 434 U.S. 159, 171-78 (1977) (sustaining the use of a warrant, in conjunction with the All Writs Act, to compel a telephone company to prospectively provide call records). But the All Writs Act is only relevant to third-party assistance associated with an electronic search, not any ongoing nature of the search. And even if there were any prospective search authority under the All Writs Act, it would be displaced by the more specific time limits imposed by Rule 41. See Pa. Bureau of Correction v. U.S. Marshals Serv., 474 U.S. 34, 40-43 (1985) (emphasizing that the All Writs Act is a “residual source of authority” that is overridden by more specific provisions).
Fed. R. Crim. P. 41(e)(2)(C) (setting out time limits for installation and operation of a location-tracking device pursuant to a warrant). Before federal and state rules were amended to address tracking devices, the ordinary law enforcement practice was to obtain a series of time-limited warrants (if they obtained warrants at all). See, e.g., State v. Jackson, 76 P.3d 217, 220-21 (Wash. 2003) (describing a ten-day tracking device warrant, followed by a second ten-day warrant).
See Jonathan Witmer-Rich, The Rapid Rise of Delayed Notice Searches, and the Fourth Amendment “Rule Requiring Notice,” 41 Pepp. L. Rev. 509, 561-70 (2014) (reviewing an unbroken history of search notice requirements); see also Akhil Reed Amar, Fourth Amendment First Principles, 107 Harv. L. Rev. 757, 802-03 (1994) (describing ex post notice as a central feature of Fourth Amendment warrants).
See, e.g., United States v. Freitas, 800 F.2d 1451, 1456 (9th Cir. 1986) (holding that, if a court issues a warrant for surreptitious entry of a home, the Fourth Amendment mandates ex post notice with minimum delay); see also Berger v. New York, 388 U.S. 41, 60 (1967) (describing notice as a “requirement” for “conventional warrants”).
See, e.g., United States v. Pangburn, 983 F.2d 449, 449-50 (2d Cir. 1993) (“Although we have required that seven days notice be given after covert entries for which search without physical seizure has been authorized, that notice requirement is grounded in [Rule 41] and is not compelled by the Constitution.”).
18 U.S.C. § 2518(8)(d)(2012) (requiring actual service of notice within ninety days of a wiretap’s conclusion); 18 U.S.C. § 3103a (2012) (granting general authority for delayed-notice search and seizure warrants); Fed. R. Crim. P. 41(f)(3) (permitting issuance of delayed-notice warrants where authorized by statute).
See Letter from William E. Moschella, Assistant Att’y Gen., to J. Dennis Hastert, Speaker, U.S. House of Representatives 7 (July 25, 2003), https://cdt.org/files/security/usapatriot /030725doj.pdf [http://perma.cc/XUE8-QS8K] (explaining that the delayed-notice search statute “requires law enforcement to give notice that a search warrant has been executed in all circumstances”); cf. In re Grand Jury Subpoena for [Redacted]@yahoo.com, No. 5:15-cr-90096-PSG, 2015 U.S. Dist. LEXIS 17379, at *1-2 (N.D. Cal. Feb. 5, 2015) (rejecting an indefinite gag order for electronic-data warrants and subpoenas).
The Stored Communications Act (SCA) does not statutorily require notice to a subscriber after the government executes a search warrant for content stored with a service provider. 18 U.S.C. § 2703(b)(1)(A) (2012). Courts disagree on whether the SCA expressly eliminates any notice requirement, or merely defers to the notice provisions of Rule 41. Compare United States v. Scully, 108 F. Supp. 3d 59, 84-85 (E.D.N.Y. 2015) (concluding that the SCA mandates notice only where investigators have not obtained a warrant), with In re Application of the U.S. for an Order, 665 F. Supp. 2d 1210, 1216-21 (D. Or. 2009) (holding that the SCA incorporates Rule 41, including its notice provisions). Furthermore, at the time the SCA was enacted, Congress (and the courts) believed that content stored with a third-party business was often exempt from Fourth Amendment protection. See United States v. Warshak, 631 F.3d 266, 288 (6th Cir. 2010) (concluding that the SCA violates the Fourth Amendment by not imposing a warrant requirement for content privately stored with third-party services). Based on a modern understanding, then, a warrant for content stored with a service provider must satisfy the notice requirements of the Fourth Amendment (to the extent they exist) and Rule 41 (to the extent they are not uniquely abrogated by the SCA). These notice requirements are both satisfied because the warrant is executed via a third party. See In re Application of the U.S. for an Order, 665 F. Supp. 2d at 1221-22 (holding that a warrant for stored content, executed via a third-party service provider, satisfies Rule 41’s notice requirements); id. at 1222-24 (same for Fourth Amendment’s notice requirement). Microsoft brought a Fourth Amendment challenge to the DOJ policy against notifying suspects whose stored content is searched. See Complaint for Declaratory Judgment, Microsoft Corp. v. U.S. Dep’t of Justice at 13-14, No. 2:16-cv-00538-JLR (W.D. Wash. Apr. 14, 2016). Microsoft agreed to dismiss the case when the DOJ adopted a new policy on gag orders for service providers; the policy does not require notice to defendants. See Memorandum from Rod J. Rosenstein, Deputy Att’y Gen., to the Heads of the Dep’t Law Enf’t Components, the Heads of the Dep’t Litigating Components, the Dir. of the Exec. Office for U.S. Att’ys, and All U.S. Att’ys (Oct. 19, 2017), https://assets.documentcloud.org/documents/4116081/Policy-Regarding-Applications-for -Protective.pdf [http://perma.cc/4X8C-5C4K].
18 U.S.C. § 2518(8)(d) (2012) (“[T]he issuing or denying judge shall cause to be served, on the persons named in the order or the application, and such other parties to intercepted communications as the judge may determine in his discretion that is in the interest of justice, an inventory [of the wiretap application and execution].”); Fed. R. Crim. P. 41(f)(2)(C) (“[T]he officer executing a tracking-device warrant must serve a copy of the warrant on the person who was tracked or whose property was tracked. Service may be accomplished by delivering a copy to the person who, or whose property, was tracked . . . .”).
Fed. R. Crim. P. 41(f)(1)(C) (allowing constructive notice by “leav[ing] a copy of the warrant and receipt at the place where the officer took the property”); see also Fed. R. Crim. P. 41(f)(2)(C) (allowing constructive notice of a tracking device warrant “by leaving a copy at the person’s residence or usual place of abode with an individual of suitable age and discretion who resides at that location and by mailing a copy to the person’s last known address”).
Fed. R. Crim. P. 41(f)(1)(C) (“For a warrant to use remote access . . . the officer must make reasonable efforts to serve a copy of the warrant and receipt on the person whose property was searched . . . . Service may be accomplished by any means, including electronic means, reasonably calculated to reach that person.”). The amendment is a clarification of the existing notice requirement, rather than a new notice requirement. The rule text prior to the amendment still imposed a constructive notice requirement, and the delayed-notice statute still applied. See Memorandum from David Bitkower, supra note 92, at 8 (acknowledging that, even without the hacking-specific notice amendment to Rule 41, the DOJ is still bound by the delayed-notice statute when it deploys malware); id. at 9 (suggesting that the hacking-specific notice provision is grounded in the Fourth Amendment).
See, e.g., Third Amended Application for a Search Warrant, supra note 37, at 24 (specifying that “the government may delay providing a copy of the search warrant and the receipt for any property taken until the time that a suspect has been identified and has been placed in custody”); id. at 13 (requesting delayed notice “because the investigation has not identified an appropriate person to whom such notice can be given”); Application for “Bulletin Board A” Search Warrant, supra note 41, at 35-36 (specifying that “the government may delay providing a copy of the search warrant and the receipt for any property taken for thirty (30) days after a user of an ‘activating’ computer that accessed ‘Bulletin Board A’ has been identified to a sufficient degree as to provide notice”); Application and Affidavit for Search Warrant, supra note 4, at 16 (specifying that “the FBI may delay providing a copy of the search warrant and the receipt for any property taken until no more than thirty (30) days after such time as the name and location of the individual(s) using the activating computer is positively identified”).
See Joseph Cox, FBI May Have Hacked Innocent TorMail Users, Vice: Motherboard (Jan. 21, 2016), http://motherboard.vice.com/en_us/article/wnx5px/fbi-may-have-hacked-innocent -tormail-users [http://perma.cc/UBV9-LAZ3]; Joseph Cox, Unsealed Court Docs Show FBI Used Malware Like ‘A Grenade’, Vice: Motherboard (Nov. 7, 2016), http://motherboard .vice.com/en_us/article/wnxbqw/unsealed-court-docs-show-fbi-used-malware-like-a -grenade [http://perma.cc/SV3B-NSLM]; Poulsen, supra note 53; Kevin Poulsen, If You Used This Secure Webmail Site, the FBI Has Your Inbox, Wired (Jan. 27, 2014), http://www.wired.com/2014/01/tormail/ [http://perma.cc/UY4T-EBFE].
See Joe Uchill, ACLU Questions How Tor Email Users Got FBI-Deployed Malware, Hill (Sept. 6, 2016), http://thehill.com/policy/cybersecurity/294618-aclu-why-did-email-service-users-get-fbi-deployed-malware [http://perma.cc/3D9M-N6LJ].
See, e.g., United States v. Torres, 751 F.2d 875, 882-85 (7th Cir. 1984) (holding that the four core protections of the Wiretap Act are mandated by the Fourth Amendment for video surveillance and that the Federal Rules of Criminal Procedure are sufficiently flexible to accommodate those super-warrant safeguards); United States v. Biasucci, 786 F.2d 504, 507-12 (2d Cir. 1986) (following Torres); United States v. Cuevas-Sanchez, 821 F.2d 248, 251-52 (5th Cir. 1987) (adopting Biasucci and Torres); United States v. Mesa-Rincon, 911 F.2d 1433, 1436-46 (10th Cir. 1990) (applying the four core protections of the Wiretap Act to video surveillance); United States v. Koyomejian, 970 F.2d 536, 538-42 (9th Cir. 1992) (following Cuevas-Sanchez); United States v. Falls, 34 F.3d 674, 679-83 (8th Cir. 1994) (following and applying Koyomejian); United States v. Williams, 124 F.3d 411, 416-20 (3d Cir. 1997) (assuming the correctness of Torres).
See, e.g., Joffe v. Google, Inc., 746 F.3d 920, 926-36 (9th Cir. 2013) (applying the Wiretap Act to wireless network interception); United States v. Councilman, 418 F.3d 67, 69-85 (1st Cir. 2005) (holding that email interception is covered under the Wiretap Act). If the government obtains solely real-time communications metadata in conjunction with a hack, it must comport with the pen register statute. 18 U.S.C. §§ 3121-3127 (2012). Since a warrant is substantively more rigorous then a pen/trap order, the only practical implication is that a federal investigation must be included in an annual Department of Justice pen/trap report. 18 U.S.C. § 3126 (2012).
See Luis v. Zang, No. 1:11-cv-884, 2013 WL 811816, at *4-9 (S.D. Ohio Mar. 5, 2013) (reviewing litigation on keyloggers and concluding that, if malware reports keystrokes to a remote party, it implicates the Wiretap Act); Shefts v. Petrakis, No. 10-cv-1104, 2012 U.S. Dist. LEXIS 130542, at *37-44 (C.D. Ill. Sept. 12, 2012) (holding that screen capture software that recorded email activity was covered by the Wiretap Act). Courts have generally not required that the transmission of recorded activity be precisely contemporaneous with the activity. See Williams v. Stoddard, No. PC 12-3664, 2015 R.I. Super. LEXIS 58, at *19-30 (R.I. Super. Ct. Feb. 11, 2015) (summarizing perspectives on wiretap timing).
See Riley v. California, 134 S. Ct. 2473, 2489 (2014) (“The term ‘cell phone’ is itself misleading shorthand; many of these devices are in fact minicomputers that also happen to have the capacity to be used as a telephone. They could just as easily be called cameras, video players, rolodexes, calendars, tape recorders, libraries, diaries, albums, televisions, maps, or newspapers.”).
See Berger, 388 U.S. at 59 (“[T]he conversations of any and all persons coming into the area covered by the device will be seized indiscriminately and without regard to their connection with the crime under investigation.”); id. at 65 (Douglas, J., concurring) (“The traditional wiretap or electronic eavesdropping device constitutes a dragnet, sweeping in all conversations within its scope—without regard to the participants or the nature of the conversations. It intrudes upon the privacy of those not even suspected of crime and intercepts the most intimate of conversations.”); see also United States v. Biasucci, 786 F.2d 504, 510 (2d Cir. 1986) (“[C]oncern with the indiscriminate nature of electronic surveillance led the Berger Court to require that a warrant authorizing electronic surveillance be sufficiently precise so as to minimize the recording of activities not related to the crimes under investigation.”); United States v. Torres, 751 F.2d 875, 885 (7th Cir. 1984) (“Television surveillance is identical in its indiscriminate character to wiretapping and bugging.”).
See Cardozo et al., supra note 74, at 13 (collecting business policies for handling government data demands, including annual transparency reports); Google, Way of a Warrant, YouTube (Mar. 27, 2014), http://www.youtube.com/watch?v=MeKKHxcJfh0 [http://perma.cc/YS53 -QUYA] (explaining that Google requires search warrants for user content, examines warrants for errors, narrows production for overbroad warrants, and notifies users of government demands); see also, e.g., Opening Brief of Appellant Facebook, Inc. at 3-9, In re 381 Search Warrants Directed to Facebook, Inc. and Dated July 23, 2013, No. 30207-13 (N.Y. App. Div. June 20, 2014) (describing Facebook’s challenge to search warrants from the New York County District Attorney for user content with questionable probable cause support and no date or content restrictions).
Imagine that the government hacks a user’s device and monitors their files. So far, courts have concluded that super-warrant doctrine does not apply. But, in the future, a user’s files will be automatically synced to remote services and other devices (e.g. Apple’s iCloud). Those synced files are plainly electronic communications under the Wiretap Act and the Berger doctrine. Would the government then be required to obtain a super-warrant for file monitoring?
See Ed Ferrera et al., Forrester Research Inc., Government Spying Will Cost US Vendors Fewer Billions than Initial Estimates 2 (2015) (estimating $47 billion in costs over three years); Daniel Castro & Alan McQuinn, Beyond the USA Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness, Info. Tech. & Innovation Found. 1 (June 2015), http://www2.itif.org/2015-beyond-usa-freedom-act.pdf [http://perma.cc/WX2U-SXCN] (estimating well over $35 billion in costs over three years).
See, e.g., Jonathon W. Penney, Chilling Effects: Online Surveillance and Wikipedia Use, 31 Berkeley Tech. L.J. 117-72 (2016) (concluding that surveillance disclosures chilled online activity); Alex Marthews & Catherine Tucker, Government Surveillance and Internet Search Behavior 40 (Mar. 15, 2017) (unpublished manuscript), http://ssrn.com/abstract=2412564 [http://perma.cc/BW5B-TY78] (finding that Google users’ search behavior changed as a result of the surveillance revelations in June 2013); see also Margot E. Kaminski & Shane Witnov, The Conforming Effect: First Amendment Implications of Surveillance, Beyond Chilling Speech, 49 Rich. L. Rev. 465, 466-67 (2015) (linking government surveillance to First Amendment interests). But see Sören Preibusch, Privacy Behaviors After Snowden, 59 Comm. ACM 48, 48 (2015) (concluding that surveillance disclosures led to a decrease—not an increase—in privacy behaviors).
Whether privacy protections should be initially imposed by Congress or the courts is a subject of scholarly debate. Compare Orin S. Kerr, The Fourth Amendment and New Technologies: Constitutional Myths and the Case for Caution, 102 Mich. L. Rev. 801, 806 (2004) (arguing that Congress should be the primary source of privacy rules), with David Alan Sklansky, Two More Ways Not To Think About Privacy and the Fourth Amendment, 82 U. Chi. L. Rev. 223, 224-33 (2015) (arguing that the courts should not wait for Congress to create privacy rules).
For example, Congress could amend the CFAA to read:
(f) This section does not prohibit any lawfully authorized investigative, protective, or intelligence activity of an intelligence agency of the United States.
(g) This section does not prohibit any lawfully authorized investigative or protective activity of a law enforcement agency of the United States, a State, or a political subdivision of a State, provided that the agency has complied with the procedure established in 18 U.S.C. § 2518.
See, e.g., Orin S. Kerr, Applying the Fourth Amendment to the Internet: A General Approach, 62 Stan. L. Rev. 1005 (2010) (arguing that Fourth Amendment protections for online communications generally track, and should continue to track, a content/noncontent distinction); Orin S. Kerr, An Equilibrium-Adjustment Theory of the Fourth Amendment, 125 Harv. L. Rev. 476 (2011) [hereinafter Kerr, An Equilibrium-Adjustment Theory] (arguing that the evolution of Fourth Amendment law has balanced—and should continue to balance—changes in criminal and government technical capabilities); Kerr, supra note 31 (arguing that the evolution of Fourth Amendment law reflects four distinct conceptions of constitutional privacy); Kerr, supra note 29 (describing how constitutional privacy protections have applied and should continue to apply to transborder data flows).
See, e.g., Baude & Stern, supra note 31 (recommending that Fourth Amendment law track statutory and common law privacy protections that apply to private actors); Daskal, supra note 29 (recommending that Fourth Amendment information privacy law abandon territoriality restrictions); Richard A. Epstein, Privacy and the Third Hand: Lessons from the Common Law of Reasonable Expectations, 24 Berkeley Tech. L.J. 1199 (2009) (recommending that constitutional privacy protections track Lockean social contract theory and social norms); Nita A. Farahany, Searching Secrets, 160 U. Pa. L. Rev. 1239 (2012) (recommending using intellectual property law as a metaphor for Fourth Amendment protections); Paul Ohm, The Fourth Amendment in a World Without Privacy, 81 Miss. L.J. 1309 (2012) (recommending a new balancing approach for Fourth Amendment protections); Jed Rubenfeld, The End of Privacy, 61 Stan. L. Rev. 101 (2008) (recommending reconceptualization of the Fourth Amendment as a right to security against state action); David Alan Sklansky, Too Much Information: How Not To Think About Privacy and the Fourth Amendment, 102 Cal. L. Rev. 1069 (2014) (recommending reconceptualization of the Fourth Amendment as a protection for personal sovereignty); Solove, supra note 31 (recommending a new framework for Fourth Amendment law that emphasizes procedure over coverage).
See, e.g., Orin S. Kerr, The Effect of Legislation on Fourth Amendment Protection, 115 Mich. L. Rev. 1117 (2017) (reviewing judicial approaches to how congressional enactments influence Fourth Amendment articulation, and recommending that courts should independently interpret constitutional privacy protections); Erin Murphy, The Politics of Privacy in the Criminal Justice System: Information Disclosure, the Fourth Amendment, and Statutory Law Enforcement Exemptions, 111 Mich. L. Rev. 485 (2013) (noting tendencies in congressional surveillance regulation, and arguing that privacy protections should evolve from an interbranch dialogue); John Rappaport, Second-Order Regulation of Law Enforcement, 103 Cal. L. Rev. 205 (2015) (explaining how the Fourth Amendment can regulate policy, not just line-level police officers); Daphna Renan, The Fourth Amendment as Administrative Governance, 68 Stan. L. Rev. 1039 (2016) (recommending implementation of administrative law strategies as a component of Fourth Amendment surveillance regulation).
See, e.g., Patricia L. Bellia & Susan Freiwald, Fourth Amendment Protection for Stored Email, 2008 U. Chi. Legal F. 121 (recommending against application of the third-party doctrine to stored email content); Maureen E. Brady, The Lost “Effects” of the Fourth Amendment: Giving Personal Property Due Protection, 125 Yale L.J. 946 (2016) (recommending constitutional privacy protection for location surveillance, based on reinvigoration of the Fourth Amendment’s “effects” language); Susan Freiwald, Cell Phone Location Data and the Fourth Amendment: A Question of Law, Not Fact, 70 Md. L. Rev. 681 (2011) (recommending Fourth Amendment protection for cellphone location data); Adam M. Gershowitz, The iPhone Meets the Fourth Amendment, 56 UCLA L. Rev. 27 (2008) (recommending limits on the search incident to arrest doctrine as applied to electronic devices); Stephen E. Henderson, Learning from All Fifty States: How To Apply the Fourth Amendment and Its State Analogs To Protect Third-Party Information from Unreasonable Search, 55 Cath. U. L. Rev. 373 (2006) (recommending limits on the third-party doctrine as applied to electronic surveillance); Renée McDonald Hutchins, Tied Up in Knotts? GPS Technology and the Fourth Amendment, 55 UCLA L. Rev. 409 (2007) (recommending Fourth Amendment scrutiny for GPS-based location tracking).
Id. In recent work, Kerr appears to have significantly walked back this theory, arguing that judicial deference and inaction are only warranted to the extent that technology remains in flux; once technology has stabilized, courts should independently articulate surveillance regulation. See Kerr, supra note 290, at 1149-57.
See, e.g., Swire, supra note 309, at 914 (“The regulated industry of law enforcement has a concentrated interest in reducing regulation—pushing for fewer warrants, less onerous reporting requirements, and so on.”). There are recent, noteworthy exceptions to this generalization. See Daphna Renan, Pooling Powers, 115 Colum. L. Rev. 211 (2015) (describing how federal agencies collaborate to enhance surveillance capabilities); Shirin Sinnar, Protecting Rights from Within? Inspectors General and National Security Oversight, 65 Stan. L. Rev. 1027 (2013) (describing how inspectors general can constrain national security surveillance).
Department of Justice Policy Guidance: Use of Cell-Site Simulator Technology, U.S. Dep’t Just. (Sept. 3, 2015), http://www.justice.gov/opa/file/767321/download [http://perma.cc/YTX2 -YWSA]; Justice Department Announces Enhanced Policy for Use of Cell-Site Simulators, U.S. Dep’t Just. (Sept. 3, 2015), https://www.justice.gov/opa/pr/justice-department-announces -enhanced-policy-use-cell-site-simulators [http://perma.cc/JXL9-LBRR].
Memorandum from Alejandro N. Mayorkas, Deputy Sec’y of Homeland Sec., to Component Chiefs, Department Policy Regarding the Use of Cell-Site Simulator Technology (Oct. 19, 2015), http://www.dhs.gov/sites/default/files/publications/Department%20Policy%20Regarding%20the%20Use%20of%20Cell-Site%20Simulator%20Technology.pdf [http://perma.cc/R2C3-FEGU].
See, e.g., United States v. Matish, 193 F. Supp. 3d 585, 621-22 (E.D. Va. 2016) (“The Court finds that due to the especially pernicious nature of child pornography and the continuing harm to the victims, the balance between any Tor user’s alleged privacy interests and the Government’s deployment of the NIT . . . weighs in favor of . . . [the] use of technology to counteract the measures taken by people who access child pornography online. The Government’s efforts to contain child pornographers, terrorists and the like cannot remain frozen in time . . . .” (footnote omitted)).
See, e.g., United States v. Michaud, No. 3:15-cr-05351-RJB, 2016 WL 337263, at *7 (W.D. Wash. Jan. 28, 2016) (accepting the government’s position uncritically in three sentences); Government’s Response to Second Motion to Suppress and Request for Franks Hearing at 17, Michaud, No. 3:15-cr-05351-RJB, 2016 WL 337263; United States’ Response to Defendant’s Motion to Suppress at 2, 6-7, Michaud, No. 3:15-cr-05351-RJB, 2016 WL 337263.
See, e.g., Matish, 193 F. Supp. 3d at 593-94 (asserting that Tor protects a user’s IP address without any explanation of how it does so, which is essential for evaluating whether obtaining a Tor user’s IP address constitutes a Fourth Amendment search); United States v. Werdene, 188 F. Supp. 3d 431, 444 (E.D. Pa. 2016) (claiming wrongly that the defendant’s “IP address was subsequently bounced from node to node within the Tor network”).
See, e.g., Richard Salgado, A Small Rule Change that Could Give the U.S. Government Sweeping New Warrant Power, Google Pub. Pol’y Blog (Feb. 18, 2015), http://publicpolicy.googleblog.com/2015/02/a-small-rule-change-that-could-give-us.html [http://perma.cc/PSB4 -UBVY].
Cosponsors: H.R.1110 – 115th Congress (2017-2018), Congress.gov, http://www.congress .gov/bill/115th-congress/house-bill/1110/cosponsors [http://perma.cc/4RZP-T4RX]; Cosponsors: S.406 – 115th Congress (2017-2018), Congress.gov, http://www.congress.gov/bill /115th-congress/senate-bill/406/cosponsors [http://perma.cc/KD2K-6GYL].
See, e.g., Azam Ahmed & Nicole Perlroth, Using Texts as Lures, Government Spyware Targets Mexican Journalists and Their Families, N.Y. Times (June 19, 2017), http://www.nytimes.com/2017/06/19/world/americas/mexico-spyware-anticrime.html [http://perma.cc/2L5K -57FF] (reporting how elements of the Mexican government appear to have targeted journalists with malware). See generally supra Section II.G (discussing risks and negative externalities associated with law enforcement hacking).
Users of Tor, Tor, http://www.torproject.org/about/torusers.html.en [http://perma.cc /85TF-Q266].
See H. Comm. on the Judiciary, 114th Cong., Encryption Working Group Year-End Rep. (2016), http://judiciary.house.gov/wp-content/uploads/2016/12/20161220EWGFINALReport.pdf [http://perma.cc/67JE-WBT3].
See, e.g., United States v. Kahler, No. 16-cr-20551, 2017 WL 586707, at *7 (E.D. Mich. Feb. 14, 2017) (“The Government argues that, despite using a software which exists only to veil the user’s IP address from prying eyes, the user has no reasonable privacy interest in his or her IP address. This argument has little to recommend it. If a user who has taken special precautions to hide his IP address does not suffer a Fourth Amendment violation when a law enforcement officer compels his computer to disclose the IP address . . . then it is difficult to imagine any kind of online activity which is protected by the Fourth Amendment.”).
See, e.g., Baude & Stern, supra note 31, at 1888-89 (proposing that Fourth Amendment regulation of privacy track statutory and common-law regulations of privacy); Kerr, supra note 290 (describing possible relationships between statutory privacy regulation—much of which restricts both private and government intrusions—and Fourth Amendment privacy regulation); Richard M. Re, The Positive Law Floor, 129 Harv. L. Rev. F. 313 (2016) (proposing that statutory and common-law regulations of privacy set a minimum for Fourth Amendment protections).
See Facebook, Inc. v. Power Ventures, Inc., 844 F.3d 1058, 1065-69 (9th Cir. 2016) (allowing a CFAA claim where the plaintiff expressly and completely revoked defendant’s authorization by letter); Craigslist Inc. v. 3Taps Inc., 964 F. Supp. 2d 1178, 1181-84 (N.D. Cal. 2013) (same); Weingand v. Harland Fin. Solutions, Inc., No. C-11-3109 EMC, 2012 WL 2327660, at *3 (N.D. Cal. June 19, 2012) (allowing a CFAA claim where the plaintiff arguably delineated the defendant’s authorization by verbal statement); Mayer, supra note 159, at 1654-56 (describing “without authorization” liability under CFAA).
See In re Warrant to Search a Target Computer at Premises Unknown, 958 F. Supp. 2d 753 (S.D. Tex. 2013) (denying and criticizing a malware warrant application); Stanford Ctr. for Internet & Soc’y, In Conversation: The Hon. Stephen W. Smith and Former Magistrate Judge Paul S. Grewal, Youtube (Nov. 9, 2016), https://www.youtube.com/watch?v=3 -fycsuHXpU [http://perma.cc/SBX4-U43A] (explaining that “many judges . . . don’t exactly know what they are being presented with” and applications can include “a very anodyne term like ‘network investigative technique’”).
See James B. Comey, Going Dark: Encryption, Technology, and the Balances Between Public Safety and Privacy, Fed. Bureau Investigation (July 8, 2015), https://www.fbi.gov/news /testimony/going-dark-encryption-technology-and-the-balances-between-public-safety -and-privacy [http://perma.cc/K4FQ-B7VX] (“[E]ncryption as currently implemented poses real barriers to law enforcement’s ability to seek information . . . .”); Sally Quillian Yates, Deputy Attorney General Sally Quillian Yates Delivers Oral Testimony Before the Senate Judiciary Committee, U.S. Dep’t Just. (July 8, 2015), http://www.justice.gov/opa/speech /deputy-attorney-general-sally-quillian-yates-delivers-oral-testimony-senate-judiciary [http://perma.cc/32QD-GAYV] (“[E]ncryption has been designed so that the information is only available to the user and the providers are unable to comply with the court order or warrant.”).