Data Rights and Data Wrongs: Civil Litigation and the New Privacy Norms
abstract. Significant media and scholarly discussion has focused on the civil liberties implications of government access to electronically stored personal data held by third-parties. This essay, however, argues that that civil litigation between private parties in the data privacy and security space is also shaping important cybersecurity and privacy norms. Because no comprehensive data privacy law exists in the United States, litigants in these disputes must rely on conventional civil legal doctrines that are ill-suited to the legal questions raised by the rise of the mass collection of personal data. As a result, it is unclear how courts will resolve emerging questions and we believe that the law will develop in an uneven and unpredictable ways.
Introduction
In recent years, significant media and scholarly attention has focused on the emerging legal questions surrounding government access to private data held by, or accessible to, third parties. Two of the most prominent examples were the Department of Justice’s attempts in 2016 to force Apple to decrypt an iPhone belonging to the perpetrator of the December 2015 mass shooting in San Bernardino, California,1 and in 2018 to compel Facebook to decode the encryption in its Messenger application.2 Last year, the Supreme Court adjudicated whether a warrant based on probable cause was required to access cell-site records created and maintained by third-party wireless service providers.3 The popular press has also reported on the legal debate surrounding what access the government should have to facial recognition technologies and databases run by providers such as Amazon, Microsoft, and Google, among others. Yet although constitutional and civil-liberties scholars and the media have extensively examined these issues in the criminal context, few have paid attention to the many and varied ways that civil litigation between private parties in the data privacy and security space is shaping important cybersecurity and privacy norms.
This phenomenon is occurring for several reasons. To begin with, data collection and analytics are increasingly vital to operating a business—and are becoming integral to the way businesses deliver products and services to their customers. For their part, consumers, too, are also using more devices and programs that produce data—data that is regularly stored and can be analyzed for a range of purposes.4 As the market grows for devices and services that collect data and the amount of data amassed by companies increases exponentially—and is shared, sold, or stored with third parties—courts and lawyers alike should expect civil litigants to seek access to this data during the normal course of discovery. In addition to geolocation data, which might be relevant in a wide variety of civil contexts, it is almost inevitable that data collected by wearable fitness technology, appliances, drones, or automated vehicles will become the type of information that is routinely sought in civil ligation. Relatedly, as we will see below, fundamental questions about what data companies can collect about individuals, what they can do with it, and the circumstances under which it can be disclosed to third parties, will be shaped to a great extent through civil litigation.
This Essay discusses some of the potential ways that these new legal questions will arise in civil litigation and the potential effects they will have on cybersecurity and privacy norms—norms which will, for better or worse, confront courts and litigants in criminal and civil liberties cases. First, we briefly analyze how data incidents involving businesses may shape the development of privacy norms in the private sector. Second, we explore how, in the absence of a comprehensive data privacy regulation, legal norms surrounding the collection, analysis, and sale of personal information will be formed in civil litigation using existing laws—laws that are ill-suited to the emerging complexity of data privacy disputes.
I. business-to-business litigation and our data
As readers of law journals and the popular press are well aware, data breaches have become a regular source of reputational and legal risk for companies. High-profile breaches such as the 2017 Equifax breach—which exposed the social security numbers, birthdates, and addresses of 145.5 million people5—garner significant media attention. Yet while there are state laws, discussed in the next section, that articulate a company’s notification obligations to individual consumers, at least to date, there are no comparable state or federal laws that define the legal obligations that apply to businesses that share information with each other, whether involving the personal information of individuals or confidential business information. In the absence of any governing legislation, companies typically manage risks associated with data sharing among themselves through pre-contract due diligence or through various contractual provisions that define data security obligations and procedures.6 Perhaps not surprisingly, the increase in data sharing inevitably means that more disputes will wind up in litigation when data mishaps occur as between two businesses.
A few recent examples highlight the core questions at issue in business-to-business litigation over data-security. In 2017, Aetna settled a lawsuit for $17 million after 12,000 insurance members received an envelope that, through a clear plastic window, revealed the member’s HIV status,7 a violation of the privacy provisions of the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA) as well as various state privacy laws.8 The mailings in question involved a third-party company who arranged and actually mailed the envelopes and a plaintiffs’ attorney.9 After the $17 million dollar settlement, Aetna sued the third party, seeking indemnity, contribution, reimbursement, and damages for their purported negligence.10 The mailing company then filed an action against Aetna, arguing that Aetna, as a large and sophisticated insurance company, was responsible for ensuring the mailing complied with applicable state and federal law.11 Among other things, they accused Aetna of transmitting to them far more personal information than was required to complete the mailing.12,13
Business-to-business litigation often directly follows a business-to-consumer data breach. For example, breaches involving credit card, debit card, and banking information are burdensome on financial institutions who must deal with a wave of fraud claims, questions from consumers, and the possibility of losing customers. In the summer of 2018, JPMorgan Chase and its payment processing arm, Paymentech, sued Houston-area hospitality chain Landry’s over a 2015 data breach involving credit card information.14 The breach was caused by a program that was installed on payment devices at Landry’s restaurants that read the information on the magnetic stripe of the card reader, which included cardholder name, card number, expiration date, and CVV number.15 A similar breach happened to Wendy’s in 2015 when malware infected a small portion of its point-of-sale systems and permitted the use of compromised third-party vendor credentials. In that case, issuing banks sued Wendy’s for the costs they incurred in compensating cardholders for fraudulent charges made on the compromised cards and for the cost of re-issuing new cards.16
Although on the surface these lawsuits involved traditional negligence and contract questions, at their heart they present two critical and fundamental issues. First, who is responsible for data breaches where information travels between two commercial parties? Second, what security standards will be expected between companies that process sensitive information, especially if that information is protected by a federal privacy statute? In a nutshell, what are—and should be—the norms about securing data, and the penalties if those expectations are not fulfilled? As more companies process or store information in the cloud, we are likely to see more cases involving unauthorized access to sensitive personal information and valuable corporate information. These cases will—for better or for worse—build a body of case law that answers the questions posed above and which constitute data privacy norms for information shared between businesses. And, in so doing, these norms will create the “atmosphere” of an ecosystem that also informs how attorneys and courts frame arguments and decisions concerning how data is handled, shared, secured, and analyzed in criminal and civil rights litigation.
II. the wild west of data collection
The amount of data we produce each day is staggering. By 2025, the proliferation of data-producing devices and services means that each person with an internet-connected device will have at least one data interaction—sending or receiving from a continually expanding universe of such devices every 18 seconds, or almost 5000 per day.17 Reports of Cambridge Analytica’s mass collection of Facebook user data18 have raised the public’s awareness of some of the potential policy issues raised by the rise of big data. As demonstrated by that episode, even if the average person knows that companies or their cell phone carriers are collecting immense amounts of data about their lives, many may not know where that data winds up.19 They are often outraged when they find out. Even so, at present, data privacy in the United States is governed by patchwork of federal and state laws that typically only concern certain classes of sensitive data or cover certain entities. This leaves a vacuum in which important privacy concerns are left unaddressed by law.
Many cell-phone owners may know, for example, that their carriers are collecting geolocation data from their phone. What they may not know is that their carrier also may sell that real-time location data to third parties. According to recent reporting, geolocation data from AT&T, Sprint, and T-Mobile phones were accessed by 250 bounty hunters and related businesses through a company called CerCareOne.20 Although some of the largest carriers have recently decided to end the practice in light of this reporting,21 their decision to do so does not answer the fundamental question of the limits of permissible use and the consent required from the person from whom the data is collected.
The rise of automobile-tracking technologies presents another challenge. Electronic toll-collection systems like EZ-Pass have long been subject to criminal and civil subpoenas to obtain evidence of a vehicle’s given location at a particular time.22 Over the past few years, New York has implemented a cashless toll system that ensures that a record is created of every car that passes through a toll. Operated by the same contractor that administers EZ-Pass, the New York system takes a photo of a driver’s license plate and sends that information to the relevant government entity, which then sends a bill to the address registered with the vehicle.23 But private companies gather this information as well. A Texas-based company, Digital Recognition Network (DRN), has taken 6.5 billion photographs of license plates that it then geotags, stores, packages, and sells to automotive lenders, insurance companies, and vehicle-recovery professionals.24 Of significance to civil liberties advocates, DRN also partners with a company called Vigilant Solutions that provides data and image analytics for vehicle location to law enforcement, who can use the location data—notably, without the high stands of proof and procedural protections of a warrant.25
As is characteristic of other privacy laws, the response of legislators to the intrusiveness of these technologies has largely been reactive and piecemeal. Sixteen states have enacted statutes limiting the use of license plate readers to law enforcement and related entities and requiring that the records be destroyed after a certain period of time.26 Although there is a federal statute that governs the disclosure of personal information gathered by state motor vehicle departments, it was passed in 1994, long before lawmakers contemplated these companies.27 Not surprisingly, many pre-internet, decades-old laws simply did not account for the way data is generated, collected, and used today.
Other countries have taken steps to address existing gaps in data-privacy laws. For instance, under the European Union’s General Data Protection Regulation (GDPR), an entity is only able to collect personal information28 about a “data subject” if it has a legal basis to do so, for example by obtaining the data subject’s consent.29 The regulation also contains provisions governing how data is processed, stored, and transferred and gives data subjects the right to request information about what data is collected and how it is used, to correct information, and even to request the deletion of the data.30 The GDPR may be enforced with administrative sanctions or through a private right of action.31
The United States has no national privacy law like the GDPR. As a general matter, data privacy in the United States is governed by a series of federal and state laws that only cover certain classes of sensitive data or certain entities.32 The Federal Trade Commission (FTC) has used its authority under section 5 of the Federal Trade Commission Act33 to expand its focus on privacy issues over the past decade and, acting under that authority, has issued nonbinding guidance on online behavioral advertising.34 In addition to section 5 of the FTC Act, some prominent federal privacy laws include the Health Insurance Portability and Accountability Act35 and related regulations,36 which govern private medical information; the Fair Credit Reporting Act,37 which regulates consumer credit information; and the Financial Services Modernization Act,38 which governs certain banks and financial institutions with respect to the collection and use of financial information. There are also federal laws that protect the privacy of written electronic communications,39 student education records,40 and personal information collected online from a child under thirteen years of age.41 While these federal laws provide some protections, they fail to provide the comprehensive privacy protections of the GDPR.
At the state level, all fifty states and the District of Columbia42 have statutes that require consumer and regulator notification in the event of a data breach involving personally identifiable information, but the various provisions in these statutes vary from state to state in terms of what information is covered43 and under what circumstances notification is required.44 In addition, these statutes only govern a company’s obligations to make certain notifications after a breach of information has occurred rather than the collection, storage, and transfer of data. And, like the federal statutes, they generally only apply to certain narrow classes of sensitive private information, such as Social Security numbers and bank-account and credit-card information. Apart from the California Consumer Privacy Act discussed below, which does not take effect until 2020 and may be significantly revised before then, there is no federal or state statute to specifically address the enormous amount of varied consumer data that modern companies collect.
Because there is no comprehensive framework governing data ownership and use, the scope of a data collector’s ability to collect and sell personal data in the United States is—and will continue to be—litigated using legal theories advanced to, and ultimately decided by, civil judges and juries. Examples of this phenomenon are not hard to find. For example, in a recent civil suit filed by the City of Los Angeles, the City alleged that the company that operates the Weather Channel app45 deceived consumers, under the pretext of providing weather information, into permitting the application to collect a massive amount of geolocation data that the company then shared with its parent company, IBM, and various third parties.46 The complaint alleges that the company has referred to itself as “a location data company powered by weather” capable of collecting more than one billion pieces of geolocation data per week.47 The information collected from the data powers IBM’s “audience-derived location targeting platform” JOURNEYfx.48 According to IBM’s own website, JOURNEYfx “uses one of the world’s largest continuous streams of first party location data—The Weather Channel—to find and reach relevant audiences. It leverages people’s real-world behaviors over time to shed light on their wants, needs, preferences, consumption habits, and anticipated future activities.”49
Because there are no rules governing the company’s collection of data and the manner in which any disclosures should be made to consumers, the suit was brought under California’s Unfair Competition Law50 based on the theory that the data collection constitutes “unlawful, unfair or fraudulent business act or practice.”51 Specifically, the court will need to determine whether the City’s allegations that the disclosures in the privacy policy, which were only accessible after the app was installed and the user was prompted to turn on location services, did not sufficiently describe the purpose for which the data was collected, and whether that conduct meets the definition of an unfair or fraudulent business practice or deceptive advertising.52 Rather than tailor the inquiry to the unique context of data collection, the court will almost certainly be applying the same language as it has in cases involving, for example, the terms of residential mortgages53 and the health claims of breakfast-food manufacturers.54 Even if this case produces a clear precedent in California for how disclosures should be made, state unfair and deceptive trade practices laws vary widely in terms of prohibited conduct, available remedies, and whether private rights of action or class actions are available.55 With dozens of separate state statutes and bodies of caselaw, it is easy to imagine a confusing mess of contradictory rules emerging. While traditional federalism arguments concerning the benefits of state innovation and varying approaches may seem appealing, the pace of those developments will simply not match the pace of technical innovation and emerging legal issues that results from that change.
In 2018, the California legislature may have shown a potential way forward when it passed the California Consumer Privacy Act (CCPA).56 The statute, which goes into force in 2020, requires companies subject to the Act to inform consumers that the company is collecting information, allow consumers to opt out of the sale of their personal information, provide consumers—at their request—information about how their data is used, and delete a consumer’s information when asked to do so.57 Critically, it provides an expansive definition of personal data that reflects what companies are actually collecting.58 The CCPA could potentially serve as a model law for other states,59 or perhaps even a federal law, but given the current political climate and industry opposition, it is unlikely that will happen in the near future. Over fifteen years elapsed between the passage of the first data breach notification law in California in 200260 and the passage of Alabama’s,61 the last state to do so, and there is still no federal law. It will be impossible to know how the CCPA will affect covered companies’ compliance decisions until after the statute takes effect, but the demand for consumer information is expected to grow significantly over the next decade as businesses become more data-driven.62 Lawyers interested in privacy rights would do well, therefore, to expect this gap in consumer protection to persist.
For privacy advocates, the stakes of the regulatory void in which these data collectors operate to collect vast quantities of data are far higher than, for example, directing personalized advertisements and discounts for fast food restaurants based on geolocation information—a project that JOURNEYfx advertises.63 If recent news reports are accurate, the third-party data merchants who purchased geolocation data from cell phone carriers subsequently, and inappropriately, sold real-time location information to bounty hunters, bail bondsman,64 and in one case, a law enforcement officer who tracked phones without a warrant.65 It is with these private sector actors that process increasingly immense amounts of data that many of the most pressing future privacy questions arise. Aside from the legislative process (which is often hampered by gridlock) and criminal litigation (which infrequently raises these issues due to the relative paucity of criminal cases as compared to civil filings), the only way for concerned individuals to object to the mass collection of data is through ligation using conventional civil law doctrines.
Conclusion
As we produce more data that companies can farm for value or lose, questions surrounding data ownership and responsibility for liability following a data mishap will continue to become more pressing. When has an individual consented to the type of data that a company is collecting? What type of data can they share with third parties and under what circumstances? What recourse will an individual have it their data falls into unauthorized hands or the government? These questions increasingly touch on fundamental notions of privacy. Yet barring significant and comprehensive federal data legislation in the United States, these questions will be principally answered in the context of civil lawsuits—often in lawsuits between businesses. And precisely because the outcomes of these lawsuits will shape norms and laws concerning privacy and security in years to come, litigators would be well-served by closely following these civil cases.
Joseph V. DeMarco is the founding partner of DeVore & DeMarco LLP, a boutique law firm that specializes in the law of data privacy and security and cybercrime prevention and response. From 1997-2007, he served as an Assistant United States Attorney for the Southern District of New York where he led cybercrime investigations and prosecutions. Brian A. Fox is an associate attorney at DeVore & DeMarco LLP.