How “Notice and Consent” Fails to Protect Our Privacy
Blog Post
March 23, 2020
We’ve all seen it before: you enter a website and a banner pops up along the bottom, asking you to “click here” to accept the site’s terms of service and privacy policy. This practice is based on a framework for protecting individual privacy known as “notice and consent” (also called notice and choice, or transparency and choice). Put simply, notice and consent requires private entities to notify individuals and ask for their permission before collecting and utilizing their personal data. As we become increasingly aware of how corporations collect, share, and monetize our personal data at an alarming scale, we also need to recognize the persistent, structural shortcomings of the notice and consent framework in safeguarding privacy.
While some companies still believe that this framework should remain the main method of “protecting” consumer privacy, there is growing consensus among privacy experts, civil society organizations, legislators, and some FTC commissioners that it’s time to move beyond it. Notice and consent is too weak in practice to meaningfully shield individual privacy. Instead, we need comprehensive privacy legislation that will empower individuals with explicit user rights over their data, and provide strict limits on how private entities handle that data.
Notice and consent is inadequate both in informing individuals and in protecting their privacy. Few ever read the lengthy and legalistic policies attached to the products, apps, and services we use every day, which seem to stretch for pages on end. Indeed, artists taking aim at terms of service have found that policies for major services like Instagram could take over an hour to read—even assuming people without legal training would be able to decipher them—and cut a darkly comic picture when printed out and displayed in their full length. Not only does reading all these notices waste time, but a 2015 study also revealed that privacy policies do not sufficiently disclose all possible data practices, and they are sometimes deliberately silent on, for instance, what subject areas of information they collect.
Even if notices were both comprehensive and easy to understand, it is practically impossible for individuals to provide meaningful consent. The sheer number of entities collecting, using, and sharing personal data, along with the fact that people are not engaging directly with most of those entities, means that people cannot effectively weigh the costs and benefits of revealing information or permitting its use or transfer in most cases.
People also lack bargaining power with companies looking to use their data. Consumers either consent and have access to a service, or they decline and they have no access to that service at all, often without equivalent alternatives. Further, under notice and consent, even when individuals have a meaningful opportunity to consent to one company’s privacy policy, they are unable to engage in any meaningful way with third parties gathering data on the site. Long, deliberately vague notices don’t empower individuals, but confuse them, and talk of consent makes little sense where there are no real choices.
We need a new approach. Congress must enact legislation that will provide individuals with rights over their information and limit how companies may use personal data. Adopting explicit use restrictions for data, as well as defining the rights of users over their own data, will protect privacy better than the current notice and consent framework does. Transparency alone is insufficient. Even if people knew what data was being collected and for what purposes, it’s unclear how an individual can change what a company is doing with their data.
Use restrictions, on the other hand, recognize that knowing what is happening is different from doing something to change what is happening. Creating distinct guidelines for what companies can and cannot do with data puts the burden on them to adopt better privacy-enhancing practices. Those collecting and processing data are also in a better position than individuals to see how these tools and practices can be harmful.
Privacy legislation should require companies to assess what data they actually need in order to provide the services they offer, and only permit them to collect this required data. A restriction on collecting unnecessary data would significantly limit companies’ data collection, reducing risks to individual privacy from the get-go. Further, limitations on how companies can use data, such as prohibiting “secondary uses” beyond the purpose for which the information was originally collected, would put the onus on companies to behave better with personal information.
A new privacy regime should also explictly prohibit discrimination against individuals through data practices. Studies have shown that digital data can be used to discriminate against members of protected classes and perpetuate historical patterns of discrimination. Legislation should also limit the extent to which companies can sell users’ information; this can help protect personal data by limiting the number of companies and other organizations that have access to it, and providing users with greater control over their information. For example, the California Consumer Privacy Act, which went into effect January 1st of this year, provides individuals with a “right to opt out” of the sale of their information, and requires that businesses inform people of this right. Such explicit guidelines on the collection, use, and sale of data also would obligate companies to affirmatively assess their systems enabling them to detect potential problems and preempt problematic data practices.
While use restrictions would reduce the onus on individuals to protect their data and transfer much of that responsibility to companies, it is also important that privacy legislation clarify users’ rights. User rights would specify what actions an individual can take to control how companies collect or use their data. Unlike notice policies, which only serve to inform individuals of what will happen to their data, granting rights to users would empower people to limit access to their own personal information and control how it may be used. As OTI has previously pointed out, giving users the rights to access, correct, delete, and port their data, would not only inform them of which entities have their personal information, but would also allow them to have real choices about practices surrounding their data.
Several recent bills also provide some examples of what user rights can be included in legislation. The Consumer Online Privacy Rights Act of 2019 would provide individuals with specific data privacy rights they can assert against private entities, including the rights to opt-out of the transfer of data, port their data, and delete and/or correct collected data. The Privacy Bill of Rights, introduced in April 2019, also includes specific user rights like rights to access, correct, delete, and port data. The Online Privacy Act of 2019 introduces rights to access, correction, deletion, and porting, as well as a right for individuals to review automated decisions.
Under a notice and consent legal framework, individuals today have no real choice in how their personal information is managed. They should, however, be able to exercise autonomy with specific user rights to their personal data. Additionally, entities collecting, using, sharing, and monetizing personal information must be held responsible for handling that data in a way that respects individual privacy. It is time to move away from the false promise of the notice and consent framework, and adopt a new privacy paradigm that ensures people have specific rights over their data, as well as imposes meaningful data use restrictions on companies.