
The recent passage of the California Consumer Privacy Act of 2018 (CCPA) has made the collection and sharing of personal information (PI) more complex for any company that relies on consumer data for advertising, business insights, or operations. This paper outlines the requirements of the new law, set to go into effect in 2020, and discusses the opportunities it presents for businesses to change their processes and improve consumer trust.
The CCPA was passed on the heels of the General Data Protection Regulation (GDPR) in Europe. While this paper recognizes that laws are likely to change prior to 2020 deadline, companies need to prioritize secure consumer data strategies while they have time. As the following paragraphs will show, there is already public demand for privacy protections, and legislators are open to pass laws that regulate the collection and sale of personal data.
Businesses collect personal information for a variety of reasons including to better understand a marketplace or to serve more targeted ads to consumers. While there have been abuses in data collection, as the recent Cambridge Analytica scandal illustrates, the vast majority of businesses operating in California are using data for legitimate purposes. CCPA establishes rules for the collection and sale of consumer information and ensures that companies are liable for any damages caused by data breaches, failing to gain customer consent of the collection and use of personal data, and defines the processes for selling personal information.
OVERVIEW OF THE CALIFORNIA CONSUMER PRIVACY ACT
The state of California defines personal information as any data that identifies, relates to, describes or is capable of being associated with an individual or household. Common categories of personal information include identifiers such as a name, mailing address, IP address, email addresses, or ID numbers (e.g. social security, drivers license, or passport numbers). Electronic network activity such as browsing history or Google search history is also considered personal data, as are any identifiers of the computer a person is using. Geolocation data and any education data that is not publically available is also categorized as protected personal data.
The CCPA provides consumers more control over their personal information by forcing businesses to disclose the information it collects about a consumer and the reasons it gathers sensitive personal data. The law also requires that companies categorize all third parties that share information to ensure they are transparent in their data management. In addition, CCPA grants the consumer the following rights:
- The right to delete personal information
- Opt out of the sale of personal information and prohibits businesses from discriminating against those who opt out
- Prevents the sale of data from those under the age of 16 unless the consumer “affirmatively authorizes” the use of their data
Digital privacy laws such as CCPA are pushing for a shift in the ownership of PI from corporations to consumers. The laws are an attempt to empower the consumer to share their PI as they see fit. This shift makes sense when viewed in light of previous privacy legislation in California.
PRIVACY LAW IN CONTEXT
The CCPA is likely the first of many state and federal digital privacy regulations in the United States (Colorado and Vermont have proposed similar legislation). Recognizing that these new laws are imperfect and may change over time does not give businesses a reason to ignore the legislation or worry about complying with the laws at some indeterminate point in the future. Privacy laws are neither new or novel. The state has recognized a person’s right to privacy since 1972 when voters amended the California Constitution to include the right of privacy among the “inalienable” rights of all people. The amendment established a legal and enforceable right of privacy for every Californian. Fundamental to this right of privacy is the ability of individuals to control the use, including the sale, of their personal information. The CCPA is a continuation of the commitment to personal privacy and recognizes that California’s laws are out of touch with current technology. The authors of the digital privacy legislation are clear:
“As the role of technology and data in the everyday lives of consumers increases, there is an increase in the amount of personal information shared by consumers with businesses. California law has not kept pace with these developments and the personal privacy implications surrounding the collection, use, and protection of personal information.”
The CCPA is an attempt to amend this discrepancy and bring existing privacy laws up-to-date with technological advancements.
RISKS: FEES & TRUST
Failure to plan for privacy laws increases the risk of legal fines, civil lawsuits, and possibly jeopardizes consumer trust. One example of a company who was unprepared for GDPR is Instapaper. This popular bookmarking app abruptly entirely cut off services to the EU when they were caught without a plan to meet GDPR requirements. Rather than risk a €20,000 fine, they blocked all traffic from consumers in the EU. Many other companies were forced to take similar actions. National Public Radio presented EU users with a text-only site if they declined the GDPR privacy message (forgoing any ad revenue). The U.S. television network A&E went a step further and blocked all EU traffic to its website because they could not update their ad display networks in time to be compliant with GDPR. A&E was not alone in preventing access, as the Los Angeles Times, the New York Daily News, and the Chicago Tribune also surfaced a message stating that their site was unavailable in most European countries for the same reason. Had these companies prepared a data strategy ahead of the law, there would have been no need for such drastic measures. Blocking all consumers from California is not an option for most companies, and the penalties for CCPA violations are severe as well. Under the law, a company is liable for a $7,500 fine per violation. For example, if a company’s data is breached, they can be fined $7,500 for every customer impacted, a sum that could quickly climb to millions of dollars. On the other hand, when a company engages in transparent and secure data practices, it improves consumer trust. A recent study by IBM found that “when it comes to driving choice, cybersecurity trumps confidence in leadership and corporate social responsibility. It’s not just an under-the-hood operational function, it is part of how companies are judged in the consumer marketplace.” We can also take a lesson from Microsoft, who has already adopted GDPR regulations for all of their markets in preparation for expanded privacy laws.
Microsoft has been pushing for federal digital privacy regulation since 2005 when then Sr. Vice President and General Counsel Brad Smith told members of the Congressional Internet Caucus that Americans were increasingly concerned with their online privacy and U.S. laws should do more to protect them. In a blog post from May 2018, Microsoft provided the following reason for updating their privacy policy before national regulation was passed in the United States:
“Privacy is also the foundation for trust. We know that people will only use technology that they trust. Ultimately, trust is created when people are confident that their personal data is safe and they have a clear understanding of how and why it is used.”
The push towards consumer trust through data security has been a guiding principle at Microsoft for well over a decade, and other companies would do well to follow their example.
A NEW ERA FOR ADVERTISING
Digital advertising companies that require personal information to make their ads effective recognize that this law will force them to change their data processes. Dipayan Ghosh, writing in the Harvard Business Review, provides examples of firms that will need to develop new ways to handle consumer data. He notes that data aggregator firms such as Acxiom, Epsilon, Experian, and Oracle rely on the transfer of personal information between third parties. Ghosh concludes that the practice of selling to ad networks, marketers, retailers, or any other type of business “are precisely the kinds of practices that are directly threatened by the consumer’s rights to deletion and to opt out of sale of data.” The exchange of personal information between companies is the linchpin of digital advertising revenue. Though Antonio García Martínez, Facebook alum and author of Chaos Monkeys, argues in a recent Wired articlethat companies like Google and Facebook will not feel the impact of the CCPA because they have a direct relationship with consumers, these companies risk a loss of consumer trust when personal information is mishandled. The coverage of Facebook’s Cambridge Analytica scandal is evidence that the public cares how their data is used and will hold social media companies responsible for data breaches, indiscretions, and mistakes.
New regulations, whether from California, Europe, or a yet-to-be-defined national US policy will likely have the biggest impact on data aggregators. Business will be impacted if their current data management and practices does not evolve. A poll taken the day after GDPR showed that 30% of respondents chose to “opt-in” to allow the use of their personal data. As consumers become data-savvy, they will choose to opt out of services that they do not trust with their personal information. In response, data aggregators will need to improve communication by being more transparent with consumer on one hand, while offering incentives for personal information on the other.
BEYOND COMPLIANCE: A PATH FORWARD
As the companies who were caught short-sighted with the launch of GDPR can attest, waiting to define your company’s data policies and processes is an expensive mistake. But legal compliance should be the bare minimum of a company’s data management plan. Forward-thinking companies will build the data processes required to utilize their data to drive critical decisions and improve customers’ experience. As Jennifer Belissent succinctly stated in a recent Forrester report: “Insights-driven companies … systematically use their data to deliver better customer experiences, improve operations, and create competitive differentiation — all of which adds to the bottom line.” For many companies these new laws will be the impetus that gets them thinking about how they use, process, and protect data today.
Secure data exchange and collaboration across platforms and companies will deliver valuable consumer insights and will become a primary focus of operations for successful businesses. A quick review of data valuation statistics makes the point clear:
- In 2018 48% of data executives reported working to commercialize their data, a 16% increase over 2017. (Forrester)
- Similarly, the International Data Corporation (IDC) predicts that “by 2020, half of the Fortune 500 will be connected to open and automated information exchanges to enable rapid provisioning of data services for governance and product development.” (IDC)
- Industry experts predicted that open data, freely available information on consumers collected by the government, will be responsible for between $3–5 trillion in economic value in 2013. (McKinsey)
These figures will continue to increase in the coming years as companies begin to learn how to leverage not only their internal data, but external sources as well. However, the process of data sharing and storage must be done correctly, to avoid legal fines and develop a trusting relationship with consumers.
New legislation means that companies will need to update their data practices securely while enhancing the ability to weave data into a company’s everyday business practices. Having the data is not enough, it must be accessible in the right places at the right time. Current trends in data usage show that businesses need to move beyond thinking in terms of compliance and adopt strategies that allow for automated data sharing between platforms, collaborations between companies, and information exchange between customers if they want to survive in the modern marketplace.
This piece was written by Data Republic guest author, David Rheams PhD