California earlier this year took a big first step to narrow the American-European tech regulatory divide with a comprehensive data privacy law. Tech companies operating in two of the largest markets in the world must now give their users some basic transparency and control over what data is collected about them and why. But data protection is more than just another compliance check-box. Data privacy rules are reshaping digital markets and informing the debate about competition policy in tech to address increased consumer demand for better protection of personal information.
Data protection and privacy, Brussels to Sacramento
In May 2018, the EU’s General Data Protection Regulation (GDRP) went into effect. It gave internet users in Europe the power of consent, more transparency, and increased control (including access and portability) over their personal data online. The GDPR also restricts unjustifiably over-broad collection of data, and limits the use of data to the purpose for which it was collected.
The US has no national data privacy law. But in January 2020, California (the world’s fifth largest economy) passed its own. Due to the state’s size, the California Consumer Privacy Act (CCPA) could in effect become a new nationwide standard (unless a federal law preempting is enacted). The CCPA gives users the right to know what data is being collected and how it is being used, to opt out of the sale of their data, as well as a right to view and delete the data. It does not have as-strict security requirements as the GDPR, but the CCPA creates a private cause of action for suing companies responsible for data breaches.
Combined with similar laws passed or under consideration in other major jurisdictions, a worldwide patchwork of rules is emerging that provides transparency and control in what data is being collected and how it is being used. Tech companies complying with these rules have had to make changes to their online products and services, such as adding a pop-up for obtaining user consent or identifying new ways to generate revenue in a more data-restrictive environment. Companies have also had to make changes to their day-to-day operations, which can span from giving new responsibilities to an existing employee, to hiring outside advisers (like legal or tech support), all the way to creating entirely new positions (such as a data compliance officer).
The burden of compliance: big fish, small fish
Data protection laws generally address what would commonly be thought of as consumer protection concerns—privacy, transparency, personal autonomy—as opposed to the structural and market-wide concerns of competition laws. But data privacy and competition are becoming increasingly connected as consumers demand more protection of their data. Some, though, see it as more of a conflict than a symbiotic relationship.
The main criticism of privacy rules such as GDPR and CCPA is that they put smaller players at a competitive disadvantage to larger ones.1https://www.wsj.com/articles/how-europes-new-privacy-rules-favor-google-and-facebook-1524536324 This is said to occur, for example, by helping incumbents with “direct relationships with users” if “wary consumers are more prone to trust recognized names with their information than unfamiliar newcomers.”2https://www.nytimes.com/2018/04/23/technology/privacy-regulation-facebook-google.html This argument assumes that the more transparent the use and stringent the consent requirements for data, the more likely users are to be scared away from dealing with smaller, lesser-known tech players. Yet it could just as easily be said that the need for developing trust is a competitive opportunity, rather than a disadvantage, for smaller companies looking to distinguish themselves from a field rife (even among its largest players) with data breaches and suspect privacy practices. Since this trust should benefit from more, not less, transparency and control in the use of personal data, the going-in assumption just as well could be that privacy rules can improve, rather than weaken, the competitiveness of lesser-known, privacy-minded companies.
That assumes smaller tech companies are able to comply with the rules. Another common criticism of data protection laws concerns their technical challenges, in particular in obtaining user consent. For example, the Wall Street Journal ran an article within days of the GDPR going into effect that suggested Google’s market share had increased at the expense of smaller advertising players who struggled to get the user consent needed to run a data-driven online advertising ecosystem.3https://www.wsj.com/articles/eus-strict-new-privacy-law-is-sending-more-ad-money-to-google-1527759001 But Google’s post-GDPR bump appears to have receded in the months that followed, as the rest of the industry learned how to navigate the new regulations.4https://adexchanger.com/privacy/privacy-regs-like-gdpr-hurt-competition-in-the-short-term-study-finds/; https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3477686 Similarly, publishers using Google’s ad services initially panicked in response to its plans to shift the responsibility of obtaining certain user consent onto them.5http://digiday.com/media/googles-gdpr-approach-raises-publisher-concerns/ But within half a year of GDPR going into effect, the two sides made progress in easing the burdens,6https://digiday.com/media/google-publishers-gdpr-standards/ and little more has been said on the issue ever since.
So nearly two years in, there is little to indicate a widespread inability to comply with the technical requirements of GDPR—though some of the anecdotal hiccups from its initial months continue to reappear in criticisms of data protection rules today.
But technical challenges aside, the cost of compliance is a whole other issue. It is not difficult to imagine small tech players lacking the resources (personnel and money) to satisfy all the requirements set out by the data privacy laws. Indeed, a common criticism lodged at these regulations is that their compliance costs are disproportionately borne by smaller players unable to bear them, who are then left at a competitive disadvantage to their big-pocket rivals.7https://www.ft.com/content/f77c3b3a-4c44-11e8-97e4-13afc22d86d4
Of course, it must be accepted that regulation will cost something. But as to the question of whether data privacy rules, in particular, cost “too much”: two years in, GDPR compliance costs for individual companies are unfortunately not yet well studied. (It’s too early for the CCPA.) Anecdotal evidence exists on both sides of the argument, but it’s difficult to sift through and identify real trends. Yet clearly no cataclysmic shock has been spotted for tech firms operating in Europe. The costs of compliance with the GDPR and CCPA (the latter exempting companies based on size and industry) would seem to scale up with the size of an organization and the scope of its data practices, so it seems reasonable to assume that the regulations will not impose costs that companies are unable to cover. So until good research emerges to suggest otherwise (please send it here), the burden of proof ought to lie with a well-funded tech industry.
Regulation and innovation: glass half empty or half full?
An especially vocal criticism of data privacy rules which could have a direct bearing on competition policy is that they stifle innovation. This view should not be taken lightly. A large body of research indicates that regulatory burden can stifle innovation, hamper entrepreneurship, and erect barriers to entry that impede upstarts.8https://www.tillvaxtanalys.se/download/18.62dd45451715a00666f214dd/1586366216495/Report_2010_14.pdf When it comes to the GDPR, the alarmist version of this can be summed up as follows: “Tech start-ups, video games makers and ad-tech businesses that specialise in data crunching for marketers are among those pulling products or services or their whole operations out of the EU as result.”9https://www.ft.com/content/2afa6e6c-5ecb-11e8-9334-2218e7146b04 A more nuanced version is that “the GDPR is likely to hobble the short-term development of Europe’s artificial intelligence industry” because “[i]f data are the feedstock on which the algorithms gorge, then Europe may be rationing its most precious commodity.” By contrast, “Chinese AI companies, almost wholly unfettered by privacy concerns, will have a raw competitive edge when it comes to exploiting data.”10https://www.ft.com/content/f77c3b3a-4c44-11e8-97e4-13afc22d86d4 Critics have similarly said that the CCPA will stifle innovation in the tech sector.
Although raising some bona fide competition concerns, the reasoning is flawed. As for the GDPR, any competition-related disadvantage such policy might impose on Europe’s tech sector should diminish with the passage of comprehensive data privacy laws in other major markets. The CCPA may already go quite far in leveling the playing field, but laws in other major jurisdictions will have an effect, as well.11https://www.consumersinternational.org/media/155133/gdpr-briefing.pdf But even if Europe and California were on their own in enforcing data protections, resetting the bar lower according to what less-regulated jurisdictions are doing would seem dubious if consumer protections and rights of privacy are valued by society.
Another variant of the innovation-stifling argument is that data privacy rules will decrease venture funding of European startups, the “smart money” redirected to companies operating in more laissez faire environments. One study found that “EU technology firms, on average, experienced double-digit percentage declines in venture funding relative to their US counterparts after GDPR went into effect.”12https://voxeu.org/article/short-run-effects-gdpr-technology-venture-investment Since the study uses data on the more turbulent first five months of the GDPR, it would be interesting to see if its observed “short-run” effects have been sustained in the long-run as online players adapted to the new regulations–especially now with a similar data privacy law in California. Moreover, the study itself acknowledges that funding is only one consideration in many. One fundamental question that comes to mind: is society worse off if venture funding is diverted away from businesses that do not provide users with basic data protections?
Competition policy through data privacy: the visible hand
Research suggests that “social regulation” which seeks to benefit society by correcting market externalities or failures can improve innovation and enhance competition.13https://www.itif.org/files/2011-impact-regulation-innovation.pdf It is no mystery why representative democracies around the world are passing sweeping data protection laws: consumers increasingly value their privacy online.14https://www.forbes.com/sites/martyswant/2019/08/15/people-are-becoming-more-reluctant-to-share-personal-data-survey-reveals/#ddd2aac1ed15 So might not data privacy rules reflect a form of competition policy that helps to nudge an invasive internet to a new competitive paradigm in which companies compete and innovate on a more privacy-friendly basis, like American car makers made more competitive against Japanese imports by fuel efficiency and safety regulations?15https://www.itif.org/files/2011-impact-regulation-innovation.pdf
Such a frame of thinking suggests that critics may be too pessimistic when they focus on tech companies as victims of data protection. It could just as easily be that data privacy rules reward the companies able to earn the trust of their consumers, or better yet, develop products and services that are not so dependent on exploiting user data in the first place. As one commentator summarized it: “GDPR is yet another moat for established companies. It may take them some time to adapt their data models and engineer systems for data deletion, but once they have done so, it becomes something every startup will have to implement in order to compete. Alternatively, start-ups that bake in privacy by design, have a substantial advantage over lumbering established companies that have to adjust their existing processes and may – in some cases – discover that their business models are fundamentally at odds with the regulations.”16https://digiday.com/media/winners-losers-gdpr/
A good example of this is what has happened in competition among web browsers. As covered in a previous post, increased consumer demand for data protection has caused the major web browsers to shift away from cross-site tracking of users with third-party cookies. Meanwhile, Brave, an up-start in the industry that is looking to differentiate itself as the “private web browser”, has rolled out a product that does not rely on tracking users. And data protection rules may be helping open an even wider lane for it, as reflected in the company’s recent complaint to UK authorities that Google has violated the GDPR in its use and combination of user data across different products.17https://brave.com/competition-internal-external/ A similar rationale lies behind support by another privacy-minded web browser (and search engine), DuckDuckGo, for a national data privacy law in the US.18https://www.judiciary.senate.gov/imo/media/doc/Weinberg%20Testimony.pdf
Data privacy and competition policy can not only co-exist, they can co-evolve. Two years in, Europe’s experience with its data protection rules does not look anything like the apocalyptic scene some were predicting. The internet in the EU has not gone dark. Tech companies operating there have survived by adapting. California’s new law–passed in the heart of the tech industry–will be another test of what data privacy rules can mean for digital markets and competition policy. And in the coming years, more experience with these rules and also hopefully some empirical research will help us understand the role that data privacy (whether driven by markets or regulatory compliance) can play in weaning the digital economy off its dependence on personal data.