Blog

This is Not the End of Privacy

By Michael Simon, Attorney at XPAN Law Group.

We’ve all likely seen the same headlines in the past few months:

It’s easy to be concerned after reading such headlines – and there are many, many more just as concerning as these.  But it’s also easy to forget that it doesn’t have to be this way; the Coronavirus pandemic does not necessarily result in the end of privacy.  As I told SecureWorld in my recent podcast interview with them: “This is not the death knell of privacy. It is just the opposite.”

We need to ask the right questions, not “should we sacrifice our privacy?”

Privacy laws are not about prohibiting or even inhibiting business; they are about transparency for the data subjects. Without transparency there can be no trust. Without trust, your intended data subjects will lack the confidence that the data they provide will not be used against their own interests.

Until a vaccine or a viable treatment option becomes widely available, we are going to need to find another way to combat the spread of COVID-19.  As World Health Organization (WHO) executive director Dr. Michael Ryan has recently explained (starting at 30:36), new technological solutions such as mass data analysis, contact tracing, location tracking, and immunity certificate verification offer perhaps the only way forward towards opening up the economy.

We could create systems that do not require trust. We could create systems that leverage what MIT professor and privacy expert Sandy Pentland calls “’big brother data methods’” that by design cause “fear and bewilderment” among users. We could create systems that leverage the legacy of companies with a long history of “extreme secrecy” that use data in massive amounts even though it is “unclear what exactly this data is, where it comes from, or how it’s being used.”

Such trustless systems will ultimately fail. As a spokesperson for the ACLU has wisely put it: “These systems also can’t be effective if people don’t trust them.”  If people become convinced that their own cellphones will turn into tattletales they might do the previously unthinkable: leave their cellphones at home – or worse.

Thus, the right question is not whether we should sacrifice our privacy. As the WHO’s Dr. Ryan goes on to state (at 32:51): “we want to ensure that all products that are developed are done in the most sensitive way possible and that we never step beyond the principles of individual freedoms rights for individuals and for society.”  The right question is how we can build systems that incorporate trust and solve this crisis effectively. 

Fortunately, our privacy laws and regulations are already premised upon transparency, upon creating trust. We can and in many ways already have begun to build the systems designed to incorporate the trust we will need.

The pathway to creating trust has always been there in the privacy laws – we just need to use it

Starting our analysis with that most pervasive of international privacy laws, the EU’s General Data Protection Regulation (GDPR), Recital 39, “Principles of Data Processing,” sets out a clear understanding of the centrality of transparency:

“Any processing of personal data should be lawful and fair. It should be transparent to natural persons that personal data concerning them are collected, used, consulted or otherwise processed and to what extent the personal data are or will be processed.”

The European Data Protection Board (EDPB) recently issued “Guidelines 4/20 on the use of location data and contract tracing tools in the context of the COVID-19 outbreak” that emphasize how “data protection is indispensable to build trust, create the conditions for social acceptability of any solution, and thereby guarantee the effectiveness of these measures.”  For these reasons, the EDPB recommends that systems created to combat the current crisis be designed to be as transparent as possible: “algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available for the widest possible scrutiny.”

Discussing how systems should be designed, naturally leads us to Article 25, “Data protection by design and by default”, which requires proactive privacy measures, such as pseudonymization and data minimization to be integrated into systems. 

Under the California Consumer PRivacy Act (CCPA) the preamble (Section 2) of the CCPA makes clear that the ultimate purpose of the Act is transparency: “(i) Therefore, it is the intent of the Legislature to further Californians’ right to privacy by giving consumers an effective way to control their personal information . . .”

On April 10, 2020, the Attorney General of California, Xavier Becerra, issued a press release “reminding consumers of their data privacy rights amidst the COVID-19 public health emergency.”  In addition to reminding California consumers to avoid email scams, protect their home networks, and make effective use of the privacy features of virtual meeting systems (something I have written on as well), Becerra re-emphasized the need for consumers to demand the transparency that the CCPA requires of businesses.

We can build tracking systems that incorporate transparency and privacy by design

The most well-publicized effort to create some form of tracing app has been through a joint effort by Apple and Google.  Both companies have fully committed in their announcements to make “privacy and transparency paramount” in a wholly opt-in system.  The system’s built-in privacy features have been described by technology experts as:

“The basic system uses your phone’s Bluetooth to anonymously track who you have been in close proximity to. If you opt in to the system, your phone will spot when you’ve been near other people who get diagnosed with covid-19, as long as they also use the system. You won’t know their identity and they won’t know yours, but your phone will flash a notification letting you know you’ve been at risk of exposure.”

This focus on transparency by two technology giants is promising. Even the normally hyper-cautious ACLU has stated that the two companies’ proposal “appears to mitigate the worst privacy and centralization risks . . .”  Yet, as the technology giants recognize, wide-spread adoption of this opt-in app will require a high degree of public trust.

It is not just the technology giants who are working on solutions for opening up our economy.  Indeed, as Dr. Ryan is proud to note (at 33:33): “. . . we’ve had ideas for apps from people as young as 14 or 15, from individuals, from small startup companies, from huge globally based companies.”  Some of these small startup companies have made it clear that they have designed privacy into their products:

Of these startup apps, the one that appears to be furthest along is Private Kit: Safe Paths, which combines centralized and decentralized approaches, though any centralized data will be released in an anonymized and aggregated manner. The Safe Paths team has promised to incorporate encryption in the next version, with assistance provided by the co-creator of the RSA security algorithm.  As a final trust-building element, team lead Ramesh Raskar was recently quoted as being “emphatic that his code is open source” so that:

‘every part of the code should be visible to everybody, every day’—and that no government or tech company would have exclusive control over a centralized database that it could abuse. Users wouldn’t learn anything else about the infected person, such as age or sex.

Conclusion: This is not the end of privacy, though it may be – with apologies to Winston Churchill – the end of the beginning

Privacy rules and regulations represent, relatively speaking, a new ideal.  It has been less than two years since the implementation of the GDPR. The CCPA only went into effect at the beginning of this year; and, the Attorney General is not scheduled to enforce it until July 1, 2020.   Our older privacy laws, such as the Health Insurance Portability and Accountability Act (“HIPAA”) and the Children’s Online Privacy Protection Act (“COPPA”) have only seen signs of aggressive enforcement in the last several years. And, in many ways, privacy rules have yet to be fully tested, with enforcement actions slow to materialize, and those that to happen, resulting in somewhat disappointing results.

Perhaps this was all just the beginning. Now we face a true test, on scale that exceedingly few could have ever imagined. To address this challenge, we could indeed build “scary” systems that “erode,” “infect,” and “sacrifice” our privacy rights.  But it doesn’t have to be that way.  Our privacy rules don’t require it.  Our technology tools don’t require it.  If we as a society, choose to go the “big brother data methods” route it will only be because we as a society chose to do so.  We have not just an alternative, but an imperative to instead design for transparency and trust.

We just have to be clear what privacy is really about.  And we just have to ask the right questions.

* * * * * *

Nothing contained in this blog should be construed as creating an attorney-client relationship or providing legal advice of any kind.  If you have a legal issue regarding cybersecurity, domestic or international data privacy, or electronic discovery, you should consult a licensed attorney in your jurisdiction.