Privacy Nihilism is Pervasive. Are Our Laws to Blame?
I attempt to make the case for why we need more complete, clear, and consistent privacy laws.
“I don’t care about privacy, I have nothing to hide.”
As someone who has lived and breathed privacy & data protection for nearly a decade, I’ve heard this statement, or some variation of it, thousands of times. I’m only slightly exaggerating here. It’s so pervasive in my world that I stopped keeping count. Still, each time I do hear it, a little part of me dies inside.
Imagine if people said “I don’t care about free speech, I have nothing to say,” or “I don’t care about the right to religion, I don’t believe in anything.” People stage mass protests and even overthrow governments when their right to speak is infringed, or when they can’t practice their faith. And yet, when it comes to our right to privacy – to determine what we share (and choose to hold back from) others, to control information that relates to us, or to be left alone … we collectively shrug.
“Privacy is dead”, we sigh in resignation. “Who cares if Facebook or TikTok know everything about me?”, we think as we click ‘Accept all’ to yet another annoying cookie popup, or blindly check the box on ‘I have read and agree to the Privacy Policy’ even though none of us do.1
Despite the fact that nearly every country recognizes a right to privacy, and a growing number recognize a separate right to protect our personal data (also referred to as ‘data privacy’), there’s a tremendous amount of privacy nihilism in the world.2 Why have so many people given up on privacy and data protection? Why is this nihilism so prevalent?
With apologies for bastardizing some Shakespeare, the fault lies not in ourselves, but in our laws – specifically, the hundreds of confusingly complex, unclear, and inconsistent global laws covering technology, and in particular, the protection of our privacy and personal data online and off. Not only the laws, mind you, but I truly believe that fixing the laws would be a good start.
There’s Got to be a Better Way, Right?
Privacy and data protection laws usually go well beyond cookie popups or website privacy notices – the best laws are designed to preserve our autonomy, choice, right of self-determination, and dignity, and to ensure that our data is protected, accurate, used fairly & transparently, for legitimate reasons. The best laws keep the worst violations of our rights at bay.
After all, we live in a world where automated decisions are made about us daily, sometimes based on inaccurate (unintentionally or otherwise) personal information stored in massive databases and shared indiscriminately with nameless third parties. These decisions affect our ability to get hired for jobs, receive benefits, rent or buy a home, or travel freely. Facial recognition failures lead to false convictions, while near-constant public surveillance leads us to modulate our behaviors.
Meanwhile, our most sensitive secrets get passed around to data brokers, governments, and advertisers like a church collection plate, and are constantly exposed by hackers on the Darkweb who exploit weak data security practices and our very human instinct to never delete anything. All that data hoarding & sharing means that it’s become much harder to recover from all the kinds of youthful indiscretions or regrettable life choices that previously would have naturally been forgotten.
The trouble is, while most privacy and data protection laws start out with the best of intentions, they rarely succeed. That’s at least in part because they’re large, often inscrutable, inconsistent, and packed with well-meaning but vague edicts that don’t actually do much. Well, they don’t do much, that is, beyond providing job security for lawyers & consultants shilling “privacy theatre” solutions – things like checklists, audits, training & awareness exercises. These things seem like they’d be effective, but in the end they’re little more than legal marshmallow fluff.
Most data protection and tech laws barely constrain the worst offenders, who usually have the talent, resources and connections to play around the rules and game the system. Simultaneously, they leave smaller organizations overwhelmed and unsure of what to do. And most of us don’t even know about the rights we’re given under these laws.
No wonder so many have given up and declared privacy to be dead.
But the problem isn’t one of negligence or willful bad behavior alone. I argue that these problems will only get worse as more countries introduce their own reactive laws that attempt to address new technologies like artificial intelligence, rein in Big Tech’s excesses, or prohibit each new way we humans develop to torment one another online. But it doesn’t have to be this way – we all can do better when it comes to protecting our personal data and ensuring our privacy online and off, but it’s going to require a whole lot more than just privacy “awareness”, reading 20-page privacy notices, or privacy theatre bullshit.
First, we need to fix the laws themselves.
Completeness: Not Just the What and Why, but Also the How
It’s ironic that few modern laws make any sense, despite the fact that we’re bound by them. Ignorance of the law is not an excuse, as the old chestnut goes.
Laws are usually painfully detailed about the importance of doing something (like providing privacy notice to users, or reporting a data breach), or not doing something (like ‘selling or sharing’ personal information about an individual, or placing cookies on people’s computers without consent). They’re also detailed about the why — including the legislative history and need for the laws. What they don’t include is clarity or technical details of the how.
They leave that crucial detail open to (mis)interpretation.
This makes compliance costly and difficult for data controllers, while keeping individuals in the dark about their rights, or what they should expect when it comes to the use of their data.3 Worse still, it leaves application of the law open to clever legal shenanigans and litigation delays, especially when the consequences for getting it wrong are high, but actual enforcement is low.
Sometimes, this gap is filled in with regulatory guidance or amendments to the law. Sometimes judicial decisions add some clarity, but that can take years to come about. Journalists and experts sometimes help. But so often we’re left to our own devices. And frequently, we get it wrong.
Let me give a real-world example. As a data protection consultant, I regularly have to guide clients and help individuals handle or make data subject access requests. Many data protection laws give individuals a right of access to their personal data. But the devil is usually in the details of what must be provided, by who, how & when, and what individuals must do to meaningfully exercise their rights. For example, laws vary wildly when it comes to:
Restrictions - Are there any restrictions on what information can be requested or must be provided? Do I have a right to anything someone has about me? When can a business say no? What if the request is resource intensive or repetitive? What if someone is asking because they want to sue?
Process - Are there rules on where or how an access request is made? Do I need to call, or submit a form? Can I ask for data about another person on their behalf? Does it matter where I or the business is based in the world? What laws apply?
Obligations - What other details must be provided during an access request? Do I as a business need to share details on who I’ve shared data with? How much information can I ask for to verify identity? Can I share a summary or does it need to be a complete copy? What rights are available to me if an organization ignores my request?
And this is just one tiny aspect of data protection. I could write a book applying this same analysis to dozens of different aspects of most laws.
In short, poorly written, incomplete laws impact everyone because nobody knows what to do. Complete laws (or at least better guidance from the source) that provide enough detail on how to comply and set reasonable expectations on what to expect, mean that the system flows more smoothly for all of us.
Clarity: Laws Should Pass the “Pub Test”
Part of what makes our current legal regime so painfully difficult is that laws aren’t written for the people; they’re written by politicians (who are often lawyers) for other lawyers, judges, and regulators to interpret. But laws shouldn’t require a law degree or specialization to understand.
Instead, they should be simple, jargon-free, and clear enough to be explained over a pint or two at the local pub. Speaking from personal experience, a beer often makes the driest stuff way more interesting.4
That means, we need laws that are short and clear, or at least include a more concise summary of expectations and requirements which are easily accessible and well-publicized. Think of it as a legal ‘Tl;Dr’.5 Making laws smaller and clearer will help, because it’s much easier for normal humans to process a 20-page law than a 300-page book-length piece that reads like War and Peace, though I’ll settle for laws written in plain English (or other applicable languages).
Take traffic laws, for example. While I can’t speak to any country’s specific legislative text, the laws themselves have been carefully distilled down, refined, and clarified to be understandable by most of us. If you ever sit for a driving test, you’ll get a small handbook explaining the rules of the road. After all, we have to know enough of the law to pass our written driving test and behind-the-wheel examination. There’s no reason we can’t have the same for privacy and data protection laws.
In addition to making it easier to understand, adopting clear, easy to parse legislation (or at least better clarifications and practical advice) also makes global consistency easier. This, in my opinion, is why most data privacy and protection laws around the world have adopted the EU General Data Protection Regulation’s definition of personal data, compared to the California Consumer Privacy Act’s version of personal information. The GDPR definition is clear and concise, whereas the CCPA’s definition adds a load of complexity and exceptions.
Ironically, there are at least six US state laws that require legislation to be written in readable, ‘plain English’, including a law in California. Sadly, few lawmakers seem to follow their own laws. Perhaps it’s a clarity issue?
Consistency: If it’s Bad For Me, it Should be Bad For Thee
My third suggestion is that laws should be consistent, particularly when it comes to banning invasive, harmful, or abusive technologies. If a new behavior, or (ab)use of tech, or business model is so bad that there needs to be a law about it, it should be applied consistently to everyone. Is it any less bad when governments abuse facial recognition systems, use AI to make automated decisions about us, or track our movements online compared to the private sector? Why should it be ok for a small tech startup to sell or share individual personal details about us, just because they process thousands rather than millions of customer records?
Data privacy and tech laws shouldn’t be conditional, any more so than traffic laws are conditional. The size and political reach of the actor shouldn’t dictate legality or obligations around protecting our data and privacy rights. That’s why it’s just as bad if an Amazon delivery van runs a red light and mows down a pedestrian, as if you or I do it. The size of a monetary penalty is a different story, of course, but the laws should apply consistently.
In cases where laws must be conditional, those conditions should be explicit and defensible. The legislative rationale for granting an exception should make sense and be clearly communicated, not buried on pages 67-90 of a 300-page bill, or open for the well-resourced to devise loopholes around. For example, to strain the traffic example a bit more, it’s reasonable to permit an ambulance, fire truck, or police car to run a red light when flashing lights and sirens are on. There’s an obvious reason and rationale for this example, and it feels fair and explainable to all of us.
The advantage of a law written simply, clearly, and consistently is that it becomes much harder to hide loopholes and create advantages for those with the time, resources, and/or political connections to subvert them. It means we all can start playing on a more level playing field, and know that abusers of the law can’t hide behind opaque exceptions or clever lawyering.
More importantly consistency has the potential to spread: for example, in October 2022, the EU passed a law (known as the Common Charger Directive) mandating that all cell phones, tablets, and laptops adopt USB-C as a universal charging standard by the end of 2024. New portable electronic devices in the EU must ship with USB-C, and there are no exceptions. Since the law was passed, similar laws have been enacted in India, Taiwan, and California, and more are being proposed. And since making these devices is centralized, it’s likely that the likes of Apple, Samsung, Dell, and Google and the other device manufacturers will follow, because it’s costly to make this stuff, and why have different devices and different ports for different parts of the world?
Consistency forces consensus, and that forces behavioral change. This is why smoking indoors is banned almost everywhere. Why slavery and child marriage is vanishingly rare in the world. It’s why most countries recognize a right to privacy. Maybe in a few years, the Common Charger Directive and laws like it will mean that our collective junk drawers (and landfills) will be a little less cluttered with single-device chargers. And maybe by creating more consistent laws, we’ll have a more consistent framework for privacy and data protection.
Changing the Law to End Privacy Nihilism
We need to start treating privacy and data protection more like other fundamental rights. One way to get there is by fixing the laws. As many have reminded me, this isn’t the only thing that needs to be fixed. There is no single solution when it comes to dealing with complex systems.
But, as a friend reminded me, fixing the law is a good first start to kickoff a virtuous circle. It makes addressing the other problems (bad behavior, market forces, our lack of empathy and difficulties in seeing harms when it happens to someone else) a bit easier.
To sum things up: We don't necessarily need more laws; we just need better ones. Laws that don't require a magnifying glass or a law degree to understand. Laws that don’t create inconsistencies and inexplicable loopholes. Laws that are fit-for-purpose, explainable, clear and apply to all.
Now, I don’t think this will be easy – getting politicians to change how they do things is no small feat, especially when there are many vested interests working to keep the laws obscure and vague. But my hope is that some of you will take my message to heart. Either directly, by advocating for better, clearer, more complete & consistent laws (or pushing your elected representatives to do so), or indirectly by breaking free of privacy nihilism and declaring privacy dead.
If you can even get to that last bit, I’ll be happy.
For what it’s worth (and since many have asked), I suspect this will be the first in a series where I discuss the other problems that lead to privacy nihilism. I wanted to start with the law first, because that’s the thing I know well. But it’s clear that there’s more to be done, even if we have a perfect legal framework we’re working with.
A final note: I spent about a month on this piece, though the content has, in some form or another been rattling around in my brain for about 3 years, give or take.
2024-05-15: After I received a great deal of constructive feedback, I have made a few edits to this piece — to clarify that banning the tech itself isn’t really what I meant - more the associated behaviors when using the technology, and pointing out that fixing the laws by themselves will not be a panacea. I never meant to imply either of those conditions, but clearly that didn’t come across in the piece.
A big thanks to John Marshall, Jeff Jockisch, John Marshall, Shoshana Rosenberg, Merry Marwig and the brilliant comments, kudos, and criticisms I received related to the piece from so many wonderful people.
If you liked this, leave a comment. If you hate it, and think I’m an idiot to think such simplistic things, leave a comment. If you want to drop me a law review article explaining why complexity in the legal system has been discussed before, I’ve probably read it, but leave a comment anyway. If you don’t want to mass-blast something, drop me a line directly.
Seriously, I crave attention.
And if you did like it, or my writing generally, please share this! Maybe with your politician friends, if you’re so inclined. But really, anyone you’ve heard complain that privacy is dead should also read this.
I get that this is a stupid hill to die on, but, definitionally, it’s not a privacy policy – it’s a privacy (or better still, transparency notice). A policy guides what an organization does internally to protect personal information, whereas a notice explains to others what you’re doing with their data and their rights. If you want to make a person like me sad, you’ll keep calling it a policy though.
For example, the right to privacy or to a private life is enshrined in Article 12 of the Universal Declaration of Human Rights, Article 8 of the European Convention of Human Rights, the 4th Amendment of the US Constitution, and in Article 7 of the Charter of Fundamental Rights of the European Union, as well as countless constitutions and legal provisions.
An increasing number of countries also recognize a related right – the right to protection of personal data. This right requires that entities who use or do stuff with our personal data, do so fairly, for specific purposes, and lawful reasons. It also gives individuals the right to access and correct inaccurate data that relates to them. This right is recognized in Article 8 of the Charter of Fundamental Rights, and in laws like the EU’s General Data Protection Regulation (GDPR), California’s Consumer Privacy Act (CCPA), an increasing number of US state laws, and China’s Personal Information Protection Law (PIPL).
Most laws refer to entities (either individuals or organizations) that make decisions on what to do with personal data – e.g., collecting, storing, processing, sharing, deleting, etc., as controllers of data. Some laws, like CCPA, refer to these entities as ‘businesses’, others as ‘covered entities’. For this piece, I’m calling them controllers.
Well, except for laws on data transfers outside of the EU. That’s just a confusing mess no matter how many beers you’ve had.
For example, the European Commission provides a very user-friendly page on the Digital Markets Act which explains the major aspects of the law clearly and simply, includes a video overview, and examples of do’s and don’ts. https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_en
Thank you for the thoughtful and thought-provoking post. I am fully on board with simplicity and clarity. I don't remember much from my days at university in Switzerland but one think is Eugen Huber's laudable goal to make the Swiss civil code a book that everyone can consult at home and understand. There shouldn't be any lawyers required. And look at the Swiss Data Protection Act. Not perfect, but 32 pages (https://www.fedlex.admin.ch/eli/cc/2022/491/en) and structured and printed (if you use the PDF) in a way that you can actually read the thing and find relevant clauses (hello, US lawmakers!). The ICO faces a lot of criticism, but three cheers to their focus on plain and simple language in their guidance (hello, EDPB!).
I am less on board with trying to be prescriptive and consistent. I think a good law should be risk-, outcome- and principle-based. Should the small dental practice around the corner face the same DPO, DPIA, ROPA etc. requirements as Microsoft? I am also not sure if we necessarily need better laws, more importantly to me is that regulators to collaborate better on consistent and helpful guidance that solicits input from all the stakeholders before publication (EDPB could learn from the ICO and FTC on this aspect). Many more thoughts but so little time :) Thanks again.