As a German living in the United States, I often find myself baffled by Americans' cavalier attitude towards data privacy.
For instance, in a situation I recently witnessed, one of my American friend’s attitude could be summarized like this:
“You’re a private company that wants a DNA sample? Sure, go for it! I mean, what could go wrong? And let me fill out your survey about my medical history while we’re at it!”
Explaining to them why data privacy matters is an uphill battle. It often comes with a side dish of “you’re just being paranoid.”
To put it bluntly: people who are concerned about data privacy are not paranoid. People who aren't concerned about it being naive.
And I don’t mean that in a shaming way… based on our background, culture, and life experience, we’re all naive about some things. That’s why it’s so important to have a diversity of opinions.
My own opinion on this is shaped by my German culture, my legal background, and the coincidence that one of my parents’ friends happens to be one of the leading privacy advocates in my home country.
Here’s a one-sentence introduction to why data privacy matters: right now, your data might be used in ways you don’t want, whether that is to charge you a higher price based on your user information or set your insurance rates.
But that’s just the tip of the iceberg and I want to talk about what lurks underneath the surface, threatening the (allegedly) unsinkable ship we’re finding ourselves on.
To do that, let me answer a question I often get from Americans when I talk about data privacy: “What’s the worst that could happen?”
Given where I grew up, I’m pretty aware of the worst that could happen. It’s all over our history and it’s enough to chill you to the bones.
So, to stir you out of your complacency and to have you reconsider before willingly spreading more of your sensitive data, let me introduce you to some worst-case scenarios. (Just to be clear, everything that I am describing in this article is either a thought experiment or actual history, not an ill-advised Nazi comparison that I have warned against.)
Thought experiment: data privacy and the Fourth Reich
When you grow up in Germany, the education system doesn’t let you forget what could go wrong.
That’s why, as part of my training as a lawyer in Berlin, we visited the place where the “Final Solution” was discussed and implemented: a villa in Wannsee that’s now a museum. (In case you don’t know, the “Final Solution” was the code name for the murder of all Jewish people within reach.)
Our visit to the place where these awful discussions took place was disturbing. The villa itself is beautiful, with luscious green, a lovely lake, and peaceful surroundings. In other words, this suburbian oasis was entirely at odds with the horror of the history that had taken place in it.
It was a good lesson to not judge a book by its cover… or to not assume that evil can’t take place in what appears to be paradise.
As we all know, here’s what happened: during the Holocaust, two-thirds of the Jewish population in Europe were killed, a loss of six million lives. Up to 220,000 Roma people (25% of the European population) were also killed.
For a moment, imagine someone like Hitler rising to power today in a country that had the military and logistical capacities that Germany had in the 1930s and 1940s.
How many Jews do you think would survive in that situation? How many Roma?
When Germans play out that thought experiment, we typically conclude that nowadays— given modern surveillance methods — that number would be close to zero.
What could the Nazis have done if they had location tracking and face recognition, alongside CCTV surveillance everywhere?
And that thought alone should be reason enough to be wary of a post-privacy world.
Also, just in case you’re assuming that this does not personally concern you because you don’t fall into the groups I just mentioned or into any other groups that are discriminated against in your country, think again.
Holocaust survivor Simon Wiesenthal warns:
"For your benefit, learn from our tragedy. It is not a written law that the next victims must be Jews. It can also be other people."
For instance, when Khmer Rouge killed almost a quarter of their population, one of the groups they targeted were people they perceived as educated or “intellectual” (which often included people wearing glasses or speaking a foreign language).
That describes me rather well and given that you’re here reading a rather long blog post, you might have “qualified” as a worthwhile prosecution target alongside me.
In the absolute worst-case scenario, that’s a stake when it comes to the battlefield of data privacy and surveillance: easier identification, location, and extermination of individuals or entire groups of people that someone in power wants dead.
Thought experiment: data privacy in the Not-So-Democratic Republic
Let’s assume the absolute worst wouldn’t happen. Let’s instead consider the “medium-worst” of what could go wrong.
Again, German history has a helpful lesson for you: from 1949 to 1990, as you all know, there were two Germanys.
The German Democratic Republic (West Germany’s more autocratic sister) had something called the Ministry of State Security, which is commonly known as the Stasi.
The Stasi has been described “as one of the most effective and repressive intelligence and secret police agencies in the world.”
In an article, Germany’s current chancellor Merkel, who grew up in East Germany, describes how she coped with the constant surveillance and that it was important to “not let them drive you crazy.”
You may be wondering what the point of all that was. As historian Hubertus Knabe put it:
The main purpose was to control the society. […] The East Germans knew, of course, that they were surrounded by informers, in a totalitarian regime that created mistrust and a state of widespread fear, the most important tools to oppress people in any dictatorship.
The Stasi collected information about people because they knew it was power.
It’s the same reason all those companies are collecting data about you — because it gives them power over you.
Of course, their motivation is different and less nefarious: while the Stasi wanted to prevent a revolt against their totalitarian system, data collectors want to earn money from their data and advertise to you.
But, as the Cambridge Analytica scandal has shown, Skunk Anansie was right when they bluntly stated that: “Yes, it’s f***ing political. Everything’s political.”
Question for you: who do you think has amassed more data about people, the Stasi— or your favorite tech companies?
I mean, one of these groups doesn’t even need to use surveillance as we’re happily sharing everything with it.
At this point, people often point out that unlike the Stasi, tech companies are not part of a governmental structure and don’t have an army at their disposal. In other words, we should be less concerned about it.
Here’s the thing though: just because a company is striving towards doing something good today and just because it doesn’t have an army now doesn’t mean that this will always be the case.
There is certainly a historical precedent for a company effectively acting like a country in its own right: at its height, the East India Company had an army of 260,000 men (twice the number of the British standing army). It also controlled a majority of the Indian subcontinent and engaged in the slave trade.
Can you imagine what harm a company like that could do with your data?
However, let’s assume companies will remain as they are now, army-less. Who do you think would prevail if sinister governments tried to get well-intentioned tech companies to forcibly hand over their data?
A few thoughts about naivety
There’s one quote that best summarizes my attitude towards people’s ignorance about the importance of data privacy.
It’s taken from the BBC show Merlin, a show that depicts the legendary wizard Merlin as a young boy living in a Camelot that persecutes magic. Or, to be more direct: a Camelot that’s waging a genocidal war on the magical people.
In that show, Camelot’s knight Mordred is secretly a magic-user. In a conversation with his old friend Morgana, she points out that King Arthur wouldn’t accept him if he knew about his magic.
To which Mordred expresses his belief that one day, King Arthur would accept the magical people.
“Your naivety would be charming if it wasn’t so dangerous.”
While Morgana has gone off the deep end in her hatred towards Camelot and become evil herself, she does have a point about naivety being dangerous.
The dangers of the black swan
Don’t believe anything bad could happen with all the gathered data? Well, as Spanish philosopher George Santayana put it: “Those who cannot remember the past are condemned to repeat it.”
Right now, if you’re fortunate enough to live in a law-respecting, democratic society, you might be able to get away with being naive.
But we’re one black swan event from a dystopian future.
In case you’re wondering, a black swan event (a term coined by Nassim Nicholas Taleb) is an event that:
- is surprising to the observer,
- has a major effect, and
- gets rationalized in hindsight.
Let’s look at what happens when one of these hits. (And no, the Coronavirus situation isn’t a black swan.)
From heaven to hell in an instant
In 1913, Europe saw itself as the height of civilization. If you want to get a feeling for just how optimistic Europe’s people must have felt back then, put on Beethoven’s “Ode to Joy” and imagine that you have never heard about World War 1 and 2.
In his book Flashpoints, George Friedman describes the general European sentiment as follows: “To many, it seemed as if they were at the gates of heaven.”
Then Archduke Franz Ferdinand got assassinated (the black swan event) and as we all know, it escalated quickly from there.
Truth be told, when I first learned about the sequence of events that led to World War 1 (and, by extension, Hitler’s rise to power, the Holocaust, and World War 2) as a child, I didn’t quite get how the assassination of one Archduke could have all these catastrophic effects.
I think that’s because the whole sequence of events was a black swan. Sure, we can rationalize it in hindsight but in reality, it’s not exactly a straight line.
So, don’t for a moment think that a black swan event couldn’t throw us back into a hellhole.
It could.
And being naive about it won’t save you.
That Europeans in 1913 were ignorant of the horror that was about to descend upon them didn’t change one thing. If anything, it made them less prepared.
The good news about data privacy
Now that I’ve spelled that all out, do I think it’s likely that the worst-case (or even the medium-case) scenario comes to pass?
No, I don’t. I’m cautiously optimistic about the future. I just know that there’s a difference between optimism and naivety.
The reason we should be wary about the sinister possibilities I discussed in this article is not that they’re very likely…it’s that they’re so devastating if they happen.
Do I think all that’s happening with data and surveillance right now is bad and that tech companies are evil?
No, I don’t. Life is complex, and so is Big Data. I think data gathering and surveillance can have positive impacts, too, and that tech companies (or rather, the people leading them) can have good intentions.
Right now, there’s a lot of icky stuff that is happening when it comes to your data (such as insurance companies using data about things such as your TV habits to predict how pricey of a customer you will be for them).
But there’s a world of a difference between icky and genocidal.
When I convinced my friend to not share her medical history with a private company that already had her DNA samples, I didn’t do it because I was worried about an impending genocide or authoritarian regime. I did it because I was concerned about the impact it might have on her or even her blood relatives.
What if my friend’s daughter applied for insurance and that company (or rather, their algorithm) considered her mother’s medical history and decided that she’s too high-risk to take on? Or if her son wanted to get a 30-year mortgage but the algorithm concluded that his family’s medical history doesn’t look promising for longevity?
While that would be bad, it’s nowhere near “Fourth Reich”-level of bad.
The reason we should be wary about the sinister possibilities I discussed in this article is not that they’re very likely…it’s that they’re so devastating if they happen.
And hopefully, you’ll consider that the next time you think about sharing sensitive personal information or assume that data privacy doesn’t matter.
Let's stay in touch!
Want good things delivered to your inbox? Sign up for my helpful emails. I'll get you started today with a short meditation that can increase your focus in just a few minutes.
Never thought about this “whether that is to charge you a higher price based on your user information”.
¡It opened my mind!
Yeah, it’s kind of crazy, isn’t it? Thanks for your comment, Aldo!