Washington, D. C. in 2054: State surveillance of people thanks to the Internet is so advanced that their behavior can be predicted. In the film “Minority Report,” there are no more crimes because the state eliminates potential perpetrators before they commit them. This becomes a problem for police officer John Anderton, played by Tom Cruise, when an algorithm predicts that he will murder someone in the next 36 hours.
Pure science fiction? Not necessarily. Intelligence Agencies stored such vast amounts of data about us already that most of our lives can be analyzed and predicted—for example, our purchasing behavior. And governments around the world are increasingly creating centrally stored digital identities for every citizen from previously scattered data that will soon be implemented into what the NSA calls “pre-crime centers”.
The video promises that the UN Digital ID will ‘streamline information sharing, daily workflows, access to platforms and buildings’, etc. It will store all of your personal, health, security, travel, financial and pension data.
This means a global alliance of powerful public and private players, headed by the UN, is advancing all the data crawling that has been done on us much further. They are working on a transnational digital identity for every person. This is to include all existing data about the individual as much as possible. We will only be able to identify ourselves on a biometric basis—with face, iris, fingerprint—and release data on request. The promise: We will decide for ourselves whether to release our data. This is window dressing. The concept of transnational digital identity paves the way for total surveillance.
ID2020 — Sugarcoated propaganda at its finest
It is about private aspects of our identity—our bodies and health; interests and friendships; thoughts, feelings, and sexuality. Our privacy is the space in which we are allowed freely to develop our personality. It is a space protected by constitutions worldwide, about which others should only know what we voluntarily tell them.
“The management of identity is still completely inadequate worldwide,” says Dakota Gruener, head of the ID2020 organization in New York City. “250 million children have no birth certificate, millions of refugees no papers. There is a proliferation of virtual identities on the Internet, which online fraudsters and child molesters exploit to the full. And law-abiding citizens suffer from unnecessary controls.”
A lack of efficient identity control costs the world hundreds of billions of Dollars every year. ID2020 wants to change that. An alliance of high-tech corporations such as Microsoft and Accenture, the Rockefeller Foundation, major aid organizations such as Mercy Corps, CARE and the Bill Gates-funded vaccination alliance GAVI. The UN refugee agency UNHCR is also a close cooperation partner.
“Identity is a human right. We all need to be able to prove who we are. But one in seven people worldwide cannot prove who they are. He or she is therefore largely excluded from health care, education, and banking. At the same time, we privileged people also struggle with countless usernames and passwords on the Internet. Because we can’t prove who we are, our online transactions remain risky. All of these problems can be solved by our concept of a transnational digital identity. Moreover, it protects our personal data, which we always carry with us and which we alone have.” — Dakota Gruener
In addition to the digital photo, it is also possible to include fingerprints, as a (for now) voluntary feature, in the ID card.
Needless to say, that this is all just marketing blahblah to sell you the total control that digital IDs combined with CBDCs will bring over your life.
“This technology, which is also used to trade bitcoins, is ideally suited to ID2020’s concern. We think of digital identity as a collection of documents and other information about the person. This allows the individual to credibly substantiate certain attributes He can show, for example, his college diploma, proof of immunization, proof of his credit score, or information about his occupation.” — Dakota Gruener
Like the U.S. government, the EU Commission is also enthusiastic about the project of a digital identity that is open to everyone; a project that would allegedly put an end to the countless virtual identities on the Internet, online fraud and child abuse.
The EU outlined its far-reaching vision for digital identity in a 2019 report from its EU Blockchain Observatory:
“When we say digital identity, we need to understand it as the sum of all attributes that exist about us in the digital world; an ever-growing and complementary collection of data points. This could be important documents, but it could also be an account on a social media platform, the history of our purchases from online retailers, or testimonials from friends and colleagues. There’s really no limit at all.”
Test subjects are lured with false promises
Unlike in many national systems, individuals should have control over their data. If a bank, a landlord, or a border official wants to know details about you, you should use a smartphone app to release only the information you wish to share. Everything else remains hidden. Self-sovereign identity is the buzzword they use for this.
The ID2020 project will initially start as a vision and framework concept. Within this framework, governments, organizations and companies are to develop their own projects that will later grow together.
One example is the Known Traveller Digital Identity (KTDI) project. It is intended to enable worldwide travel without papers. The governments of Canada and the Netherlands are involved in this World Economic Forum project, as are airlines, airport authorities, hotels, credit card and rental car companies—and the multinational Accenture Group as a technology supplier.
As with all digital identity projects, those who participate first have their biometric data stored—especially their face so that they can be recognized at checkpoints. Those traveling with KTDI also provide personal data as a confidence-building measure, explains Christoph Wolff, the project’s director.
“The critical areas are, of course, who you are, where you live, things of that nature.” — Christoph Wolff
Past travel or credit card history could also be stored on the blockchain, Wolff says. (If anyone knows the reason as to why a border patrol agent should be interested in my credit card purchases, feel free to explain.)
“Once this system has some momentum and is used, yes, past border crossings are also stored. That of course increases credibility because you can just provide more validated data.” — Christoph Wolff
Wolff promises that anyone who joins KTDI can expect to have ‘an extremely pleasant trip’—at the airport, when picking up the rental car, at the hotel:
When the traveler arrives, and he can identify himself by his biometrics, i.e., by his face, then this information flows together in the background, and the traveler is classified as trustworthy in 99 percent of the cases. He can then cross the checkpoint without standing in line or being checked.”
Digital registration of refugees
“We’ve digitized the health history of refugees there,” Dakota Gruener reports. “If they show up at one of the four health stations now, the staff knows immediately what’s going on. As a second step, we’re now providing refugees with digital proof of the education they’ve received through job creation programs.”
Their new digital identities, filled with important documents, should help refugees put their lives on solid footing after their stay in the camp, says Dakota Gruener.
ID2020 is running another project in Bangladesh, in partnership with the government and the GAVI vaccination alliance (SURPRISE!). There, she says, it’s all about digital immunization records and digital identity.
In Bangladesh, to date, only 20 percent of all children receive a birth certificate. At the same time, however, almost all children are vaccinated against diseases. That gave us the idea of linking the two things: on the one hand, we are strengthening the vaccination system by introducing a digital vaccination certificate, and on the other hand, we are using the digitization of the vaccination system to establish a digital identity for the children.” — Dakota Gruener
Bill Gates and the digital vaccination certificate
As far as the corona pandemic is concerned, I think we all want to resume our normal lives as soon as possible. But that depends critically on whether we can prove a current corona test or, in the future, a vaccination. Proof of corona vaccination must become a requirement for cross-border travel. And proof of vaccination must be reliable so that lives are not unnecessarily put at risk. No paper that could be lost or forged; no, a digital proof of vaccination on a biometric basis: The camera at the border authority or even at the entrance to the soccer stadium can tell from my face whether I have been vaccinated.” — ID2020 partner Bill Gates urged in a March 24, 2020, interview with online medium TED Conferences
Ironically, the proof of vaccination apparently needs to be more reliable than the vaccine itself.
“The pandemic would thus lose much of its terror,” Dakota Gruener hopes. “And the Corona vaccination would open up a unique opportunity to enter the digital identity for billions of people.” How convenient for them.
Concepts collide with applicable law
A probing look reveals troubling things from as many as seven aspects:
First, under the EU’s General Data Protection Regulation, personal data may only be collected and processed to the minimum extent necessary—and only for well-specified purposes. The collection and storage of data for general administrative purposes planned by ID2020 stakeholders and supported by the EU Commission diametrically contradicts this regulation.
Second, according to the General Data Protection Regulation, stored personal data must be deleted as soon as the specific purpose of its collection ceases to apply, or data subjects revoke their voluntary consent to its storage. Deletion, however, does not work on a blockchain. A blockchain is essentially a digital bank statement and, as with bank statements, the account balance is not derived from the individual entries, but from the sum of the entries in the account. The moment they cross out content, the signature mechanisms that couple the items together and prove that the entries are unaltered cease to work. It’s somewhat like when they rip out a bank statement from a collection of bank statements; then they don’t know if the account balances are correct either because they can’t do the math.
Third, according to the General Data Protection Regulation, a responsible authority is liable for the proper handling of personal data. However, there is no such responsible entity in blockchain-based data processing—only the automatism once set in motion, which no single entity can control.
Fourth, the biometric data generated during the establishment of digital identities is used as raw material for the use and further development of facial recognition systems. Such surveillance systems are already commonplace in Chinese cities, and may soon be so in the United Kingdom. The German Ministry of the Interior is experimenting with them at Berlin train stations. However, in none of the cases mentioned is anything known about the consent of affected people, which is provided for, for example, by the EU’s General Data Protection Regulation.
Fifth: ID2020’s concept of digital identity envisions that we will only ever release the information to asking entities that they need and that we want to release them to. This is unrealistic. Being able to selectively release information about ourselves sounds good—in theory, but this thoroughly ignores the power imbalance in almost every identity check. If my employer wants a document from me, a border official or my landlord—then I can hardly say no.
There is a risk of arbitrariness and abuse
In fact, the market—that is, the needs of companies, UN agencies, governments, and consumers—will shape the rules of transnational digital identity. We’re going to see, and we’re already seeing this right now, that basically any substantive discussions about the sense and nonsense of these kinds of mechanisms that are currently taking place completely miss the mark. That’s actually what gives me a headache, that the normative force of the factual, so to speak, will strike, so at some point certain standards will be established, certain procedures will be established, which then simply become a fact that you as a consumer can’t ignore at all without excluding yourself from certain services and/or society.
Seventh, the blockchain with our digital identity will be safe from hackers, the initiators promise. And no government will be able to siphon off our data via a backdoor, either. That’s the second illusion and blatant lie. Digital identities will collect the data of our lives for decades. But who knows today what hacking techniques will be used in 2034 or even 2054? And—regardless of whether blockchain or published source code, every IT system can contain backdoors. The moment data is in somewhere, the data can also get out. Quite simple. And we have a very, very clear trend in all industrialized countries that intelligence services are increasingly organizing access authorizations to these systems within corresponding laws, specifically made to undermine your constitutionally granted rights. The US leads with the CLOUD Act, which even allows them to access data that is located on servers of American providers abroad.
Sinister new world
The concept of transnational digital identity optimizes the increasingly dense network of commercial and government surveillance for digital progress. A web that we ourselves are busily knitting by sending—via smartphone and fitness tracker, via black boxes in our cars and Amazon’s Alexa—countless gigabytes of highly personal data to all kinds of companies.
That data is linked to points in time. That means that from these traces that we leave behind, if someone had access to all of this data, it would now be possible to trace almost all of our daily behavior. Every single step, every movement. Our car logs how loud we turn up the radio, at what point and at what time we put on the turn signal. That could be correlated with traffic light timing. That means this information can be put together to form an overall picture of what we do and don’t do every day.
All this information is being sent to the servers of security authorities faster and faster: The U.S. picks up what Google, Apple, and Meta collect. The EU Commission is fighting doggedly for data retention, and now wants to oblige operators of social platforms to search even encrypted messages for traces of child abuse. This will make it possible to search for practically anything, and virtual house searches without cause will become commonplace.
Meanwhile, several EU countries are enabling better monitoring through their own digital identity projects. France, for example, will soon be registering its citizens biometrically. Anyone who participates will then be able to conduct almost all official business on their smartphone. Those who refuse will continue to stand in line in the corridors of government offices until access to these services will be denied to them completely.
Germany’s government is pushing ahead with two projects in particular: Firstly, citizens are to receive an electronic patient file ‘in order to receive optimum health care.’ This will contain all health data, including psychiatric diagnoses, and will be stored on servers run by a private company. Secondly, there are plans to convert each citizen’s tax ID into a comprehensive personal identification number. The state will then be able, at the push of a button, to obtain a multifaceted picture of that citizen. Needless to say, the federal and state data protection commissioners have constitutional objections to both projects.
This has nothing to do with democracy anymore
In a quite near future in which digital identity operators have enough tools in their hands to monitor us all comprehensively, like in the movie “Minority Report,” algorithms would then come into play that systematically diagnose our behavior and derive predictions from it. Imagine now, all of a sudden, measuring the length of time a person fixes his gaze on a small child. If he crosses a certain threshold, all of a sudden, he’s a pedophile. The grave danger is that such numbers will be used as circumstantial evidence to make assessments about people, and then predictions about their behavior. In doing so, we rob our society, our world, of a very essential element, namely that of the free development of one’s own personality. And with that, we’re heading into something like a monitored state, without it being a surveillance state in the classical sense.
A state in which democracy exists only on paper. If, for example, a politician becomes uncomfortable, embarrassing data can be dug up at any time to put him under pressure. This is not democracy because these people, if they are politicians, can no longer be guided by what their voters want, but have to be guided by what the people want, who can destroy them at any time because they know all their weaknesses.
Increasing digital surveillance obviously threatens our democracy far more than, say, Vladimir Putin, but concern about one’s own data attracts almost no one.
Why is there no mass movement for data protection? Perhaps a socio-psychological conundrum is the cause: First, there is the contradiction that we are reluctant to reveal anything about ourselves, but want to know as much as possible about others. The less we know about another person, the worse it is for us because we don’t know for sure whether we are behaving correctly; the better it is for the other person because he or she retains the opportunity to develop. Now, of course, the reverse is also true, that not every free development of the person is positive for the other people involved in his environment. Of course, there are people who harm other people.
Because we want to prevent or limit that, at the same time we humans strive to gain as much information as possible to identify, to be able to punish, to be able to prosecute, and so on, and so forth. Identification and anonymity are antagonists that we both want. The anonymity we want for ourselves, the identification we want from others.
Monitor, report, denounce
There is a second factor in people’s low level of commitment to protecting their data in the so-called paradox of privacy: Although our privacy is important to us, we voluntarily disclose the most intimate information—as payment for advantages and conveniences in everyday online life. We don’t feel that we are ultimately paying a high price because it doesn’t cause any pain for the moment.
Third, and finally, we experience the Janus face of digitization—obvious benefits on the one hand, threats on the other—as extremely complex and confusing. When it was still about census, it was simple. Then you could say: census yes or no. Today it is extensively more complex. People want to protest, but there are so many thousands of ends that would have to be touched. It’s such a nitty-gritty of surveillance, which together brings such total surveillance that you’d actually have to say, we aim to shut down the Internet, or something like that, which nobody can ask for if you ‘kind of want them to stop.’ That’s why so little happens in terms of resistance.
Despite all this, there are still optimists who—despite the irresistible triumph of digital administration and transnational digital identity—believe that total surveillance is avoidable. They believe that data protection laws and the courts can put a stop to total surveillance after all. That is, these courts aren’t yet compromised. And indeed, as recently as today, the European Court of Justice once again declared data retention inadmissible in quite a few member states, thus strengthening the principle of data erasure.
If we want privacy, if we want to preserve data protection at least in part in an increasingly digitized and logged world, then deletion is the most important data protection instrument imaginable. Because the past must also be allowed to be forgotten at some point. In other words, if data is only kept for as long as it is needed and then has to be deleted immediately, as stipulated by the General Data Protection Regulation and earlier also by the Federal Data Protection Act, then this data must disappear again.
More and more people are experiencing the pressure of commercial and government surveillance as suffocating, and in light of this, you should rely on the principle of hope. I hope that people don’t sleepwalk into a surveillance society. I hope that more and more people wake up to the fact that we’re about to be asked to engage in an extremely dangerous transnational digital identity.