A woman uses her smartphone to register her personal data on a menstrual tracking app.
Atlantico: Supreme Court-proposed changes to US abortion law are prompting clinics to look into digital privacy. Some clinic workers say they are embracing encrypted messaging apps and Zoom meetings To leave fewer paper paths if Roe v Wade is invalidated. How does this reflect the reality of the situation?
Fabrice Ebelboin: It reflects a fact that couldn’t be cooler: from the moment abortion becomes illegal, one can imagine a legal action that would require data recovery to catch those who do this illegal thing. It’s the law. It’s also a great opportunity to realize that all the platforms to which we provide our digital data without really thinking about the consequences, are paying for our freedom. Many digital activists have been claiming this from ten or fifteen years ago, but it has remained somewhat abstract until now. Now almost everyone realizes this, but it is too late. Because this is not only the way to identify a woman who is seeking an abortion or who has had an abortion, it leaves abundant numerical traces that make it possible to characterize it. A few years ago, a family received diaper promotions, without understanding why, before realizing that one of the 16-year-old girls was pregnant, which the marketing system determined from the effects. Amnesty International compared her profile to hundreds of thousands of pregnant women and extrapolated this. So it is very good that clinics are taking these measures to reduce the digital footprints of their customers, but this will not be enough, especially since we learned recently that many hospital organizations in the United States are sharing their data with Facebook. There will be 1,000 more ways to hunt them down. This is valid for abortion as for all subjects. If I suffer from liver cancer and consult sites that use Google Analytics, Google will end up concluding that I have liver cancer. The next step is that nothing will prevent Google from providing this information to insurance companies who will subsequently refuse to provide health insurance. And worst of all, it doesn’t necessarily violate my personal data. Since the GDPR, data brokers have found ways to circumvent the regulation.
Our lives under Instagram: This new social control we imposed on ourselves after we airlifted everyone who came before
Is it possible to change course and get out of this situation?
Today, it is no longer really possible to go back, but it is possible to look things in the face and perhaps have a collective consciousness. The right to a vibrating abortion in the United States seems to me a wonderful opportunity for such awareness. Women’s rights is a sensitive topic and this will force us to confront the reality in the world in which we have lived for more than fifteen years. This will undoubtedly translate into a setback for American women’s rights, after all, this threat comes from the United States and it is legitimate for American citizens to be the first victims, but this is undoubtedly one of the last opportunities that is presented for us to realize the world we have entered on one level . We’ve been there for over a decade. These companies are stronger than most countries. You just have to realize that, it’s too late to fight it. At best, we can probably identify solutions, no doubt looking at the side of strengthening the fight against corruption.
To what extent are these technologies actually used in the United States and France?
We don’t know anything about that at all. What is certain is that most GAFAMs are geared towards insurance. Thus, the Health Data Center, which is the repository in which all of our health data is stored, has been delegated to Microsoft. Microsoft has invested heavily to acquire a company that specializes in predictive health based on health data and is interested in insurance. Tomorrow, inevitably, complementary health will be determined by our digital footprints. The current welfare state system will decline, which is unfortunately inevitable, in favor of private insurance, which not everyone will be able to afford, not necessarily because of a lack of means, but also because of the biological inequality in the face of health problems that artificial intelligence will be able to determine to ensure only profitable customers . This example I’ve been using for a long time, without the gradient effect. But there, the miscarriage is very real.
Biased moderation: Behind the Mila case, these embarrassing questions social networks refuse to answer
Is personal data actually being used for lawful purposes?
When Mila is harassed, we recover her harasser’s data. This still works poorly in France because GAFAMs cooperate very little with the justice system which, for its part, has remained in the 20th century and does not know how to interact with GAFAMs. But prosecuting someone based on the release of their personal data is very common.
Is it possible to escape from this system?
On an individual level, there are two voices, diametrically opposed, to doing so. Stop using any digital tool, though, in the absence of data, insurance companies likely won’t risk your insurance. The second is to understand in detail what we do and how we publish our data. Unfortunately, this is only within the reach of a small elite. She can lie, cheat, split her profile and deceive the AI. When you really understand how your personal data is published, you have this power, but it is very complex.
Collectively, they have been corrupted, and governments in any case abdicated the throne in the face of GAFAM. Digital sovereignty is no longer the responsibility of states, except for China and Russia, from a digital point of view, the rest of the world is, so to speak, an American colony. Our major hosting services, from Orange to Atos via Thalès, have become GAFAM franchisees under Macron’s first five-year term. It cannot be cured.
On the business side, the problem is that the services offered by GAFAM are practical, efficient, professional and often over-deliverable. In the short term, this is a very good account, and perfectly aligns with the urgency of presenting an annual report that boosts the stock market price. In the long run, this is self-locking in technologies that will eventually end up sucking any future profits through licenses that will end relentlessly. The alternative is to invest human, technical and financial resources to ensure their independence, but very few companies do this, which is not at all possible for publicly listed companies.
The accused on social networks, stand up: the evidence for undermining the foundations of our democracies exists
What made us let ourselves be trapped in this situation?
digital illiteracy. Business leaders, political leaders, and most journalists understand absolutely nothing about digital issues. It is all too easy to lie, manipulate, and defame whistleblowers, to force GAFAM solutions and make those who warn of the risks pass them on to the conspirators.
To what extent is the population accepting the situation?
I don’t think they accepted the situation. They don’t understand that. We live in a world where the rule of law has fallen dramatically. Many laws are broken. But if tomorrow the government decides to restore the rule of law based on personal data, it will be a massacre. We will be able, if we really want – which is not the case – to catch most tax evaders thanks to their personal data.
Is the general revolution hypothesis if social control becomes more important?
I don’t think there will be other, more important reasons for the revolution that will come long before that. This system of social control is more insidious than others, such as the one in China, for example. To some extent, we have already accepted a certain form of social censorship put in place by social networks. We know vaguely what is going on there. Since the Cambridge Analytica case, most people have realized that such an efficient system of selling us products can sell us ideas, and that alertness, which is at the heart of Cambridge Analytica’s manipulation, is being exercised excessively by the French government, without shocking anyone.
China is always cited as an example of a country where social control is strongest. Can we imagine a Western country behaving like Beijing?
All CAF recipients already have some form of social credit. We can also talk about the credit rating in the United States. There is already an embryo of social credit in France, the United States, and Japan, all of which are democracies. China should not be seen as an exception but rather as a vanguard who has to be relatively honest and clear about its intentions regarding the use of technologies for purposes of social control, but the West has not been left out. The INDECT project, which dates back a decade, had the goal of control and social stability, and again, this didn’t shock many people. From a numerical point of view, everyone, dictatorships like democracies, go in the same direction. China assumes that when France does it discreetly, that’s it. At worst, we can say that we are late.