For internet and technology companies, collecting user data remains one of their main sources of income. But this business model includes a security risk for users, as demonstrated by recurring cases of undisclosed commercial use, massive leaks, and hacking incidents. Is there a credible solution for strengthening users’ privacy?
Companies such as Google and Apple focus on collecting daily user data, mostly via mobile phones, and combining information from the applications running over time: the user's calendar and agenda, for example. A multitude of apps track the user's location in real time, while health and sports apps dig deep into their biometric information. This data is collected and analyzed, allegedly to offer more tailored and sophisticated services. In fact, most users do not realize they are offering a wealth of data to service providers and platform owners, free of charge.
Privacy activists, such as Austrian Max Schrems, have expressed strong concerns about this model. They highlight the risks of increasingly frequent privacy violations and abuses. This was perhaps best illustrated by the Facebook scandal known as the Cambridge Analytica case – in which British consulting firm Cambridge Analytica obtained the personal data of 87 million Facebook users without consent, in order to “provide analytical assistance to the 2016 presidential campaigns of Ted Cruz and Donald Trump”.
Schrems says he warned Facebook representatives about the data-mining activities of Cambridge Analytica, but could not convince them to act:
They [Facebook representatives] explicitly said that in their view, by using the platform you consent to a situation where other people can install an app and gather your data.
But why would you care about privacy if you have nothing to hide, after all? Whistleblower Edward Snowden answered this question in a Reddit discussion in 2015:
Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.
Real risks linked to the use of information technology platforms
French software engineer and data expert Gaël Duval has been involved for years in free software development, including the Mandrake Linux distribution – an operating system (based on a Linux kernel) which can legally be modified and shared with others.
Duval then decided to build an OS that would provide mobile phones users with heightened protection of their data: /e/OS.
Global Voices spoke to him to understand how communication technology impacts lives, and present both an opportunity and a risk. Here is his view on the evolution of such technology:
This is a philosophical question. I personally have very mixed feelings about it because I've always been passionate about technology. But at the same time, I feel that sometimes it's too much, and I miss the time when you had to find a phone booth to have a call. It was probably a more carefree and [slower-paced] life. Younger people might be surprised to know that until I was five, there was no phone at home and no television. So sometimes I feel I lived a part of my life in a totally different world, that doesn't exist anymore. On the other hand, it's really exciting to see what we can do with modern technology, like having an HD video call with someone on the other side of the planet, and seeing all those electric cars that, at least, are not burning petrol and [filling] our lungs with the exhaust fumes.
Besides the seductive dangers of nostalgia for those who still remember analog times, we are also facing a real risk of dependence on information technology – a 2018 study linked behavioral problems in children with excessive use of smartphones, which was shown to cause issues including attention deficit disorder (ADD) and depression. A survey published in 2020 by Common Sense Media found that 50 percent of teenagers in the Los Angeles area feel dependent on their smartphones.
The risk inherent in our use of such technology was recently openly acknowledged by insiders from the industry in the Netflix documentary The Social Dilemma, which includes testimonials from former employees of Big Tech – including Google, Twitter and Facebook – explaining how they purposefully nurtured user addiction for profit.
Some governments have reacted by upgrading protective legislation in order to both raise user awareness and place more responsibility on technology companies. In May 2018, the European Union passed the General Data Protection Regulation (GDPR). The law adds multiple constraints to data management, such as asking users for explicit authorization for the use of their data and requiring companies to remove this data after a period of three years without interaction. It also introduces extremely large fines for those who do not respect these rules. But its enforcement is limited by a lack of resources to do so among local authorities, and it is, of course, only applicable in EU member states.
A tool to empower users
This current climate convinced Duval of the need to create a tool that would allow people to take control of their own data, as he explains:
Our slogan is “Your data is YOUR data,” because our personal data belongs to us, and those who pretend that it shouldn't are either against freedom and democracy, or they have a business that is fuelled by advertising – because personal data can help sell ads at a much higher price.
This is how the OS he created operates:
/e/ is a digital ecosystem that provides a smartphone operating system that doesn't send [to Google] any piece of your personal data, like your searches, your geolocation… and that respects users’ data privacy. It doesn't look at the user's data for any purpose. It also provides basic online services such as an email address, some storage, a calendar, a way to store your contacts – everything linked with the smartphone operating system.
Duval said that when it comes to personal data, Google and Apple are in the same boat – this data fuels Google's business model, which is essentially based on selling advertising, while Apple, despite claiming to protect its users’ privacy, receives an estimated 8 billion to 12 billion US dollars each year to pre-install Google search on iPhones and iPads.
Using an iPhone, a user sends about 6MB of personal data to Google, per day. It's double [that amount] for Android users. Besides, Apple hardware is a closed box, without any transparency about what's happening inside. You have to trust them. We, on the other hand, support “auditable privacy”: all the /e/OS and the cloud software source code (the “recipe” for building the products) is open-source. It can be challenged by specialists and audited.
In a context of growing dependence on smartphones, it is clear that protective legislation is not enough to raise awareness and equip users with the right tools and knowledge to protect their data privacy – and this is where a digital tool that makes users more responsible and proactive can play an important role.