Decolonizing data to tackle digital authoritarianism

Image by Parker Coffman on Unsplash. Free to use under the Unsplash License.

In recent years, much has been written about the growing threat posed by governments that increasingly collect, restrict, and manipulate data for authoritarian purposes. While “data” is key to understanding the changing nature of state repression, it has a double role in digital autocracies: a tool of oppression and also a means of colonialism and assimilation.

In 2016, Thatcher, O’Sullivan, and Mahmoudi used the metaphor “data colonialism” in their research to describe how governments’ control and possession of data creates a power asymmetry between them and their citizens. According to Escalante et al., data colonialism refers to “the world’s saturation with data flows, but also to the communities and planetary spaces whose power to say no have been erased.” Despite producing an enormous amount of data, users of smartphones, computers, and various technologies have minimal ownership of that data. However, states continue to colonize the accumulated data that reflects almost every aspect of their citizens’ lives — such as their ethnicity, political views, religious and spiritual practices, health, and sexual orientation — for their political interests. The data colonialization, thus, denotes a new strategy for states and big tech companies to use technology for repressive purposes.

A recent illustration of how state-led data colonialism poses a threat to individuals, especially members of racial, ethnic, and religious minority groups, is the Chinese government’s accumulation of and access to biodata in the Xinjiang Uyghur Autonomous Region (XUAR). The government runs an extensive assimilation project in the XUAR, which is recognized as a cultural genocide against the minority Uyghurs by seven countries, including the United States. A recent leak of the Chinese government records from public security bureaus in XUAR once again revealed the militarized and inhumane nature of the detention camps, where over 1 million Uyghurs have been arbitrarily detainedThe government’s colonization of biodata is a critical aspect of its assimilation project.

In 2016, the Chinese government launched the “Physicals for All” program that requires collecting the biological data of the Uyghurs under the pretext of free healthcare. The collected data presents very detailed biological profiles of the Uyghurs, from iris patterns to DNA samples. In so doing, Physicals for All facilitates the government’s expanding digital authoritarianism in two ways. First, it provides new opportunities for surveillance in the XUAR. Collaborating with local high-tech companies, the Chinese government creates a complex surveillance network using the biodata of the Uyghurs. The iris prints, voice patterns, and any other biological evidence about the Uyghurs are used to develop new surveillance technologies to easily identify, classify, and label minorities. Therefore, the government practices an expanded version of digital authoritarianism to oppress the minorities in the region. Second, the government monopoly over biodata enables the assimilation of the Uyghurs because the officials detect the women who are pregnant or “need” to be subjected to forced abortion and involuntary sterilization. Hence, the state’s data colonialism opens the way for oppression, assimilation, and ongoing atrocities in the XUAR.

Data colonialism is similar to traditional colonialism in terms of its appropriation of human life. States thus use their ownership of data to regulate the behaviors and cultural and religious practices of minorities. The Citizen Lab’s recent report on digital transnational repression shows that digital autocracies take advantage of the colonized data to regulate their citizens’ behaviors and oppress them even when they are abroad. According to the report, countries like China, Saudi Arabia, Syria, and Vietnam use various technologies to silence and intimidate anti-government activists abroad. In the face of technological oppression, the activists interviewed in the report mention the emotional stress and insecurities they experience. According to the activists abroad, their governments’ transnational repression pushes them to change their behaviors in many ways: “keeping a low profile online, posting pictures of specific locations only after leaving them, and asking that conference biographies be kept offline” are some of those. They also refer to the targeting of their families digitally. In the report, an Uyghur activist abroad states that the Chinese government forced his relatives to work as proxies to gather information about him during their WeChat talks.

In the light of these examples, there is a need to decolonize data to deal with digital authoritarianism. But how? The primary strategy could be encouraging people to be the creators of their own technological tools instead of acting as mere consumers. Although many people hold technical skills beyond basic coding, people still rely heavily on predetermined technological software and algorithms. Users of technology, especially those from minority and indigenous groups, should be able to participate actively in technology-making processes. This goal can only be achieved through a redesigned and more inclusive education system that provides the necessary tools for the technical development of individuals. 

Another strategy could be holding authoritarian countries to account if they colonize data for repressive purposes. International society should advocate for better technology laws and regulations that would be watched by political and civil initiatives. Technology develops fast while creating tech policies and laws takes much longer. As a result, autocracies exploit the current gaps in the regulations. More responsive political and legal regulations can prevent the risk of future violations. In this way, even if not eliminated immediately, the degree of data colonialism can be lessened.


Please visit the project page for more pieces from the Unfreedom Monitor.

Start the conversation

Authors, please log in »

Guidelines

  • All comments are reviewed by a moderator. Do not submit your comment more than once or it may be identified as spam.
  • Please treat others with respect. Comments containing hate speech, obscenity, and personal attacks will not be approved.

Digital Rights news from around the world directly in your inbox.

Sign up for weekly global internet censorship news!

Submitted addresses will be confirmed by email, and used only to keep you up to date about Global Voices and our mission. See our Privacy Policy for details.

Newsletter powered by Mailchimp (Privacy Policy and Terms).

No thanks, show me the site