- Global Voices Advox - https://advox.globalvoices.org -

Unfreedom Monitor Report: Information

Categories: Advocacy, Disinformation, Media, Tech & Tools, War & Conflict, Unfreedom Monitor

Image courtesy Ameya Nagarajan

Authoritarian regimes have long had a complicated relationship with media and communications technologies. The Unfreedom Monitor [1] is a Global Voices Advox research initiative examining the growing phenomenon of networked or digital authoritarianism. This extract from of the report on Information is from the series of reports to come out of the research under the Unfreedom Monitor. Read the full report here [2].

Information manipulation is a common phenomenon and practice in the contexts of the countries selected for this report. Generally, dis/misinformation campaigns in these countries are state-backed and benefit either dominant or competing political actors. Disinformation becomes a tool for the fight for power in a socially polarised context caused by problematic political events (e.g. coups, elections, changes in the government and protest mobilisation). In most cases, disinformation is aimed at compromising political opponents, highlighting the achievements of the political regime or suppressing dissent. The factor that plays a crucial role in these societies is a comparatively high proliferation of the internet and the enthusiasm of the public in using social media, which goes hand in hand with a certain traditional media unfreedom. However, disinformation strategies become not just a tool of operating domestic politics but also a way to establish political influence across borders.

Despite the fact that the idea of digital authoritarianism strategies is driven and delivered by modern IT tools, in essence they are similar to propaganda techniques. The tactics aim to legitimise certain narratives by injecting them into the media ecosystem and then repeating them so they become the new common sense for the population. Such an approach is amplified by internet technology, which allows the creation of credible media ecosystems, mimicking real ones and filling them with inauthentic users acting similar to online human behaviour.

Meanwhile, advertising targeting on social media sets up opportunities to run an influential campaign that now stretches beyond the Cambridge Analytica case. However, it is debatable whether targeted communication can achieve attitude change or form an attitude. Instead, the audience must be prepared to change their attitude as a result of what Jacques Ellul conceptualised back in the 1960s [3] as pre-propaganda: “the conditioning of minds with vast amounts of incoherent information, already dispensed for ulterior purposes and posing as ‘facts’ and as ‘education’.”

This term is relevant to today’s digitised strategies of information manipulation. As the overview of cases and practices of disinformation show, disinformation campaigns become more sophisticated and dispersed, aiming to prepare the audience to accept a certain point of view. As Ellul writes, pre-propaganda, “without direct or noticeable aggression is limited to creating ambiguities, reducing prejudices, and spreading images, apparently without purpose.” As in spreading disinformation, the primary effect is psychological — creating an alternate picture of reality for the individual.

To summarise, three factors define information manipulation today in the observed countries. First, there is continuous enhancement of disinformation technical abilities to overcome the measures taken by IT platforms and make disinformation look more trustworthy, especially using AI. Second, grand strategies and narratives for disinformation become more complex, as the narratives and tactics are used for an amplified psychological effect, such as creating mistrust, raising doubt, etc. They are used similarly to pre-propaganda and propaganda itself. The third factor remains the underlying traits of human nature, as in lazy thinking, the tendency to consume more emotional content, etc.

There are two main streams of discussion that look into the future of tackling disinformation. The first one suggests that the most promising tool for fighting disinformation is empowering societies through continuous media literacy and overall improvement of the media environment quality. Incentivising high-quality journalism and supporting civil society are a few primary things. Another debate comes with working towards the growing quality of the content circulated on platforms, including social media. Adding to continuous investment in content moderation, another recommendation is to prioritise authentic and high-quality content [4]. Some of the initiatives have already been run by companies, including Google [5].

Yet, as the given cases show, eliminating political disinformation in authoritarian countries where information fields remain under state control can be unrealistic. Despite many of the platforms using global social media, which can be regulated according to their own standards, many countries block existing platforms, limit access with legislative tools, or grow domestic platform alternatives. The practice of disinformation becomes the running tool for developing digital authoritarianism by using digital means to indoctrinate narratives favoured by the state. The discourse on disinformation, at the same time, becomes a tool of the ongoing repression of the freedom of speech, as the states use fake news legislation to silence dissent in countries. However, the development of blockchain technology may empower content creators and digital activists in these countries with opportunities to factcheck independently and incentivise the creation of authentic content.

Read the full report here [2].

The Unfreedom Monitor

Authoritarian regimes have long had a complicated relationship with media and communications technologies. The Unfreedom Monitor is a Global Voices Advox [6] research initiative examining the growing phenomenon of networked or digital authoritarianism.

Read the full report here [2].