Facebook removes Ukrainian pro-government and opposition networks for ‘coordinated inauthentic behaviour’

facebook_ukraine_cib

Facebook found coordinated influence campaigns with ties to Ukrainian pro-government and opposition politicians. Image by Pete Linforth from Pixabay.

In a May 6 report, Facebook said it had removed two networks of accounts targeting Ukrainian users for ‘coordinated inauthentic behaviour’ – a term the platform uses for accounts that were set up to engage in coordinated manipulation of public opinion or to share disinformation. The intriguing part? One of the groups was found to have ties to Ukraine's ruling party, Sluha Narodu (Servant of the People).

Digital influence goes mainstream

Ukraine is no stranger to social media manipulation: botnets, astroturfing in the comments, coordinated account blocking, and sock puppet Facebook accounts have all appeared in its online public sphere. Though most suspicious activity is said to originate outside of the country (GV's own investigations have found bot accounts and fake websites linking back to Russia), the country's own political players are also increasingly competing for influence online.

According to a 2019 report on the snap parliamentary elections in Ukraine by OPORA, a civil rights network and election monitor, online political advertising was used by all political parties, who collectively spent circa 1.8 million dollars on Facebook ads. Though Ukrainian electoral laws don't regulate online political advertising, these materials are guided by a specific set of rules on Facebook and are officially stored in the platform's Political Advertising Library. This allowed OPORA researchers to analyse the ads and assess party spending on online PR.

Beyond official political ads, however, experts suggest the Ukrainian political social media space is rife with “black PR” and emotionally charged orchestrated comment campaigns. Identifying them as such is tricky, as they are not subject to the same transparency requirements as mainstream political ads. This is why official notice from Facebook and other platforms about such activities is important.

What did Facebook remove?

In its report, Facebook indicated it had removed two different networks of accounts that had engaged in coordinated inauthentic behaviour. The first one consisted of “105 Facebook accounts, 24 pages and five Instagram accounts that originated in and targeted Ukraine”. Facebook said the suspicious activity came to their attention through an investigation by Bukvy, an independent news media outlet in Ukraine.

The people running the network combined fake and authentic accounts to share a mix of legitimate news and their own content with their 23,000 followers and also managed what looked like news pages to drive traffic to websites posing as legitimate media sources. Although the people behind this activity attempted to conceal their identities and the coordinated actions, Facebook's investigation found links to individuals associated with Sluha Narodu (Servant of the People), the ruling political party in Ukraine on whose ticket President Zelenskyy was elected and whole title mirrors the TV series which propelled him to fame.

The people behind this activity posted in Ukrainian and Russian about news and other topics like corruption; politics; automotive industry; satiric content about COVID-19; supportive commentary about the Sluha Narodu party and the current Ukrainian government; and criticism of opposition parties and politicians including former president Poroshenko and the mayor of Kyiv, Vitali Klitschko.

facebook ukraine sluha narodu

An example of content shared by the account network tied to the Ukrainian Sluha Narodu party. Image: press handout by Facebook.

Another, significantly larger group of accounts was identified and removed by Facebook on a tip from the U.S. Federal Bureau of Investigations. A total of 477 Facebook accounts, 363 pages, 35 groups (followed by 2.37 million people) and 29 Instagram accounts (with a following of less than 30,000) that originated in and targeted Ukraine were taken down. The accounts also spent close to 0.5 million dollars in advertising on Facebook to promote their posts.

Facebook's investigation attributed this network to “entities and individuals sanctioned by the US Treasury Department” in January 2021 — namely, politicians Andrii Derkach, Petro Zhuravel, and media companies linked to the sanctioned Begemot Media, as well as political consultants associated with two other political actors: Volodymyr Groysman and Oleg Kulinich.

While Derkach and others were sanctioned for alleged interference in the U.S. elections, their Facebook activity in this case focused solely on Ukrainian audiences and themes and was part of a larger “deceptive influence operation” encompassing different social media platforms and websites.

They built a ready-made network of seemingly independent media websites and social media assets that worked to promote content favorable to the three politicians and their political groups, while also engaging in similar behavior in support of other political actors across the entire political spectrum, likely for hire. This included competing political parties in Ukraine. The operation posted consistently anti-Russia content.

Ben Nimmo, Facebook's Global IO [influence operations – GV] Threat Intelligence Lead, noted in the report that Ukraine is “among the top sources” of coordinated inauthentic activity that the platform has identified and removed in recent years and that “the vast majority of deceptive campaigns” have targeted domestic audiences. In Nimmo's view, this signals a worrying development for “the burgeoning industry of what we call IO-for-hire that offers media and social media services involving inauthentic assets and deceptive amplification across the internet”.

At the same time, Nimmo said Facebook recognises the role of non-governmental organisations and independent investigative reporters in Ukraine “who continue to uncover and report on these deceptive campaigns” and suggested it would be worthwhile to assess the impact these public interventions are having on the success or failure of such influence campaigns.

Start the conversation

Authors, please log in »

Guidelines

  • All comments are reviewed by a moderator. Do not submit your comment more than once or it may be identified as spam.
  • Please treat others with respect. Comments containing hate speech, obscenity, and personal attacks will not be approved.