A New Kind of War: From Russia, With Disinformation, Part II

/
8 mins read
Computer hacker

Photo by Bermix Studio on Unsplash

This is the second segment of a three-part series exploring how Russian disinformation has impacted and continues to impact U.S. elections. Catch up with Part I, in which Powell explored the roots of Russia’s information warfare, which pushed propaganda to the U.S. electorate across social media platforms before the U.S. presidential election in 2016.

Trolls and Bot Armies

When we think of cyberwarfare, we usually think it is something measurable and detectable, like a malicious virus that spreads among computers, altering the way they operate. The technology warfare utilized by Russia is different. Russia’s information warfare was actually designed to target social media sites. Russia created thousands of fake user accounts, called “bots” (short for robotic), and “trolls” (accounts used by paid actors) to bolster its false news stories, share propaganda, and affect users on an emotional level. They are interactive and engage social media users in real time.

Wikipedia defines a troll as a person who posts inflammatory, extraneous, or off-topic messages in an online discussion forum, chat room, or blog, with the primary intent of provoking readers into an emotional response or otherwise disrupting normal on-topic discussion.

Bot and troll accounts are set up to disseminate propaganda and target people with specific messaging to increase tensions. Unlike with computer viruses, there is no downloadable software or anything tangible to protect us from these types of attacks. You can use internet sites to check if an account is behaving like a bot, but all you can do to distance yourself from bot activity is to block and report them. 

I first became aware of this type of information warfare in late 2016. A friend suggested that I use Twitter to find out about the Women’s March that was planned for January. I was pleasantly surprised to find thousands of people uniting and sharing information. I wasn’t prepared for the trolls and bots. At first glance, it appeared that people were attacking one another, but as I scrolled through these posts, I began to see people who were aware that certain accounts were not actual users, but bots and trolls. I was intrigued and also grateful for the users who pointed out those accounts that were likely automated — the ones to “block and report.” It made discerning fact from fiction easier. 

Since then, I have continued to be on the lookout for bots and trolls. Here is a screen capture of my Twitter post responding to an Atlantic story about Donald Trump’s refusal to visit a World War I cemetery in France and his disparaging remarks about soldiers who died serving our country.

A screenshot of a cell phone

Description automatically generated

I posted what some would consider an emotional post. Next to it is a response by what I suspect was a bot trying to provoke an emotional response. Why did I think this? The tweet references Breitbart News as its source; Russian bots and trolls were found to be active in boosting conservative-leaning Breitbart News and Sputnik News stories. When I ran the user account through the Bot Sentinel website, it came up as being “Problematic.” Translation: likely a bot or troll.

Social media is a perfect hunting ground for bots and trolls. Some of the biggest social media platforms we use to share personal stories, opinions, and family photos were actually collecting data on us. While that isn’t all that surprising, the misuse of user data by social media companies is surprising. Facebook, the world’s largest social media platform with 2.4 billion users, also allowed Russian apps connected to Mail.ru (a Russian internet company with apparent ties to the Kremlin) onto its site to collect data from users as well as their friends. Facebook also shared data with companies like Cambridge Analytica, a political analyst group, which used it to create PsyOps-type programs to determine targets and the type of propaganda to use with specific individuals. Then they sent out their bot and troll armies.

Researchers from the Rand Corporation drive home how Russia’s speed at disseminating information through bots and trolls allows them to get ahead of stories and gain the edge on forming the narrative around the facts or even their interpretation of events.

“Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. This propaganda includes text, video, audio, and still imagery propagated via the Internet, social media, satellite television, and traditional radio and television broadcasting. The producers and disseminators include a substantial force of paid Internet ‘trolls’ who also often attack or undermine views or information that runs counter to Russian themes, doing so through online chat rooms, discussion forums, and comments sections on news and other websites. Radio Free Europe/Radio Liberty reports that ‘there are thousands of fake accounts on Twitter, Facebook, LiveJournal, and vKontakte’ maintained by Russian propagandists. According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.”

Every time Trump lies, bot and trolls quickly begin to parrot his words. Using bots and trolls to spread and amplify propaganda, the PsyOps attacks have the ability to take root and do irreparable harm. At some point they become so ingrained in our social behaviors that the PsyOps no longer requires the use of technology. Once the belief system has been altered by lies, the propaganda has taken root and the PsyOps or technology are no longer even necessary. Instead, those members of society who have been exposed to and accepted it will share and elevate this information.

What is more dangerous, your computer being hacked or your mind?


Come back tomorrow to read Part III focusing on how Donald Trump uses the same tactics as Russia to spread disinformation among the U.S. people.


DemCast is an advocacy-based 501(c)4 nonprofit. We have made the decision to build a media site free of outside influence. There are no ads. We do not get paid for clicks. If you appreciate our content, please consider a small monthly donation.


1 Comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous Story

Women for Biden-Harris: Daily Digest, October 16

Next Story

Lincoln Project: See Something, Say Something!

Latest from Election Security

%d bloggers like this: