Let’s Talk About Bots

Let’s talk about bots.

I just spent two days tracking a massive troll/bot network on Facebook. This network contained thousands of profiles from all around the world, but not all of them were fake so let’s go over how bots function.

First: bots vs trolls vs trolling:
A bot is an automated fake profile programmed for automated responses. 
A troll is a fake account run by a real person. (Sometimes known as sock puppet identities or impostor accounts).
Trolling is a conversation habit real people with real profiles use to annoy other real people.

Bots are common on Twitter due to the brevity of tweets and the ease of the retweet system. Trolls are more common on Facebook because it allows for more flexibility in comments but from a user’s perspective, it can be very difficult to tell the difference between the two. (But there are a few tell-tale signs of it being a fake profile, though.)

Now that we have that established, there are 3 different types of bots (also known as sybil accounts):
1) Pay bots: These are accounts that will share something like a news article, but attach a link to another page that is paying for the service to boost traffic on their page. High traffic pages can be bought and sold.

2) Spam bots: These spread spam. Annoying little advertisements, but generally harmless. Except as networked accounts. Spam troll/bot networks attack pages and groups with the purpose of establishing connections and then later sending spam or requesting money from the individual. This is a pattern currently being investigated by the FBI and they are often of foreign origin.

3) Influence bots: These are dangerous. These are bots used to control and distort public perception and dialogue. They will often hijack trending hashtags to manipulate meaning and content. They can spread radicalism, fake news and propaganda and are regularly used for disinformation campaigns. Russia is not the only ones to use influence bots for propaganda, but they are the best at it. The act of manipulating and controlling the perception of public consensus is known as “astroturfing” (Rubin, 2017).

You can protect yourself best on Facebook by not accepting friend requests from people you don’t physically know. If you do, you risk exposing yourself to scammers and propagandists. On Twitter, I strongly recommend connecting your profile to a bot detection program like Botometer (formerly known as BotOrNot. DO NOT use BotOrNot because it is a fake program that will tag legitimate news sites as fake news. The name was hijacked.) (Stewart, Arif, & Starbird, 2018)

 

Because of the rapidly changing nature of the digital realm, tactics of information distortion change just as quickly. Be vigilant and watch for methods of manipulation.

Don’t wind up part of a troll/bot network.

 

References

Botometer® by OSoMe. (n.d.). Retrieved March 29, 2018, from https://botometer.iuni.iu.edu/#!/faq%23twitter-login

Davis, C. A., Varol, O., Ferrara, E., Flammini, A., & Menczer, F. (2016). BotOrNot: A System to Evaluate Social Bots. Companion, 273–274. https://doi.org/10.1145/2872518.2889302

Rubin, V. L., (2017). Deception detection and rumor debunking for social media. In Sloan, L. &Quan-Haase, A (Eds.) (2017) The SAGE Handbook of Social Media Research Methods, London: SAGE. https://uk.sagepub.com/en-gb/eur/the-sage-handbook-of-social-media-research-methods/book245370

Stewart, L. G., Arif, A., & Starbird, K. (2018). Examining Trolls and Polarization with a Retweet Network. https://doi.org/https://doi.org/10.475/123_4

About the author