By contrast with amateur trolls, professional democracy attackers are much more subtle and effective. They start by posting or Tweeting positive, warm messages designed to build a social media following. Rolling Stone writes:
Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.
Disinformation operations aren’t typically fake news or outright lies. Disinformation is most often simply spin. Spin is hard to spot and easy to believe, especially if you are already inclined to do so. While the rest of the world learned how to conduct a modern disinformation campaign from the Russians, it is from the world of public relations and advertising that the IRA learned their craft. To appreciate the influence and potential of Russian disinformation, we need to view them less as Boris and Natasha and more like Don Draper.
As good marketers, professional trolls manipulate our emotions subtly. In fall 2018, for example, a Russian account we identified called @PoliteMelanie re-crafted an old urban legend, tweeting: “My cousin is studying sociology in university. Last week she and her classmates polled over 1,000 conservative Christians. ‘What would you do if you discovered that your child was a homo sapiens?’ 55% said they would disown them and force them to leave their home.” This tweet, which suggested conservative Christians are not only homophobic but also ignorant, was subtle enough to not feel overtly hateful, but was also aimed directly at multiple cultural stress points, driving a wedge at the point where religiosity and ideology meet. The tweet was also wildly successful, receiving more than 90,000 retweets and nearly 300,000 likes.
This tweet didn’t seek to anger conservative Christians or to provoke Trump supporters. She wasn’t even talking to them. Melanie’s 20,000 followers, painstakingly built, weren’t from #MAGA America (Russia has other accounts targeting them). Rather, Melanie’s audience was made up of educated, urban, left-wing Americans harboring a touch of self-righteousness. She wasn’t selling her audience a candidate or a position — she was selling an emotion. Melanie was selling disgust. The Russians know that, in political warfare, disgust is a more powerful tool than anger. Anger drives people to the polls; disgust drives countries apart. (emphasis added)
The researchers, Darren Linvill, associate professor of communication, and Patrick Warren, associate professor of economics, discussed their research with KUOW, an NPR affiliate station, in a 9 minute interview. KUOW writes:
To stop trolls from exploiting existing tensions in American society, he says people need to question why we’re seeing certain messages and the consequences of sharing them before hitting retweet.
“I think that there’s a lot that you can do,” Warren says. “If you’re mindful of the origins of the information you’re sharing, it can make a big difference.”
Linville: “..... I think it doesn’t ultimately [matter] if it’s a Russian troll or an Iranian troll or a Chinese troll, I think one needs to be careful when you’re interacting with anonymous accounts not to retweet someone just because they use the same hashtag as you did and you agree with them, but also not accuse people of being Russian trolls just because you disagree with them. I think that’s one of the biggest impacts of Russian disinformation is that we don’t trust each other anymore and it’s really dangerous and it’s a lasting impact.”
Warren: “I think it’s important to realize that when you share something on social media, you’re doing two things. You’re sharing a message, but you’re also bringing prominence to the account you’re sharing. And so the question you should be asking yourself often on social media, in addition to the obvious question that we all start with, which is: Is this real or not? The next question you should be asking yourself is, why am I seeing this? Algorithms kind of rule our lives on social media. And what these guys are trying to do is get people who shouldn’t be central to the conversation to become more central to the conversation due to their gaming of the algorithm.”
Defensive disinformation vs. offensive disinformation
Defensive disinformation is used by professional government trolls to deny and distract from information the government wants to hide, distort or deny. For example, the Saudi Arabian government ran botnet trolls on Twitter that falsely denied the Saudi government murdered journalist Jamal Khashoggi.By contrast, offensive disinformation which is content specifically designed to manipulate emotions and attitudes by focusing on social stress points and playing on personal ideology. This kind of propaganda focuses on what is important to the people in the target country, not in the troll farm country. The goal is to to reinforce differences in existing attitudes and beliefs and use those differences to foment social division, distrust in institutions, e.g., the professional media, fellow citizens, and out-groups.
The ideology target
In a previous discussion here, I attacked political ideologies as a factor that significantly contributes to, or directly causes, major social and political problems. Strongly held ideological beliefs make it much easier to reject inconvenient facts, truths and sound reasoning. The research discussed in this OP makes it clear that professional trolls intentionally reinforce and then target ideological differences to foment social distrust and discord.For self-defense against troll manipulation, the researchers suggest asking some self-reflection questions when you are confronted with social media content from a source you are not familiar with. First ask yourself, is this true? For ideologues, belief in lies is easy when the lie fits personal ideological belief. Second, ask why am I seeing this? Trolls know how to manipulate the algorithm. Third, ask what impact on other would sharing or upvoting this have? This asks for a measure of empathy, which in a way is an opposite of self-righteous belief, which can easily be reinforced by troll lies and manipulation.
No comments:
Post a Comment