Unpacking Education & Tech Talk For Teachers
Unpacking Education & Tech Talk For Teachers
Identifying Disinformation Strategies
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In today’s episode, we'll review seven general persuasion strategies used in disinformation campaigns as well as five strategies that are more specific to disinformation being shared on social media platforms. Visit AVID Open Access to learn more.
#277 — Identifying Disinformation Strategies
10 min
AVID Open Access
Paul Beckermann 0:01
Welcome to Tech Talk for Teachers. I'm your host, Paul Beckermann.
Transition Music 0:06
Check it out. Check it out. Check it out. Check it out. What's in the toolkit? What is in the toolkit? So, what's in the toolkit? Check it out.
Paul Beckermann 0:17
The topic of today's episode is Identifying Disinformation Strategies. I have to admit, it's becoming more and more difficult to tell if something I read online is real or fabricated. I find myself being skeptical about a lot of the content I see posted, especially on social media. Part of the problem is that disinformation campaigns are on the rise. People are deliberately spreading misinformation with the intent to deceive others. And that adds to the confusion. Technology continues to add to the problem as well, making it easier and faster than ever before to spread these falsehoods. In this type of misinformation environment, it's important that we take control and gain back the ability to separate truth from lies. If we're going to cut through this confusion and combat disinformation, we have to become educated on the strategies used by these bad actors in their disinformation campaigns. We must learn what these people do to fool us into believing their fabricated content. If we know what to look for, we'll be better able to spot misinformation when we see it in action. Let's start this empowerment journey by looking at seven general disinformation strategies that are commonly used to deceive and persuade.
Transition Music 1:35
Let's count it. let's count it, let's count it down.
Paul Beckermann 1:38
Number one, discrediting authoritative voices. According to the article, Four Key Ways Disinformation is Spread Online, published by the World Economic Forum, disinformation campaigns often attempt to discredit experts or authorities who support the opposing viewpoint. They might do this by spreading false accusations, connecting the authority figures to conspiracy theories, or by painting them as corrupt. If they can taint the opposition's credibility, they can make their false claims seem more believable.
Number two, impersonation. This strategy involves taking on the appearance or voice of a real person, and using that identity to spread false information. This might mean making up a fake quote and illegitimately attributing it to someone. Or it could mean using technology to create a deep fake, which is a video, picture, or audio recording of someone that has been manipulated in some way. These videos or voice recordings can make it appear as if someone is saying something that they never actually said.
Number three, create fake personas. This is different from a deep fake where someone uses a real person's likeness and then manipulates it. In the case of a fake persona, a new person is simply invented out of thin air. These fabricated personas might then be labeled as experts or are tagged with false credentials. Perhaps they're staged in photographs that put them in official or important locations. These fake personas are used to convey authority and to convince an audience that their words or point of view have merit. They're not real, but they seem convincing and authoritative.
Number four, appeal to emotion. The American Psychology Association, or APA, points out that people are much more likely to react to and share content that elicits a strong emotional response. Content that makes someone angry or fearful is especially effective.
Number five, polarization. Disinformation campaigns often try to divide people into conflicting and polarized groups. These divided identities might be defined by things like a political party, class, or race. The Brookings Institute recognizes this approach and calls the spread of fake news "a symptom of our polarized societies, where people actively seek out information that affirms their point of view, regardless of its authenticity or accuracy."
Number six, conspiracy theories. A conspiracy theory is a belief that a secret plot is being carried out by powerful people to accomplish some type of sinister goal. Dr. Karen Douglas, a professor of social psychology at the University of Kent, says people are drawn to conspiracy theories in an attempt to satisfy three important psychological motives. These motives include the need for knowledge and truth, a need to feel safe and secure, and to feel good about themselves as individuals, and also good about themselves in terms of the groups that they belong to. People engaging in conspiracy theories often feel like they're finding a collective truth that satisfies these needs. Because they're so powerful, disinformation agents may devise and amplify a conspiracy theory to advance their agenda.
Number seven, false connections. This is a practice of putting misleading headlines on a story or article. Essentially, the headline, visual, or caption does not support or align to the content within the article. For the many information consumers who never read the article, the headlines become misinformation. Those that do read the article may be subconsciously influenced by the headline which, in turn, may influence how they consume the content and the article.
So those are seven persuasion strategies that are used in general. Next, let's narrow in a bit and look at some strategies that are used specifically on social media platforms. These approaches often utilize the previous seven strategies, and then activate them in more specific ways to fit social media.
Transition Music 6:51
Here is your list of tips.
Paul Beckermann 5:57
Number one, micro targeting. The World Economic Forum describes this approach as the process of analyzing social media accounts in order to specifically target or direct posts toward people who are most likely to believe the content and amplify it through resharing. People might be targeted because of their affiliation with a political party or an economic demographic. Advertisers do this with products and disinformation players do it with ideas and false messaging.
Number two, astroturfing. Astroturfing involves posting overwhelming amounts of content from fake accounts, making it artificially appear that a specific point of view has more grassroots support than it actually does. Appearance can seem like reality, and if a lot of people appear to support a specific point of view, others may be persuaded that the frequently posted perspective and content must have merit.
Number three, flooding. This strategy involves flooding social media with posts and comments to drown out other perspectives with sheer quantity. While similar to astroturfing, this approach is less concerned about the perceived grassroot source, and is more focused on the overwhelming quantity of posts.
Number four, unsuspecting actors. The goal of this approach is to get a prominent or influential person to share misinformation. Because that person is well-known and likely respected, the post not only amplifies the message to many followers, but it also gives credibility, by association, to that person. This strategy is supported by the power law of social media, which states that: "Messages replicate most rapidly if they're targeted at relatively small numbers of influential people with large followings." In other words, when a well-known person shares misinformation, it can spread very quickly to their many followers.
And number five, bots and trolls. Both of these approaches involve insincere and misleading posts on social media accounts that are intended to persuade or influence perspectives. Bots are automated computer programs that typically target like-minded audiences who are inclined to believe the group's talking points. These people are easier to convince since they're already inclined to support that point of view. They're also easier to detect by social media monitoring algorithms. Because of that, they're more apt to be kicked off the platform. Trolls, on the other hand, are real humans who join an online conversations. Like bots, they're there to promote specific viewpoints or to spread misinformation. Because they're human, however, trolls are harder to detect by the algorithms and they may be able to convince the less gullible social media users by interacting with them personally and responding directly to their posts or questions.
So there are seven general disinformation strategies as well as five that are more specific to social media spaces. I encourage you to think about these strategies as you engage in social media and other online spaces. If you see something that seems a bit outlandish or you hear an influencer spreading an inflammatory point of view, ask yourself if one of these strategies is in play. I also encourage you to help your students become more aware of these concepts. The more informed we all are collectively as a society, the better chance we'll have to combat misinformation. And our students will be participating members of our adult world very soon. Let's prepare them to interact with it with their eyes wide open. Be sure to tune in next week as I continue this conversation. We'll be going one step further and exploring ways that consumers can be protected from misinformation and disinformation.
To learn more about today's topic, and explore other free resources, visit AvidOpenAccess.org. And of course, be sure to join Rena, Winston, and me every Wednesday for our full-length podcast, Unpacking Education, where we're joined by exceptional guests and explore education topics that are important to you. Thanks for listening, take care, and thanks for all you do. You make a difference.