First, a look back at the good old days of 2018, when there was a social media site called “Twitter” that was, at least somewhat, trying to do something about the spread of misinformation.
Sloan School of Management, MIT, March 2018:
A new study published in Science finds that false news online travels “farther, faster, deeper, and more broadly than the truth.” And the effect is more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information.
Falsehoods are 70 percent more likely to be retweeted on Twitter than the truth, researchers found. And false news reached 1,500 people about six times faster than the truth.
The study, by Soroush Vosoughi and associate professor Deb Roy, both of the MIT Media Lab, and MIT Sloan professor Sinan Aral, is the largest-ever longitudinal study of the spread of false news online. It uses the term “false news” instead of “fake news” because the latter “has lost all connection to the actual veracity of the information presented, rendering it meaningless for use in academic classification,” the authors write.
To track the spread of news, the researchers investigated all the true and false news stories verified by six independent fact-checking organizations that were distributed on Twitter from 2006 to 2017. They studied approximately 126,000 cascades — defined as “instances of a rumor spreading pattern that exhibits an unbroken retweet chain with a common, singular origin” — on Twitter about contested news stories tweeted by 3 million people more than 4.5 million times. Twitter provided access to data and provided funding for the study.
Now, an educator is trying a new approach to making students social media literate.
Our beliefs inform our decisions and actions, so being misinformed can have serious consequences. Yet while protecting ourselves from misinformation is essential, trying to debunk every false claim after it pops up can feel like an overwhelming and endless game of Whack-A-Mole.
Thankfully, inoculation theory may provide one solution (McGuire and Papageorgis 1961). Similar to how a vaccine builds immunity to a pathogen by exposing our bodies to a weakened form of the pathogen, we can build immunity to misinformation by exposing our minds to a weakened form of misinformation. We can deliver misinformation in a weakened form by combining two elements: a warning of the threat of being misled and counterarguments or refutations explaining how the misinformation is misleading (Traberg et al. 2022). A growing body of evidence shows that inoculation can train our minds to identify (and therefore not fall for) misinformation and has the potential for large-scale use with long-term protection (Traberg et al. 2022).
There are a variety of types of inoculation (see Table 1). The primary inoculation methods are either fact-based or technique-based (a less studied type is source-based). Fact-based inoculation corrects misinformation with factual explanations and is therefore limited to a particular topic. Technique-based inoculation explains the strategies used to mislead, such as logical fallacies or rhetorical techniques, providing resistance against the same techniques in different types of misinformation (Cook et al. 2017).
Inoculation delivery mechanisms can be either passive or active (another less studied type is experiential). Passive inoculation occurs when the facts or techniques used to mislead are explained to the audience, while active inoculation builds mental immunity by getting people to actively create the misinformation themselves (Roozenbeek and van der Linden 2019).
While it’s possible to debunk specific examples of misinformation after exposure, it would be virtually impossible to keep up with the firehose of misinformation. Who has the time or energy? Instead, we can prebunk against broad categories of misinformation by inoculating students against the most common techniques used to mislead.
–
Imagine a young child watching a magic trick for the first time. If we wanted to help her understand that there was no magic involved, just sleight of hand, we could simply explain what the magician did. But to really help her master how and why she was fooled, we could teach her how to perform the trick herself. These two approaches—passively explaining the trick versus actively doing the trick—can both inoculate against strategies magicians use to mislead. But no explanation can replace the perspective gained from performing the magic trick. (Plus, it’s a lot more fun!)
Classroom activities that combine an active delivery mechanism and technique-based methodology are an effective way to inoculate students against broad categories of misinformation.
_
In “Please Don’t Fail Me,” students are instructed to imagine that it’s the end of the semester, and they’re failing because they didn’t do the work. On a discussion forum, they are to write an email to the professor to argue why they should pass the class anyway, using at least four fallacies from the lecture. Importantly, students are told to have fun, an instruction many of them heed quite well.
Consider the following “email”:
Hey Prof Trecek-King,
So, I couldn’t help but notice you gave me a 34% for the year, and see, I’m gonna need you to bring that grade up a little bit. The reason my grades have been slipping lately is actually because my uncle’s friend’s kid’s dog just had babies, and one of them got hit by a car. My car. I accidentally killed my uncle’s friend’s kid’s dog and now my uncle’s friend’s kid is depressed, which honestly has been weighing on my heart lately. Also, other than that, if you fail me in this class then I’m not gonna get into the graduate school I wanted to get into, and I’ll never be able to get my doctorate and then even more people will die. If you fail me in this class people WILL die and it will be your fault.
The way I look at it is like this, why would you give me a failing grade? Yeah, I didn’t do any of my homework but there are homeless people. Literally homeless people. Everywhere. You should put more of your focus and energy on that if you really care so much. I even asked my mom and dad if they think my final grade is fair, and they agree with me. It’s not fair. So, anyways PLEASE update my grade and I would appreciate it sooooo much. Thank You.
Science for Life Student
After their initial post, students identify and explain fallacies other students used in their emails. In this case, students correctly identified the fallacies committed: slippery slope, red herring, appeal to emotions, and appeal to authority. Other commonly used fallacies include cherry picking, appeal to the masses, false choice, and ad hominem (with some pretty hysterical attacks on my character). This assignment is used concurrently with the Cranky Uncle game, which uses cartoon humor to teach players the fallacious techniques used to deny science (Cook 2021; Cook et al. 2022; Cook 2022).



People learning cold reading techniques out of curiosity or fun are often alarmed at how readily other people fall for it and ascribe psychic abilities to them.
Sometimes I curse having learned the Wood technique years ago; it’s useful, ingrained, but oft-times I find myself going back to look for nuance. These day’s it’s all in the presentation.
That whole firehose thing dates back a long time, Alinsky I think: sucking all the air out of the room, making so much noise, pumping so much s..stuff there’s no room to get anything in edgewise. We’ve seen it from town halls to comments of obscure blogs, literally the monkeys’ strategy: overwhelm with nothing …
Tangential note: I adopted the howler monkey as my avatar as a nod to my years on talk.origins newsgroup and its long-running cre-evo wars. One of the more resilient creationists called the regular pro-science crew a bunch of poop-flinging howler monkeys, and we embraced the epithet (meetups in meatspace were called Howlerfests).
This sounds like motivated cognition to me. Liars who want an audience have to tell lies that appeal to some demographic… so they have (or can pick) a clear target, and one they have a good chance of selling the lie to. Those who aim at truth are delivering information that they take to be correct– and they don’t worry much about whether the audience will like it: what’s true is true, whether we (or anyone else) likes it or not. Tell them what they want to hear, and they will listen… but they won’t learn a damn thing.
Their neuronal connections will be reinforced. The brain wants and seeks out certainty.
Unlearning something (bigotry, widely-shared superstitions, etc.) with which you were raised is especially difficult. My own shedding of once-fervent supernatural beliefs had me in a frequent state of disorientation not far from madness until I could tease out what was real and what was indoctrinated.