Capitol Insurrection and Conspiracy Beliefs
Jane DOE: The following podcast was released in 2021 on a separate platform. IC leadership, thought leadership, titles, current events, and technology may have changed and evolved since its original release.
Dr. Mikey Biddlestone: … essentially those conspiracy stereotypes are used to justify the fact that this is an existential threat to us. And this is our last stand, this is our last chance to fight against this existential threat to our in-group.
Jane DOE: The opinions and views expressed in the following podcast do not represent the views of NIU or any other U.S. government entity. They are solely the opinions and views of Jane DOE and her guests. A mention of organizations, publications, or products not owned or operated by the U.S. government is not a statement of support and does not constitute U.S. government endorsement.
(Intelligence Jumpstart intro music)
Welcome back to the Intelligence Jumpstart podcast. I am your host, Jane DOE. When do conspiracy theories cease to entertain and become dangerous? I am going to discuss just that with our guest today. Dr. Mikey Biddlestone is a Postdoctoral Research Associate at the Cambridge Social Decision-Making Lab. His primary research focus includes the social factors associated with conspiracy beliefs, developing misinformation interventions, and investigating psychological reactions to everyday politics. He has published empirical work on conspiracy beliefs in the COVID-19 pandemic, led a number of theoretical reviews detailing the social processes involved in conspiracy beliefs, and helped to develop a misinformation checklist with the Doublethink Lab in Taiwan.
Jane DOE: Dr. Mikey Biddlestone, thank you so much for being on the Intelligence Jumpstart.
Dr. Mikey Biddlestone: Thanks so much for having me. I'm really excited to have a chat about all things misinformation and conspiracy beliefs and things like that.
Jane DOE: I’m excited and I want to start by asking … what led your focus … your research focus on conspiracy theories and misinformation?
Dr. Mikey Biddlestone: I think I've always been interested in things that are … that I wouldn't necessarily have thought of in the first place or are not necessarily intuitive to me. And the thing with conspiracy theories in general is that they're quite creative, they kind of create this interesting narrative for stuff that it would be really interesting and kind of exciting if it were true. So that's the real appeal of this topic to me. But on a personal level, I've had a couple of friends who have been, I guess, you could say, fallen down the rabbit hole in that sense. And I think, at first sort of implicitly, I definitely wanted to understand why they've gone through that process. And this has actually really helped me understand their experiences. And also, made me realize, you know, that I can't change their mind, myself, I can only really, I don't know, understand, and hopefully inform them if they're willing to listen. But yeah, that's the kind of personal level of it as well.
Jane DOE: That's really interesting. I like that a personal experience that led you to a very fascinating focus. And it's very true … it's there … it is so easy to get swept down any number of rabbit holes. Because when I was getting ready for this discussion today, it's just like, oh, my gosh, is that true?
Mikey Biddlestone: No, definitely, yeah.
Jane DOE: … wait no … I had no idea about some of the stuff that’s out there. It's kind of scary in some ways, and very humorous, in others. And I'm just not completely sure how to feel.
Mikey Biddlestone: Yeah, definitely.
Jane DOE: But when I first started thinking about conspiracy theories, I thought about, you know, the Kennedy assassination, but also like, you know, growing up, we watched the movies like National Treasure … I don't know if you've ever saw that, where they all these conspiracies and secret societies … and it's all kinds of adventure. And you know, it's very cool stuff.
So, three are some conspiracy theories that I think of as ridiculous and mildly entertaining … I mean they're entertaining. But, in 2019 the FBI identified fringe conspiracy theories as a domestic terrorist threat. And that is a big leap from watching Benjamin Franklin Gates run through the Library of Congress and tunnel under the White House. So, how would you define a conspiracy theory?
Dr. Mikey Biddlestone: Okay, so, I mean, in around the late 90s, I would say, there was kind of this agreed upon, I don't know scholarly definition, if you could call it, of conspiracy theories that are basically these … these secretive, or small group of secretive, powerful actors, who behind closed doors, act nefariously, and plot in secret ways in order to achieve their own selfish goals, basically. Their own selfish political goals.
So that's not necessarily changed, currently. But one of the interesting debates at the moment is around … well, there's two interesting debates … around the kind of mentality or the way that these beliefs form. But also, in terms of the definition, including powerful groups. So, if we look at history, of course, there's a lot of examples where people see governments or these kinds of New World Order Type enemies, or outgroups, they perceive as very powerful, and therefore, they assume that they have the ability or the resources to create this kind of global conspiracy.
So that does tend to be the sort of general notion of what most conspiracy theories capture. However, there are also a lot of conspiracy theories, especially on the right wing, or maybe extreme right wing, I should say … that tend to target relatively powerless groups as well. These can be ethnic minorities or religious minorities. And these are often capturing ideas that, you know, our society is being infiltrated by these groups who are essentially, in real terms, are underrepresented and have relatively less material resources in order to actually engage or to push a conspiracy in the first place. So, in that sense, it's kind of less realistic, even though both have the unrealistic elements to them.
Jane DOE: Yeah. So, I'm curious … is there a specific schema or profile that we can build or use, to identify the types … the sorts of people who believe in conspiracy theories? And I really don’t want to profile anyone, but … ugh
Dr. Mikey Biddlestone: Sure, that’s cool, yeah … Yes, there's definitely … there's a profile and there's kind of caveats to that. Right? So the idea that we're kind of tackling with at the moment, which we've made a bit of ground on, theoretically, is this distinction between belief in a given or specific conspiracy theory like, JFK was assassinated by the CIA. Versus this idea of a conspiracy mentality, which captures this kind of general propensity to perceive the world and conspiracies terms. Regardless of what the topic is, or what the groups are involved in that. So, the idea here is that with specific conspiracy theories, most people are … you know, at least kind of swayed or are suspicious of some circumstances. And that's kind of just a human condition. Right? Where we have pattern … pattern recognition things in our brain, we … and there are genuine conspiracies that have happened in history. Right? So, it'd be kind of silly to just dismiss all conspiracy theories outright.
That being said, people who will have what we tend to refer to as a conspiracy mentality, they tend to believe in mutually conflicting conspiracy theories as well. So, for example, there's this paper where they looked at whether people would believe in the conspiracy theory that Princess Diana was murdered by the royal family, but also that she faked her own death. And more recent research has basically shown that people who have a conspiracy mentality will believe in both of those conflicting things, right, she can't perfect her own death if she's been killed by the royal family. And it's a similar thing with Osama bin Laden conspiracy theories that he was murdered way before … or he faked his own death. And people will believe both of these conspiracy theories at the same time, right?
That is what tends to characterize this propensity to perceive the world in conspiracist terms. It’s less based in reflective reasoning, or analytical judgments, that mean that you kind of check that your logic is consistent … and that becomes irrelevant … and it's more that everything is a conspiracy.
That being said, it depends on how we measure it. Maybe there's some elements of maybe they think that some could be true, but they're not fully true. So, there's bits that we need to tackle there in the literature still, but yeah, that distinction is belief in one specific conspiracy theory versus belief in basically all conspiracy theories.
Jane DOE: It’s so interesting how our minds work and that makes sense. So that kind of made me think about a study that was done by the University of Chicago’s Project on Security Threats. And after the January 6th insurrection, they did a study with all the, you know … arrest documents, court documents … to bring together a fairly comprehensive understanding of the people who were arrested. And to examine the persistent and ongoing threat posed, by actors or events on January 6th. And there were some pretty interesting conclusions from this study.
Such as the population was older than we would assume it would be. I’d like to think that folks who believe and do foolish things as younger. They don't have a lot of worldly experience, or, you know, they're still naïve. But 66% of those arrested were older than 34 years old. Then, 85% of them were actually employed, and not just employed, but employed as … you know, doctors, lawyers, architects, you know, white collar professions. Right?
And perhaps the biggest surprise was that 89% of those arrested actually had no direct affiliation to the actual militia and fringe groups that were identified as a domestic terrorist threat. These were folks that appeared to be acting alone or independent … or you know … independent of group membership.
And there are other like implications for that … I guess the other kind of result was that they came … they found that a lot of the people that attended and came to DC were from counties that Biden had won in the 2020 election. Individuals who felt like they were being displaced. And I don't know, if you've heard of the great replacement, it was a Camus theory, you know, white nationalist. But if, if the population of white adults in those counties that Biden won dropped below 60% I think they said that they were 18% more likely to send people from that county to the Capitol on that day.
Dr. Mikey Biddlestone: Interesting … okay.
Jane DOE: So, I mean, all of that is very interesting, because a lot of characteristics, I just wouldn't think of as far … as somebody that would you know, actually believe in conspiracy theory … but then mobilize with an intent to commit violence. So, I guess my question here is how conspiracy theories go from the ridiculous … like you mentioned, Princess Diana faking her own death, and at the same time, she was killed by, you know, the royal family … to dangerous and violent how, how does that evolution … I don't know what to call it. How does this happen?
Dr. Mikey Biddlestone: Sure. Yeah. So, So I guess, kind of what we're describing is like a version of radicalization, right? It's like conspiracy radicalization. And there's a few elements to that I was thinking of when, when you were explaining those, those factors. So, I think one of the things that's kind of important to mention is that … so, I think, specifically with the January 6th insurrection was that there was a lot of targeted misinformation on social media. Right? This has been found in a lot of kind of network analysis on social media analysis and this kind of thing. One of the things that we know is that there's not such conclusive evidence in terms of how age relates to conspiracy beliefs, and susceptibility to misinformation in general. However, there is some evidence to suggest that when someone is susceptible to misinformation, because of their age, it tends to be explained by the fact that they have less digital media literacy.
So, the idea there is that if we intervene and say, you know, this is this is how you can detect misinformation online … and this is how, how easy it is to create misinformation online … in a very cheap way … and for it to be spread very easily. That tends to be an element of age, right? And that there may be this kind of, you know, the idea of … technology, natives, technology migrants, I think it's called something like that. But I don't think that explains it enough, really, because what I think that explains is kind of the top-down element of it. Right? … which is that we know that certain groups were targeted in those types of pieces of misinformation, disinformation, and conspiracy theories.
However, there's also this element of why do those stick? Right? Why do those conspiracy theories stick, or those ideas stick to those people? And like you said, why did they mobilize people? So firstly, conspiracy stereotypes of groups are sort of used, you know, especially for example, in Nazi Germany … in order to paint certain outgroups as very highly agentic or competent, but also cold or you know, an empathetic. So, the idea there is that, that those, those groups are going to be perceived as extremely threatening, but also able to garner the means to engage in some conspiracy or threat against you as an in-group. Right?.
So, what that does is it kind of plays on your, your political affiliations and your political ideology. So that can sometimes be tied to, you know, certain groups such as often antisemitism with conspiracy theories. Right? Conspiracy stereotypes of Jewish people tend to be that they are cold, and agentic, essentially. But also, in terms of how that leak into reality … essentially those conspiracy stereotypes are used to justify the fact that this is an existential threat to us. And this is our last stand, this is our last chance to fight against this existential threat to our in-group.
And as you mentioned, the great replacement theory, this is actually a really common sentiment in general in conspiracy beliefs, as they form … which is that my group as an existential force, or an agentic movement in the world is threatened. And that means that I'm going to be completely wiped out, which is, of course … it plays on my fears that I'm going to die. It plays on my fear fears that I'm not going to have control over my life or autonomy … any security over my life. I'm never going to be able to have control over any of the systems or the society that I live in. So basically, all of those things are loads of ingredients that create the situation, that means that this is existential fear. And the only thing I can do is to fight back against this in a very extreme way, as a last stand against the people who are trying to destroy the way we live and who we are, basically. That's the language of these kind of conspiratorial notions.
Oh, and finally, sorry, on that bit that you mentioned, in terms of education, and that kind of socio-economic status as well. Basically, there's a lot of evidence on … so we call it motivated reasoning in psychology. The idea here is that you can be, you can have the very high cognitive ability in certain domains. But if you are motivated to use certain knowledge to protect what you already believe, or the groups that you already belong to … you're not going to be looking for the correct information, you're only going to be looking for information that already justifies what you want to believe. So, the idea there is that you can be the smartest person in the world with the highest cognitive ability. If you have a strong identification with let’s, say, a certain political party …. all you're going to do is use those very high cognitive abilities to bolster or support what you already believe, as well. So that's why that seems conflicting, but it kind of makes sense in that way.
Jane DOE: Yeah. No, that's very true and so interesting. So, I’d like to step back for a minute. When we're talking about misinformation, I'd like to clarify how you define misinformation, because we have like misinformation and disinformation. And a lot of times when we're talking about conspiracy theories, I would use the term disinformation because it's intentional, and it's meant to cause harm. It's you know, it’s a way of causing dissent. Whereas like misinformation is part of being a human. Because sometimes … I don't know if you ever played the game telephone, you know, what the message gets to the end of the line is going to be completely different. And it's not because of, you know, malice or anything, it just happens. So how do you … how do you define misinformation?
(Manolis Minute intro music)
Manolis Priniotakis: I’m Manolis Priniotakis, NIU’s Vice President for Research & Infrastructure … and this is this week’s Manolis Minute.
In the next episode, I’ll be speaking with John Cohen. He’s in the role as the senior official performing the duties of undersecretary of Intelligence and Analysis AND the counterterrorism coordinator for the Department of Homeland Security (DHS).
We will be talking about domestic extremism and the intelligence challenge. He has a long title for a complicated position – or in his case, two positions: the top DHS intelligence officer as well as the CT role.
The first person to lead DHS intelligence was longtime CIA officer Charlie Allen, who first served as Assistant Secretary before the position was upgraded to Under Secretary for Intelligence and Analysis.
In the pantheon of CIA legends, it may be a little controversial to say this … but only a little … there are CIA legends and then there is Charlie Allen. In a career spanning many decades that connected him with several complicated national security issues … to include the collapse of the Soviet Union and Iran-Contra, he’s noted as among the best. His service at DHS was a coda in his federal service – but an important one.
From its humble beginnings after the creation of the Department of Homeland Security … with the passage of the Intelligence Reform and Terrorism Prevention Act (IRTPA) of 2004 … DHS Intelligence and Analysis … or I&A, as it is commonly known … is now a key part of the contemporary IC and has the important job of working to safeguard the homeland while protecting civil liberties and respecting human rights.
Thanks again for listening to Intelligence Jumpstart. For more information on NIU, please visit our website, www.ni-u.edu.
(Manolis Minute exit music)
Dr. Mikey Biddlestone: So, that can basically be any correct piece of information or belief, regardless of whether that's intentional or not, as you … as you mentioned. Right? So, this can also be sensitive to the time period that is brought up in. So, you know, one piece of information might be misinformation at first, but then become fact or the other way around, depending on what time period. But it can also depend on the kind of context and your understanding of everything. Right? So it can be that you're leaving out certain pieces of information … and that can be in a manipulative way or by accident. And as I mentioned earlier, you know, the kind of cognitive biases lead us to do that sort of thing where we'll accidentally leave out a certain important topic … which means that we're much more susceptible to misinformation that may otherwise be kind of harmless, you know.
And as you mentioned, yeah, when it becomes disinformation, that's when this kind of, I guess vagueness or gray area, of when facts, kind of incomplete or slightly incorrect, but have a kernel of truth in them … that's when those are mobilized on or strategically used in order to spread disinformation or spread false information, I should say … in order to have some sort of political gain or some … maybe even … even in businesses, they can spread disinformation in order to, to counter their competitors. Right? … and make them seem illegitimate … that kind of thing in many different contexts. That being said, you know, on top of disinformation, you also have propaganda, which tends to be more of a kind of mobilized or streamlined, state-sponsored version of disinformation, essentially,
Jane DOE: Gotcha, so basically, Nazi Germany … their campaign against Jewish communities across Europe.
Dr. Mikey Biddlestone: Exactly, exactly. Yeah, no, definitely be propaganda, because, of course, state-sponsored. Yeah.
Jane DOE: So, I have a couple of questions here … and I'm trying to … zero on them … But so, because of the First Amendment … people with, you know, some limitations have the right to say what they want to say … obviously, they can't go into a crowded theater and scream fire, or, you know, purposely, like, tell somebody that a voting pole has moved … so that that person doesn't get the chance to, you know, vote and participate in democracy. Those are things you can't do under the First Amendment.
But it does allow people to consume any type of information … there is no … they have the right to consume information that they want … like you said, motivated reasoning … they want to find information that kind of fits in with their current belief systems. So, is there any … what does accountability look like in this? And I mean, is that something we should even be worried about when we're talking about potential interventions?
Dr. Mikey Biddlestone: Yeah, so that's a really good question. I think, honestly, there's not a very clean answer to that. I think in terms of accountability … our knowledge of harm that speech can cause … I should preface this with, you know, this may be slightly my personal opinion … but this is kind of from my experience of conducting research on perceptions of morality and harm and things. Our perceptions of harm and our understanding of what causes harm with our speech updates as time goes on. Right? So as a society, or as some societies, you know, we tend to have this kind of social norm that changes, which is, you know, some language we just all accept is less acceptable to use. Right? So certain racial slurs or things like that.
That being said that that doesn't end there, you know, that there's still things that we update, there's still harm that we realize is caused maybe in a kind of indirect way. And some people might understandably argue that, you know, indirect harm, muddies the waters, in terms of accountability that we're talking about.
So, I think on that sense, I think it's important to help people understand that there's no end to our perceptions of when we should keep updating these ideas of kind of where free speech ends or doesn't end. Right?
That being said, that can be that that change or progress can really be mobilized by certain groups to argue that this is a slippery slope to policing speech. And that is a valid concern. You know, people should be attentive and concerned about how far these, these laws go … and what are the implications or rules, a lot of these laws and rules are. So, I think, basically, the roundabout point I'm trying to get to is this is such a gray area that I'm not sure we should focus necessarily on accountability, to counter misinformation or susceptibility to conspiracy theories and stuff.
What we need to do is, is educate people and allow them to be protected against these attempts to sway them in political ways or other ways socially. So, while you know, that is a complicated question, I don't think there's a clear answer to that. And therefore, we need to focus our efforts on where we know it's going to work … where we know it's going to protect people … which I can go into in a bit … about how that might work in terms of, you know, intervening and educating people and helping them to know how to approach things critically … despite the fact that, yeah, they may be very smart, they may accidentally use certain biases and things like that.
Jane DOE: Absolutely. I would love to hear more about that. I think interventions are, like you said are … and education are the key … and I’m very curious to hear about your thoughts on certain interventions … I don't know, activities, processes that have, you know, had positive results.
Dr. Mikey Biddlestone: Yeah. So, there's actually quite a few interventions that seem to have some supportive evidence for them at the moment. And what we tend to refer to in a general sense is these interventions to reduce susceptibility to misinformation. And in some cases, to reduce a willingness to share misinformation as well. So, the idea there is that we want to stop the spread of misinformation in general, which includes both of those things, which are your kind of susceptibility and your willingness to share these ideas.
Based on kind of lots of different psychological theories, in past literature that hasn't ever focused on misinformation, those have sort of been adopted for this kind of updated topic. So, one very sort of influential and kind of straightforward thing in psychology is about how we think. Right? The types of thinking styles we use to understand information. So, this general idea is that on one side, we have this reflective, analytical, rational thinking style … which motivates us to look for conflicting information to determine whether what we believe is true or not. Or, we have kind of intuitive or automatic or non-reflective thinking processes or styles … where we tend to basically not reflect on our thinking, and we just kind of go with what our motivations tell us to do.
So maybe unsurprisingly, we know that conspiracy beliefs are kind of born out of this intuitive or automatic thinking style as opposed to analytical thinking styles. Ironically, a lot of conspiracy theorists would say, they are rational skeptics and things … but regardless, what we find in the research is, if you believe conspiracy theories, or you're susceptible to misinformation, you tend to rely more on these intuitive thinking styles.
So, what scholars have done is they've kind of tried to use this to develop interventions to counter our reliance on intuitive reasoning. One way that they can do this is what they call accuracy nudges. Which is before maybe you decide to share a piece of misinformation on social media, they give you this prompt that says, “Are you sure? … or do you want to think about this for a second?” … and that simple prompt basically engages people in their analytic thinking styles. And it's been shown basically to reduce people's susceptibility to misinformation. Or, what we call improve their truth, discernment, so they're able to detect the difference between real and fake news a lot more effectively. And also, this has downstream effects on the fact that they're less willing to share misinformation online.
That being said, there's some evidence that this might not be effective across the board. So, for example, for people who are Republican voters, or tend to be more conservative ideologically, they are less susceptible to this intervention strategy … meaning that it tends to not influence them as much. It doesn't nudge them into being more accurate. So even though that's a very scalable, intervention, where you could just, you know, have this prompt on social media … just this little window that you click off. It’s not necessarily been shown to be super effective.
So, there are many other versions, which are things like labeling the sources and giving reliability estimates on the sources of the information as well, which have also been suggested to be good. The idea that that's an issue is that you’re kind of always catching up with misinformation. Right? You're always using the rhetoric of the misinformation … when you're kind of fact checking, or you're debunking information, you're always trying to catch up with the statement and you're always basically legitimizing the language that's used in this misinformation. And what we know is when you present a piece of misinformation and a fact, alongside each other, the misinformation essentially cancels out the fact. So, basically, what you're doing when you're fact checking is you're giving more credence to the misinformation itself.
So, one possible remedy to this, which some scholars are working on, is this idea of inoculation. So, inoculation theory was developed in around the 60’s … in order to basically understand why and how people are brainwashed … usually in a political sense. So, these are kind of … what we call, cultural truisms, such as kind of ideological views things like this. How to kind of un-brainwash them or deprogram them, you could call it in terms of cults.
So, the idea here is that we use a vaccine metaphor. What we're doing is we expose you to small doses of misinformation alongside the reason why this is a piece of misinformation … and what strategy maybe a merchant of doubt is using in order to make you uncertain about something. So, an example of this could be that they use kind of logical fallacies, like incoherence, like conflicting ideas, and try to kind of glaze over that. They use false dichotomies .. they use scapegoating … emotional manipulation ... and they tend to try to polarize people. There's lots of examples of this right?
What you can do with inoculation is you present a piece of misinformation. And you say, this is misinformation, because, for example, what they're trying to do is appeal to your emotions. It's trying to really garner something up in you, which makes you angry or something like that. And they can also kind of make minority views or things that are not necessarily mainstream views seem like they are or this kind of silenced majority by kind of giving them false amplification. So, what that does is it essentially vaccinate you against future misinformation that you encounter. The benefit of this is that you don't have to catch up with constantly with misinformation because it has this kind of general protective element to it.
So that being said … that there's this issue with the fact that maybe the metaphorical idea of misinformation … immunity does wane over time. So after about three to six weeks, the effective inoculation against misinformation maybe isn't as effective. But extending this vaccine metaphor again, you can give people booster shots of inoculation. Right? The general sense of how we do at the moment is we get people to play games where they are someone spreading disinformation online. And it's explained to you how this is a piece of disinformation. It's quite difficult to get lots and lots of people in a wide range, play these games online. Basically, they take a long time, it requires effort and motivation to do it. So, what we're scaling at the moment, which there's some evidence that might work is these kinds of little 30-second ads on YouTube that can explain to you how inoculation works and how disinformation can sway you and make you susceptible to disinformation in general. So, this seems to be quite promising avenue, as a kind of general immunization against misinformation.
Jane DOE: That’s great … and ….
Dr. Mikey Biddlestone: That's quite a lot of information. I didn’t know where to stop in between.
Jane DOE: No, that's great. It is a lot of information … but it is incredibly complex and so, so fascinating. I think … when you think about it, like the spread of misinformation … and all the legal and you know, political, like ramifications that like surround it all … the one thing I do have a question about … because you mentioned, digital natives, or Gen Z, earlier … and we have this group, this generation coming up, they've always been on computers, they've always been … they're associated with social media, and they’ve essentially been exposed to all of these messages since birth.
So, one of the things that you all mentioned in your research was if … if they had already been exposed, that a lot of these interventions would not necessarily be effective. And there also had to be some sort of motivation to like … go to the games or whatever … like, like you just mentioned. Would it be effective to start some of these interventions at a younger age? Because by the time you get to college, I mean, you pretty much you kind of almost have your own identity. And you know, you've been exposed to your friends, you've been exposed to your parents’ belief systems, and all of that plays into this … who you are … But there are also studies that conclude that the political ideology of where you go to school plays a bit into who you are and developing your belief system. So, would it be effective to even, you know, expose younger age grade school children to some of these games? Or do they actually have the cognitive capacity to really understand the meaning behind all of that?
Dr. Mikey Biddlestone: Yeah, I think … that's a really good way to look at it. Because you know, of course, yeah, we can all just say, yeah, just in schools educate kids and that kind of thing. But you're absolutely right. There may be some, some cognitive hurdles to that, right, in terms of their development.
That being said, I think that may be a concern that we have for a lot of concepts that we now teach kids. That, you know, we've kind of developed ways to make it accessible to them. So, I think on that, you know, we have to be creative. Right? Of course, it’s easy to say that rather than do it. But, I think, you know, that there are ways to speak to people in terms of their generational language or their medium. Right? So, I think that there's no, there's no research that I'm aware of on this. But you know, for example, you could maybe use memes to appeal to certain generations that, you know, are used to, as you mentioned, that kind of social media environments.
You know, because it sort of, but I guess, you know, that, like you said, even the younger, even younger generations are going to be having a different medium of communication or … you know, Tick Tock is a relatively new thing. And it's quite a new concept to have that kind of short video bite thing. You could mobilize those short videos to, to appeal to them, you know, in that same space, that they seem to enjoy … that kind of thing.
But I think with kids in terms of the cognitive element. I think you could definitely simplify terms, right. So, for example, you know, when I mentioned the different strategies that we make people aware of when we inoculate them against misinformation … you do have to give a definition. Right? I can't just say, oh, they use an ad hominem argument, and that means that they're trying to create misinformation about stuff. We have to explain that an ad hominem argument is focusing on the person as opposed to the situation or the arguments. And what that does is, it kind of deflects from the actual facts of the situation, or the actual important elements of what we're talking about in a political discussion. Right?
That can be I think, simplified, probably in lots of different ways, or in a kind of video format, or an implicit way, that means that it kind of goes in with repetitive education. And maybe, as you mentioned, games, you know, if we develop games that are kind of simplified as well. I really do think that we can, we can train really young kids to understand this stuff. If not build some sort of intuition that they can't necessarily articulate, but they're kind of suspicious of attempted manipulations of their understanding of the world.
The general idea is that we're basically motivating people to be resistant to persuasion, right. So, if we can build that into an educational format, in whatever means necessary, I think that would really help, in general.
Jane DOE: So that's from a young age. Interesting. Shifting gears a bit … do you have ideas or an opinion about how governments should approach disinformation? The US Communications Decency Act … it came about in 1996 I believe … and it has not been updated since. And, you know, Facebook … well, it's now Meta … Mark Zuckerberg and Twitter … they've all appeared before Congress on the Hill … and so there's been a lot of, you know, going back to the accountability, asking questions about what we're doing to protect … protect their users. How do we keep people safe while not infringing on their rights? Because I keep on thinking about McCarthyism. I don't know if you're, you're familiar with that.
So back in the 50’s, a member of Congress targeted those who he believed subscribed to communism. People were being targeted, because of their thought … their beliefs … and caused a lot of harm to the public and challenged our Constitution … as he … and I know it wasn’t just one man, but – he seemed to be policing thought and weaponized the accusation to keep his critics under control.
At the end of the day, what can we do to take steps to make sure that something like that … something that was very insidious … doesn’t happen again?
Dr. Mikey Biddlestone: Just to double check, are you referring to kind of like vilifying certain disadvantaged groups? Or do you mean the kind of label of that's a conspiracy against me, and that being kind of a damaging or gray area? Or both?
Jane DOE: Kind of both, so we have this liberty, because of the First Amendment, to say these things … to believe things. But is there … does the U.S. government … or corporate America have a responsibility to get a handle on the … what continues to be an elevated problem of disinformation. Or would any effort be considered thought policing or considered an official attempt to disenfranchise groups with specific beliefs? I’m not sure if that makes sense.
Dr. Mikey Biddlestone: I completely understand what you mean. Yeah. Thank you for the clarification. Yeah. So, I think like … that there’s definitely a real danger in being perceived as discriminating against people who identify as kind of rational skeptics or conspiracy theorists … especially in a democracy. Right? So, there's this recent paper, though, just submitted with a colleague, Kenzo Nera, and his colleague in Belgium, where basically he was looking at does perceived discrimination of the conspiracy theorist in group, right … this identity as a conspiracy theorist does perceived discrimination against them, increase your strength of identification with this group. Does this make you more set in your ways to call yourself a conspiracy theorist and to be proud of this, this label? Right?
The idea here is that that's a kind of often a response in discriminated groups in order to deal with feelings of low self-esteem associated with discrimination and stuff. So, what we found was, if there's perceived discrimination in society against conspiracy theorists as a group, this doesn't increase the identification as a conspiracy theorist, this kind of just doesn't really bother people. Right? But if this perceived discrimination comes from politicians, or perceived, kind of powerful groups, this will strengthen their identification with the conspiracy theorist in group. And the reason for this is because they perceive more discrimination based on a conspiracy itself, which is that the idea of the word conspiracy theory is a conspiracy made up by the CIA, in order to delegitimize anyone questioning anything in society. So that's basically a long-winded way of saying, we need to be really careful about labeling people, conspiracy theorists, especially when it's coming from the government or powerful groups.
And there is possibly a way to go about this. And basically, I think what it falls into is this way to communicate information and evidence to the public. So especially with things that are kind of ,,, we need this collective effort in order to tackle … so, things like the COVID-19 pandemic, or strategies to tackle climate change as well. Basically, we need the public to be on side. But you can't just say, I'm here to persuade you. This is why you should do what I say. Right? Because we know that that doesn't work, people react in a negative way. So, one way to go around this is this kind of framework that some of my colleagues have kind of come up with that we're testing at the moment … to basically see us is being honest and open about information, regardless of whether it's kind of good or bad for the topic. Let's take vaccines as an example, saying that there are actually some very small instances of harmful effects of vaccines does not actually hurt people's trust in the people who are trying to push vaccines for people to have in the public. Right? What it does do is it increases long-term trust in that government agency or the group that are powerful that are pushing these vaccines. Right?
So, one way to go about this is that we call it this PROVE framework. So, it's P-R-O-V-E. So, the “P” stands for pre-bunk, which is like the inoculation that we're talking about earlier, basically, preemptively prepare people, educate them, for ideas that might make them resistant to efforts to confuse them or to make them uncertain, or to kind of mobilize them in a manipulated way.
And the “R” stands for reliably informed. Which is basically instead of trying to persuade the public trust them that if you give them the right information themselves, they're gonna come to the right decision. And if they don't, that's because the information is not certain yet enough. Right? That's the idea. If you have an educated public that should work. Right? … or at least to a certain extent, where it's not so damaging … is a tiny minority, you know, with the misinformation.
The “O” is to offer balance, which is, you know, if there are some uncertainties, or there's an alternative perspective on something, then you should raise that to show that you're being impartial and you're not trying to sway on one direction. That being said, you should offer balance when it's appropriate because, for example, with climate change, the overwhelming scientific consensus means that it's kind of inappropriate to present balance because the balance isn't supported by scientific evidence.
The “V” stands for verify evidence. That's basically show people how to kind of verify the evidence and understand whether it's good or worth looking at. You know, for example, sample sizes, the types of experimental manipulations or designs used to that kind of thing.
And finally, the “E” is explained uncertainty or disclose uncertainties so that people know the areas in which we still need to have information on, in order to make an informed opinion. Even though you're saying I'm reliably informing you, however, there's also this stuff that we still need to work out and the jury's out on that essentially.
So basically, the idea here is that even though it might worry people, it might make them, you know, scared. Some reality is quite concerning. The idea is that you're a powerful group that can mobilize them to support this positive thing in the world. And they'll trust you, in order to do that. Right? So, the point is, you can't just pretend to be trustworthy in order for people to trust but trust you, you have to be trustworthy in the first place.
Jane DOE: I really like that. I like that PROVE model. So, I'm gonna I'm gonna go ahead and take that.
Dr. Mikey Biddlestone: Seems to not necessarily hurt views … which is a good thing, you know … hasn't necessarily made people more likely to, you know, take vaccines, that kind of thing … but seems to … at least for now, yeah. It doesn't make people turn off the idea … which is kind of what we're looking at as people don't react negatively … at least when you're being honest with them. Which is promising.
Jane DOE: Right. That is so fantastic. So, Mikey, you’ve been so generous with your time. You’ve definitely given me a lot to think about … regarding how I or groups I am a part of define others. And I have one last question for you and its more of a personal interest. I don't know if you've ever heard of the website, Snopes. It's S-N-O-P-E-S.
Dr. Mikey Biddlestone: Oh, I’ve heard of this before … I’ve not been on it.
Jane DOE: Right. It is a fact-checking website and folks can check stories shared across social media or … or elsewhere … you go to this website, and it tells whether it’s true or false.
And maybe we don't want to advertise this if there is. But I'm kind of curious to know if there is a database of conspiracy theories out there that can tell you like this is the conspiracy theory. This is like the, where it began the origins of it, and you know how it's grown-up type of thing? Because it would be really interesting and educational to see how these theories began and have evolved over time.
Dr. Mikey Biddlestone: It’s so true. Yeah, that would be a really helpful to … I'm just trying to think of because, you know … I’m on basically, I'm on like what academic Twitter is what they tend to call it. You know, so you tend to see a lot of tools that people promote, you know, these kinds of online tools, but I'm not sure if I've seen that. I don't think there is a database of mapping conspiracy theories over time.
What they tend to do, I think, is they'll look at how one certain conspiracy movement will develop over time. So, for example, there's lots of QAnon on 4chan, for example. And they basically download all of the data with some metadata saying, you know, when this, when this comment was made … what the kind of words that they were using are to put it in the category … that kind of thing. But I don't think … no, I don't think there is a way to really look at conspiracy theories over time. I mean, researchers have looked at examples of mentioning conspiracy theories in certain newspapers, and to see whether they kind of increase over time. Which they found that they basically fluctuate, they don't necessarily increase. So, they fluctuate based on political circumstances. So is there an election suddenly conspiracy theories are mentioned constantly. Right? But no, unfortunately, I'm not aware of any yet … of any historical, yeah, I mean, I use Wikipedia for that, basically, yeah, for the oldest ones I can find.
Jane DOE: Yeah. That's great. Thank you so much for your time today. This was such a fascinating discussion.
Dr. Mikey Biddlestone: Well, thank you so much. It's been a pleasure. Yeah, really interesting chat as well. I love talking about this stuff. And yeah, hopefully, it gets better in the future. We're in worrying circumstances, but I think we have some promising possible interventions that might be might work for the future, hopefully.
Jane DOE: Yeah, absolutely … it’s true … Thank you. Thank you so much.
Dr. Mikey Biddlestone: Thank you. Cheers.
(Intelligence Jumpstart exit music)
Jane DOE: Thank you for listening to the Intelligence Jumpstart podcast. We'd love to hear from you about what you liked and what you'd like to hear more of. If you would like to hear more about a specific topic or issue, send us a note at NIPress@niu.odni.gov. To learn more about NIU visit our website at NI-U.edu.