Countering Disinformation: Feeling is Believing

Countering Disinformation: Feeling is Believing

Our solutions are all too often aimed at how people think when they should also be focused on how people feel.  We are trying to reach people’s heads when we need to reach for their hearts as well.

Its springtime in 2020 and disinformation continues in full swing while governments and international organizations try their best to counter it.  Disinformation helped to bring the measles back to Europe and is already playing a role in global markets in response to the coronavirus outbreak. Meanwhile, disinformation campaigns by malign foreign actors are eroding people’s confidence in public institutions, elections, and each other.

Disinformation campaigns targeted against democracies have two main goals.   First, they seek to further divide and polarize democracies along ideological lines; Secondly, they attempt to undermine trust in the institutions that sustain a strong democracy. Media, government ministries, academia, and the electoral process are all regular targets of disinformation campaigns.

In working with various partners and experts to address hybrid threats, much of the discussion focuses on countering disinformation campaigns.  Whether it be fake news, deep fakes, or the use of trolls and bots, past and current examples are examined and new solutions are always on the table.

And yet so many of the proposals to counter disinformation we see today rely on the same kinds of responses; education, teaching critical thinking, fact-checking, and technical solutions to stop disinformation sources before they can spread their toxic messages too far.  And with each new solution which is enacted, they do not seem to be making much progress in countering disinformation. People are still believing it, spreading it, and even defending it – and I think I know why.

Our solutions are all too often aimed at how people think when they should also be focused on how people feel.  We are trying to reach people’s heads when we need to reach for their hearts as well.

“Seeing is believing, but feeling is the truth” ~ Thomas Fuller

Disinformation is a subset of an old and well-known phenomenon; propaganda.   Propaganda traffics mostly in emotions, and not just negative ones.  Propagandists appeal to fear, anxiety, and hatred, but also to loyalty and love. When we understand that propaganda is mainly an emotional manipulation, we see that human emotions can be manipulated, often for malign purposes.

The antidote to disinformation has recently been seen as the process of finding factual truth and using it to refute falsity with facts, but taking this approach exclusively means ignoring what we have learned from neuroscience.

Recent research tells us that we all consider ourselves rational thinkers, even though neuroscience also tells us that we exercise pure critical thinking a lot less than we think we do.  Instead, we mostly practice “motivated reasoning.”

So what is motivated reasoning?  Neuroscience tells us that our reasoning is actually laced with emotion.  Not only are the two inseparable, but our feelings about people and ideas arise in a matter of milliseconds, much faster than our conscious thoughts.  They rise fast enough to detect with an EEG device, and well before we’re aware of it.  Human evolution required us to react quickly to stimuli in our environment to increase our chance of survival. Thus, due to one of our own innate human survival skills, our emotions play a significant role in how we receive and process information.

In the process, we push threatening information away and pull friendly information close to us.  In many ways, we end up applying fight-or-flight reflexes to the information itself.

This is not to say we are driven by our emotions, but rather that emotions precede our thoughts when we first encounter and process information. Reason is slower and more deliberate and science tells us that it does not happen in an emotional vacuum. For this reason, we are susceptible to the possibility that our emotions can set us on a course of thinking that’s biased and not always supported by facts, especially on topics that are the most important to us.

And yet policymakers keep trying to use left brain responses to address what is also a right brain phenomenon.

Another dynamic at play when dealing with disinformation is the psychological concept of cognitive closure. When faced with uncertainty or in an environment of anxiety, people seek rapid clarity or closure, which can make them vulnerable to ideas that offer clear binary choices. In embracing this black-and-white worldview and the resulting identity those beliefs provide, people then aggregate into like-minded groups and often insulate themselves from alternative viewpoints.

Propaganda and disinformation prey on desires for certainty, identity, and belonging.  Targeted disinformation amplifies the fear and anxiety that humans feel in uncertain times, and taps into an emotional desire to reject ambiguity.  This is why we so often encounter individuals with strongly held beliefs who process new information as a threat.

Thus, in yet another survival product of human evolution, cognitive closure causes us to seek safety not only in certainty but within a tribe of people who reached the same quick conclusions.  And this does not occur necessarily because people are closed-minded, but rather their human survival reflexes, driven by emotions, have been hacked.

When it comes to why people choose to spread disinformation, we encounter yet another emotional dynamic; the desire for higher social status.

In 2018 a study by a team of researchers from the Massachusetts Institute of Technology (MIT) analyzed the diffusion of the major true and false stories spread on Twitter from its inception in 2006 to 2017. Researchers found that disinformation sped through Twitter farther, faster, deeper and more broadly than the truth in all categories of information.

The MIT study found that false news was 70% more likely to be retweeted than the truth.  The reason was not only human decision-making under uncertainty but also the premise that novelty attracts human attention. When information is novel, it is not only surprising but more valuable. It makes someone feel they have a raised social status since they are “in the know” or have “inside information.”

Disinformation operators know this and are using neuroscience to tap into the emotions of their target audience – motivating them to share false information that appears novel and thus provides the person sharing with more attention from their peers and in turn, a higher social status.

And once that false item gets shared often enough, people will see the same false news several times in their social media, as more of their friends decide to share that information. Due to something called the illusory truth effect, this repetition of false information will make that information feel more true.  When something starts to feel true, people are more likely to share it.

Solutions: The Head and The Heart

We all think we are discerning critical thinkers and most believe that it’s other people who are more vulnerable to fake news. Unfortunately, none of us are immune to this dynamic.  Whether we realize it or not, any of us can possibly spread disinformation.

Too often, when we think we’re reasoning, we are actually rationalizing. In the words of University of Virginia psychologist Jonathan Haidt:  “We may think we’re being scientists, but we’re actually being lawyers.”  Our “reasoning” is too often used a means to a predetermined end as we try to win our “case.” And too often it’s full of our own biases, no matter how objective we’ve convinced ourselves we are.

So, what can be done?

On the surface, providing education on critical thinking, evidence-based responses to disinformation, and technical solutions seem to be a reasonable place to start.  This is where most governments and international organizations like the EU and NATO have focused their efforts.

And yet all of these solutions are aimed at the people’s minds and not at their emotions.  Our responses are ignoring where neuroscience tells us we should be focusing our efforts.  Is it any wonder progress has been slow?

Indeed, we need fact-checking, technical solutions, and a wider understanding of critical thinking, but without an emotional component, we’ll always be a step behind disinformation operators who continue to exploit neuroscience to destabilize our societies through disinformation.

On the education front, in addition to teaching critical thinking, we can offer some new skills to limit the effects of disinformation by using the lessons learned from memory studies.  Quite simply, by encouraging people to withhold judgment on new information they encounter. When this happens, people are more likely to evaluate information – giving the rational thought process time to catch up with the initial emotional signals we’ve received.

When people take the time to evaluate, they are less likely to accept (much less share) any disinformation or misinformation they’ve encountered. Essentially, we all need to slow down. When people slow down, they do a better job of distinguishing fake and true information.

This seems applicable in social media and some changes to how information is shared on these platforms could be helpful.  If social media required people to evaluate the information they are about to share, they may be less likely to share false or misleading information.

As for creating a more balanced mental and emotional approach to countering disinformation, there are two groups in society which are well equipped for this task; marketing and advertising experts, and politicians.

Marketing and advertising professionals design campaigns to reach the heads and hearts of consumers and they constantly monitor which messages resonate and which ones don’t.   And while some governments have tapped into this expertise to hone specific messaging efforts, most have yet to receive their advice and counsel on their entire approach to countering disinformation.

Much like advertising professionals, politicians have an innate sense of how to tap into both human emotions and thoughts to convince them to choose a certain policy or even to vote for them.  Their entire campaign staffs are aimed at developing and deploying these kinds of messages and receiving feedback to better refine their overall message.

In fact, the average parliamentary candidate’s campaign messages contain more potent emotional content than almost any of the counter-disinformation efforts being put to use currently in the West.

We may find that in order to properly leverage neuroscience to effectively counter disinformation and protect our societies from its destabilizing effects, we may need to commit more resources (of the right kind) to achieve the results we are looking for.  But we won’t know unless we take a hard look at our current approaches, take a brutally honest look at our results, and see if we can make better use of what science tells us about how to counter disinformation.