Fighting fake news: What Daniel Kahneman could teach us about this problem
Let’s not beat around the bush for this one: fake news – or rather the explosion of sensationalist and non-credible journalism – is one of the most serious problems we face today. It’s an extraordinary issue that has caused even more extraordinary damage to society and the beliefs we construct about the world around us. Anyway, you knew this already: so what do I have to say about this that could be so groundbreaking? The following post is mainly about a different perspective about fake news, although the deeper message behind it goes far beyond that. With the help of Daniel Kahneman’s book “Thinking, Fast and Slow", I want to discuss an important understanding about the way we think and perceive the things around us.
However, before we get started I have to introduce you to the main characters of this story.
Say hello to System 1 and System 2.
Since I tend to quote shows, videogames, and movies a lot, I have to clarify that these aren’t actual characters. Kahneman uses this metaphor to depict the two systems in our mind that control how we think and it’s important to briefly explain this to understand the rest of the post. Basically, System 1 is the part of our mind that is our intuition. When you walk into your office on the first day of work, system 1 is in charge of first impressions of the people and assessing the overall “vibe” of the workplace. This is not a deliberate process; it happens automatically with little to no effort. When you read a headline that sounds interesting or crazy or funny, system 1 is the part of your mind that instantly detects that it is interesting or crazy or funny and thus makes you click it. The system is always on and buzzing when you’re conscious. This is a very brief and condensed overview of it and the book has a far more elaborate explanation of system 1, but it would be too much for this occasion. The most important thing to understand about system 1 in this context is that it is primarily concerned with split-second impressions and feelings that happen involuntarily and/or with little or no effort.
On the other hand, System 2 is the little sibling that ideally would (and should) be in charge of operations but is unfortunately too lazy to do so. This system is in charge of concentration, problem solving, self-control, logical reasoning, and the deliberate construction of thoughts. For example, System 1 does not have the ability to solve riddles or multiplication problems, structure ideas for a blog post, or question the validity of an article/headline that passes by on your news feed. This is the job of System 2. Again, there’s much more to these concepts than I write here, but it would be far too much to explain in this post. However, a crucially important feature of System 2 that you need to remember in the context of this topic is that it is fundamentally lazy and will only be activated if absolutely necessary. A great example of how lazy System 2 is and allows System 1 too much free reign comes from the famous bat-and-the-ball problem. Nevertheless, to summarize the relationship between the two sibling systems, Kahneman states that:
System 1 runs the show; it’s the puppet master that only cedes its control when things get too difficult. Only then does System 2 have the last word. Alright, I may have bored you to death with this psychological preamble, but with that out of the way we can now talk about the main story.
Sidenote: For those of you who remember or watched it, I think a decent (though imperfect) comparison comes from the show Yu-Gi-Oh. In this case, I think Yugi Moto is System 1 – dealing with all the simple, everyday life affairs – while the Pharaoh is System 2 – called into action when shit gets real and a serious duel has to happen because Yugi just doesn’t have the capacity for that.
What does this all have to do with fake news then? I personally believe that fake news triggers our System 1 into absolute and unrestrained overdrive. Think about the nature of this stuff: you read a sensational, though not too outrageous, headline that is easy to digest and plays to your feelings. What!? That’s insane! There’s no need for System 2 to come into action because it plays primarily to your emotions rather than your rationality and the headline is not too complicated to understand. Anyway, you click the headline, read the story, and become outraged at whatever ludicrous “fact” you read about the world. Remember that System 2 is lazy so it grants System 1 unrestricted reign to do as it pleases as long as the story isn’t completely out-of-whack with your preexisting beliefs (pigs simply cannot fly, no matter how much you want to believe it, so System 2 won’t let that one pass).
That’s primarily the problem: journalism that’s exclusively tailored to the anarchic and impulsive System 1 instead of our rational, though lazy, System 2. It was truly inevitable then that a problem like fake news would arise when such conditions came together for the perfect storm. However, it’s important to emphasize, as Kahneman does in the book, that these mental failures don’t necessarily mean that we’re stupid (hard to believe, I know). We are simply, by default, lazy creatures that are prone to such imperfections. He puts it nicely on page 114 when he says that, cognitively speaking, “sustaining doubt is harder work than sliding into certainty.”
That’s why questionable articles are so easy to digest and so scarce in structured arguments. The easier it is for us to slide into certainty, the less likely it is that our hero System 2 is called into action. System 2 has the key, it just can’t find the door. Thus, we keep reading questionable stories that are just about believable enough that System 1 buys it such that those repeated impressions become reinforced into consistent beliefs. Referring to the results from the bat-and-the-ball problem, Kahneman laments the implications of such phenomena:
Does the blame lie with the lazy/unethical journalism or with our “shoot first and ask questions later” System 1? Neither of the two, I believe. I think it lies primarily with the laziness of our System 2 to be more engaged when we perceive the world around us and look beyond what meets the eye. Fake news just capitalizes on this cognitive laziness. But speaking of looking further…
What you see is all there is (WYSIATI)
WYSIATI (a poetic acronym, isn’t it?) is one of my favorite concepts that I’ve read so far in the book. At first sight it may seem like no more than an elaborate version of intellectual myopia, there’s really more to it. Kahneman states that System 1 is “a machine for jumping to conclusions,” particularly when sense has to be made from limited information. Although he doesn’t use the following example specifically for WYSIATI, I think this illustrates the point more accurately.
Imagine you read the following hypothetical(!) statement: “In a telephone poll of 300 seniors, 60% support the president.” For many people, it’s highly likely that their devilish System 1 already constructed a basic story in their minds: the elderly support the president. What you see is all there is. As Kahneman discusses, “omitted details of the poll, that it was done on the phone with a sample of 300, are of no interest in themselves; they provide background information that attracts little attention.”
System 1 wants to create a coherent story as easily as possible in order to avoid the involvement of its pesky, smarter sibling System 2 that would question the source of the story. The elderly support the president, there you go, I don’t want to spend any more mental effort on thinking about the nuances of this story. What you see is all there is, as long as the details aren’t too notably absurd (a poll of 5 seniors) or weird (a poll of Super Saiyan seniors).
A wonderful real-life example of WYSIATI that I came across a while ago is from the following article and, especially, the headline (link). Before you continue, quickly read that article and pay close attention to your (initial) reactions as you do so. Alright now that you’re done, I hope you noted that the empty jar, per 100 grams, is actually marginally cheaper than the full one (using System 2, simply convert the lb to grams, and do some quick math). Now be honest: were you outraged by the story or did you catch yourself before coming back here? If you didn’t catch yourself, it’s fine; I’m not here to shame you. However, this perfectly illustrates both the concept of WYSIATI and the hyperactivity of that mischievous System 1.
It’s so much easier to slide into certainty when faced with information that sounds plausible enough, plays to our emotions and preexisting beliefs (businesses are greedy!), and avoids laborious use of our System 2. Unfortunately, we think that what we see really is all there is…
The interactions between the two systems and the concept of WYSIATI are not just limited to how we read the news. Ask yourself this: is your System 1 active when you have negative, harmful beliefs about yourself? When you read something about an acquaintance that achieved something awesome in her life, do you immediately say “well that just shows how I’m a piece of crap and other people are way better than me”?
Understand that WYSIATI affects your thinking here too because all you see is how shit you are – the content of the message – instead of allowing System 2 to analyze the reliability of the message or thought – “do other women/men really think I’m such a piece of crap? How much actual proof is there of that?” Remember, it’s easier to slide into certainty than to question whether your life is genuinely so crap and awful.
Credits and closing words
This entire post was inspired by the book Daniel Kahneman's book Thinking, Fast and Slow. Although I’m only halfway the book at the moment, I hope I didn’t misconstrue his ideas and thoughts; these were all my interpretation of his teachings. I was inspired by his insights and, for some curious reason, thought it applied aptly to the fake news pandemic occurring at the moment.
Why did I write this? Partially because of a misplaced sense of moral obligation to speak about this problem, but mainly because it is emblematic of a deeper issue which I stress time and time again on Cowboy Funk: despite the remarkable potential of the human mind, it has serious weaknesses. These weaknesses can make us think we’re worthless, that the stories we read are more credible than they truly are, or that we’ll never achieve anything worthwhile in our lives. Nevertheless, the fake news episode was just one of the consequences of our lack of vigilance of our mental frailty. Who knows what’s next? The main takeaway from this post, therefore, is to be aware of your mental Systems and mobilize System 2 as much as possible in your daily life. Don’t allow System 1, the other you, to run riot.
You can find Daniel Kanhneman’s book on Amazon, eBay, and wherever else human beings buy their books these days. And no, this is not a salespitch. It’s just a damn good book that teaches us how fallible our minds are and why we need to guard against our own mental weaknesses. This post touched on just a sliver of Kahneman’s fascinating insights, so you should really buy and read the book if you want to understand and enjoy its full brilliance!
Also, make sure to listen to this fantastic NPR episode where the hosts actually track down a fake-news publisher and interviews him about why he does what he does: link.
See you, Space Cowboy.