Psychology
The Psychology of Fake News (And What Tech Can Do About It)

Summarized using AI

The Psychology of Fake News (And What Tech Can Do About It)

Cecy Correa • November 13, 2018 • Los Angeles, CA

In her presentation at RubyConf 2018, Cecy Correa explores 'The Psychology of Fake News' and its implications for technology and society. She begins by referencing the infamous 1938 'War of the Worlds' broadcast, which illustrates the potency of media to manipulate public opinion and generate mass panic. This historical context leads into a discussion on modern technologies, such as facial manipulation and audio alteration, which can create misleading narratives. Correa emphasizes the rapid spread of misinformation on social media—recent studies show false information propagates six times faster than the truth—and considers the psychological mechanisms behind why people believe fake news.

Key Points Discussed:

  • Historical Context: The 'War of the Worlds' event exemplifies the impact of media on public perception, with the myth of mass panic stemming from overstated reports.
  • Psychological Insights: Correa draws on concepts from 'Thinking, Fast and Slow', differentiating between fast, intuitive thinking and slower analytical reasoning. Many changes in belief stem from emotional responses rather than rational deliberation.
  • Community Influence: As false information is shared among communities, the collective belief often reinforces misinformation, making it challenging to correct falsehoods.
  • Knowledge Illusion: The common misconception of knowledge leads individuals to overestimate their understanding of complex issues, which can contribute to the spread of misinformation.
  • Mitigation Strategies: Correa proposes a three-pronged approach (education, design, engineering) to combat misinformation:
    • Education: Integrating humanities into STEM education is vital for understanding technology’s societal impact.
    • Design: Ethical design practices in advertising and user experience should reduce the spread of misleading content.
    • Engineering: Technological advancements must now be met with human oversight to address complex challenges like deepfakes.

Correa concludes with a call for technologists to take responsibility in addressing misinformation. The path forward involves enhancing education, employing responsible design, and utilizing robust engineering solutions. The overall takeaway is that combating misinformation is a collaborative effort that requires critical thinking and ethical frameworks to navigate the complexities of the post-truth world.

The Psychology of Fake News (And What Tech Can Do About It)
Cecy Correa • November 13, 2018 • Los Angeles, CA

RubyConf 2018 - The Psychology of Fake News (And What Tech Can Do About It) by Cecy Correa

Fake news spread six times faster in social media than true stories. As technologists, our industry has built the tools that enable the spread of disinformation across social, the web, and beyond. But fake news is nothing new, it has been a part of each advancement in the technology that powers the spread of information, from the printing press to blogging. What makes fake news so appealing? Is it a tech problem or a human problem? In this talk, I will explain the psychology that makes fake news appealing to our brain, and what technology can learn about this psychology to build better tools.

RubyConf 2018

00:00:15.470 All right, my name is Cecy Correa, and I'm here to talk to you about the psychology of fake news, why people believe in it, and what we can do as developers, designers, and technologists to create experiences that are more honest and transparent for our users. To begin, I want to talk about Orson Welles and his 1938 broadcast of 'War of the Worlds.'
00:00:38.100 If you were here earlier during the break, you listened to Orson Welles's rendition of 'War of the Worlds,' which was broadcast on October 30th for their Halloween episode. Orson Welles, being the genius that he was, didn't want to just do any rendition of 'War of the Worlds.' He wanted to do something different, so he thought, wouldn't it be great to tell this story as a series of breaking news? They structured it as a news broadcast and presented it as if it was happening in real time. So on the night of the broadcast, they started off with a pretend show, including music, and then began with weather updates before announcing breaking news that a UFO had landed in New Jersey. Panic ensued. Reports indicated that due to this broadcast, people panicked in the streets, packed their bags, and attempted to leave for safety. There were even reports of individuals considering suicide due to their overwhelming fear.
00:01:35.430 A study released in 1945 attempted to quantify the impact of this event, suggesting that approximately 1 million people listened to and were affected by the broadcast. This event has since become a cautionary tale about the power of media, and it is often discussed in media studies textbooks. That was then, and this is now. I want to play a clip for you, and the first thing I want you to do is close your eyes and just listen to the audio.
00:02:55.500 [Audio clip plays] Now that it’s almost completed, who did you think it was?
00:03:21.209 [Audience response: Barack Obama] Okay, now watch his mouth. The context is that our political parties have moved further apart, making it harder to find common ground. When I said in 2004 that there were no red states or blue states in the United States, I was wrong. This shows how technology is manipulating reality, using facial manipulation technology to alter videos of public figures like President Barack Obama. Additionally, programs like Project VoCo by Adobe can take recordings of someone's speech and generate audio tracks of them saying whatever we want by typing it out. If you're interested in learning more, I recommend checking out the article on futureoffakenews.com and the Radiolab episode entitled 'Breaking News,' where they delve into the ethical considerations of these technologies.
00:04:51.419 In that Radiolab episode, one key question posed to the creators of these technologies was whether they were concerned about potential misuse. They responded that they believed people would possess the critical thinking skills necessary to discern fake content, assuming people were aware of the technology. However, this creates plausible deniability. If someone is caught in a compromising situation on video, they can easily claim it's a fake, leading to a society where no one can believe anything. We now live in a post-truth world where misinformation can spread rapidly, often faster than the truth.
00:06:15.060 At this point, we cannot simply pause and reflect on whether we should take action; these issues are upon us. We cannot turn back time or hide this technology. Therefore, we must figure out how to address the fallout of unintended content generation. To do this, we need to understand why people believe in fake news.
00:06:41.070 We also need insight into the psychology behind fake news, specifically how beliefs are formed and how we process knowledge. Let’s start with beliefs, referencing the book 'Thinking, Fast and Slow,' which elaborates on the mechanisms of human thought. Our brains operate in two systems: System 1, which is fast, automatic, and intuitive, versus System 2, which is slower and more analytical. We use our slow, analytical mind, or System 2, to form beliefs; however, how we react to those beliefs relies on System 1.
00:08:00.149 For example, when we examine headlines from two different fake news websites owned by the same person, we can identify how they are written with similar provocative language, eliciting strong emotional responses. This technique, often referred to as yellow journalism, has been employed since the 1890s due to its effectiveness at appealing to our emotions, engaging our intuitive lizard brain.
00:09:01.320 When faced with challenging questions, we often opt for easier, emotionally charged answers. For instance, when asked how financial advisors who exploit the elderly should be punished, we might consider how much anger we feel toward these predators, rather than engaging with a more analytical assessment of justice. This emotional response can often blur the line between genuine knowledge and feelings.
00:10:06.750 Let's discuss the knowledge illusion, a principle highlighted in another insightful book. In a classic exercise, participants proclaim that they know a lot about bicycles, only to fail entirely when asked to draw one. This experience leads to a reevaluation of their actual knowledge. We often do not realize how little we know until confronted with our inaccuracies.
00:12:08.430 Shallow knowledge is, in fact, by design; if we were constantly aware of everything we don't know, our brains would be overwhelmed, leading to dysfunction. The authors of 'The Knowledge Illusion' illustrate that to progress technologically, we can't expect to know every aspect of a complex topic.
00:12:40.240 Moreover, strong feelings often do not correlate with a deep understanding of issues. As more individuals come to believe something together, those beliefs are reinforced. Community plays a significant role in the spread of fake news: a story elicits an emotional response that drives individuals to share it, thereby reinforcing their connection to the community, regardless of its truthfulness.
00:14:02.690 A recent study by MIT revealed that false information spreads six times faster than true information on social media, which can be traced back to these inherent psychological tendencies. Returning to the 'War of the Worlds' event, I want to clarify that much of what we believed about its impact is mythological.
00:15:06.440 Reports in 1945 suggesting widespread panic from the broadcast were vastly overstated. This claim was bolstered by newspapers of the time that were losing ground to radio, seeking to criticize its competing media. In reality, panic and mass hysteria were minimal, as subsequent surveys indicated that only a small percentage of individuals were even listening to the broadcast. Police reports confirming panic or suicide attempts were nonexistent.
00:16:03.220 Yet, as people repeatedly read about this supposed hysteria, a collective belief formed around the idea that it had occurred. Thus, fake news can create emotional responses that perpetuate myths, such as the impact of 'War of the Worlds.' Can we really disprove a fake news story that has already spread? While some methods might expose misinformation, the reality is that people resist admitting ignorance.
00:17:04.580 Returning to the bicycle analogy, if we ask people to explain a complex policy instead of drawing a bike, they might feel uncomfortably challenged, leading them to remember the experience as one of embarrassment rather than learning. The crux of the issue is finding a way to encourage critical thinking without creating defensiveness.
00:18:26.700 When addressing the perpetuation of misinformation, we need to incorporate a three-pronged approach: education, design, and engineering. First, let’s focus on education. During an interview with the Guardian, Mozilla Foundation’s Mitchell Baker emphasized the importance of integrating the humanities into a STEM education.
00:19:11.350 We must ensure that future technologists possess the framework and vocabulary necessary to contemplate the relationship between technology and society. It isn't about reducing the importance of STEM but rather enriching it with an ethical understanding of human behavior, privacy, and vulnerability. Hearing similar sentiments from various speakers further validates this perspective.
00:20:10.000 This isn’t just about educating technologists; it’s about cascading that knowledge into our communities. For instance, my boss mentioned the Cyber Collective in Brooklyn, which organizes free meetups at the local library to discuss vital topics like fake news and cybersecurity. These straightforward conversations help raise computer literacy before tackling more complicated discussions about online fact-checking. We must not only educate ourselves but reach out to those around us as well.
00:21:02.480 Next, let’s discuss design. A friend, Marissa Morbi, and AJ Davis proposed a panel for South by Southwest on designing for a post-truth world. Marissa communicated the importance of understanding native ads and seamless user experiences. Native ads can appear harmless but can contribute to misleading content when misused by disinformation campaigns.
00:22:09.000 As an example, I want to highlight a screenshot from a platform where disguised ads integrate intimately with standard posts, creating confusion for users. There should be a balance of friction and transparency in these ad experiences to protect users from misinformation.
00:23:00.100 Now, let's address engineering. After the release of the 'Future of Fake News' video, CNN discussed deepfakes and our unpreparedness regarding the technology. As technologies develop, both human oversight and AI are needed to combat the sophistication of deepfakes and information bots systematically.
00:24:28.810 For example, companies must acknowledge the responsibility they have in moderating content. Traditional editorial practices remind us that this is not an entirely new issue; solutions exist that ensure accurate information dissemination. The collaborative platform Fact Check Me represents both a technological and human effort to track stories and provide analytical feedback to users.
00:25:35.590 Additionally, the concept of bots and their role in spreading misinformation has evolved, revealing more sophisticated methods to amplify divisive narratives on social media. Efforts like the Bob Check Me Chrome extension reflect the pressing need to combat the rise of automated misinformation.
00:26:45.020 Jessica Powell's reflections on the contradiction of Silicon Valley's advertising capabilities versus its claims of ignorance in preventing misinformation further shed light on our industry’s responsibility. It is imperative that we, as professionals, take accountability for our role in addressing these challenges.
00:27:28.200 To conclude, tackling misinformation requires a collaborative effort that involves education, ethical design, and robust engineering strategies. We must adapt to our evolving post-truth landscape as we navigate the complexities of fake news and its impact on society. Our ability to create tools that promote critical thinking will largely dictate our future.
00:29:06.810 If you have any questions or want to see my slides and research, feel free to reach me at @sassycorrea on Twitter. You should see my slides posted shortly. I’m also happy to answer any questions in the hall or on Twitter. Thank you.
Explore all talks recorded at RubyConf 2018
+86