00:00:11.360
Awesome, thank you so much! Yes, let's go with the rules; I don't care what you all say, you can fight me on this. My name is Ceci, and I'm here to talk to you about the psychology of fake news and what we can do about it as technologists. Before I get started, I just want to give a shout out to Keep Ruby Weird. Before I decided to become a programmer, I was kind of soul-searching, trying to figure out what programming language I should learn, and the community I joined was really important to me.
00:00:25.250
So, I decided to buy a ticket to the inaugural Keep Ruby Weird back in 2014, not knowing anything about Ruby. I sat right there in the front row, and I saw this guy talking about his cats, and I knew that this was the community for me. I've been into Ruby ever since, so I'm still a newbie, but I'm very grateful for this community. It's just amazing to be here talking to you all right now, so thank you, Keep Ruby Weird!
00:00:54.310
Before we start talking about fake news, I want to touch on the future of fake news. If you want to learn more about some of the topics we're going to cover today, you can visit FutureOfFakeNews.com. This is an episode produced by Radio Lab where they delve into fake news in much more depth.
00:01:08.310
Now, I'm just going to show you a quick clip. This film has been modified from its original version, formatted to fit the screen. It never happened. On the back end now of my presidency, as it nears completion, there are still all kinds of issues that I'm concerned about. The single most important thing I can do is to address the fact that our parties have moved further and further apart, making it harder to find common ground. When I said in 2004 that there were no red states and blue states in the United States, I was wrong.
00:01:54.110
As my presidency was almost completed, I realized how our parties have moved further apart, making it increasingly difficult to connect. That summarization led us to the future of fake news. If we look at the technology behind fake news, we can see that face manipulation is not quite there yet; it's apparent that it's not real. However, the audio technology behind it is fascinating. Essentially, this comes from a piece of software called Project VoCo, developed by Adobe.
00:03:05.079
What this software does is it takes existing audio, and since any person speaking for just a few seconds will run through every sound necessary to produce the English language, you can create audio clips in that person's voice for words they didn't actually say. This is interesting because you don't necessarily need videos of someone actually moving their mouth. You can manipulate existing videos without showing mouth movements but still have the audio clip to create something that sounds close to what someone could say.
00:04:39.490
That technology isn't out there publicly yet; it’s still forthcoming. If you want to learn more about this technology, again, check out FutureOfFakeNews.com. Let's discuss how we arrived at the world we are currently in concerning fake news, which many people tend to blame the social media bubble created by platforms like Facebook for.
00:05:00.000
First of all, we have selective feeds. Essentially, this means that platforms like Facebook track the things you click, like, and comment on, and they aim to show you content that fits this bill. From a product perspective, this makes sense because they want you to engage continually with their platform. However, the outcome is that people only receive one type of information.
00:05:29.700
If you encounter a piece of information that you disagree with, it's very easy to block it. Additionally, we see link previews on platforms like Facebook where you can view a headline, a short blurb, and a photo that gives you the gist of whatever that article is trying to convey. As a result, people no longer click on articles; especially if you consume information in a feed, you are likely just skimming through. You may like something, but for you to click through, it really needs to grab your attention.
00:06:54.690
This gives way to clickbait, a real problem when you must write headlines that attract people's attention. I’m going to share with you ten ways to write effective headlines, number nine is going to shock you! What happens now, as a result of these clickbait trends, is that we are getting more targeted content.
00:07:34.390
News outlets are beginning to shape their content in a way that they know their audience will react to, whether positively or negatively, because your reaction is what drives clicks. This is why news coverage is no longer objective; it focuses more on what will resonate with the audience rather than presenting the truth. For example, two major media outlets can cover the same event from two completely different perspectives.
00:08:19.420
Who do you believe? You're probably going to trust whichever outlet aligns more with your worldview. This makes it very hard to parse what is happening around us. Even established news sources can fall into this pattern, making it easy for fake news outlets to use similar approaches. You could have two news outlets with opposing political leanings, both owned by the same person. This demonstrates how easy it is for misinformation to be spread with just a little effort.
00:09:40.320
Keeping in mind the prevalence of fake news, these outlets can profit significantly. Reports show that during peak times in the 2016 election cycle, websites dedicated to fake news made between ten to thirty thousand dollars per month. Fake news indeed has a long history; if you think about clickbait, it is nothing more than yellow journalism, a term coined in the 1890s.
00:10:30.080
This type of news coverage has persisted for over a century because it works. Let's discuss what convinces people to believe things that are distorted or untrue. We’re going to look at some psychological principles that explain this phenomenon, starting with system one and system two thinking as outlined in the book 'Thinking, Fast and Slow.'
00:11:16.020
Essentially, we have two ways of thinking: system one is fast and intuitive, while system two is more analytical. System one thinking affects beliefs, which is why it is difficult to change people's beliefs, as they are intrinsic to their identity. On the other hand, when you are confronted with information that challenges your worldview, it's challenging as it requires analytical thought.
00:12:24.920
This is why processing information that differs from our beliefs or worldviews is more difficult. We tend to consume information that aligns with our existing beliefs because it is easier. Another principle that is important comes from the book 'The Knowledge Illusion,' which explains how little we actually know about the world.
00:13:04.540
The authors argue that if we had to master all details about everything we interact with daily, we simply wouldn’t function. For example, if you needed to know every detail about a car beyond just driving it, you'd be overwhelmed. Our brains do a fantastic job of allowing us to function without needing a deep understanding of everything we encounter.
00:15:01.360
Furthermore, there was a study where participants were asked to draw a bike after rating their knowledge of bikes. Many participants felt confident they knew bikes well but struggled to draw one accurately. This type of experience highlights our misconception of our knowledge and the realization that we only remember what we need to know.
00:16:04.260
The participants would generally rate their knowledge lower after attempting to draw a bike. This reveals that people are typically unaware of their ignorance until personally confronted with it. We are wired to be okay with not knowing; however, we tend to overestimate what we know. Additionally, if a belief is shared within a group, it gets reinforced.
00:16:54.800
We only need to know a little about something, and if it’s shared among peers, it strengthens that belief. This dynamic makes it difficult to convince people otherwise when faced with alternative information. Clay Johnson articulates this information well in his discussion surrounding the concept of an information diet.
00:18:50.310
Imagine your favorite delicious but unhealthy food, like a bacon cheeseburger. Now think about the healthy food you eat but don’t particularly enjoy, like kale. In terms of an information diet, the information you agree with is like eating the delicious burger that brings pleasure to your brain, while challenging information is like eating something healthy that you dislike.
00:19:41.350
This illustrates why when people encounter facts that challenge their beliefs, it’s hard for them to digest. If we reflect on system one and system two thinking, it’s clear that challenging beliefs require analytical thinking, which is often more difficult.
00:20:12.770
To summarize, deep beliefs do not necessitate deep understanding. Those beliefs are reinforced through shared experiences in groups, further complicating access to differing viewpoints. Additionally, we may not be aware of what we don’t know, leading us to avoid seeking out challenging content.
00:20:32.650
It’s far easier to consume information that aligns with our beliefs, much like junk food is easy to enjoy. This is how we create and sustain bubbles of misinformation. People demand accountability from tech companies, including platforms like Facebook, for spreading fake news.
00:21:25.700
While I don’t necessarily agree that it’s entirely tech’s fault, it's important to note that we have tools that can reinforce those behaviors. A quote I like from Alan Kay illustrates this point well, 'The internet was built so well that most people think of it as a natural resource, like the Pacific Ocean, rather than something created by humans.'