Episode 2
Ever get fooled online? It might be because of the way your brain works. Professors Steven Sloman and Lisa Fazio describe cognitive biases and give advice to help students recognize and overcome common errors.
Earn professional development credit for this episode!
Fill out a short form featuring an episode-specific question to receive a certificate. Click here!
Please note that because Learning for Justice is not a credit-granting agency, we encourage you to check with your administration to determine if your participation will count toward continuing education requirements.
Resources and Readings
Steven Sloman
Cognitive, Linguistic & Psychological Sciences, Brown University
References:
- Steven Sloman and Phil Fernbach: The Knowledge Illusion, Why We Never Think Alone
- Daniel Kahneman: Thinking, Fast and Slow
Lisa Fazio
Psychological Sciences, Vanderbilt University
References:
- Calling Bull: Data Reasoning in a Digital World
Transcript
Monita Bell: So, I got duped recently on a social media platform—only for a few minutes; I just wanna point that out. I was ecstatic that a music artist I loved started following me. And I immediately sent a screenshot to my sister as evidence of this amazing news. But then ... that blue check mark was conspicuously missing, as were thousands of followers.
Turns out this particular account was using the same profile photo as my beloved artist and included the word “official” in its user handle. They got me. I knew what to look for, and they still got me. Briefly. But what about when the thing that gets us isn't a fake social media profile? Mind you, watch out for that. With all these information breaches on Facebook and Google, you really can't be too careful.
What about when the thing that gets us is some actual fake news? What about when misinformation pops up in otherwise credible places? Or in a student's history assignment? Have you ever retweeted a story without reading it because you agreed with the headline? Or ever shared something on Facebook and found out later it wasn't completely true? Odds are, you have, and there's a reason for that.
The way our brains work, and the way my brain was working when I got fooled recently, is to take shortcuts, or heuristics, to process information. Our brains make these split-second decisions based on things we've already seen or heard. So, even when something not quite right slips in there, we might not notice it.
So what can we do about our brains? We're gonna get into that in today's episode. I'm Monita Bell, your host for The Mind Online. This podcast comes to you from Teaching Tolerance, a project of the Southern Poverty Law Center. In each episode, we'll explore an aspect of the digital literacy world, what educators and students alike need to know and how educators can guide students to be safe, informed digital citizens.
Today, you're gonna hear from Lisa Fazio, an assistant professor of psychology and human development in Vanderbilt University's Peabody College. You'll also hear from Steven Sloman, professor of cognitive, linguistic and psychological sciences at Brown University.
And, spoiler alert, they're both gonna tell us how bad we are at actually knowing things and at fact-checking—students, educators, baby boomers, centennials, everybody. I don't care how old you are.
But, on the flip side, they're gonna tell us how we can know and do better. Let's get into it. First up, my conversation with Steven Sloman.
Hello, Steven. Thank you so much for talking with me today.
Steven Sloman: It's a great pleasure, Monita.
Monita Bell: I think our listeners are gonna have a lot to take away from this conversation. Just to start, will you introduce yourself and tell us a little about what you do?
Steven Sloman: Sure. My name is Steven Sloman. I'm a professor of cognitive, linguistic and psychological sciences at Brown University, and I study how people think, how people reason about the world and how we divide the world up into parts, and how we collaborate with each other—everything having to do with cognitive process, or the process of thought.
Monita Bell: Okay. Thank you for that. What we are very concerned with on this podcast, and in our work, is how educators can get their students to think about the ways that they think as they engage with digital technologies and social media and otherwise participate online.
So, can you talk about how the cognitive processes that you study play into how young people engage with the digital world? And all of us, really, I suppose.
Steven Sloman: Absolutely. That was gonna be my first comment. That young people are people, too. I study how people think, and young people just think like other people. Sometimes they have less knowledge, but the fact is that adults often don't have very much knowledge either.
So, what I have discovered, over the many years that I've been studying thought is that we think about the world, and the world is incredibly complex. Everything you think about, from the lowly ballpoint pen, to complicated things like space shuttles—each of them has an immense amount of complexity and interacts with other things in extremely complex ways.
So, the secret of cognition is simplification. We're constantly simplifying the information that comes in. We're simplifying our understanding of how the world works. We're simplifying the strategies we use to make decisions and solve problems.
One way we simplify is by using what's called heuristics, or rules of thumb. For instance, if we see a new person, then we use this bank of prior information in order to understand who that person is, to predict their behavior, to know what the best way to interact with them would be.
But we can't use all the information we know about people. There's just too much of it. And so, we simplify often by using stereotypes. We take general knowledge we have about classes of people and understand people in terms of that general knowledge. It's important to understand two things about stereotypes.
One is that they're often based on something. They don't generally come completely out of the blue. On the other hand, they can be very wrong about individuals. By virtue of using stereotypes, we often get the facts really wrong. That's one example.
Another example is what sometimes called the availability bias. We will use information that is really recent, that's really available in memory, in order to make judgments and decisions, even if that information isn't representative of the world at large.
So, if I'm driving along on the highway, I'll be honest with you—don't tell anybody—but I generally drive a little bit above the speed limit.
Monita Bell: I don't think you're alone.
Steven Sloman: But if I pass a crash, then I slow down. Right? And, for the next several miles, until I've forgotten about the crash, I actually drive at a more moderate pace; I'm more careful because the possibility of a tragic event is so available to me that it governs my behavior.
Another example, I think, is the way we respond to terrorist incidents. Often society gets very scared and closes its borders and starts checking people and becomes scared of people of various nationalities because that's where the terrorists came from. But then we forget.
We do it immediately after a terrorist incident. We become hypervigilant immediately after the incident even though, in a sense, that's the safest time because everybody's hypervigilant, and so it's the worst time for a terrorist to act.
But, when the event is available, that's when we act. When there's just been a hurricane, then we're very sensitive to the possibility of hurricanes. So, we respond more than we should to what is sitting in our memories. That's another example of a heuristic that leads to certain kinds of biases.
The source of bias that has interested me most in the recent past is the degree to which we depend on other people for our thinking. So, as I said, the world is incredibly complicated. We can't work things out. So, often, we depend on other people to work things out for us.
I like to say we outsource our cognitive problems. If we wanna know what the best brand of cereal to buy is, it's actually a complicated problem, right? There's lots of nutrients involved. We don't know exactly what the fat and caloric content is, et cetera.
And so, we depend on the people around us to tell us what good cereals are. Marketers depend on this fact. They try to be the experts that provide the information. If we're deciding what our political position should be on, say, medical care, gee, that's an incredibly complicated issue. In fact, all federal issues tend to be incredibly complicated—energy, warfare, any economic issue. You name it, these are incredibly complicated issues. We can't work things out ourselves, so we depend on the people around us.
What we've seen, recently, is that this dependence on people around us has caused us to coalesce into tribes where we have different groups of people with very different positions. The people within each group tend to have a common position. That's not by chance. That's because they depend on each other for their thinking.
If you ask the vast majority of individuals why they feel they do about, say, the Supreme Court, people aren't gonna be able to give an incredibly intelligent answer because we're not lawyers and we haven't studied the Supreme Court decisions. There's just so little we can say.
So, we depend on others to tell us whether we should support a particular nominee or not. This leads to particularly scary kind of biases because it leads to polarization.
This may be getting worse by virtue of the nature of modern digital technology.
Monita Bell: Why is that?
Steven Sloman: Modern digital technology … So, first of all, let me try to disabuse a common misconception. There's actually surprisingly little evidence that social media, per se, is the source of all our problems.
Steven Sloman: People who have studied this cannot find direct evidence that social media, as distinct from other digital technological formats, has created a problem. Which isn't to say that digital technology in other forms, like cable TV, for instance, might not be really important.
But I think what's happened is we have two groups in society that have decided to define themselves in opposing ways, and what technology has done is they've given every dissenting voice a huge audience, so people really respond to negativity.
We respond to fear. We respond to resentment. And so, when you have people who take advantage of our responsiveness to fear and resentment and other negative tones of voice, let's say, they can broadcast their feelings across to the entire population even beyond American borders now.
That leads to these waves of reaction—waves of acceptance by some and reaction by others—and so the others take advantage of digital platforms to express resentment and disaffection with the other side. And you get into this ping-pong that leads to a huge amount of polarization.
Monita Bell: Cognitively, why do we tend to respond more, or at least more vocally, to these negative things—the fear and resentment? Why is it that responses to positive things seem to be less apparent, or, you know, why do people make less noise about that?
Steven Sloman: Well, it seems that our cognitive system is designed to be more responsive to possible costs, possible dangers in the world than it is to possible benefits. You can see this at the lowest levels of cognition. So, if there's the possibility of a tiger in the environment, then our cognitive systems are designed to make us turn away and run.
And that makes a lot of sense, right?
Monita Bell: Right.
Steven Sloman: It makes sense because negative things can often have really tragic consequences. They can end our lives, whereas, positive things, they're not quite symmetric. There is no positive thing that is going to do something as good for us as a negative thing can do bad.
Even if we find the bounty of food that keeps us healthy for the next period of time, we can't be assured that we're gonna have it for eternity. Nothing is gonna give us eternal life, whereas, other things are gonna end our lives.
So, there's a sort of basic asymmetry in the world such that negative things really are, in some sense, worse than positive things. And the human mind is constructed to be responsive to that. Here's one simple demonstration. Most people, if you say, "I'm gonna flip a coin, and if it comes up heads, you give me ten dollars, and if it comes up tails, I give you ten dollars." Would you accept that bet?
Monita Bell: Mmm … I don't know.
Steven Sloman: Yeah, right. Not clear. Most people wouldn't. Most people say, "Nah. I'll just stick with the status quo. I'll stick with the way things are."
Monita Bell: Yeah.
Steven Sloman: And the reason is that the prospect of winning ten dollars, for most people, isn't as positive as the prospect of losing ten dollars is bad.
Monita Bell: Right, right.
Steven Sloman: So, the negative thing looms larger. The possibility of loss looms larger than the possibility of gain. This is actually a real stabilizing force in society because it means that we tend to be happier with what we have than we are with the prospect of getting other things.
Now, this is certainly not always true, but for the most part, people stick with their spouses, right, because the fear of losing a spouse looms larger than the prospect of getting a new one. We certainly feel that way about our other family members. The prospect of losing a child is horrifying.
Now, the prospect of getting a new child is, of course, very positive. But it's not quite as massive in its emotional effect as the prospect of losing a child. So, the fact that we are, in some sense, not willing to risk things for the prospect of gains, because we're so scared of losing, is a kind of stabilizing force in society. It makes us happier with the way things are.
Monita Bell: I'm thinking about the heuristics that you've been talking about in terms of how we receive and process information. When we think about the educators out there who are teaching young people to navigate the various information sources that they come across—in this, I guess, wide world of, in some cases, very negative information or information that is responsive to something negative—what should educators know about guiding their students when it comes to being aware of these heuristics and how it affects what they're coming across as they seek out information?
Steven Sloman: Yeah, that's a great question. I think the major lesson is to teach students that they're not gonna fully understand themselves because every issue is just too complicated for them to understand. This actually violates what we hear on TV all the time.
We're told, "Get the information yourself. Make your own decision." I actually think that's bad advice. On something like medical care—nationalized medical care—you've gotta study for decades and decades to really understand nationalized medical care.
And I don't think sitting at the computer studying the issue for an hour or two is gonna be sufficient to tell you how you should feel on the issue. It's gonna tell you very little, in fact. I think we have to accept that we have to depend on others for our information.
But that doesn't mean we can't reason. It means what we have to reason about is where we're getting the information from. What we have to reason about is whose opinion we're going to respect.
And that itself is a complicated subject. I tend to believe people who have credentials, who've studied an issue, who have demonstrated a certain amount of objectivity, who've gotten things right in the past and demonstrated they understand how things work, in the past, and who have the trust of other people that I respect.
In some sense, I think that what we should be teaching people, and not just students but everyone around us—I mean, our entire society could learn this—is that what we need is a serious discussion about who to trust. And we have to get clearer criteria about who we think we should trust and who we shouldn't.
I mean, it's interesting that the different poles of political perspectives in our society seem to have very different perceptions of that, right—who they should trust.
Monita Bell: Right. I think that raises an interesting point because when we talk about, from an educational standpoint, evaluating sources, for instance—
Steven Sloman: Yeah.
Monita Bell: We’re taught to look for these cues that speak to credibility.
So, you look at, say, what does the URL end in? Does it end in an .edu, does it end in a .org? You look at how recently the information was published, et cetera. But, increasingly, it's easy for people who lack credibility to make something look credible.
So, it sounds like what you're saying is we need to do a lot better job, a more thorough job, of investigating the criteria for credibility.
Steven Sloman: Yeah. Exactly. And as you just pointed out, it's getting harder and harder because people are getting better and better at deceiving us.
Monita Bell: Mmm. Mm-hmm.
Steven Sloman: So, it seems to me the other thing that society needs in order to deal with the fact that there's so much misinformation and falsity out there that we have to try to filter out is we need a society in which it's okay to say, "How did you find that out?" Or, "Why should I believe you?" Right?
Monita Bell: Mm-hmm.
Steven Sloman: I worry that in today's day and age, even in the most educated circles, it's not cool to question people. We're taught not to challenge people often because we might be making them feel uncomfortable. But if we want to get to the truth ... Look, I have misinformation that I have passed on. I don't know anyone who hasn't heard something that they passed on, and later it turned out that what they heard was false.
So, if we could instill if we could develop a culture in which it's okay to challenge and to check because we all really wanna get it right. I think that that could ... I actually think that's necessary in order to walk that line that you're talking about, about figuring out what's true and what's not, especially in digital media.
Monita Bell: One thing I also hear you saying is that it's nearly impossible to really know something is true without a shadow of a doubt.
Steven Sloman: That's true.
Monita Bell: Yeah.
Steven Sloman: Yeah. No, look, that's a fact. I mean, the theory of evolution is a theory. Right? And there's a sense in which it could be false. I happen to believe it because there's a preponderance of evidence in its favor. But, it's just a theory, and it may be that the sun doesn't rise tomorrow.
It's a very, very low-probability event, thank God. But it's a possibility. So, we can never be sure of anything.
Monita Bell: Thinking back to this idea of cognition, and how we come to know and how we process information, one thing I'm thinking—I just wanna get your thoughts on this—is that before educators start talking to their students about doing online searches and weeding through all the information, that perhaps they first need to think about the various kinds of ways that we actually take in information.
So, they need to be thinking about all the heuristics. They need to be thinking about what kinds of voices rise to the top versus those that get pushed down. So, the negative versus the positive. Would you say that, in your estimation, that's a fair assessment?
Steven Sloman: Absolutely. But I would like to add something, which follows directly from the point you just made. Which is that however much information a student collects, however careful they are, they're not gonna discover the whole story. Whatever they produce is not gonna be the final word.
And so, the other thing that a teacher has to instill in a student is the understanding that their information is limited, that their knowledge is necessarily limited and that they shouldn't be too confident in their view.
We have to reduce the level of hubris in our society. None of us know everything, and, in fact, all of us know shockingly little about things.
Monita Bell: That's an excellent point. It also made me think about something you have talked about before, which is this concept of “contagious understanding.”
Steven Sloman: Yeah.
Monita Bell: I wanna make sure I have this right. Contagious understanding, then, is this idea that because people around us understand something, we believe we understand it too?
Steven Sloman: Exactly. Yes. So, we've—
Monita Bell: And, therefore, that we have knowledge.
Steven Sloman: Exactly. So, we've shown this in the laboratory—exactly what you just said. That's right. We just tell people that others understand, and that makes people feel like they understand a little better themselves.
Monita Bell: Mm-hmm.
Steven Sloman: It's this notion of outsourcing, right?
Monita Bell: Right.
Steven Sloman: We outsource our understanding, and the fact that other people understand gives us personally a sense of understanding. Which in general, is fine, right? I mean, I understand how my car works in the sense that I can drive it. Even though it's incredibly complicated, and there's no way I could fix it if something broke. So, I do depend on the understanding of the manufacturers of the car and mechanics.
The issue is that people don't fully appreciate the extent to which their knowledge is actually sitting in other people's heads.
Monita Bell: Okay. Which goes back then to the credibility thing—who we choose to place our trust in.
Steven Sloman: Exactly.
Monita Bell: Okay. Fantastic. Is there anything else you think educators should know as they go about trying to teach their young people how to know and how to go through all these webs of knowledge that they encounter? Something that we haven't touched on yet?
Steven Sloman: Yeah. Let me say one last thing. It's actually the heart of this book that I wrote with Phil Fernbach, called The Knowledge Illusion. What we describe in there is how the process of explanation reduces people's hubris. So, one way to get someone to appreciate how little they understand is simply to ask them to explain how it works.
If you ask someone to explain how a bicycle works, they generally come out saying, "Oh, it's much more complicated than I thought. I guess my understanding is more limited than I thought." And, certainly, if you ask people to explain how a political policy would have its effect, people have remarkably little knowledge.
And so, it's a way for people to discover themselves how little understanding they actually have. And, at the same time, to learn something, right? And to discover the gaps in our understanding.
Monita Bell: Mmm. Mm-hmm. I like that. So often our endeavor is to go find all the resources you can on this topic: who said this, who said what. But, in many ways, you're suggesting, I think, part of the goal should be to find out how little we do know.
Steven Sloman: Right.
Monita Bell: And then to determine the gaps.
Steven Sloman: And we can do that by trying to put the information together. By trying to generate an explanation that synthesizes the information, and in the process, we'll learn something, but one thing we'll learn is how little we understand.
Monita Bell: Well, fantastic. Thank you so much. I think the only thing I have left to ask you is if, other than The Knowledge Illusion, which you just mentioned—thank you for that—can you think of any other resources that might be useful to educators as they grow in their own development around this topic of cognition and knowing?
Steven Sloman: There are a lot of resources online that discuss cognitive illusions and cognitive bias, and the heuristics that people use. One very popular source is a book by Daniel Kahneman, who's a psychologist who won the Nobel Prize in economics for his work on how people make judgments and decisions, and he has a very readable book called Thinking, Fast and Slow that people might enjoy if they haven't read it already.
Monita Bell: All right. Steven Sloman. Thank you very much for your time today. I appreciate it.
Steven Sloman: Great pleasure. Thanks for having me.
Monita Bell: Bye-bye.
Steven Sloman: Take care. Bye-bye.
Monita Bell: That was Steven Sloman, professor of cognitive linguistic and psychological sciences at Brown University. Next, Lisa Fazio is going to explain why we all stink at fact-checking and what we can do about it, especially, as it concerns teaching our students.
Did you know that Teaching Tolerance has other podcasts? We've got Teaching Hard History, which builds on our framework, Teaching Hard History: American Slavery.
Listen as our host, history professor Hasan Kwame Jeffries, brings us the lessons we should've learned in school, through the voices of leading scholars and educators.
It's good advice for teachers and good information for everybody.
We've also got Queer America, hosted by professors Leila Rupp and John D'Emilio. Joined by scholars and educators, they take us on a journey that spans from Harlem to the frontier West, revealing stories of LGBTQ life that belonged in our consciousness and in our classrooms.
Find both podcasts at tolerance.org/podcasts, and use them to help you build a more robust and inclusive curriculum.
Hello, Lisa. Thank you so much for talking with me today. I really appreciate it. First, will you just start by introducing yourself and telling us a little bit about what you do?
Lisa Fazio: Yeah. I'm Dr. Lisa Fazio. I'm assistant professor of psychology at Vanderbilt University's Peabody College. I study how children and adults learn new information about the world—both correct information and then also incorrect information. And then, if they do learn false things, how we can correct that knowledge.
Monita Bell: On the topic of that incorrect information, I know you've done some exploring of fake news. That's something we're definitely interested in exploring, as well. How did you become interested in studying this particular topic?
Lisa Fazio: Yeah. Back when I was an undergrad, I got really interested in false memories, and how you can kind of suggest things to people, and then they'll falsely remember different events in their life. But then in grad school, we started studying, not false memories for events, but false memories for facts.
People misbelieving that the Atlantic is the largest ocean on Earth. Things like that. That work really led me, nicely, into this work on fake news, and why aren't we that good at noticing errors in the world around us, and why do we pick up those errors and use them later on.
Monita Bell: Mm-hmm. That speaks directly to something you've written about, which is the Moses Illusion. Can you just tell our listeners what that is and why it's important?
Lisa Fazio: Yeah. The best example of the illusion is you ask people the question, "How many animals of each kind did Moses take on the ark?" Most people will say, "Two." Even though, when you ask them to explain or give them some time, they realize that, well, of course, they know that it was Noah, not Moses, who took the animals on the ark.
But what we find is, not just with that question, but with lots of similar questions and similar facts, as long as the information's close enough to being correct, our mind seems to just skim over it, and assume that it's the correct information.
Monita Bell: This is related to another concept I know you've written about called knowledge neglect, right?
Lisa Fazio: Yeah. This is the broader term we use for all of the times when people have relevant knowledge in their heads, but they don't use it in the situation at hand. So, you don't notice that it's Moses, not Noah. In other work, we've shown that people believe repeated statements to be more true than things that they only have heard once.
What's interesting is that that happens, even when you know the correct information. So, even among people who know that Scottish men wear kilts if you hear the sentence, "The skirt that Scottish men wear is called a sari," twice, then you believe it's more true than if you've only heard it once.
So, even though you've got that prior knowledge, it doesn't protect you against this repetition effect.
Monita Bell: Why do our minds skip over these things, even when we have the facts? What are our brains doing to cause that to happen?
Lisa Fazio: Yeah. Most of the time, it's a really useful heuristic. Most of the time, people tell us true things, but they do also sometimes misspeak, repeat themselves, mispronounce things. Our brains are really good at assuming the proper answer and going along as if nothing was wrong.
It would take a lot of mental energy to constantly be checking everything that we hear against everything that we know about the world. So, instead, as long as it seems close enough, our brain just assumes that it's correct and moves on.
Monita Bell: Okay, so. Tying this back to fake news, then, there's so much false information around us, and, certainly, in the digital world that we live in, it's much easier to come across that false information and to have it come at us in different forms and on different mediums and platforms, et cetera. You've pretty much said, in your work, that we're bad at fact-checking. I guess what you also just said speaks to that. That our brains are doing these things that make us bad at fact-checking.
Can you just talk a little bit more about that? That even when we're, you know, reasonably smart or looking at different resources on a given topic, why are we so bad at actually checking things for facts?
Lisa Fazio: Yeah. One thing I like to point out is this is just a function of human brains. So, it doesn't matter if you're liberal, conservative, smart, not so smart. Those factors don't really play a role. This is just a shortcut that our brain uses to be fast and efficient as we're processing information.
But that doesn't mean that it's hopeless. There are some things that we can do that help us to be better at fact-checking and better at noticing errors. So, we're better at noticing errors when we know more about the topic.
History graduate students were less likely to fall for Moses Illusion-type questions that had to do with history facts; whereas, biology graduate students were less likely to fall for it, if it was biological facts that were in the questions. And, similarly, really slowing down and actively trying to notice false information can be useful in detecting those errors.
Monita Bell: As this concerns then K-12 educators who are trying to make sure that their students have a good solid sense of digital literacy and the skills to navigate the digital world, how does this translate into what they're doing in classrooms—so the slowing down and all that. Can you speak to that a little bit?
Lisa Fazio: Yeah. I think there's a few tips that educators can give their students. The first is before you share anything on any social media, just pause, take three seconds, five seconds, and think about how you know that this is true.
And, if you're not sure, then a quick Google search is normally really helpful in confirming or disconfirming what it is you're about to share. The other thing I like to tell people is that if something feels too good to be true, or if you read it, and you really, really want it to be true, that's probably a signal that you should check and see if anyone else is confirming that information.
Monita Bell: Like Snopes? [chuckles]
Lisa Fazio: Exactly.
Monita Bell: Put in snopes.com, yeah. Mm-hmm. Okay, so beyond the teaching of the slowing down, are there other things you would suggest that educators do in helping their students become better at fact-checking?
Lisa Fazio: Yeah. I think one useful thing is to be better able to evaluate graphs and data that are put out there. So, there's lots of ways that people misuse data or misrepresent graphs in order to make them misleading. There's a great website that's called callingbullshit.org, or for younger educators, there's also callingbull.org, and it's some professors who've put together really great materials on how to notice and debunk some of these common ways to misinterpret data in graphs.
Monita Bell: Okay. Awesome, yeah. Thank you for specifying those URLs. I know those will be very useful to the educators in our audience. Are there any other resources you would recommend for teachers interested in learning more about this?
Lisa Fazio: I think reading up on reverse image search. That's the other—
Monita Bell: Mmm. Mm-hmm.
Lisa Fazio: —key thing I think is really useful in debunking things. So many of the false, fake news things we see now are reusing images in the wrong place or composite images. And, doing a quick reverse image search can be really valuable to notice those.
Monita Bell: Okay. And to go through those exercises with students, as well.
Lisa Fazio: Exactly.
Monita Bell: We're in such a meme culture. Yeah, that's very useful.
Lisa Fazio: Yeah. I haven't heard of anyone using this assignment yet, but I just think it's ... it would be so fun. Having students come up with the best piece of fake news they can. Put everything in there that they think makes it most likely to be believed, to go viral.
And, in the process of doing that, the students are gonna have to research, "What is it that makes fake news more likely to be believed, to be shared?" Things like that. I think that could be a really fun way to teach some of these activities.
Monita Bell: Just to follow up with that, what would you see, I guess, being the larger implications of students better understanding what would make folks more likely to believe or click. What do you think they would ultimately take away from that?
Lisa Fazio: Yeah. I'm hopeful that they'll then realize when they're being persuaded in those same ways—
Monita Bell: Mmm. Mm-hmm.
Lisa Fazio: —and it'll help them notice when that fake news comes across their news feed.
Monita Bell: So, just in considering all of the research and the work you've done, in the grand scheme of things, what would you say are some of the biggest misconceptions or the things that people get most wrong when it comes to understanding how we receive and interpret information, especially in this digital world?
Lisa Fazio: Yeah. I think one thing people often believe is, "It can't happen to me. This is something that only happens to other people." And we've got lots of evidence that that's not true. There's something that everyone can fall for, everyone does sometimes fall for.
The other one is that this is some kind of new thing in society, whereas misinformation has always been with us; it's always been a part of society. I think what's new now is how quickly and far it can spread because of digital media.
Monita Bell: Okay. Do you have any final thoughts for our listeners who are educators out there?
Lisa Fazio: No, but I really hope that they will do some of this work with their students. I think it's really important, and they're out there on the front lines giving these students tools to notice this misinformation.
Monita Bell: Yeah, and I really ... I would say a big takeaway for me, just from what you've been saying, is the fact that we just have a lot of work to do to overcome what our brains naturally do—
Lisa Fazio: Right.
Monita Bell: —with the shortcuts. It's just something that we have to work toward when it comes to parsing all the information out there.
Lisa Fazio: Exactly. And, one last thing I wanna add is that fake news doesn't just happen in political contexts. There's all sorts of fake health information and other types of information. So, I encourage teachers to think broadly about what types of misinformation their students might be seeing.
Monita Bell: Okay. I think that's a great way to wrap this up. Thank you so much. Just one more time, can you tell us who you are and what you do?
Lisa Fazio: Of course. Lisa Fazio. I'm an assistant professor of psychology at Vanderbilt University's Peabody College. I study how people learn and remember both correct and incorrect information.
Monita Bell: Thanks for tuning in to The Mind Online, a podcast of Teaching Tolerance, which is a project of the Southern Poverty Law Center. I'm your host, Monita Bell, senior editor for Teaching Tolerance. This podcast was inspired by Teaching Tolerance's Digital Literacy Framework, which offers seven key areas in which students need support developing digital and civic literacy skills and features lessons for kindergarten through 12th-grade classrooms.
Each lesson is designed in a way that can be used by educators with little to no technology in their classrooms. The Digital Literacy Framework and all its related resources, including a series of student-friendly videos and a professional development webinar, can be found online at tolerance.org/diglit. That's tolerance.org/diglit. This episode of The Mind Online was produced by Jasmin López with help from Tasha Lemley and Talia Blake. Production was supervised by Kate Shuster. And special, special thanks to our guests Steven Sloman and Lisa Fazio. And to Teaching Tolerance senior writer, Cory Collins. Like what you heard? Then share this podcast with your friends and colleagues, and remember to rate and review us on iTunes or wherever you listen.