In this final episode, highlights from our guest interviews walk listeners through the seven key areas of Learning for Justice’s Digital Literacy Framework.
Earn professional development credit for this episode!
Fill out a short form featuring an episode-specific question to receive a certificate. Click here!
Please note that because Learning for Justice is not a credit-granting agency, we encourage you to check with your administration to determine if your participation will count toward continuing education requirements.
Resources and Readings
- Heidi Beirich (Episode 3): Intelligence Report, Southern Poverty Law Center
- Meredith Broussard (Episode 11): Artificial Unintelligence: How Computers Misunderstand the World
- Katy Byron (Episode 9): MediaWise, The Poynter Institute
- Rafranz Davis (Episode 4)
- Lisa Fazio (Episode 2): Psychological Sciences, Vanderbilt University
- Keegan Hankes (Episode 10): Intelligence Report, Southern Poverty Law Center
- Erica Hodgin (Episode 5): Civic Engagement Research Group, University of California, Riverside
- Joe Kahne (Episode 5): Civic Engagement Research Group, University of California, Riverside
- Alicia Johal (Episode 8): Sweetwater Union High School District
- Matthew Johnson (Episode 1): MediaSmarts
- Safiya Noble (Episode 3): Algorithms of Oppression: How Search Engines Reinforce Racism
- Betsy O’Donovan (Episode 11): Journalism, Western Washington University
- Lois Parker-Hennion (Episode 7): Tappan Zee High School Library
- Gordon Pennycook (Episode 9): Behavioural Science, University of Regina
- Meenoo Rami (Episode 10): Minecraft Education, Microsoft
- Melissa Ryan (Episode 6): "Ctrl Alt-Right Delete" newsletter
- Steven Sloman (Episode 2): Cognitive, Linguistic & Psychological Sciences, Brown University
- Will Sommer (Episode 6): "Right Richter" newsletter, The Daily Beast
- Julia Torres (Episode 7): Disrupt Texts
- Kelly Weill (Episode 8): The Daily Beast
- Shana White (Episode 1): Constellations Center for Equity in Computing, Georgia Tech
- Sam Wineburg (Episode 4): Stanford History Education Group
Monita Bell: I’m no longer in the classroom, but the English teacher in me is still hanging around. She shows up now and then when I’m editing and working with writers, when I might say, “What’s your thesis?” or “How do these ideas support your overall point?”
Even when I’m helping my husband with a presentation for work, I might suggest, “Try doing some freewriting to figure out your main idea and how you want to convey it.”
Monita Bell: I love a good outline. This is why outlining is good.
Monita Bell: In fact, I’ve spent most of the past year talking to all kinds of experts with all kinds of angles on teaching digital literacy, and it’s all part of an outline, of sorts. A framework. We know educators love a good framework.
Monita Bell: Welcome to this final episode of The Mind Online! I’m your host, Monita Bell. This podcast grew out of Teaching Tolerance’s Digital Literacy Framework. And while we’ve been talking about the ideas behind that framework for the past year, we thought that a good way to wrap up our season would be to go back to each part of the framework and hear our guests’ expertise on it.
Monita Bell: The framework itself is pretty short: seven items, each with a few supporting ideas. A simple approach to some pretty complicated stuff. Let’s walk through it.
Monita Bell: First, the basics: “Students can locate and verify reliable sources of information.”
Monita Bell: Good information doesn’t always lead to good decisions, but bad information is often worse than none at all. The key word there is “reliable,” right? The internet is full of all kinds of information, but the heart of digital literacy is being able to distinguish between reliable and unreliable sources. This matters for all of us and most certainly our students.
Monita Bell: Here’s what some of our guests have told us about the problems of evaluating sources for reliability—and some of the many solutions they’ve suggested for educators.
Matthew Johnson: So it’s very easy now for almost anybody to create a website that looks as professional as any legitimate website—in many cases, more professional.
Katy Byron: Even if it went viral, that doesn’t necessarily mean it’s legit.
Gordon Pennycook: If you expose someone to a fake news headline, the simple act of having read it makes it seem more likely to be true, subsequently.
Lisa Fazio: Fake news doesn’t just happen in political contexts. There’s all sorts of fake health information and other types of information. So I encourage teachers to think broadly about what types of misinformation their students might be seeing.
Sam Wineburg: We’re in an age of cheap internet templates where anybody with $25 and a little bit of tech savvy can create a spiffy-looking website. … The great majority of the kinds of markers that we used to be able to rely on—is it a .org and is there a contact address listed—we can no longer depend on those things.
Steven Sloman: I worry that in today’s day and age, even in the most educated circles, it's not cool to question people. We’re taught not to challenge people often because we might be making them feel uncomfortable.
Safiya Noble: Have your students do Google searches. Have them look for news stories about the topics of the day. See what kinds of things they’re finding, and then put them into alternate environments. Send them to the library and have them not just look in the library database, which again, is another kind of constraining system, but have them walk the stacks. Have them go and interview people and talk to people, older people in their communities, about those phenomena.
Will Sommer: I think really the key is stressing to people and stressing to students tools about critical thinking and being able to distinguish between what is, say, in a newspaper or a magazine or even on cable news versus something that is an anonymous post on an internet forum.
Matthew Johnson: We also know that they will take an effort to fact-check things when they’re doing research on someone else’s behalf. So, when they’re trying to get some health information for a friend who asked them to help them find it or finding a recommendation of a product for a relative, or something like that, they do take the steps there. But they don’t take the steps when things come to them through social media, and we know, of course, that that is how so much news comes to them.
Julia Torres: We are helping students to learn how to evaluate web resources, and then how to negotiate the gigantic amount of information that’s available in this digital world. How do they learn who they can trust, and what entities they can trust, and who’s trying to get their attention via this webpage for commercial purposes? Then also how are their opinions or biases being swayed or manipulated?
Matthew Johnson: One of the ways that we can deal with that is to make fact-checking part of our civic duty. To valorize fact-checking, and also penalize spreading bad information.
Gordon Pennycook: This is not some boring thing. This is actually a super exhilarating, exciting, fun activity: to be able to use the powers of your brain to figure out what’s true in the world.
Betsy O’Donovan: First of all, journalism is a discipline of verification.
Gordon Pennycook: That’s what education is for: to teach people to be critical thinkers.
Kelly Weill: Research like you can be sued.
Monita Bell: That was Matthew Johnson of Canada’s MediaSmarts, Katy Byron of MediaWise, behavioral science professor Gordon Pennycook, professor of psychology Lisa Fazio, and Sam Wineburg of the Stanford History Education Group. You also just heard from Daily Beast tech reporter Will Sommer, librarian Julia Torres of Disrupt Texts, professor of psychological sciences Steven Sloman, information sciences professor Safiya Noble, journalism professor Betsy O’Donovan, and Daily Beast reporter Kelly Weill.
Monita Bell: Now, critical thinking—the kind that these folks are talking about—starts with some healthy skepticism. And our guests have done some critical thinking of their own about getting students to examine online sources—another part of getting reliable information.
Safiya Noble: The internet sometimes truncates and shortens our patience for complex thinking.
Steven Sloman: We’re constantly simplifying the information that comes in. We’re simplifying our understanding of how the world works. We’re simplifying the strategies we use to make decisions and solve problems.
Sam Wineburg: We have to be very, very careful in indicting young people for not really having a grasp of what they find online. I think really the situation is better captured by saying that the great majority of us have a difficult time making thoughtful discernments of what to believe and what to reject when we encounter things online. So, let’s let young people off the hook, and let’s just talk about pretty much all of us.
Steven Sloman: The world is incredibly complicated. We can’t work things out. So, often, we depend on other people to work things out for us. I like to say we “outsource” our cognitive problems. If we wanna know what the best brand of cereal to buy is, it’s actually a complicated problem, right? There’s lots of nutrients involved. We don’t know exactly what the fat and caloric content is, et cetera. Marketers depend on this fact. They try to be the experts that provide the information. If we’re deciding what our political position should be on, say, medical care, gee, that’s an incredibly complicated issue.
Shana White: The one big thing I would definitely emphasize is that you can’t trust anybody on the internet. That sounds really sad to say, but to me, it’s similar in person. Like, I tell my children, you can’t just trust people on the street. The internet is just a really, really large highway. So, you can’t necessarily trust people have the best intentions on the internet, and if you go in with that mindset, you’re going to be a little more protective.
Steven Sloman: What we’ve seen, recently, is that this dependence on people around us has caused us to coalesce into tribes where we have different groups of people with very different positions. The people within each group tend to have a common position. That’s not by chance. That’s because they depend on each other for their thinking.
Safiya Noble: Critical thinking is who wins and who loses in a particular situation, and why. What questions are we not asking? What biases or past discriminations or past wrongs might we still be operating under, and how might it be different?
Steven Sloman: It means what we have to reason about is where we’re getting the information from. What we have to reason about is whose opinion we’re going to respect. And that itself is a complicated subject. I tend to believe people who have credentials, who’ve studied an issue, who have demonstrated a certain amount of objectivity, who’ve gotten things right in the past and demonstrated they understand how things work, in the past, and who have the trust of other people that I respect. In some sense, I think that what we should be teaching people, and not just students but everyone around us—I mean, our entire society could learn this—is that what we need is a serious discussion about who to trust. And we have to get clearer criteria about who we think we should trust and who we shouldn’t. I mean it’s interesting that the different polls of political perspectives in our society seem to have very different perceptions of that: who they should trust.
Monita Bell: You just heard once again from Safiya Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism; Steven Sloman, professor of psychological sciences at Brown University; and Sam Wineburg of the Stanford History Education Group. You also heard some wisdom there from Shana White of the Constellations Center for Equity in Computing at Georgia Tech.
Monita Bell: Of course, information online doesn’t just sit there; it’s being pushed to us all the time in all kinds of ways. Just like Dorothy in The Wizard of Oz, we need to look “behind the curtain,” so to speak, and see who’s pulling the levers. And it’s important to help students learn to do this too. That’s where the framework moves next, actually: “Students [will] understand how digital information comes to them.” Whether it’s in a game or Google, digital literacy means asking why information is where it is and who’s responsible.
This season, we talked a lot about search algorithms and how they work. Many of our guests encouraged educators to push students beyond the surface of results from popular search engines. Let’s start with professor Safiya Noble.
Safiya Noble: In the past, we’ve always relied upon teachers and schools and textbooks and librarians, literature, art, to help us develop our education and become knowledgeable. I think we cannot substitute those multiple inputs with a search engine.
Sam Wineburg: Because often the first click is destiny. You see it so many times: People click on something, they’ll go down a rabbit hole, and five minutes later they’ll be watching cat videos on YouTube. It’s happened to all of us, right? We go and we Google something, and then, about five or six clicks later, we can’t even remember how we got to where we got.
Safiya Noble: The other thing that I think people need to think about, young people in particular, is that expecting to ask complex questions and getting an answer in .03 seconds is really antithetical to everything we know about being deeply educated, well-educated and having a depth of knowledge. There are some things you cannot know in .03 seconds. We should not socialize ourselves to thinking that there are instant answers to complex questions.
Lois Parker-Hennion: Getting them to dig a little bit deeper and go beyond that surface information that they find in a Google search is one of the challenges, and one of the things that I try to teach students.
Alicia Johal: There’s tons of engaging science videos about relevant, new inventions or discoveries, and so, some of them are as short as one or two minutes, and it’s just enough to wow the kids, that then they want to talk about it and ask questions.
Meenoo Rami: I think we learn by tinkering. We learn by trial and error. We learn by talking to other people. And I think this idea of play-based learning actually would seem quite natural to educators, especially educators who believe in giving the students an opportunity to meet the students where they are, and using things that they’re passionate about and bringing that into the classroom.
Alicia Johal: And then, science is one of those subjects where you can’t always see what you’re teaching them. Like when I’m teaching them about genetics and chromosomes and the way DNA replicates, they can’t do that in front of them. They can’t even see that under a microscope. So, they have to look at a video to really see a simulation of what that looks like at the microscopic level.
Julia Torres: If someone had put Octavia Butler in front of me when I was in school, my whole life would have changed, my whole reading life would have changed. If someone had put Assata Shakur in front of me, my whole life would have changed, my whole reading life would have changed. But I came upon her much later in life. One of my goals, as an English teacher and as a librarian, is to make sure that people are exposed to texts in which they can see themselves reflected linguistically, culturally, ethnically. My goal is to make sure that happens sooner in life, and not later.
Meredith Broussard: So, technochauvinism is the notion that technology is superior to all other solutions. I would argue that really we should be thinking about what is the right tool for the task, because sometimes the right tool for the task is a computer, undoubtedly. Then other times, the right tool for the task is a board book in the hands of a child sitting on its parent’s lap. One is not inherently better than the other, and we should think about what’s right for the moment.
Monita Bell: That was Meredith Broussard, author of Artificial Unintelligence: How Computers Misunderstand the World. You also heard again from Safiya Noble and Sam Wineburg; with librarians Julia Torres and Lois Parker-Hennion; San Diego middle school science teacher Alicia Johal; and Meenoo Rami, manager for Microsoft’s Minecraft Education.
Monita Bell: In many cases, computers might be the right tools, but that doesn’t absolve us from thinking critically about the information presented to us online. It just doesn’t. This season, our guests taught us that algorithms aren’t neutral. In fact, they can be deeply and dangerously biased. That’s why we should teach students to critically interrogate search results.
Heidi Beirich: The fact of the matter is that Google, and most of these places, tend to reflect back to us the horrific biases we already have in society. When you crowdsource page ranks, or you crowdsource photos, or you crowdsource anything, you’re going to get just what’s deep seated in our society: racism, white supremacy, bigotry.
Meredith Broussard: For a long time, we pretended that technology was neutral or objective, and it is not. It is subjective. It is biased. It is opinions embedded in code. We need to move beyond this reflexive notion that technology is somehow better than humans, and we need to really critically examine technology to figure out what kind of an impact it’s having on the world, and also what kind of world are we creating when we use technology―especially when we use biased technology.
Will Sommer: I mean, this is something that a lot of extremist groups are very aware of. They’re actively trying to recruit young people, especially young men. We’ve seen white supremacist groups try to recruit young people through the kind of things you would expect them to be on: edgy forums like 4Chan, through video game chat rooms, stuff like that. These sort of ideologies, whether they be white supremacist or various conspiracy theories or a mixture of all of those, they’re very active and very consciously trying to recruit young people on the internet.
Meredith Broussard: I love technology.… I’ve been coding ever since I was a little girl. I love building things with technology, but I don’t love the way that technology is used to surveil people. I don’t love the way that technology has been weaponized against poor communities, against communities of color. I don’t like the way that certain groups are excluded from being seen by technological systems.
Heidi Beirich: The autofill feature on Google has been a cesspool, and they’ve been working on it, but for a while there, if you typed in ... Say you were going to ask a question about Muslims, so you type in “Muslim” and then it would say, like, “All Muslims are terrorists.” And it had the same kinds of problems with every single minority population. Forget it, it’s not just about minorities, it’s about women—terrible, misogynistic material. So you had to be careful for all that, and students today have to know how to navigate this. Teachers have to help them understand how flawed this stuff is.
Melissa Ryan: And we just know so much of this conversation is being driven by bots and bot networks, which are obviously controlled by a human. For me, it was very sobering when I realized, “Oh, it’s not 50,000 men coming after me on Twitter. It’s 20 very bored guys with a bot net.”
Heidi Beirich: So it’s really easy to get sucked in, and when you have fragile-minded people, young people, they can get lured in by older racists who start teaching them why they should hate, giving them a worldview that’s toxic, and frankly just wrong. Or the way that the algorithms themselves work on the mainstream platforms can basically suck you down a rabbit hole of hate.
Keegan Hankes: This really does come back to finding ways to teach diversity and inclusion and to decrease this feeling of isolation and anxiety that so many hate groups are trying to amplify all the time. I mean it really does come back to the core mission of Teaching Tolerance in my mind, which is finding ways to be inclusive, finding ways to bring more diversity into the classroom, and finding ways to get these students to open up their minds and stay, you know, a little bit more expansive than maybe some of these online communities would have them be.
Monita Bell: That was Keegan Hankes and Heidi Beirich from the Southern Poverty Law Center’s Intelligence Project; Meredith Broussard, data journalism professor at New York University; Will Sommer, tech reporter for The Daily Beast; and Melissa Ryan, author of the newsletter “Ctrl Alt-Right Delete.”
Monita Bell: Let’s take a quick break. When we come back, we’ll get into how educators can help support students in being good digital citizens.
Monita Bell: Did you know that Teaching Tolerance has other podcasts? We’ve got Teaching Hard History, which builds on our framework, Teaching Hard History: American Slavery. Listen as our host, history professor Hasan Kwame Jeffries, brings us the lessons we should have learned in school through the voices of leading scholars and educators. It’s good advice for teachers and good information for everybody.
We’ve also got Queer America, hosted by professors Leila Rupp and John D’Emilio. Joined by scholars and educators, they take us on a journey that spans from Harlem to the frontier West, revealing stories of LGBTQ life that belong in our consciousness and in our classrooms. Find both podcasts at tolerance.org/podcasts, and use them to help you build a more robust and inclusive curriculum.
Monita Bell: Welcome back! As we continue to walk through Teaching Tolerance’s Digital Literacy Framework, we’ve made our way to the world of online communities.
Monita Bell: The internet is a great, big online community full of the beautiful diversity of our world. And bots. A lot of bots, but also real people. Just like us adults, young people want to find and hang out with their people when they find them online. But of course, some forms of interaction are better than others.
Monita Bell: That brings us to area three of the framework: “Students can constructively engage in digital communities.” Good digital citizens create inclusive spaces, just like we should be doing IRL. And digitally literate students can recognize hate or bias when they see it. And, hopefully, they’ll stand up for what’s right. Here’s what some of our guests had to say about participating in online communities.
Alicia Johal: I get so frustrated when schools have rules like banning devices because we’re missing out on an opportunity to teach kids really valuable and imperative life skills.
Betsy O’Donovan: Yeah. We all get tired, and it’s a lot of work, but it’s so worth doing, to just check our facts, to take a second and just be sure. It also makes us more trustworthy to other people, right? “Don’t let folks borrow your good reputation to share lies,” is the thing I tell my students. Every time I share something that’s wrong, I’ve let somebody take advantage not just of me but of my friends and the people who trust me to share things that are accurate. I find that super offensive. I’m not willing to let that into my life. I want to be a trustworthy person, and that means making sure that what I share is coming from other trustworthy people.
Rafranz Davis: We’re not giving students enough choice to make a lot of these decisions themselves. We’re automatically thinking about the one or two students who will make mistakes, and even when they do, we don’t think about how to help them through those mistakes. We immediately do things like block them from all access whatsoever, which happens across the board.
Shana White: Social media’s an awesome tool to leverage for students. Like I said, to build that broad audience, and access people they might not necessarily be able to access in their local community, … to reach out to experts, … to expose students to things that they might not necessarily get to see or hear. ... I think it’s a great tool in that means, but again, teachers, we have to also be mindful to use our agency for good and not evil. To leverage social media as a positive thing for our students in the classroom. To build them up, but to also help them amplify their voice.
Steven Sloman: There’s actually surprisingly little evidence that social media, per se, is the source of all our problems. People who have studied this cannot find direct evidence that social media, as distinct from other digital technological formats, has created a problem.
Erica Hodgin: It’s incredibly important for young people to know how to navigate digital information, to be able to know how to express themselves in digital spaces and to be able to use those kinds of tools—not only to learn, but also to communicate.
Julia Torres: And then how do you make sure that your personality and the way that you show up in the real world coincides with what you're portraying yourself to be in this digital world? Then how are you interacting with folks and negotiating that space and consuming? We want students to think of themselves as producers of the content, but then also consumers, and looking at that through a critical lens.
Katy Byron: And that’s the stuff that kids are so scared about, is misinformation about themselves being spread and going viral in their school community online. That really struck a heart chord for me. It was so sad to see these kids talking about that. I mean, my biggest takeaway right now is that it’s so hard to be a teenager in today’s universe with digital info. It’s really tough.
Monita Bell: Once again, you heard from Katy Byron of MediaWise; science teacher Alicia Johal; Shana White of Georgia Tech’s Constellations Center for Equity in Computing; professor Steven Sloman, and Julia Torres of Disrupt Texts. You also heard Rafranz Davis, who directs Professional and Digital Learning in Lufkin ISD in Texas; and Erica Hodgin of the Civic Engagement Research Group at the University of California at Riverside.
Monita Bell: So far, we’ve encouraged students to get out there and explore the online world with a critical eye. But, as we’ve heard, it’s not always a friendly place. That’s why the fourth area of Teaching Tolerance’s Digital Literacy Framework is “Students understand how online communication affects privacy and security.” Julia Torres and Alicia Johal reminded us about digital footprints and safety and explained how they approach these concepts with students.
Julia Torres: What’s a digital footprint, what does that look like, what kind of footprint are you leaving?
Alicia Johal: It’s also having a conversation with students about what’s safe—and what’s not—to post online. Like, don’t post your mother’s maiden name, or your home address, or your social security number.
Julia Torres: Some people consider safety to be protecting them from realities of life that belong to our students in our most marginalized communities. It’s their daily reality. It doesn’t do them any good to pretend like it doesn’t exist. I’m talking specifically about things like police brutality, and issues around class and race that people in other communities choose to protect children from.
Rafranz Davis: And then last but definitely not least, is digital copyright. I think social media blurs that line for what copyright means and nobody seems to care anymore. … something can appear on television or in an online video, and it’s immediately downloaded and uploaded to another channel. When people are doing that, they’re actually stealing the creative rights of those who were the original creators. But we often don’t think of that because we think, “It’s online, it’s here, I can just share it.” But there are actual, real ramifications to that.
Alicia Johal: It’s hard for teachers to take risks with something like social media, because there’s a level of maturity that’s required by the students, and it’s amazing how quickly one tweet or photo can be shared, widespread. And I think that for a lot of teachers, it’s a safety issue. You’re not only dealing with your students, but this is somebody else’s child, and you have to be able to have a really honest conversation with the class and teach them how to be responsible online, and how to post to social media.
Kelly Weill: But I think there’s a few conspiracy communities that really thrive on YouTube. … As a lot of researchers have noted, there’s kind of a reactionary tendency in a lot of gaming videos right now, or the gaming community right now.
Matthew Johnson: We know from a number of sources, including our own research, that young people are more likely to encounter hate material online accidentally, rather than looking for it. So, it’s important to teach them from very early on that this can happen, that you can find things you weren’t looking for.
Kelly Weill: I think young people are particularly susceptible to being radicalized because you’re young, you don’t have fully formed political ideologies.
Betsy O’Donovan: We’re really at the very beginning of this, and even though the way things work on Facebook or the way things work on YouTube or the way Google Search works, is what we have known for most of the lifespan of the internet to date, does not mean that that is what it should look like in 200 years. And we have to take that view of the future, where we have an obligation to our grandchildren and their grandchildren to make sure we get this technology right and that we hold the people who are writing the algorithms and writing the software and creating these data tools, we have to hold them accountable for making it good for America and good for the world, right? This is the stuff of civic life.
Monita Bell: In addition to Julia Torres and Alicia Johal, you also heard Rafranz Davis, Matthew Johnson, Kelly Weill and Betsy O’Donovan.
Monita Bell: Look, students aren’t just passively floating around online. They are making things—ideas, comments, movies, memes. Educators have a great opportunity to help students be not just conscious of their roles as producers, but also to see themselves as conscious producers. Area five of the framework is all about making things: “Students understand that they are producers of information.”
Monita Bell: We’ll hear Erica Hodgin and Joe Kahne of the Civic Engagement Research Group, Julia Torres and Alicia Johal again, and we’ll start with librarian Lois Parker-Hennion, a former member of the Teaching Tolerance Advisory Board. By the way, shout-out to librarians! They get stuff done!
Lois Parker-Hennion: You know, we used to have students doing note cards and standing up and making a speech, maybe using a PowerPoint slide. Now things are more interactive and students are creating YouTube videos, they’re going on StoryCorps, they’re doing educational videos, they’re creating games, creating their own blogs.
Joe Kahne: Part of what makes the digital revolution so important is that it creates opportunity for all of us and in some ways especially for young people because they’re on the cutting edge of so much of this—not just to consume content, but also to remix content, to produce content and definitely to circulate content.
Erica Hodgin: And so I always recommend that teachers pick one thing that will help to further expand their curriculum and what they’re already doing in the classroom and basically just start small and try one thing and then to sort of build from there and to really then learn alongside your students.
Julia Torres: There are students I know who have started nonprofits, there are students who are doing great things in the world using the internet and using the audience that they cultivate. But it’s getting students, I think, to really not just be consumers of whatever is out there online mindlessly, but to critically evaluate what they’re seeing and what they’re consuming, and then look at their role in terms of production.
Alicia Johal: Before I started using YouTube, I noticed that my students were really all about it more than I was, and I found out that many of my students were creating and posting YouTube videos, and they were creating content, and they had followers. Whether they were recording themselves singing or playing a video game, I didn’t know how many of my students were digital producers, not just consumers. I think we assume that “Oh, they’re young, they’re just listening to music or watching videos.” No, a lot of our kids are making content, sharing it online, communicating with other people, sharing their ideas.
Monita Bell: So people are doing a lot of connecting and creating on the internet. But it’s also a place where people are buying and selling just about anything. Everyone’s vying for our attention—and paying good money for it. Information isn’t just power. It’s money. A lot of money. An important part of being digitally literate means being an informed consumer. That’s area six of the framework: “Students understand their role as customers in an online marketplace.” This means being able, for example, to recognize an advertisement when they see it. Which, these days isn’t always so easy. And knowing how companies like Facebook and Twitter make money.
Monita Bell: Here’s Betsy O’Donovan, assistant professor of journalism at Western Washington University; UCLA Information Studies professor Safiya Noble; and Daily Beast reporter Kelly Weill.
Betsy O’Donovan: Most of us could stand to take a moment and really pause whenever we’re taking an action on the internet and ask ourselves, “What am I giving away and who benefits?”
Safiya Noble: But, truly, Google Search is an advertising platform. And it optimizes content based on its clients that pay it—pay the company—to help create more visibility for their content. It’s also optimized by people who have an incredible amount of technical skill—these might be search engine optimization companies.
Betsy O’Donovan: Our advertising to children used to be pretty limited to Saturday morning cartoons and after-school cartoons, and you would see toy advertisements or snack foods that look really cool. Now with YouTube, we have this constant barrage of sales messages to children because children are extremely effective at getting their parents to spend money.… If you are a child in America today, you’re seeing an insane number of advertisements.
Kelly Weill: So we don’t know exactly what goes into the algorithm, it’s sort of a black box. But what we do know is YouTube makes its money by displaying advertisements, so the more people watch, the more ads YouTube can show, and the more money it makes. So, to maximize that viewership, YouTube built an algorithm that tries to keep people on the site as long as possible. It tries to recommend videos that it thinks people will like. And those recommendations are partially based on things you’ve already watched, like for instance, if I watched some music videos by indie bands, I might get similar bands recommended. But one thing it does is, to keep people interested … it seems to push to extremes. And you might watch something more moderate, but it will recommend a more extreme version of that.
Safiya Noble: Young people right now, I think, go on the internet and in many ways, especially when they go through something like Google, they think this is going to bring them a truth, or impartial kinds of information. What does it mean? I think this is why you and I both might have been a bit jarred when we did our searches on “black girls” years ago, and found that pornography and hyper-sexualized material dominated the first page of results. That’s really different than seeing sexually explicit or eroticized advertising in a magazine that might be targeted toward me. I can, of course, be harmed by consuming those images, but I don’t think the magazine is a teller of truth in the way that people relate to Google or search engines as being these fair, neutral, objective, credible spaces.
Kelly Weill: And what I’ve heard from multiple young men is that they started watching videos pretty innocuously, but maybe as young men, they had inherent biases—maybe anti-feminist biases—is what I’ve heard a lot, and these videos sort of played on it. Even if it wasn’t the point of the video, someone might make a joke that kind of confirms that bias, and then YouTube, because it noticed this correlation between gaming videos and anti-feminist videos, would start recommending things that are more of the political nature. So that’s how several men I’ve talked to have just got funneled down from things that were pretty innocuous, gaming videos—there’s nothing inherently wrong with it—to really more fringe ideologies, without their noticing.
Betsy O’Donovan: So, for example, in ProPublica’s case, they looked at how political ads, but also particularly housing ads or job ads, were being served. And they discovered that there were these controls on Facebook that allowed, for example, landlords to target their advertising only to white renters or only to people of a certain age or income bracket, which is illegal.
Kelly Weill: So, I think because people are so drawn to these fringe videos, it’s profitable for YouTube to promote them because people are going to stay on the site, and they’re going to watch them.
Monita Bell: Really, what we’re trying to get at—with this whole framework and, through all of this—is active civic engagement. And in our world right now, digital literacy is critical to that aim. That means engaging in critique and using digital tools to lift up justice. To make the world a better place. Not to be corny, but that’s what it is. And so this is where the framework ends up, in a place where we should all be working toward all the time. Area 7: “Students can evaluate the value of the internet as a mechanism of civic action.” It’s one thing to see the levers of power behind the curtain; it’s another to tap into power to fight injustice.
Betsy O’Donovan: Because young people are demanding more from democracy, and they are concerned about the state of the republic, and they want the skills to watchdog powerful entities and people. So, I am not scared about the future, not even a little bit.
Erica Hodgin: We do find that it’s really important for young people to be able to learn effective strategies to work for change. If they do want to make a difference around an issue, for them to be able to understand, “What are the kinds of tactics and strategies that I can use?” and, “Where and when and to what extent is digital media gonna be helpful?”
Betsy O’Donovan: Oh, gosh. I love high school journalism, and I think we are in an era when high school journalism is facing significant challenges, and yet, student journalists, teenage student journalists, are doing phenomenal work. So, I think about how the students at Parkland got really thoughtful, deep coverage of the events at their school, and they were able to create a record that was in real time, extremely powerful and valuable, and very focused on their community. They did that despite significant challenges and criticism from national figures.
Julia Torres: How do we leverage digital literacy to share power with students in a way that helps them to enact change in their world?
Betsy O’Donovan: The other thing that journalism is is a civic act. I teach an intro to newswriting course. … About half the students in my class don’t ever expect to be professional journalists, but they come into the class and they’re often quite surprised by the fact that they have all of the same rights that journalists do, and they are, if not morally obligated to exercise them, they certainly can improve civic life by exercising these skills that we teach specifically as a practice of journalism. Things like verifying information before you share it on social media or making a request for public records, wherever you live, so that you can check in on what the government is doing, what the people who are making decisions about your tax dollars are doing, what people who are making decisions about your education are doing. … And when they find that out, they may not ever draw a paycheck from a media company, but they are participating in journalism as a civic activity, and everyone in America—that is what the First Amendment gives us is, everyone in America, citizen or not, has a set of rights about inspecting our government, about speaking when we find something that is wrong, and about not being suppressed as we’re doing that. … So, to everyone who’s listening to this, I hope you will be an active participant in that.
Monita Bell: That was journalism professor Betsy O’Donovan; with Erica Hodgin of the Civic Engagement Research Group; and teacher-librarian Julia Torres of Denver Public Schools and Disrupt Texts.
Monita Bell: Y’all, that’s a wrap on The Mind Online! I can’t believe it—wow! We made it to the end. I’m Monita Bell, your host and managing editor for Teaching Tolerance. It’s been my absolute privilege and honor to have you join me on this journey; to engaging in digital literacy.. So thank you, thank you for being with me! And thank you to all of my guests this season for sharing with us the many ways we can know and do better online—and support students in doing so too. Shout-out to our production supervisor, Kate Shuster, and to producers Jasmin Lopez and Barrett Golding. And another special thank you to my colleagues, Teaching Tolerance Editorial Assistant Anya Malley and Senior Writer Cory Collins for all the assists. To Podington Bear, thank you for the music.
We hope you’ll continue to learn from what you’ve heard here and dig into the Digital Literacy Framework at tolerance.org/diglit. That’s tolerance.org/d-i-g-l-i-t. And share the love with your colleagues and friends. And remember, if you share on social media, use #TeachDigLit. Thanks for listening!