You Are the Product

Episode 11

Reflections on how the attention economy affects social media and journalism, with Meredith Broussard, author of Artificial Unintelligence: How Computers Misunderstand the World, and Betsy O’Donovan, assistant professor of journalism at Western Washington University.


Earn professional development credit for this episode!

Fill out a short form featuring an episode-specific question to receive a certificate. Click here!

Please note that because Learning for Justice is not a credit-granting agency, we encourage you to check with your administration to determine if your participation will count toward continuing education requirements.

Subscribe for automatic downloads using:


Apple Podcasts | Google Music | SpotifyRSS | Help

Resources and Readings

Meredith Broussard
Website | Twitter | NYU Journalism

Books and Articles:

Betsy O'Donovan
Website | Twitter | Western Washington University: Journalism

Recent Research:



Monita Bell: If you’re not paying for the product, you are the product. 

People used this concept to describe television back in the 1970s, and almost half a century later, you could say it applies perhaps even more so to the internet.

Betsy O’Donovan: Most of us could stand to take a moment and really pause whenever we’re taking an action on the internet and ask ourselves, “What am I giving away and who benefits?”

Monita Bell: That’s researcher and journalism professor Betsy O’Donovan. Today I’m going to be talking with her and with data journalist and professor Meredith Broussard.

Meredith Broussard: For a long time, we pretended that technology was neutral or objective, and it is not. It is subjective. It is biased. It is opinions embedded in code. We need to move beyond this reflexive notion that technology is somehow better than humans, and we need to really critically examine technology to figure out what kind of an impact it’s having on the world, and also what kind of world are we creating when we use technology―especially when we use biased technology.

Monita Bell: You’re listening to The Mind Online, a podcast for educators from Teaching Tolerance. I’m your host, Monita Bell. I’m so happy you decided to join me today. This episode explores the attention economy, how the big data that makes technology companies money is made from little bits and bytes of, well, you and me—and all of us, really.

Yet, though they’ve amassed the most enormous assemblage of information in human history, the tech companies continue to get even the most basic things wrong. That’s the warning of Meredith Broussard’s book, Artificial Unintelligence: How Computers Misunderstand the World. Mind you, she’s no anti-tech, living-in-the-past Luddite. She started programming at age 11. She entered Harvard in 1991 to study computer science, not common then for a woman and definitely not a woman of color. You might say she’s a bit of a techie. 

Here’s my chat with Meredith. You’ll learn more.

Meredith Broussard: The first line of the book is “I love technology,” because I do. I’ve been coding ever since I was a little girl. I love building things with technology, but I don’t love the way that technology is used to surveil people. I don’t love the way that technology has been weaponized against poor communities, against communities of color. I don’t like the way that certain groups are excluded from being seen by technological systems.

There’s a woman named Joy Buolamwini at MIT Media Lab who has done some really amazing work on a project called “Gender Shades,” where she discovered that facial recognition systems don’t recognize people with darker skin.

Monita Bell: Oh, wow.

Meredith Broussard: They’re better at recognizing men than they are at recognizing women, and they’re very good at recognizing light-skinned men, and they’re very bad at recognizing dark-skinned women. So, it brings up the question of who is seen by these systems, who is marked as human by these systems.

Monita Bell: Oh, wow. Hmm.

Meredith Broussard: The defaults programmed into these systems are not the defaults that I think we want as a society.

Monita Bell: Right. You’re a data journalist. What is data journalism, and how’s that different from other types of journalism that our listeners might be more familiar with?

Meredith Broussard: Data journalism is the practice of finding stories in numbers and using numbers to tell stories. It’s a relatively new form of journalism. Sometimes as data journalists, we do data visualizations. Other times we write computer code in order to commit acts of investigative journalism, and you’ll often read works of data journalism in outlets like ProPublica or Vox or “The Upshot” section of the New York Times. 

In my book, Artificial Unintelligence: How Computers Misunderstand the World, one of the things that I do is I break down a lot of the technology buzzwords so that people can really easily understand them. Because one of the things that has happened in the high- tech world is that people use misunderstandings about technology in order to create barriers. So I’m really interested in empowering people to go beyond those barriers and truly understand what these technologies are that people are increasingly using to make decisions on our behalf.

Monita Bell: Generally, when people are mining this huge data, what are they seeking out?

Meredith Broussard: That’s a good question. I’m really interested right now in artificial intelligence, which is a buzzword that comes up in connection with big data a lot. So, the term AI suggests to a lot of people something from Hollywood. It makes us think about The Terminator, or it makes us think about Commander Data from Star Trek.

Monita Bell: Yeah.

Meredith Broussard: The thing is, those Hollywood images are really deeply embedded in our collective subconscious, and so those are the first stores that we go to. But the reality behind artificial intelligence is that it’s just math. It’s really complex, gorgeous, magnificent math, but it’s just math. We have to keep in mind what’s imaginary about AI and what’s real about AI. The Hollywood stuff, that falls under a category called “general artificial intelligence,” and it’s totally imaginary. It’s really cool, but it’s totally imaginary. Then, narrow AI is what we actually have, and that’s all mathematical.

Artificial intelligence, or at least the kind of AI that we have out there right now, which is, most of it is a kind called “machine learning,” which, again, is just math. What machine learning is, is it’s just computational statistics. It’s making predictions based on existing data. But the problem is, when you use machine learning to make decisions, all the machine learning model does is it replicates the world as it is, and it doesn’t get us closer to the world as it should be.

Monita Bell: You were talking about barriers a moment ago, that people like to use, say, this math to create barriers. Can you give me some examples of what you’re talking about?

Meredith Broussard: I think the most iconic example is the compass algorithm, which is an algorithm that was used to generate a recidivism score, a score that claimed to measure how likely someone was to commit another crime. Julia Angwin, the journalist who was then with ProPublica, got data on crimes and on outcomes for these individuals after their arrests and after their sentences, and she analyzed it to find out how accurate was this compass algorithm. It turns out that the algorithm was racist.

Monita Bell: Mm-hmm (affirmative). 

Meredith Broussard: It was mathematically impossible for this algorithm to treat white offenders and black offenders fairly and equally. One of the things that happened was that the unconscious biases of the creators found their way into the algorithm, because this is something that happens when we create technology. All technology includes the unconscious biases of its creators.

For a long time, we kind of pretended that technology was neutral or objective, and it is not. It is subjective. It is biased. It is opinions embedded in code. We need to move beyond this reflexive notion that technology is somehow better than humans, and we need to really critically examine technology to figure out what kind of an impact it’s having on the world and also what kind of world are we creating when we use technology―and especially when we use biased technology.

Monita Bell: You just mentioned this idea of us thinking technology’s always better, which brings me to technochauvinism. Explain what is technochauvinism, and how do you think it manifests in the ways that we think about the internet and social media, in particular.

Meredith Broussard: Technochauvinism is the notion that technology is superior to all other solutions. I would argue that really we should be thinking about what is the right tool for the task, because sometimes the right tool for the task is a computer, undoubtedly. Then other times, the right tool for the task is a board book in the hands of a child sitting on its parent’s lap. One is not inherently better than the other, and we should think about what’s right for the moment.

I think with social media, when social media started, there was this technochauvinist idea that, say, social media was going to replace journalism and that it was somehow going to be better to use social media for everything because it was on the computer. I think that it is clear by now that social media is not an adequate replacement for journalism. We still need journalism. We still need curation. The gatekeepers at the social media companies really need to do a better job of understanding social systems and making technology that works with social systems.

Monita Bell: What do you mean by “social systems”?

Meredith Broussard: I think that we can go back to Safiya Noble’s work. Safiya Noble’s book, Algorithms of Oppression, is just…it’s outstanding. She’s one of the really terrific thinkers out there today.

Monita Bell: I could not agree more about Safiya Noble’s work examining algorithms. That’s why I’m interrupting Meredith’s conversation right now just a bit to hear from Professor Noble. She talked with us in Episode 3, titled “Did You Google It?” Here’s an excerpt.

Safiya Noble: Young people right now I think go on the internet, and in many ways, especially when they go through something like Google, they think this is going to bring them a truth or impartial kinds of information. So, what does it mean, and I think this is why you and I both might have been a bit jarred when we did our searches on “black girls” years ago and found that pornography and hypersexualized material dominated the first page of results.

That’s really different than seeing sexually explicit or eroticized advertising in a magazine that might be targeted toward me, and I can kind of, of course, be harmed by consuming those images, but I don’t think the magazine is a teller of truth in the way that people relate to Google or search engines as being these fair, neutral, objective, credible spaces.

Monita Bell: That was Safiya Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism. Now, back to my conversation with Meredith Broussard.

Meredith Broussard: There’s another book coming out that I would also recommend by Ruha Benjamin called Race After Technology: Abolitionist Tools for the New Jim Code.

Monita Bell: Oh, wow, “Jim Code.”

Meredith Broussard: Yes, “the new Jim Code.” I think these three books together give you a really good perspective on how we should be thinking about social issues and what role technology plays in perpetuating social problems and perpetuating the unconscious biases of the creators of the technology. In Safiya’s book, one of the things she writes about is the way that when you Googled “white girls,” you would get images of girlhood, and when you Googled “black girls,” you would get porn.

Monita Bell: Right.

Meredith Broussard: This shows us that the people who were creating this search technology didn’t notice this as a problem, and if they did notice it as a problem, they didn’t care enough to change it until they were called out publicly.

Monita Bell: Right, exactly. People who are creating the codes are necessarily going to be reflecting the biases of the society we live in, and we need to recognize that.

Meredith Broussard: Yeah, and we can also have more diverse teams creating technology.

Monita Bell: Mm-hmm (affirmative), that part.

Meredith Broussard: Another example of this is the Apple watch. When the Apple watch launched, they made a big deal about how it had all these health-tracking applications, but you know what? It didn’t have a period tracker as a default, to make it something that you have to remove from the watch if it’s not relevant to your life.

Monita Bell: Right.

Meredith Broussard: If there were more women on the development team or if there were more women in upper management, that could have been something that could have been discussed. Somebody probably would have caught it. That’s incredibly useful. Fifty percent of the population could take advantage of that technology.

Monita Bell: Exactly.

Meredith Broussard: That seems like a good business decision to me.

Monita Bell: Right. That’s a great example. Where do educators fit into this picture when we think about technochauvinism, for example? What connections do you see for K–12 educators in classrooms?

Meredith Broussard: One of the things that I write about in the book is a computational investigation that I undertook, where I built artificial intelligence software to answer the question, “Do kids in Philadelphia schools have enough books to learn the material on the state-mandated standardized tests?” Because the same people who write the tests write the books. At one point, I got all “crusader-y,” and I thought, “All right, I’m going to build some kind of computational system that’s going to fix the system, and it’s going to allow kids to learn the material better because clearly something is not working,” because at that point only 50 percent of Philadelphia kids were passing the standardizing tests, and I thought, “We can do better than this.”

I realized that you don’t actually need a fancy computational system. You just need the book. When you’re dealing with a huge population of kids and hundreds of schools and thousands of teachers and a multimillion-dollar budget, it’s actually much harder than anybody imagines to implement, say, a one-to-one laptop program or to give everybody iPads for electronic textbooks.

Then also, the medium that you use makes a difference. Print is actually better for deep learning, so reading on a screen is really good for shallow learning. It’s good for looking up a fact to settle an argument that you’re having over lunch, right? Or it’s good for point-to-point directions, but if you really need to develop knowledge and if you need to really deeply intellectually engage with something to learn it, well, print is actually a better interface for that. This is what all of the cognitive science says. Books, it turns out, are much cheaper than computers. If you drop a book, it doesn’t break. The book doesn’t need to be recharged. You could have it out in the rain without it breaking, and you don’t need plugs in the classroom for books.

One of the things that happens in urban public school districts is their buildings are very old, and they often don’t have the electrical and HVAC systems that you need in order to support 30-odd kids in a classroom, all who need power and need air conditioning so that the computers don’t break.

Monita Bell: Right.

Meredith Broussard: Right? When you actually factor in all of the different things that you need to do in order to have a really high-tech classroom, and then you look at how much that costs and what are the logistical concerns, it’s actually much simpler to use books, in most cases.

Monita Bell: Mm-hmm (affirmative). 

Meredith Broussard: As K–12 educators, we need to think about, again, what is the right tool for the task. We need to think about the voices who are telling us, “You need to get kids using computers all the time.” I think computers nowadays are pretty easy. Kids don’t really have a huge amount of trouble learning how to use Microsoft Word or Google Docs. The programs used to be a lot more complicated. They’re pretty easy nowadays.

Monita Bell: Right.

Meredith Broussard: The rhetoric of "Oh, you’re going to be left behind if you don’t learn to code and if you don’t have computers embedded in every lesson," that’s actually just empty rhetoric. We should be more thoughtful about the ways that we use technology in the classroom, and we should also feel free to not use technology in the classroom.

Monita Bell: When you think about, then, the ways in which your body of work intersects with K–12 education, what advice would you give to educators as they try to promote digital literacy with their students?

Meredith Broussard: I would say read my book. That’s a really good starting place. The book is very accessible, and, in fact, I wrote it so that people could understand these concepts and be empowered to make better decisions about what we can and should do with technology.

Monita Bell: Meredith Broussard is the author of Artificial Unintelligence: How Computers Misunderstand the World. She teaches at the Arthur L. Carter Journalism Institute at New York University. For links to the articles and books Meredith mentioned, look for The Mind Online and find this episode at Now, a quick break.

Monita Bell: Did you know that Teaching Tolerance has other podcasts? We’ve got Teaching Hard History, which builds on our framework, Teaching Hard History: American Slavery. Listen as our host, history professor Hasan Kwame Jeffries, brings us the lessons we should have learned in school through the voices of leading scholars and educators. It’s good advice for teachers and good information for everybody. 

We’ve also got Queer America, hosted by professors Leila Rupp and John D’Emilio. Joined by scholars and educators, they take us on a journey that spans from Harlem to the frontier West, revealing stories of LGBTQ life that belong in our consciousness and in our classrooms. Find both podcasts at, and use them to help you build a more robust and inclusive curriculum.

Betsy O’Donovan was a journalist at small newspapers and big newsrooms, like ESPN’s. She was a fellow at Harvard’s Nieman Foundation for Journalism, and she wrote papers about journalism for the Knight Foundation and the Poynter Institute, and she teaches journalism at Western Washington University. It’s fair to say she thinks about journalism a lot, especially when it’s online. Here’s more from Betsy.

Betsy O’Donovan: Most of us could stand to take a moment and really pause whenever we’re taking an action on the internet and ask ourselves, “What am I giving away and who benefits?” The example I like to use in my classrooms, the one that always blows students’ minds, is when we talk about BuzzFeed’s quizzes. This is a super-fun area, because I am a sucker for BuzzFeed. It’s popcorn and candy to me, as well as some really solid journalism on the backside.

Monita Bell: Yeah.

Betsy O’Donovan: But I get a kick out of these quizzes that are like, “Tell us six ingredients you love at Chipotle and we’ll tell you which Disney prince you’re most likely to go on an adventure with,” right?

Monita Bell: Right.

Betsy: O’Donovan: That’s fun and it seems really harmless, and I would argue that for the most part, it can be, but there are less-benign uses of that kind of information, where, for example, ProPublica did a really great investigation into how Facebook ads are served. Every piece of information that you drop into Facebook is used to make educated guesses about what information or posts to serve to you based on how likely it is that people in a similar profile like that information. 

For example, in ProPublica’s case, they looked at how political ads, but also particularly housing ads or job ads, were being served, and they discovered that there were these controls on Facebook that allowed, for example, landlords to target their advertising only to white renters or only to people of a certain age or income bracket, which is illegal.

Monita Bell: Interesting. Hmm.

Betsy O’Donovan: That’s one of the darker undersides of data. The other example is something that’s really emerging with YouTube, which is, there was a pretty scary story... I believe it was in The Guardian... about how YouTube’s algorithm pulls people down really dark rabbit holes based on what kind of content…

Monita Bell: Yeah, we’ve actually talked about that on this podcast. It is quite creepy.

Betsy O’Donovan: Yeah, and especially when you’re talking about things like, the most recent example is pedophilia and this notion of when people are consuming content that features children, pretty rapidly, depending on the kind of content that you’re consuming, it can go some very dark places. That’s one of the things that we’re surrendering when we surrender our data to these companies, is we’re surrendering to their judgment, and that’s not even human judgment. We’re surrendering to an algorithm that does not have built-in ethics or morals or common sense.

Monita Bell: When it comes to money, what are some concrete examples that educators can use to help students understand how the internet, so that’s including social media and YouTube and others, how it actually monetizes our attention and our interests?

Betsy O’Donovan: Oh, absolutely. The first and most obvious example is when we think about where advertising is placed. Obviously, in an attention economy, when we’re giving content away for free, that’s because we are trying to maximize the attention on any piece of content. We don’t want to create a roadblock to somebody looking at that BuzzFeed quiz, because we have attached advertising to it, and we are able to turn to advertisers and say, “I have X number of people’s attention, and here are the demographics of those people,” their... For example, NPR’s audience is about 11% of Americans. It’s extremely white, wealthy, coastal, educated. It’s a really valuable demographic for advertisers because it has a lot of disposable income.

What media companies do is they are very interested in who their audiences are and who’s consuming their content, because there is a different dollar value attached to different groups of people. 

Monita Bell: Hmm. Mm-hmm (affirmative). 

Betsy O’Donovan: We look at, for example, children used to be, our advertising to children used to be pretty limited to Saturday morning cartoons and after-school cartoons, and you would see toy advertisements or snack foods that look really cool. Now with YouTube, we have this constant barrage of sales messages to children because children are extremely effective at getting their parents to spend money, so even though they are not themselves a wealthy demographic, they’re an incredibly powerful group for advertisers to connect to, because they are not necessarily sophisticated consumers who are looking for the downside of a sales message. If you are a child in America today, you’re seeing an insane number of advertisements, relative to people like me, who grew up in the ’80s. Our media consumption was really limited to the comics pages and to cartoons on TV. 

One thing I would encourage educators and parents but also students to do is really start thinking about asking, “I want this piece of information, or I want this piece of entertainment, but what’s coming attached to that, and is it worth it to me? I’m selling my attention. Am I getting enough out of it,” which is a really odd thing to think about when we’re constantly consuming content. But it becomes a really important question.

Monita Bell: Yeah, we are selling our attention. Our attention clearly has value to people who are making money, and so we should value the attention, but what can we be getting out of this attention that we’re giving to others?

Betsy O’Donovan: Right now it’s I have some need, I have five minutes in the grocery line or I’m waiting for a bus or I finished my assignment early and my teacher said I can use some internet time, we’re basically trading our attention for not being bored. One thing we might want to start thinking about is “Am I getting good information, or am I getting good entertainment?” This is more relevant for journalism than it is for cartoons. But one example is, of course, political coverage in the 2016 election, which is the first time that we saw people on Facebook, Snapchat and Instagram really creating political content that had no basis in reality. They were literally just making stuff up and presenting it as factual, because they didn’t really care about what the consumer needed. They only wanted to create something that was so spectacular or intriguing that people would click through, and they basically would get advertising revenue from that. They felt no concern whatsoever for the people who were maybe getting bad information and subsequently making a choice that if they had complete or better information, they wouldn’t have made.

Monita Bell: You are a journalist and you teach journalism, and we’re talking about the fact that so much of the information we consume comes from the internet and social media to the point that many people may see journalism as a dying profession. Why do we still need journalism?

Betsy O’Donovan: I love it when people ask me this question, Monita. I really, really do, actually.

Monita Bell: Yes. Good.

Betsy O’Donovan: Because it opens up this really fun conversation about the fact that journalism isn’t really, as I would describe it, a profession. It can be a job, but I believe firmly that journalism is two things. First of all, journalism is a discipline of verification, right? When we are talking about journalism, by definition, it means that someone somewhere has checked the facts and is able to backtrack those facts and assert, because they have confirmed them rigorously, that what they’re presenting to you is valid information, that is verifiable and that you could follow back to its source too. Journalism is a discipline of verification.

The other thing that journalism is, is a civic act. I teach an intro to newswriting course, and it satisfies a writing requirement at my university. So, about half the students in my class don’t ever expect to be professional journalists, but they come into the class and they’re often quite surprised by the fact that they have all of the same rights that journalists do, and they are, if not morally obligated to exercise them, they certainly can improve civic life by exercising these skills that we teach specifically as a practice of journalism. Things like verifying information before you share it on social media or making a request for public records, wherever you live, so that you can check in on what the government is doing, what the people who are making decisions about your tax dollars are doing, what people who are making decisions about your education are doing.

I’m working with college students, and they get delighted when they can do things like seek the public records about how much their professors are paid or seek public records about how many athletes have unpaid parking tickets at the university and really just start paying attention to whether the playing field is even by checking the original records. When they find that out, they may not ever draw a paycheck from a media company, but they are participating in journalism as a civic activity, and everyone in America, that is what the First Amendment gives us, is everyone in America, citizen or not, has a set of rights about inspecting our government, about speaking when we find something that is wrong, and about not being suppressed as we’re doing that.

To everyone who’s listening to this, I hope you will be an active participant in that, because that is... I don’t think journalism as a profession is dying, because I think people will always need verified information, and that is a time-consuming activity. I think people will always be willing to pay for that, maybe a smaller group of people, but still I think it’s alive as a profession. What’s more exciting is, our major has massively increased over the last three years, in part, because young people are demanding more from democracy, and they are concerned about the state of the republic, and they want the skills to watchdog powerful entities and people. So, I am not scared about the future, not even a little bit. 

I’d also like to point to a really wonderful resource, which is “On the Media,” the radio show from NPR, has an outstanding thing called the “Breaking News Consumer’s Handbook.” One of the things that we have seen over time is, in breaking news events, whether it’s a mass shooting, like at Parkland, whether it’s a natural disaster like Hurricane Katrina or any time there is a significant event, you tend to see a lot of static in the information ecosystem, where people are deliberately or unintentionally, but because they are moving too fast, introducing false information, or they’re introducing lies into the public narrative.

What the “Breaking News Consumer’s Handbook,” which is really just recipe card–sized lists of eight or ten things you can do, what it does is it gives you just quick reminders, you can tape it to the side of your computer if you have a monitor, you can tape it to the top of your laptop. It basically says, “Here are things to watch out for if you’re dealing with breaking news events, and here’s how to make sure you’re not duped or participating in the distribution of bad information.”

Monita Bell: Speaking of the practice of journalism, do you feel like there is hope left for, say, high school newspapers in this attention economy we’re in and this increasingly digital age, and also maybe just to talk about what you feel are the, is the importance of school newspapers.

Betsy O’Donovan: Oh, gosh. I love high school journalism, and I think we are in an era when high school journalism is facing significant challenges, and yet, student journalists, teenaged student journalists, are doing phenomenal work. So, I think about how the students at Parkland got really thoughtful, deep coverage of the events at their school, and they were able to create a record that was in real time, extremely powerful and valuable, and very focused on their community. They did that despite significant challenges and criticism from national figures. 

There’s also a wonderful example of a student newspaper in Kentucky. Fundamentally, they were facing a challenge from their administration about legitimate and true coverage of events, and those students stood up, sued over it, won their case, and they did so actually with the help of the other SPLC. The Southern Poverty Law Center is of course an incredible institution, but the other SPLC is the Student Press Law Center, and this is a resource that my students use very regularly. We’re in kind of constant contact with them as we are pursuing our rights as journalists.

SPLC works with high school journalists to ensure that their rights are not being infringed on. If there are educators and students out there who are concerned about their right to speak because they are school-funded institution or their right to speak because their high school is acting in loco parentis and is trying to prevent them from covering events, the Student Press Law Center is a powerful resource. They will not go to court for you, but they will provide legal advice and assistance and often connect students to attorneys who are interested in open records and free speech wherever they are in the United States. I hope people will put their number on speed dial.

Monita Bell: It seems that we’re seeing a lot of traditional media outlets leaving local markets, what some people call “news deserts,” and it seems now like more than before information access is becoming even more of an equity issue. I want to know are you seeing those patterns too of these news deserts?

Betsy O’Donovan: I am. Actually, there’s a significant body of research that’s emerging from the University of North Carolina around news deserts. Penny Abernathy, who’s a scholar there, with a deep background in journalism around the U.S., is really studying this issue pretty intensively. News deserts are, I would argue, a significant threat to American democracy, because the erosion of a strong local information ecosystem is correlated with disengagement with civic life.

One thing that’s happening that’s really exciting is, first off, we’re seeing that student journalism is sometimes filling those gaps. For example, in Chapel Hill, North Carolina, the Daily Tar Heel, which is the independent, nonprofit, student-run newsroom that’s been operating for over 125 years there, has really stepped into the gap where other local media used to cover the city, and the Daily Tar Heel has said, “We’re not just going to be concerned with student life and campus life. We have a responsibility to people in this town where we live.” 

I see the same potential for high school journalism. Who else is better equipped to thoughtfully cover school boards and to thoughtfully cover all of these subjects, teaching evaluations, test scores. Students have a particular insight on how all of that stuff runs inside their schools, and marrying that intimate insight from being in that community to data and reporting gives them the opportunity to do important schools coverage. 

We definitely have a problem with news deserts, and what we have now is the challenge of how do we backfill those, because the one thing that really does frighten me is I’m not confident that American democracy can succeed in ignorance or in a vacuum of good, complete information about how our powerful institutions are working.

Monita Bell: We’ve been talking about media markets and money and just what all this means for the present and future of good digital citizenship. I know our listeners are really vested and thinking about applying these ideas to make their communities and the world a better place. I’d love to know your thoughts about what all this means for our future, especially as it concerns good digital citizenship.

Betsy O’Donovan: Well, I think we are really at the very beginning of this, and even though the way things work on Facebook or the way things work on YouTube or the way Google Search works, is what we have known for most of the lifespan of the internet to date, does not mean that that is what it should look like in 200 years, and we have to take that view of the future, where we have an obligation to our grandchildren and their grandchildren to make sure we get this technology right and that we hold the people who are writing the algorithms and writing the software and creating these data tools, we have to hold them accountable for making it good for America and good for the world, right? This is the stuff of civic life.

We cannot have good institutions, we cannot have a great country or great global information economy if that is fundamentally corrupted by structural racism, for example, or structural economic inequality. The internet really does need to reflect a better society than the one we have today. That means demanding that it be remade until it works for everyone. 

We see, for example, in Europe, there’s a right to privacy, a right to be forgotten, that something that’s on the internet doesn’t live forever.

Monita Bell: I would say even the other steps you were mentioning before, advocating for truth when we see misinformation and disinformation online, and ensuring that we are presenting the truth when we put out information and share information.

Betsy O’Donovan: Yeah. We all get tired, and it’s a lot of work, but it’s so worth doing, to just check our facts, to take a second and just be sure. It also makes us more trustworthy to other people, right? “Don’t let folks borrow your good reputation to share lies,” is the thing I tell my students. Every time I share something that’s wrong, I’ve let somebody take advantage not just of me but of my friends and the people who trust me to share things that are accurate. I find that super-offensive. I’m not willing to let that into my life. I want to be a trustworthy person, and that means making sure that what I share is coming from other trustworthy people. 

Journalists, we operate by an ethical code that prevents us from being particularly activists in a lot of our lives. Good, ethical journalists have to understand that because we get the last word with what we print or what we broadcast, we need to hold ourselves separate from actions like marching, for the most part. Even when we have personal sympathies, the requirements of this discipline are that we remain skeptical and really push ourselves to be accurate.

The one space where I feel like journalism and people who are concerned for journalism must be activist is in keeping a clean information ecosystem. We have every right to be activists around the First Amendment and to be activists around making sure that misinformation is not polluting our environment and confusing the world. That’s our one job. We got to do it well.

Monita Bell: Thank you for taking the time to join me today for this episode of The Mind Online, a podcast from Teaching Tolerance. I’m your host, Monita Bell, managing editor for Teaching Tolerance. I want to give a special thank you to my guests, Meredith Broussard, an assistant professor in NYU’s Arthur L. Carter Journalism Institute, and Betsy O’Donovan, an assistant professor of journalism at Western Washington University.

This podcast was inspired by Teaching Tolerance’s Digital Literacy Framework, which offers seven key areas where students need support developing digital and civic literacy skills, and features lessons for kindergarten through 12th grade classrooms. Each lesson is designed in a way that can be used by educators with little to no technology in their classrooms.

The Digital Literacy Framework and all its related resources, including a series of student-friendly videos, a professional development webinar and a PD module, can be found online at That’s D-I-G-L-I-T.

This episode was produced by Barrett Golding. Thank you to NYU’s Journalism Department and the Audio Technology, Music and Society program at Fairhaven College for recording our guests. Our production supervisor is Kate Shuster, and our music is by Podington Bear.

You’ll find links to all the resources we discussed in this episode at Just look for The Mind Online and find this episode, “You Are the Product.”
If you like what you’ve heard, subscribe and share with your colleagues and friends. When you share on Twitter or Instagram, use hashtag #TeachDigLit. Thanks again for listening, and I’ll see you next time. 


Back to The Mind Online