 It's 1 p.m. on the east coast and that means it's time for the future of democracy. I'm John Sands and I'm really excited about today's conversation. The show is about the ideas and trends shaping the landscape of American life today, and one such trend is the decades-long decline in trust in the news media. As a foundation that cares deeply about how Americans are informed and engaged in our democracy, in 2017, Knight launched the Trust Media and Democracy Initiative to inform solutions to slipping trust in journalism and other important institutions in our society. For this work, we've commissioned and supported independent research that seeks to understand this trend to shed light on Americans' evolving relationship with the news media in the ways in which Americans seek information and engage in the democratic process. A key partner in this effort is the Pew Research Center's Journalism and Media Program. Their gold standard public opinion research explores public attitudes about the news media and how those attitudes can help us understand social and political phenomena like polarization, which has become a fixture of American public life today. Leading that work at Pew is director of journalism and research, Amy Mejel. Amy is responsible for the center's research related to news and information, and she's an expert in research methods and design. She specializes in how technology is changing the flow of news information today and the influence of political identity on Americans' news choices. For a conversation about America's evolving relationship with the news media and the role of public opinion research in our democracy, there's really nobody better. So Amy, thanks so much for joining us today. Thank you so much, John. It's great to be able to join you, and I certainly have appreciated all the support from the foundation these years. Absolutely. It's our privilege. Let's start with kind of a high-level view of Pew Research Center. Tell us a little bit about the center. I mean, for an audience who may have heard of Pew, who may have heard of the Pew Terrible Trusts at some point in their life, tell us a little bit about what the center does and the ways that it informs public conversations. Sure. So the center is actually, you know, we talk about ourselves as a nonpartisan fact tank. And what we sort of mean by that is that we work to inform the public about issues, attitudes, and trends of their shaping the country and also the world. And we do it in a way that is not taking, you know, a stance or a view, but is looking to provide the facts, to provide a foundation of knowledge and understanding about a range of different areas and subject areas. So for example, if I, you know, I direct our research around news and information and journalism, but we have a lot of focus and research in some of our other areas on religion, on politics and democracy, on technology and the internet, social trends, racial and ethnic issues. So there's a whole range of different areas that the center focuses on, but it's really built around the idea of wanting to provide understanding from a fact-based data gathering. And we're known most for our survey work, but we also use a lot of different methodologies. And journalism team certainly does that to a pretty wide degree from survey work to doing content analysis, which was a recent set of research we just had come out to. There's a lot of demography work that the center does and pulling in, you know, data through computational science methods and looking at sort of large mass data studies, focus groups and other kinds of work. So you mentioned that you lead the journalism and media work, but what are some of the other topics at the center, you know, kind of, kind of, in forms? What are the other kind of program areas? Sure. So again, one would be sort of religion, religion in America, religion in the world. You know, there was just another study that came out around the second sort of look at Jewish Americans, which just came out this fall, which is fascinating, really deep dive into that to their views and attitudes about religion and their place in society. We have a whole group that focuses on what we would call social trends and issues, which is everything from same-sex marriage, sexual identity, gun control, you know, issues and areas in terms of how they relate to populations. We have an area of focus on race and ethnicity. So what are the dynamics that are surrounding that science and technology? And obviously, certainly there was a lot of work during the COVID-19 pandemic that that that group is focusing on. And so let's shift to talk a little bit more in depth about your program, journalism and media. Why did you prioritize journalism and media? Why did they set up this kind of vertical within the center? What's so important about those trends? Well, certainly, I mean, if we think about it, how and the degree to which people stay are informed, keep up with what's happening and current events in the world has has really dramatic impact on how the society functions on the direction the society goes. If we think about something like, you know, even the rise of social media at the same time as cell phones and how much that just changed the way people hear about things, their exposure, what their source, you know, how they even get to it in the first place. And then also thinking about sort of understanding some of the industry itself, which is, you know, if we look at sort of the range of sources and platforms that people can turn to now and how through the polarization that you were talking about political polarization, many people have become separated to quite a large degree in their sources of news and information. And what we've seen in some of our recent work for the last few years is how that connects to a differences in views in beliefs and in knowledge about current events. So it really does, you know, sort of shape the direction and the function of society, you know, based on what people are learning about and their views about the role of the news media, you know, even in the country is it important for there to be, you know, a watchdog role. What is the role of a free press in America? In your experience, how do you see this work being used? Like who are the key audiences that you're trying to reach? First and foremost, I'd say, I mean, everything we do at this center, all the research that we produce is made available to the public, you know, in entirety free of charge. And so, you know, that's kind of at the foundation, the mission is to provide it to the public writ large, to be able to use both in the sense of sort of exploring the methodologies and perhaps, you know, using some of those methodologies in their own research, as well as, you know, learning from the findings themselves. But then also certainly policymakers, you know, we see that the work cited a lot in discussions around how to shape policy, whether it has to do with something like you know, I'm just we'll mention gun control or immigration or the FCC's, you know, thinking and discussion on technology and the role of technology and what kind of rules. Certainly other kinds of decision makers, a lot also, you know, in our work too with a local level about how people are being informed. Where are they trying for the news? What do they know about? What are they missing? You know, when it comes to using information flows, it's used a lot in school coursework too. You know, it is always kind of cool to see both in terms of the subject matter learning, but also the methodology. So a lot of, you know, statistical classes and things like that that will use our work. So I'd really say it's arranged a lot of foreign governments, foreign leaders, and certainly in our work to communicators who want to figure out how people are thinking today and how to reach and connect to the public, you know, in a larger way or more dynamic way. I think, you know, a lot of people who read the news will probably get to the third or fourth paragraph of any kind of topical news story about this, you know, story of the day and find some number from the Pew Research Center that talks about American attitudes or opinions on X or Y topic. How exactly does your program interface with journalists and newsrooms? You know, when journalists are working on stories, how do you like plug into that, to the process of them researching and developing their work? Well, we have a terrific communications team that, you know, works to get our work out, you know, to the journalists that work in that field and the news media that work in that field. And so, you know, we don't give anything out until it's published or under, you know, in kind of an embargo where people are agreeing to look at things and we do that on a very wide basis. So we share our work. We do briefings to newsrooms and, you know, certainly in areas around news information, you know, we may do briefings. For example, when we did the local news study that was a big massive study across the U.S. about how Americans are both using and thinking about local news today. And, you know, I did a briefing for sort of all of the editors across the Eastern region for the Fox News, you know, local news networks to talk about those things. And so we will do that. We'll, you know, share our reports and information. And then we get a lot of inquiries from reporters as they are, if they're working on a story, if we have data on X, Y, and Z. And then we'll certainly set up, you know, conversations to talk with them about that. So I want to shift in just a second to talk about some of the, you know, the recent work you all have done. But the question that kind of lingers in my mind every time I see one of these pure research numbers, research center numbers is that it's just amazing how it's amazing how you all seem to have your finger on the pulse of pretty much everything. Like when something happens, there's always a pure research center number to go with it. And I'm curious how you think about that, how you develop projects so that when a story happens, there is an accurate measure of public opinion on it. Yeah, that's a super fun part of the work we do is trying to think ahead and think, you know, where is there really a lack of knowledge or a lack of understanding? Sort of what's the direction of where things are going? So if I think about sort of our area, you know, and, you know, where are things moving towards? And so as we got to sort of seen the separation and where people were turning for their news, and, you know, even more dramatic differences in what they will trust and distrust, right, pure out reject. Then the next thing is, well, doesn't matter, you know, doesn't matter in terms of, you know, the views, the knowledge, the sense of reality. And so being able to then go that next step and look at sort of what kind of connections we see or don't see when you look at groups based on what their media diet is, right? So it's trying to think ahead and then figuring out, okay, here's the thing that is an answer that may well be the big next thing we need some information on. Now, how do we go about doing it? Is it a content study? Is it a survey? Is it a combination? Is it auditing something like, you know, the top news channels on YouTube where YouTube was getting talked about more and more, you know, as a place people were going for news? So we did a small survey component, and then we did an audit and a content analysis of actually what's there? Like, what is it that people are getting? What's the content? Who are the players in that space? And what are they seeing? So that was really offering, you know, a very sort of first foundational sense of YouTube as a source of news. You've printed a really great question to close this out, but I'm going to say that to the end now. And by the way, I want to remind our audience that if you have questions, feel free to add them to the chat. Let's shift and talk a little bit about some of the recent work that your team has put out. You did a big study late last year on news consumption in the digital era, and then more recently, a kind of franchise study that you've been doing for decades now on the first 100 days of the new presidential administration. Let's talk first about the news consumption study. What was unique about the approach that your team took when you were developing this particular site? Like, were there particular methodological challenges that you were trying to address? And also, of course, you know, what were some of the key findings? Yeah. Thanks. I'd love to share a few. And this, so the news consumption study was actually sort of a really big methodological exploration to look at closely and share publicly what we see in, you know, are we still asking questions about news intake? So there's one thing as views and sort of attitudes about things, but when you're asking about consumption, right, about what things people are doing, and you've got a thousand different ways people could be accessing news. So what are they even thinking about when you ask them, you know, about how much they follow the news? So we wanted to look at sort of in today's digital era, how well is it working and other ways we can improve on it? And actually, if we can share, I think I have some slides, I can share a few of the key things. And so it really was a very multifaceted just make sure you make this go forward here, multifaceted exploration. And so we used a number of different data sources, and it was really fun to dig in through these different ways. So we started with cognitive interviews, which we hadn't done a whole lot at the center before, which is where you have a small group of people. But they're taking a survey online, at the same time, they're talking with somebody who's running these cognitive interviews, and it's a firm we hire, so we weren't doing the actual interfacing ourselves to see if they're understanding the questions that we're asking, you know, well, what are you are you stumbling over this one? Why? So there's a ton of intake that happens in real time, as people are going through this survey. So then based on some of the things that we learned there, we then went into sort of split form different survey experiments to test certain changes we might make, which I'll share about a little more. And then we did a full nationally representative survey where we also, for a subset of that group, were able to gather their passive data, passive digital data, because we also wanted to take a look at a lot of talk and we've experimented with using sort of passive data to analyze certain things. And so we wanted to see, well, how well does that do? Is that or is that a viable alternative to survey work? And so, yeah, there's a passive data part. And so some of the things we explored, for example, or how much do people understand the things we're asking about? So we may think, oh, sure, streaming, you know, everybody knows that or, or, you know, using a streaming device or getting push notifications. Well, how much does the public even understand? If we ask about that, can we answer that accurately, let alone use it? And so what we saw is that most people knew a fair amount, at least something, a little bit about most of these newer, you know, newer kinds of technologies and devices. But we actually, and I didn't put the dating, but what we found was very few are using them. And so there was sort of a general sense of knowledge, but there's not a lot. There's there may be, you know, less actual practice in some of these areas than than one might think if we weren't doing a survey work there. But then we also said, you know, well, how much does the public know about original reporting? So for asking about sort of news producers, you know, people producing organizations producing the news, one of the things we saw here was that there's actually not a great understanding of who's producing their news and who's an aggregator who's sharing, you know, through other kinds of platforms. So you see here a 57% that said they didn't know if Google News does its own reporting or not. Facebook, you have, you know, quite a range there. Some of the standbys ABC News, Wall Street Journal, the report, but still that was only about half that said yes, it does its own reporting and the largest segments in many cases where people say more not sure. We also wanted to test how you use different wording. So there's some suggestion about asking how often do you get news versus sort of putting something in a typical week to make people sort of, you know, think in that sense about their past week. And what we found when we experimented with this was that there really wasn't much difference, you know, people really, really doesn't seem to to make much of a difference if you if you change that wording. And another thing we explored was if you when you ask that often sometimes rarely question, what are people, what does it mean? What does often mean for some people versus other people? What does rarely mean for some people? And then we also ask people separately how many days a week they got news from these different platforms. And what you see in these bars in that first column there is that when you the often number really the solid majority except for radio here was pretty much at that 70s a week. So we're pretty much talking at least at least every day. And the rarely is pretty much down to zero or one. So we, you know, they do seem to represent pretty well in that sometimes it's kind of everything in between. So, you know, you're sometimes really is the murkiest but it's it is also the one that's in the middle. And so then if we look at sort of the passive data exploration we did here, and this is a lot to look at but it's sort of laying out one of the things we wanted to get a sense of too is so so a lot of the survey work that we do asking about news consumption is wanting to get a sense across across all the platforms all the places you could go, television, radio, the different kinds of sources, social media. And so this is just digital. So it's only digital behavior. And then within that it's also pretty limited. There's a lot of things that aren't tracked. So it will track one device. What somebody would say is their primary device. But then that's all they track. And in certain places like if it's your work computer that would be your primary device, it oftentimes work structures and companies don't allow that tracking. And so then you're in a different device. It doesn't track in app behavior. So if you're in Facebook, you know you're in Facebook, but you don't know what you're doing. So sort of the end does that matter, right? And so one of the things we saw here is that in most cases, the responses when you ask people about how often they get news work higher than what the digital data showed. But where there were mismatches tended to be the things in areas that were people said they were heavy social media news consumers and you don't get that data, right? Or they get into a lot of different ways or they have other devices. So it sort of suggests that there can be value in it for maybe certain very specific types of studies. But as a general sense, it has challenges. And so we did really find that surveys continue to offer a viable method. There are ways we can improve and we have started to make some changes in our wording in those areas where we did see some sort of difference. This is super fascinating. At night, we partnered with Gallup, as you know, and have tried a couple of times now to do something kind of along the lines of what you did using an online platform to try to come up with some sort of comparative way of looking at self-reported news consumption patterns versus what people actually do when given the choice on a platform to look at various news stories. So this is just my mind's blown and kudos to you all for being able to actually land it. We got a question for the audience which is around different ways that different races and ethnicities engage with digital news sources. Does this report touch on that? This report doesn't. It doesn't really do kind of a population study, so it was more looking at the broad. Although somebody could dive into that data, it's a little bit smaller sample size than we would normally do for something that was a big population split looking at things demographically, but it's methodologically you could do that. We've certainly done other work where that has been more of a focus. And you'll see, for example, in general, there was a greater social media reliance on news among certain ethnic groups and black adults than you saw for whites or certainly differences by income in terms of the types of places that people are turning for news and even how much they're keeping up with news. And the other thing we saw in, for example, when it came to COVID-19 and the information needs was that black Americans were looking for information or following very closely information about COVID on a range of things at a much higher rate than were white Americans in terms of access to testing, hospital access, where can I get X, you know, the kind of unemployment benefits and things like that. And so real difference is not only in news, but also in information needs. But it's important to note that, like you mentioned, the data set is available for the general public to download if anybody wants to do their own analysis and try to figure out those issues. Yep, once we have it sort of ready and together and if there's ever anything you don't see or you can't find, please reach out to us and we will do our best to be sure we can give you access. Okay, let's shift and talk a little bit about the most recent study you released, which we've been calling the 100 Days Project, which looks at the first couple months of coverage of the new presidential administration. Tell us a little bit about that study and how far it dates back and what you've learned as these different iterations of the project have played out. Yeah, it's another really, it's one that I've just enjoyed doing each time. So we've been studying this since the start of the Clinton administration in 1993 and we actually sort of went back to get that information because we launched our journalism team a little bit after that. But it's sort of looking at how the news media are covering the first months of a new administration. And one of the cool things has been how this study has evolved over time, not so much in what we're studying or capturing. So we look at the assessment of the stories in terms of if you look at what all the different sources are saying is there a positive assessment about the administration, whatever that topic is about the administration or negative or is it neither is it somewhere in between the topics that are being focused, the sources that are being cited in there. And so we've added some dimensions since in the content analysis, but it's kept a number of those areas pretty similar. But then also when you think about back then, right, in 93 and those first couple, the media landscape was so different and so much smaller. And so we've certainly made a lot of changes to be able to include more, at least the range of the media landscape. And we can't study, you couldn't get all information in there in terms of the sample, but we try to offer a mix across platforms. And then in these last, in the starting with the Trump administration, we grouped outlets according to what their audience makeup is. So if their audience leans to the right politically, if their audience leans to the left politically, or is more mixed to see differences in coverage. And so if you look at the Biden, and so I will also say, so this is a study the first 60 days that comes out at the 100 day mark, and that has changed a couple times year to year. But you can see overall, most were of the stories were neither positive or negative there in assessment, but slightly more negative stories than positive. But look at the differences based on the type of media outlets, right? So if you have outlets with a rightly need audience, 78% that offered a negative assessment when you sort of tallied up all the statements from sources and opinionated statements from journalists versus quite a different, you know, struck balance there when you looked at those with mixed or left-leaning audiences. And then you compare even this overall to what we saw with coverage of the Trump administration the first 60 days in 2017, where you had in some ways kind of a reverse there, right? With your right-leaning audiences, outlets with right-leaning audiences much more negative for Biden, far more positive than any of the others, right? Then those with mixed audiences are left. And here outlets with left-leaning audiences, 56% negative assessments in the Trump administration. So there's certainly differences in the coverage, but in many ways also in terms of the focus of the stories themselves. So if you look across all the outlets here, there's far more focus on leadership and character in those early days of the Trump administration than policy and agenda and the reverse here. And I'm just going to offer one more quick thing here where you can see that one of the neat things is if you're able to look at all of this over time. So what we have to do to do the overtime comparison is actually put together a different sample. So we still have stories from print, from the print articles that we'll get and use as opposed to the digital because the structure is different in the papers. And we have the networks. And so it's a much smaller mix of outlets, but we can do that comparison over time and be able to see the differences across these areas. The long-term trends are just absolutely fascinating. Every time I looked at this when this study came out, my mind was just completely blown. We're getting close to time, but there are a couple more questions I want to touch on. One gets at the difficulty that seems to be growing in terms of our ability to accurately measure public opinion. It's just with the different kinds of platforms that are available to people and the different ways that people expect to be reached is changing. It means that survey outlets and survey researchers have to develop new methods to be able to accurately reach folks and take their temperature. How are you approaching this? Yeah, so certainly one example, and it's interesting because a digital era has both, in some ways made it harder, but also made it easier in being able to reach people. And so there are just different kinds of challenges. And one of the changes the center made was to transition from doing any phone calls, which is sort of the hallmark back when we started, to doing it all online through the American Trends panel. But there we put so much effort and methodological rigor in that recruitment process to get the people so that we can be sure we're offering a nationally representative look at the public, but that's really hard. And there are certain parts of the population that may be more resistant to taking part in that. So then how do you get them to be involved and to trust? And that puts for us another important reason, of course, you want to do it anyway, but to be sure our work is solid and we're methodologically strong and we're really nonpartisan is because we need to that trust and the panel members also, the people that we want to feel like they can take the survey and express their views. One of the positive elements of moving digitally is that there are certain sensitive topics because you don't have a person that you're interfacing with that you have to answer to. There are certain sensitive topics where the research shows that people are being more honest in many cases and how they're answering those kinds of questions. And just before we wrap up, since you since you teased it, what's the next big thing? Well, what's the next thing in media research that that journalists are going to be writing about? Well, you know, I would certainly say, I mean, we've been doing a lot more, you know, in the last couple of years around this information, you know, that sort of, you know, what are the facts people agree on? Are there facts people agree on? But also, I think getting to those, you know, the private, the axiomoron of the private social spaces, right? So those private social spaces that people are using as their news intake in many cases now to try to understand the dynamics and the impact on, you know, people's people's knowledge, people's views, people's sense of reality and ultimately, how does that impact society overall? This has been fascinating. Amy Mitchell, Director of Journalism Research at Pew Research Center. Thank you so much for joining us today. Thank you so much, John. All right. Our Night Live programming will return in two weeks with a special episode of Informative Engaged. Take a look at all of our live program. You can see recordings of past episodes at kf.org slash night live. Thanks for joining. Until next time.