 Trevor Burrus And I'm Aaron Powell. Trevor Burrus Joining us today is our colleague Emily Eakins, a research fellow and director of polling at the Cato Institute. Welcome to Free Thoughts Emily. Emily Eakins Thank you for having me. Trevor Burrus You've been studying public opinion for quite a while. Actually, Emily and I were actually interns together and she was doing it then in 2010 and at that time you were studying Tea Party and a lot of sort of polling about what the attitudes that go into, especially I guess Republicans, political philosophy when you're doing Tea Party work and then we had the Trump phenomenon. So I guess you're a really good person to ask the question, where did Trump voters come from? Emily Eakins That's a big question and it's the question that everyone keeps trying to answer. You want me to give it a try? Trevor Burrus Yeah, give it a try. Emily Eakins Well, so the first thing I will say, I think that people have been a little too quick to try to look for a simple explanation like what's the one thing that explains why people voted for Donald Trump since he's such an unusual candidate who has said so many things that have offended people. People think how could he have won? So I've just recently conducted a new study with the Democracy Fund Voter Study Group. This was actually put together by the Democracy Fund. They brought together an ideologically diverse group of academics and pollsters to field an original survey right after the election and do some really in-depth analysis of voters of Hillary Clinton and Donald Trump and try to understand the dynamics. And I contributed one of four reports that were released this past week on the 2016 election. And what I found is five different types of Trump voters came out to vote for him on election day and I think that that's really important because people keep looking for this single explanation to explain this surprise and I think the answer is there is no simple explanation. There are certain things that make this election distinctive and we can talk about that. But at the end of the day Trump's voters are a typical coalition which is how it always is when it comes to politics. So who are those five? What are the five kinds? All right. So the five kinds. The first one I call the American Preservationists and I think these most closely align with the media accounts of a Trump voter. They have lower levels of education and income. They're under-employed. Among the working age group of this group, half are on Medicaid which is quite a lot as you can imagine. And you wouldn't really think of them as Republicans. They want to raise taxes on the wealthy. They're very concerned about Medicare so they're more economically progressive. So what made them vote for Donald Trump? Well we can't be sure but what I can tell you is that they are very, very skeptical of immigration. Not just illegal immigration but legal immigration as well. About 8 in 10 want to make it much harder for people to legally immigrate and as you both know very well the system is already very difficult to navigate. They want to make it even harder. So they seem to really been drawn to Trump on some of that. They also have less favorable attitudes towards racial minorities. That's another kind of, you've seen that caricature in the media but that's one of five. Can I ask about that one really quick? Is there a reason why they're anti-immigrant? So are they very skeptical of it because they think immigrants take jobs? Are they very skeptical of it because they think immigrants cause crime or because immigrants change American culture? I'm so glad you asked that question. So by doing this analysis we could kind of see that it appeared that different motivations drive people to be concerned about immigration. So some of the different reasons could be security concerns. Concerns about fairness. People talk about it's unfair that people get to come in illegally when other people come in legally. Others are concerned about assimilation. And then there's also those who kind of are just flat out ethno nationalists that don't want people who aren't white coming into the country. And so where do all these groups fall? It's always hard to tell what motivates a person, right? They usually won't tell you. But for this group, I can tell you a few things that give us some clues. About half of them thought that you need to be of European descent to be really American, to be truly American. That sounds pretty ethno nationalist. Yes, it does. Now again, there's the other half that didn't feel this way. But still, I mean that was very surprising and shocking. None of the other five groups came close to this. This group is the most likely to think of their own identity, which was mostly white, as being very important to them. Like most people don't go around thinking about their race. But this group does. And they also feel like it's something called linked fate in the academic literature that some people believe that what happens to their racial group will impact them. So they're more likely to feel that way. They're more likely to think that whites are this group. And so what happens to, quote, you know, other white people will affect them. And so that's, I think, the media caricature that we definitely saw going on when Hillary Clinton talked about the deplorables. I think that those individuals were more likely to be found in this particular group. You know, they're the most likely to think that you have to be Christian, to be really American, to have lived here almost all of your life, or actually have been born here. And that makes immigration difficult. If you think people have to conform to a really kind of narrow set of characteristics to truly be a member of society, especially things that are immutable and things that people can't easily change, that makes it very hard to become accustomed to immigration. But other Trump voters were very, very different on these very same questions. So the American preservationist was what percentage of the total? About 20%. And then the next category would be? Well, how about I give you a contrast. I'll show you the group that was the most dramatically different from the American preservationists. I call them the free marketeers. They make up a slightly larger chunk, 25%. They have the highest level of education and income. They are very favorable towards immigrants and racial minorities. They look just like Democrats on those questions. They're like Democrats in terms of wanting to make it easier to legally immigrate to the United States. But they're also very fiscally conservative. They don't think government needs should be so involved in healthcare. They don't want to raise taxes on the wealthy and they're very supportive of free trade. So basically all the things I told you about the preservationists, the free marketeers kind of in many cases had opposite answers, not on every single one, but had opposite answers. And so you think how on earth were these individuals all voting in the same party? It's not that uncommon. Parties are coalitions of very different types of people. They just don't seem to realize sometimes how different they are. But what they do have in common is that they both really hated Hillary Clinton. Really? Did you have a question on your survey that was like how with like maybe faces, kind of like the pain scale, it's like, how do you feel about Hillary Clinton, like smiley face, or something to try to measure the hatred of Hillary Clinton? What we use is something called a feeling thermometer where we ask people. See, it's like the pain scale. Well, okay. You rate someone on a scale of zero to 100, zero being very cold and unfavorable, 100 being very warm. And people did not like Hillary Clinton, which, and what's interesting though is that for some of these voters, they liked her in 2012. They turned against her in 2016, which shows that, you know, all the coverage, all the negative media coverage, her emails, her servers, the charges of corruption and all of that did seem to have an impact on these others. When you said some of these voters, I mean, we're talking about specific, because did you have data from the same person and what they did in 2012, or are you just taking groups? You're absolutely right. I'm glad you mentioned that. We do have data on these same individuals from 2012, and that's what makes this data set so exciting. So we did field this survey in 2016, but what we constructed is something called a panel survey or a longitudinal survey. So we asked people to participate in the survey that had also participated in a survey in 2012. And then we asked a lot of the same questions too, so that we could see how their attitudes change, which is how we could see that some groups changed on trade and others didn't. And so we can go back and see, back in 2012, how did you feel about Hillary Clinton? I'm not asking you to remember how you felt, but I'm actually looking at what you said. So it's far more believable and credible. And one of the groups in particular that we haven't talked about yet, I call them the anti-elites. They make up about 19% of the coalition. Half of them had a favorable opinion of Hillary Clinton in 2012. They're pretty on economics. They lean progressive. They're pretty friendly, moderate on immigration, maybe not quite as liberal as Hillary Clinton on immigration, but you think, why did they not vote for her? I mean, something happened that really turned these voters against Hillary Clinton, we can all just guess what we think it was, but obviously all that negative media attention made a difference. The thing that strikes me about, so the first group, the American preservationists, was that they're called? Yes. The line from them to Trump seems somewhat clear to me. The things that they want to my mind are abhorrent, but Trump wanted those abhorrent things too, and so it makes sense for them to vote for them. The second group, the free marketers, these people just terrifically naive? How do you get from having that set of beliefs to thinking Trump is your guy who has campaigned against all of those beliefs? Well, a couple of things. They're loyal Republicans, and a majority said their vote wasn't a vote for Donald Trump. Their vote was a vote against Hillary Clinton. So if you really, really despise Hillary Clinton, then it's just who's the other guy and you vote for them. But so then do these same groups show up during the primaries? Yeah, I mean they voted for different types of people on the primaries. So as you can imagine the preservationists, you're absolutely right, they are the core set of Trump supporters. They are the ones that catapulted him through the primaries. The free marketeers, a majority of them voted for one of the other 16 candidates, primarily Ted Cruz and Marco Rubio, if they voted in the primaries. The anti-elites, a lot of them also did vote for Donald Trump, but if they didn't vote for him, they voted for John Kasich. So you can kind of see there are different flavors of Republican. The preservationists, Donald Trump, was truly their flavor. And with the anti-elites, so they disliked Hillary Clinton a lot or did they dislike elites across the board? You know, it's hard to say. We could definitely see that they dislike Hillary Clinton a lot. We can also see that they don't like elites. And then if you look at their immigration attitudes, you know, a plurality of them supported a pathway to citizenship for unauthorized immigrants. They're not like super hard line on immigration, but compared to where the Democratic Party platform was, they weren't quite there. You know, they were a little bit less feeling comfortable with immigration. And in particular, it seemed like that might have been related to the temporary travel ban on Muslim immigration. So that was one thing that made these Trump voters kind of stand out from the non-Trump voters. So majorities of Trump voters supported this, the idea of a temporary travel ban. But that being said, the intensity of the support was very different. So those preservationists, like 80%, they support this kind of policy, strongly support this kind of policy. For the anti-elites and the free marketeers, a majority of them supported the policy, but only like one in 10 strongly supported. So what you can kind of get a sense of is that they don't want to support this kind of policy, but they're frightened. They see things in the media, they see things that are happening in Europe, and it scares them. And although we can have our colleague, Alex Nerasta, explain the statistical probability that they would be harmed, that's not how humans usually think. People are strongly influenced by the stories they see in the media. And so I think the fact that Trump kind of was rising in the polls right after the Paris attacks, after the Orlando shooting and what happened in San Bernardino, all of those things scared people. Trump responded and he responded in a way that Hillary Clinton did not. And that may have helped him among some of these groups who otherwise would have been a little bit more reluctant to vote for him. So we have American preservationists, free marketeers, anti-elites. Fourth is... The fourth, actually are the largest group, but they aren't quite as distinctive. I call them the staunch conservatives. They make up 31%. And they're just conventional, social and fiscal conservatives. They're loyal Republicans. They are gonna vote for their Republican candidate. They're not as hard-line on immigration as the preservationists are, but yes, they are skeptical of it. It seems like they might be skeptical of it for slightly different reasons than the preservationists. They weren't like the preservationists in saying that you had to be white to be American, but they seem to be a little bit more concerned about assimilation and ensuring that the community has a sense of cohesion and belongingness. And so for individuals like that, immigration can pose some challenges in that at first it can be hard when you have different groups of people with different traditions, different languages and of coming together. And if you really like people to kind of be cohesive, that can be challenging. So I think that's what we saw with this group. They're very fiscally conservative. So they look a lot like the free marketeers on all of the kind of the role of government in the economy issues, but kind of in the more in-between the free marketeers and the preservationists on some of that immigration issues. And I'm sure that they would not be caught dead voting for Hillary Clinton. Right, they do. They're a staunch conservative. I mean, that's another thing too, is that the Clintons have formed such the absolute devil of the Republican Party for 20 years now that they wouldn't, maybe if they like, I don't like Trump, but I will never, ever vote for Hillary Clinton. That very common opinion. You're absolutely right. If you look at what they said in 2012, it was like 4% so they were favorable and probably those were mistakes. Those are people not really paying attention to survey. Hanging chads, pregnant chads, things like that, yeah. So what's the fifth? The fifth, this is a small group. They're only 5%. I called them the disengaged. And really these are the types of people when they take a survey they just say don't know, don't know, don't know. They don't really have a lot of opinions except for issues of immigration and elites. So it's like, if you say I don't know about every public policy question I ask you, except for issues of immigration and distrust of elites, that tells me something about who you are and why you voted for Trump. That fits with the Trump rhetoric. They tended to be a little bit younger, have a little bit less education. They don't pay attention to politics, but they do have a skepticism about immigration. Donald Trump made it very clear where he stood in a lot of those issues and so it got their attention. So we've got these five groups making up the Trump coalition. Why weren't we talking about this more nuanced view up until now? Why during the campaign and then immediately after the election was all of the conversation about the first group? Well, I mean think how long it took us to just go through all five of those groups. By now people have changed the channel. They've stopped reading the op-ed. And it's easier to have just a simple explanation. I actually have a document where I've been cataloging all the different theories that come out about why people voted for Trump. It's collective narcissism was one racism, nativism, populism, class anxieties. Rust belt woes, yeah. Exactly, because that's easier for people to remember. And I mean, reality is far more complicated than that. It's all of those. It's all of those and for different people. That's another thing is that people think, oh, well, sure, okay, it's not all racism. It's economic anxiety at the same time. But then they kind of think that that's true for all of the voters. They were all a little bit racist, a little or maybe a lot racist and then a little bit concerned about the economy when perhaps it was that some people have racial animus towards people of color and then others do not. And some are concerned about the economy while others do not. And that's I think the piece that was missing. And why do we care? I mean, that's another question. I think there's a couple of reasons why we care, but I think it's important to understand how diverse this coalition is if we are to understand kind of the future of American politics. Well, it seems that this sort of goes into a lot of the work you do in general and things that you spend a lot of time thinking about which is opinion about a public opinion and why people hold this. So if you're sitting on one side, if you're a Democrat, most Republicans look the same to you. And it's really easy to tell yourself a story that they're just racist xenophobes. And Republicans tell a similar story about Democrats. They say Democrats are all just whatever socialists, slow socialists as I think what you just said on the last episode. So you definitely typecast the other side and lose the nuance. And that gets into the bias that is often in polling, I think, in the way people think about politics that are not really thinking about a nuanced way that there's a lot of reasons to hold opinions that are more than just the other side is stupid and evil. You're absolutely right. And I guess to that point, yeah, it can be used as a weapon. It's a lot easier to try to delegitimize your political opponents if you try to like boil it down to a strawman. It's very easy to knock down. I mean, I'm not defending either side here. I'm just like reporting the data as it is. But it seems like that that is probably part of the reason why people grasped onto those kind of single theories. But they did it with the Tea Party too, which you've done work on that, as I mentioned at the beginning, trying to say, well, the Tea Party are just racists. And I'm thinking specifically about a book review you did for the Cato Journal where discussing one polling set trying to figure out the Tea Party and they basically just conclude that they're all like KKK members or something. I mean, I'm overstating this a little bit, but there was no nuance to their analysis. And you see this a lot in polling. But they just kind of put their own biases in there and say, you obviously knew these authors really didn't like Republicans and they do some polling and wow, we were right. They are KKK members. What a surprise. I think that a lot of the work that was used to describe the Tea Party would be more accurately used, more accurately applied to certain segments of Trump voters, particularly those preservationists that we were discussing. Like you said, I wrote my dissertation on the Tea Party movement and I did something similar on them that I did with these Trump voters. I did a cluster analysis, the statistical tool, it's called a latent class analysis. And you basically allow a statistical algorithm to try to find kind of these natural groupings of people. So it's less dependent on your own judgment. Your judgment can impact it in some respects, like what questions do you even put into the little statistical algorithm, but what it spits out, you aren't really controlling. And when I did that, I found several groups within the Tea Party. And the Tea Party, the central thrust of it really was more about limiting government's role in the economy and far, far less about immigration, changing demography, racial issues. It was far more about the economy spending and deficits. With Trump, I would say kind of the central, where is kind of the center of gravity? And the center of gravity with Trump is far more in the area of immigration and concern about demographic change. Do any of these five groups or the coalition of them represent something new, like a big shift in American politics or are these kind of groups that have always been there and they just happened to coalesce around Trump? Well, there's two things. I think what surprises people is to see some of these groups, like the preservationists or the anti-elites that hold views that seem very out of step with certain Republican orthodoxies. So even though Republicans may not actually cut spending, they talk about it more so than the Democrats, right? But the preservationists, that's not even their rhetoric. They're not speaking the language of tax cuts. Many of them actually used to be Democrats. About a third of them, four years ago, said that they identified as a Democrat. And so it's surprising that you would have individuals that are so different from the Republican Party's kind of stereotypical platform in the party. So I think that surprises people. I don't think it's unusual. One of the narratives of the election is that it wasn't so much that Trump won, but that the Democrats lost the election. Is it possible using the statistical methods or ever to kind of control for Hillary hatred, such that like, can we answer the question of had the Democrats run anyone but Hillary, they would they have won? This would be like one of Alex's synthetic controls, Alex Norasta, when he tries to imagine a city, if immigrants didn't come, can we run something where some very just stereotypical Democrat runs? This would be like in baseball, there's a thing called winds above replacement where you postulate the average baseball player and then you figure out how much better or worse some player is. So we should be able to do that in politics. I'm not sure how you would do this. Statistically, counterfactuals are always very difficult to try to prove, but I mean, if you look at 2012, Barack Obama won a lot of the preservationists in the anti-elites, which really surprised people because people say, well, if the preservationists have so much animus towards racial minorities, why did they vote for the first black president? And it shows it's just people's attitudes are far more complicated than people realize and he had an economic message that resonated with them. Hillary Clinton did not emphasize those issues the way Obama did. She seemed to focus more on kind of identity politics. I would argue Obama didn't really do that during the 2008 and 2012 campaigns. And as a consequence, I think he won over these types of voters. So what would have happened had it not been Hillary? Well, it depends on who the other guy was or other woman was. And if they had a message more like Barack Obama, perhaps they would have won because it's seen, well, I think most people were surprised that Donald Trump won. Political scientists have these models where they're able to predict the outcome of an election based on just economic indicators alone. That had predicted a Republican win regardless. I thought that there was some limit to the efficacy of these models, but apparently they're pretty strong. So that would have predicted that any Republican would have won regardless if it was Donald Trump or Ted Cruz or John Kasich. But perhaps if the Democrat had a message like Obama that was more unifying and had kind of this economic element to it, they could have captured a lot of the preservationists and the anti-elites. Let's talk a little bit about polling itself because I think a lot of our listeners, everyone knows that these polls happen and especially in an election year, they come out every week and there's the Pew and there's the Rasmus and there's all these different names. How is polling generally conducted? I mean, are those multiple ways, but what's the general process if you were putting this together to call people or get them to come over or fill out a survey and then how do you kind of work with the data after that? Well, there's several different ways to contact people. I mean, in kind of the olden days, people would identify people based on addresses. They would figure out what a representative sample would look like and they would fly interviewers to the cities and they would literally walk to the door and knock on the door and sit down with a family and or a person depending on what kind of survey it is and conduct the survey. Sounds expensive. Very expensive. And as companies and government decided they didn't want to spend so much money and technology was evolving and more and more people were getting access to telephones in their homes, and we were talking about a long time ago, people started to transition and a lot of people pushed back. They said, look, not everyone has a telephone in their house. You're not getting a representative sample. And they said, look, I mean, more and more people are getting telephones in their homes. This is prohibitively expensive. I think we can do a good enough job. So then they switched to the telephone interview and so they'll have a list of questions and they'll have people on a call center and they have machines that will call people and they create these representative samples beforehand and then someone will call a person and ask them if they would participate in a survey and they'll ask them the questions. Now that's becoming prohibitively expensive for a couple of reasons. One, that more and more people are not using landlines anymore. They're using cell phones and there is a government regulation that says it's illegal for a machine to call a cell phone. You have to have like a human being actually dial out the number. And so that's very expensive because you have to call like 100,000 people or something like that for these surveys. So to have someone dial 100,000 different numbers is just insane. So people are doing that. They have big call centers that will do half landline, half cell phone, but this is really given an incentive to look for new survey methods. And in addition to that, people are taking surveys less and less on the phone even if you do get them on their cell phone. They're just like, I'm too busy. I'm in the middle of something and they don't take the survey. And so now with technology, with the internet, people are starting to switch more and more to serving people using the internet. Now what I mean by this, this isn't like where NBC.com sets up a poll and says, who do you think won the debate? And everyone voted for Ron Paul, like 90%, right? That is not what I'm talking about. This is, I'm talking about firms like Ugov or Knowledge Networks or Ipsos. And what they do is they just create these huge panels of people's emails and they will contact you. They will determine if you should be sampled and they will contact you and ask you to participate in a survey. And you get a unique link, you click the link and then you can take the survey online. What's really great about this way of surveying is that people don't have to share their opinions over the phone with a stranger. So they're more honest with you. And you can imagine the impact that has on issues today, immigration, Donald Trump, Brexit in the UK, whether or not people feel comfortable telling you their true feelings is probably going to be better ascertained using an internet survey than over the phone. Is there a skewing in kind of the kinds of people who respond to internet surveys? Like, there seems to be like, you could imagine there are certain demographics, certain kinds of people who are more likely to answer a survey that shows up in their email box than others. Look, in any kind of survey method, there's always a problem with non-response bias. And coverage issues. If there's certain types of people that would just never even have a chance to be included or if there's certain types of people that even if they had a chance to be included, they would always say no. That's always been a problem with surveys. But what I would suggest is when it comes time for elections, you can actually look at what the survey predicted the results would be and compare that to the election outcome and see how good of a job that they do. And then also, you could also compare these surveys to kind of large-scale census data collection activities. You can kind of compare the survey data to those as well and see how good of a job they do. Now, people are pushing back now because they think the election, the polls were so bad. They actually weren't that bad. The election result was in the margin of error. And Hillary Clinton did get more votes than Trump. Is that true for Brexit too in the British general election where all these things with the polls just seem to be inaccurate? It seems like a lot of the polls got it wrong in the UK. Although, if I'm remembering correctly, some of the online survey firms, like YouGov, did a pretty good job predicting what was going on. And people think that in part that might be because it's online. And if you are afraid to tell someone because it's not quote politically correct to say you support Brexit, you'll say so on an online survey, but you won't tell them over the phone. So when you are doing your survey research, when you're conducting a survey, do you partner with one of these firms? Like I assume you're not setting up a machine that's calling people from your office at Kato. Correct. I just hear they're just dialing the phone all day. So it's just so you write up a list of questions and then pay a firm to conduct this thing? That's exactly right. And when you write the questions, that kind of goes, my question was, which is biasing a question. You and I talk a lot about writing these questions and how they can be biased. And when I was asking about the, finding racial bias in the Tea Party and stuff, there's ways you can ask things that really can force. It's sort of like a force and magic where you can kind of force someone to take a card. There's ways it seems like you can ask things where you can look at their question and say these sort of questions are really, really bad. Do you see that a lot in terms of how people use these questions to bias their results? From the reputable firms, not too much. And we do everything we can at Kato to make sure that our survey questions are unbiased and straightforward and that we're doing our best to measure what people actually think. But there are some limitations to how you ask the question that can create some of these problems. Even if you have, you don't want to insert any bias at all. And one of those is if you ask a question without any costs, that's what most of the reputable firms often I think find themselves doing. And part of it is that it's hard to insert all the possible costs. If we pass this repeal and replacement bill of the healthcare law, it could have X, Y, you could have a hundred different consequences, right? Are we gonna pull about all of them? And so what we often see is pulling about benefits. As though policies are benefits only. So in healthcare, we saw things like, would you favor or oppose a law that would allow children, of course they call them children, to stay on their parents' health insurance policies until they're 26, even though most people would call a 25 or 26-year-old an adult. Worded that way. Worded that way. Aaron just said a nerd's breath, I wouldn't. I just don't know. I wanted to point everyone out to that. If you follow Aaron on his Facebook, he really loves the millennials. Technically, you are an adult by that age. But again, wording aside here, so these questions will find like 75% of the population say yes, because why not? Now what we did in one of our surveys is we asked that same question the same way that everyone else does and found the same results. We're not trying to manufacture results, we found that. But then we ask a follow-up question and that's where I think we're really adding some significant value is by adding these follow-up questions so that we can show the nuance. And this time what we did is we inserted real costs that come from academic studies. So a new study coming out of Stanford, I believe, has found that this policy, it's called the Dependent Coverage Mandate, you know where children are allowed to stay on these plans until they're 26. These economists found that this policy cost workers on average $1,200 a year. And this is whether or not you have a dependent child. So you could be 50 years old with no children living at home and you would be losing $1,200 a year. And it's not just one time it would be like every single year if you have employer-provided insurance which is many, many people. This is kind of the median voter, right? So we inserted that into the question, you know, would you, this is the follow-up question, would you favor or oppose allowing these young adults to stay on their parents' plans until they're 26? If it cost you $1,200 a year, wanna guess what happened? I bet it changed. I'll go on a limb and say that one. It flipped, it flipped. Strong majorities oppose the policy now when they learn that it would cost them $1,200 a year. Which is what many of our colleagues are constantly always saying is that there are all these unintended consequences. Obviously no one wanted to charge these people this much money, maybe some did, but a lot of them didn't realize they were gonna do it, right? They thought it was a free benefit. So with polling, I think that's where a lot of the problems come from, where we say do you wanna increase or decrease spending on education, on healthcare, on veterans, on roads? And with no cost, I mean, are we gonna cut spending somewhere else? Are we gonna raise taxes somewhere else and on whom and by how much? Without those questions, without those costs included in the question, what they essentially ask you is, do you like education? Do you like children? Do you like veterans? And 75% of people say yes. 75, wow, that's... Oh, it depends. It depends. I guess 75% of people probably like children, and maybe veterans, education. It depends on the question. Do you like fun? Yeah. Yeah, I like fun. And then a lot of our friends kind of on the progressive, economically progressive will say look, Americans really are on agreement with us. They wanna raise spending on all of these programs. And my response is because you inserted no cost into the question, and all you're asking them is if they like the outcome. So what I think we're doing here is that we are providing very needed nuance to these types of policy questions. Like yes, people would love a free benefit, but they do not implicitly associate a cost with that benefit, but when you do, when you provide that for them, we find out that Americans make trade-offs in a way, in a much different way. They don't wanna raise their taxes. They don't wanna cut spending on these other areas to make room for this new program. If this nuance is as easy as adding a follow-up question that just mentions even a cost, why aren't we already being provided that nuance in our polling? Well, I'm doing it. Well, you're doing it, but why isn't it like more widespread? This is, I wanna interject that Emily and I had had this conversation before where you kind of were like, this is so easy and it is amazing that people don't do this. You kind of mentioned that on the Erin's question. Well, to be fair, some pollsters do do this occasionally, occasionally, but not all the time. And I think that the argument that they would give, it's a fair argument, as they said, well, we didn't know what the cost would be before we passed it. We had to pass it to find out what was in it. Or, or alternatively, there are a gazillion costs and a gazillion benefits. How are we supposed to accurately put those all into one question and ask someone to pick between the two? And I think that that's a fair point. But what I would say is how about, if we really wanna know how people think about this issue, let's ask a variety of questions. Let's ask about several different benefits, several different costs, and we can kind of get a sense of kind of where that median voter is rather than go around and say, 75% of Americans support X. There is a clear mandate for the policy that I love. Let's have a bit more humble approach to public opinion. For people who encounter these polls all the time, and they are used increasingly for policy purposes. It's not just, I don't know if it's increasingly, but it's not just during an election season. You see politicians using it to push policies. And in one of my areas with the Second Amendment firearms policy, we have the 90% of Americans support common sense gun control rules, which is just a magnificently frustrating, empty statistic that has the same problems as you outlined there. But for intelligent laymen who wanna see, look at a poll and try and figure out if they're being manipulated or lied to, is there a way you suggest for them to look behind the numbers and easily spot some sort of, this is probably a bad poll, kind of indicators? Well, some questions are more obviously bad questions than others. I would say polling that you see from the reputable major outlets like CBS, CNN, New York Times, I mean, those are good questions. They don't insert the costs very often for the reasons that I've described. But if you just know that going in, realize well, these results would probably change if people thought about X, Y, or Z. It does matter if support for policy A is 51% with no costs included versus 90% with no costs included. That gives you some sense about where people are, right? And so a lot of these gun questions, like assault rifle bans, support is just marginally supportive. You see polls that are under 50% and over 50%. That tells you that as soon as you insert a few more costs in there, you'd probably see support decline. Now advocates of these types of assault weapon bans as they are called, they would say, well, you're not including all the benefits that would accrue by banning these weapons. And so yeah, if we asked a variety of questions, we could kind of see where people shake out. Do you see many non-representative sample problems in polls that are widely discussed or just methodologically opposed to question problems? Are there many, this is obviously a bad representative sample or bad ways of doing regressions or something like that? Well, like I said, from the major, the major outlets that do polling, like the Pew Research Center, CNN, New York Times, I don't usually, I haven't seen that be a problem. For some of these, like certain political consulting firms where they will get hired by a campaign or a group and they won't release their top lines, what the top line is where you have the actual question wording and then the actual answers with the numbers associated with it. If they don't tell you their methodology, a lot of those pollsters are not to be trusted. And so that's actually to your earlier question. If you wanna know whether to trust a poll, see if they've posted their full results online somewhere. See if they've explained their methodology. If they haven't, they're probably one of these consulting firms that gets paid to kind of like weave a story. And if you are very interested, you could go to 538's pollster ratings where they use an empirical method to rate pollsters and how good they predicted various outcomes and they have an A through F rating. And some of these pollsters were getting Fs. And you know, those kind of, I don't hear about them anymore. So the one I'm gonna ask about one poll that always stood out during the run up to the election and now when there's approval ratings is Trump would always tweet out the Rasmussen results because he was always doing substantially better there than anywhere else. Why? So Rasmussen does a unique method that a lot of pollsters maybe aren't totally on board with. So he, if I'm remembering correctly, uses a combination of what's called a robocall which is it's one of those machines that will call people but it's not a live telephone interviewer. It will be like a computer that says, you know, would you please take the survey, press one for oppose, press two for support, press three for don't know. And I believe they combine that with some sort of online internet panel to try to get a younger cohort because you can imagine these robocalls are only able to call landlines. They can't call cell phones because of federal law. And so how do you get the people that don't have a landline? So the first thing is that people who have landlines are disproportionately more conservative because they're older and they're more likely to have a landline. But Rasmussen tried to address this by adding in this internet panel. Again, I may not be remembering this a hundred percent so I don't wanna be unfair to Rasmussen. I think this is what they do. But it's the kind of concerns that exist even if it's not the capturing a good enough group. Yes, and so that online sample is supposed to get the younger group but then the question is how good is your online panel? And so there are only a few firms that are widely recognized to have a really good online panel. And these are firms like YouGov, Knowledge Network, Ipsos, I'm not naming them all but those are some that come to mind. And so that's the other issue. But to be honest with you, I like a lot of the questions that Rasmussen asks. I think they're good questions and since people aren't a hundred percent sure about the methodology, it's very easy for them to dismiss the questions that they don't like. This might be a too complex question. So forgive me because I don't know anything about polling really but is polling getting worse? That's the first kind of question. We kind of discussed some people think it is. But if maybe there's some more difficulties in polling, we've discussed getting youth with email addresses and landlines and cell phones are probably also biased with race and all these different cohorts where it might be harder to get a representative sample of people increasingly as people become more in their own niches in varieties of ways. So I guess I'm asking these two questions which are maybe related and maybe I'm just out to lunch and I have no idea what I'm talking about but is polling getting better or worse? And is it becoming increasingly difficult to get a representative sample because of the sort of diversification of opinions? And they may not even be related questions. Well, so some people think it's getting worse. Some people think it's getting better. Like a lot of different areas. I think that what we're seeing now is very similar to what we saw before. I mentioned earlier that they used to do polling by going to your doorstep and sitting in your living room with you and going through a hundred questions on a survey. And that just, that was not sustainable. And that technology came in and made that provided a new opportunity, a new way that was less expensive and arguably in many ways more effective, more accurate through telephones. Well, now that people are kind of abandoning their landlines and only using cell phones now with the advent of the internet, more and more, I think it's like 85, 90% have cell phone access in their house or they get it on their cell phones. So it's the same idea where technology is coming in and providing a less expensive and I would argue more accurate way to measure people's opinions because they're able to answer privately without having to share what might be an unpopular opinion with an interviewer. I mean, interviewer bias is a very serious problem for certain types of questions. And then also online polling offers interesting ways to ask the question. So for instance, on the phone, you would say, who are you planning to vote for and you maybe give them a bunch of names? Well, they're not really that informed, but online you could show them a bunch of pictures and see which of these people are you gonna vote for. Which one of those is more predictive of getting at the final vote at the end of the day? And what we're seeing is that a lot of these online polls, these reputable online pollsters are doing very well at predicting the outcomes of elections, particularly in cases like Brexit and things where people feel like they can't share or express their true feelings. Thanks for listening. This episode of Free Thoughts was produced by Tess Terrible and Evan Banks. To learn more, visit us on the web at www.libertarianism.org.