Why Do Opinion Leaders Misjudge Public Attitudes?

11 July 2019, 0412 EDT

Last week, Dina Smeltz, Jordan Tama, and I had a piece in the Monkey Cage on the results of our 2018 survey of 588 foreign policy opinion leaders. We found that these opinion leaders misestimated public attitudes on (1) US engagement in the world, (2) support for trade, (3) support for military intervention, and (4) support for immigration.

I did a thread on the results, which I’ll summarize below, but I wanted to follow up with some thoughts based on a thoughtful critique from Ken Schultz that focused on our finding that elites thought the public less supportive of military intervention than our public survey results suggested.

Survey Respondents

Let me first say a word about the survey respondents. We collected lists of names and email addresses from foreign policy leaders from different professions including Congressional staff, the Executive Branch, think tanks, interest groups, and academia. Leadership Connect was the base for most professional groups. For think tanks, we supplemented with lists of scholars from the top foreign policy related think tanks according to the University of Pennsylvania. The TRIP survey (thanks Mike Tierney!) provided lists of top 25 PhD programs in IR and all APSIA schools.

We emailed the survey to respondents and then followed up several times over the course of summer 2018. Think tankers and academics are over-represented in the survey (181 and 212 respondents respectively), but many of them have some government service (45% and 37% respectively).

Since we reinvigorated these leader surveys in 2014 with the Chicago Council on Global Affairs, it’s gotten more challenging to get respondents from Executive Branch or Congressional staff, something we’re attuned to as we head in to 2020.

We also draw on evidence of public attitudes from two nationally representative surveys, one carried out last summer by the Chicago Council through Ipsos and another we contracted YouGov to carry out on our behalf in summer 2018. In the piece, we also report on findings on immigration from the 2016 leader and public surveys.

Key Findings

Let me now summarize our main findings.

(1) Foreign policy opinion leaders think the public is less enthusiastic about international engagement than they actually are.

(2) Opinion leaders think the public is less supportive of international trade than they actually are.

(3) Opinion leaders underestimate public support for military intervention, particularly when endorsed by NATO.

On military intervention, we did something a little different, following on previous experimental work Tama, Jon Monten, Craig Kafura, and I published earlier this year in Foreign Policy Analysis and blogged about for OUP here.

In previous work, we tested the effects of UN endorsement of use of force operations on leader and public support for sending U.S. troops abroad. In that work, we found a bump in support for sending US troops abroad across a range of scenarios, particularly among leaders, when subjects were assigned to the UN endorsement condition as compared to sending US troops unilaterally.

In this iteration, we brought in Josh Kertzer as a collaborator on the project. Building on scenarios used in other recent experimental work on the effects of multilateral endorsements on support for the use of force, we tested the effects of NATO endorsement in a scenario where an African country was invaded by its neighbor. The scenario was calibrated to produce some level of disagreement among the public. On the one hand, there were both humanitarian and terrorism risks; on the other hand, it would be conducted in a region of perceived low strategic interest to the United States. It therefore wasn’t cooked to favor intervention.

The experimental design randomized whether NATO supported or opposed the operation (recognizing that in practice NATO opposition is shorthand for NATO couldn’t agree on endorsing the operation). You can read the survey question here (and scroll down to page 9).

Here is what we found.

The gap between leader assessments and public opinion is especially pronounced in part because of the way we aggregated public opinion. Respondents were asked whether they supported sending US troops or not, with responses ranging from support a great deal to oppose a great deal. Intermediate responses include support a moderate amount and support a little and similar options for opposition.

In the values we reported in the Monkey Cage, we recorded any support for sending troops as indicative of support including a great deal, a moderate amount, and a little. The gap between elite assessment and public opinion therefore looks especially large.

But, even if we exclude those who only supported the intervention a little (26% in the NATO supports condition), those who supported it a great deal (24%) and a moderate amount (32%) amounted to 57%, which is still about 18-20 percentage points higher than opinion leaders’ estimates of public support.

(4) Opinion leaders misjudge public support for immigration.

Finally, in our 2016 survey, we asked leaders to estimate whether the public favored deporting migrants. We also asked the public their opinion on deportation versus allowing migrants to stay. Here again we found elites misjudged public opinion.

Are Leaders Really Getting Public Opinion Wrong?

We interpreted the findings in the Monkey Cage as leaders getting public opinion wrong. We echoed arguments from three decades ago by Steve Kull and I.M. Destler in Misreading the Public that suggested leaders are listening to vocal publics in the absence of good public opinion polling data.

While public opinion data is more readily available, leaders may not be consulting survey data and listening to the loudest voices. Other studies are also finding leaders misperceive public opinion. In a 2018 APSR article, Broockman and Skovron found bias among more than 3000 state legislators on 9 issues including gun control and abortion rights. They attribute this conservative bias to the fact that conservatives were more active in contacting legislators during the study period of 2012-2014.

In a 2019 APSR article, Hertel-Fernandez, Mildenberger, and Stokes found that about 100 Congressional staffers misestimated public support for action on a variety of hot button domestic issues including climate change and gun control. They attributed the conservative bias of these staffers to lobbying from interest groups and staffers’ own ideological leanings.

On most of the international issues we asked about — engagement writ large, trade, and intervention, it’s hard to imagine that lobby groups were responsible for the misperception. There are certainly labor unions and some domestic industries hostile to international trade, but pro-trade business groups are also active to defend and expand more trade agreements like the Trans-Pacific Partnership. Immigration is an area where we’ve seen more one-sided mobilization until recently.

On intervention, Ken Schultz offers a narrative that could rationally explain why elites think public support for intervention is lower than what we observed. As I wrote on Twitter, a hypothetical scenario asking about intervention in an African country is cheap talk, fairly easy to bless in the abstract, but it might be lower if we were to ask about an actual situation like South Sudan. We might also expect public sentiment to diminish if there is partisan division or negative news coverage of possible intervention. Perhaps elites are imagining how support will shake out over the course of public discussion.

Schultz’s argument is that our scenario is like the Libya intervention which never commanded more than 47% support. Moreover, in light of the difficult experiences in Iraq and Afghanistan, it is easy to imagine a public that is tired of intervention. Without any partisan cues about who is president, people have to make their own inferences about the likely situation. Schultz argues in his longer thread:

We ended the Monkey Cage piece with the notion that elites are perhaps confusing broader public opinion with especially vocal publics, but who are the vocal publics? What indicators are useful to track them? Those who pay a lot of attention to the news or those who are politically active? Are elites listening to vocal publics writ large or only co-partisans?

As I noted in my Twitter thread, surveys may register what people think but not how deeply they care. Leaders may be reacting to publics who are especially passionate about their positions rather than other publics who don’t care strongly.

In preparing the Monkey Cage piece, we looked at attentive publics, those who pay a lot of attention to foreign policy news. But those folks in some respects are more like elites. They were more supportive of engagement than folks who were less interested in foreign policy – 87% of Republicans, 89% of Democrats, and 79% of very interested Independents all supported engagement, compared to 70% of the public overall.

On trade, 88% of attentive Republicans and 87% of attentive Democrats and Independents said trade was good for the U.S. economy, compared to 80% of the public overall.

On intervention, attentive publics were more supportive of sending U.S. troops abroad than others across a range of scenarios that the Chicago Council asked them about including if North Korea invaded South Korea, if Russia invaded a Baltic ally, and if China initiated a conflict with Japan over disputed islands. Some other scenarios like fighting violent extremist groups in Iraq and Syria were more broadly popular across different groups. Others like if China invaded Taiwan showed more knowledgeable groups to be more reluctant to send troops, a position probably closer to elite opinion as well.

On immigration, there were large partisan splits with especially attentive Republicans very concerned about immigration and refugees as a critical threat facing the country (75%), much more than attentive Democrats (18%) and less engaged Republicans. Here, leaders of all parties might be listening to loud voices.

But, are people who consume the news also the ones contacting their legislators by phone, writing letters to their members of Congress, and engaging politically? The people who were only somewhat interested in foreign policy were more politically active in the Chicago Council mass public survey than attentive publics on a number metrics including writing or speaking to a public official in the past four years about a political issue: 84% attentive Republicans vs. 93% somewhat attentive Republicans, 75% vs. 89% for Democrats, and 80% vs. 92% for Independents. This was true on other indicators including attending a rally and donating money.

This suggests that politically active publics might be different than those who are paying the most attention to foreign policy news. When elites think of the attitudes of the public on different issues: who are they then imagining? In the case of immigration, the bias across all elites appears to be conservative, that leaders are paying attention to the most vocal actors on the right. That may not be true of all issues though. On some, they might imagine the sentiments of fellow partisans. Perhaps in the era of Resistance mobilization, the bias will start to shift to the left.

Obviously, this is a rough cut trying to understand if elite misestimates of public opinion are more than bias and ideological blinders. I’m hopeful that this is something we can pursue in subsequent work as we, like others, found that elites are not especially good at gauging the level of public support across a range of policy questions.*

* As we move forward on parallel academic work on this project, we will be posting the full data files in coming months.