Implicit Bias, Political Identity

Earlier this month, Ezra Klein of Vox.com wrote a disturbing article about the changing nature of Americans’ identification with political parties. In it, he looks at the results of a recent study by political scientists Shanto Iyengar and Sean J. Westwood, whose research suggests that party affiliation isn’t simply an expression of our disagreements on ideology or issues. Rather, it’s a matter of tribalism: the transformation of party affiliation into a form of personal identity that reaches into almost every aspect of our lives.

In their study, titled “Fear and Loathing Across Party Lines,” Iyengar and Westwood write about [PDF] an increasing intrusion of partisan behaviors into aspects of everyday life that are not ordinarily coded as political. They cite “likes” and “follows” for social media accounts associated with Republican and Democratic political figures, as well as for partisan television news-analysis hosts from networks like FOX News and MSNBC. They cite studies that suggest that residential neighborhoods are becoming increasingly politically homogeneous, and that parents are increasingly likely to express displeasure over the prospects of their offspring marrying into a family with a different party affiliation.

Implicit Bias, Political Identity

In an interview that Klein conducted for his article at Vox, Iyengar says that it’s almost certainly not a matter of increasing ideological extremism among voters from one party or another. If you look at Americans’ positions on the issues, Iyengar says, they are much closer to the center than their elected representatives. And yet, Klein writes, since the 1980s, Republicans’ feelings towards the Democratic Party, and Democrats’ feelings towards the Republican Party, have dropped off a cliff.

In order to find out how this works, Iyengar and Westwood conducted two experiments. In one, they used mock scholarship applications to measure participants’ political and racial biases. They impaneled 1,021 people and asked them to choose between fictional high-school age applicants displaying four characteristics: Democrat, Republican, African American, and Euro American.

The result was that, more than any other factor, it was party cue that exerted the strongest impact on selection for the largest number of participants. Regardless of qualifications like GPA, and regardless of the factor of race, Democratic leaners showed a stronger preference for the Democratic candidate and Republican leaners showed the same – though somewhat less strongly – for the Republican candidate. Despite the fact that it was a non-partisan task, in other words, partisanship prevailed.

In the second experiment, they conducted what’s called an implicit association test with 2,000 participants. An implicit association test, Klein writes, measures the snap judgments your brain makes at speeds faster than conscious thought. Originally developed to measure racial bias, it requires the test taker to hit a letter on your keyboard when certain word and images flash together. And based on the speed of the response, it exposes the kinds of instant judgment we make before we have time to think.

Implicit Bias, Political Identity
Figure 2 of Iyengar and Westwood, “Fear and Loathing Across Party Lines: New Evidence on Group Polarization.”

The results of the implicit association test are much the same as in the scholarships experiment: people who identify as Republicans and people who identify as conservative both code things associated with the Republican Party as “good” and things associated with the Democratic Party as “bad”; while for Democrats and liberals, it is the inverse. The results suggest that it isn’t with the rational part of the mind – the part that responds to ideas and ideologies – that people make judgments. We make the judgments based on partisan biases, and then rationalize after the fact.

For Ezra Klein, the conclusion to be drawn from this is bleak: winning an argument, at least when you’re talking to co-partisans, is less about persuasion than about delegitimization. And for political candidates, it may not be worth campaigning across party lines at all. Like other forms of fandom – sports teams, for example – party and ideology have become powerful forms of personal identity. And this changes the playbook for the worse for cynical presidential candidates, policymakers, pundits, and anybody else looking to gain followers on the cheap.

But Klein’s conclusion isn’t the only way to read this data. One of the things that his article at Vox does is publish a version of the implicit association test that readers can take online. And this, when combined with the existing results, suggest a kind of opportunity.

Yes, according to Iyengar and Westwood’s study, American voters on the whole are plagued by what looks like the partisan version of racism: we’ve grown pretty strong biases against people and institutions we identify as being members of the out-group, and in favor of people and institutions we identify as being in our camp. But we also have the tools to reverse this trend. The fact is that the mechanism by which we make political decisions does not stop at our inborn biases. We have the capacity for rational decision-making, too. And now, understanding that bias is a fact, we can be alerted, rather than resigned, to its effect. And we can be vigilant about monitoring our decision-making processes more closely.

In other words: Iyengar and Westwood’s results can be seen not just as a sign of the times, but also as a call to action. And that’s what we here at the Institute would recommend. We would recommend that you, our readers, click through to take the implicit association test, and learn just how partisan you really are. And then use that data as a starting point to listen – really listen – to what candidates for political office are saying. It will certainly turn out that you disagree with many of the positions that candidates in the upcoming elections hold. But you may be surprised to learn that – adjusting for implicit bias – your disagreement across party lines is less uniform than you probably thought.

Ezra Klein isn’t wrong when he tells us that this sort of party identification is the stuff from which candidates generate loyalty. But it doesn’t have to be. What Iyengar and Westwood’s study does is expose the technique behind the trick. It gives us the means to opt out of partisan fandom.  We can opt out.  And by doing so, we can make better choices that, in the long run, may allow us to field better candidates, too.

Extreme Policy Positions: An Experiment

Back in February, we looked at some experimental data from political scientists Michael J. LaCour and Donald P. Green that offered insight into the less rational side of our political beliefs, and into the value of face-to-face conversations with the objects of our prejudices in moderating our positions and changing our minds. Taking the issue of gay marriage as a test case, their study, published in Science, concluded that a rational argument in favor of marriage equality, when combined with direct interaction with a flesh-and-blood gay activist, was the most effective route to changing minds – and to having those changes stick. And they concluded that the rational argument alone was inadequate to the task.

Today, we look at a different face of our less-than-rational political selves: unjustified confidence in our understanding of the issues. In a 2013 study published in Psychological Science, authors Philip M. Fernbach, Todd Rogers, Craig R. Fox, and Steven A. Sloman examine the mentality of extreme political beliefs, and some strategies for how they might be moderated, better opening the door to the possibility of compromise.

Extreme Policy Positions: An Experiment

Fernbach and his colleagues look at what they see as a connection between extremity of position and depth of understanding. They contend that when people are required to confront their relative ignorance on a given political issue, they become more likely to abandon their extreme position, and to embrace more moderate positions instead. People tend to have unjustified confidence in their understanding of policies, they write, and when they are asked – specifically – to explain those policies, the illusion of understanding evaporates and they become more open to other sorts of views.

To test this, the authors conducted three experiments. In the first, they asked participants to rate how well they thought they understood six hot-button political issues – issues like raising the retirement age for Social Security and imposing unilateral sanctions on Iran for their nuclear program. They asked the participants to provide explanations for the policies they claimed to support, and then, after that, re-rated both how well participants thought they understood the policies, and how extreme their positions were.

Across all six political issues, Fernbach and his colleagues found that asking people to explain how policies work decreased their reported understanding of those policies and led them to report more moderate attitudes toward those policies.

In their second experiment, the researchers sought to determine whether it was having to explain the policies specifically that moderated participants’ positions, or whether some other in-depth discussion – like enumerating reasons why they held the policy attitude they did – would be adequate. Here, they had half of their participants explain, and half enumerate their rationale. And what they found was that the latter was inadequate: enumerating reasons did not lead to any change in position extremity at all.

Finally, Ferbach and his colleagues’ third experiment tested whether increased moderation on political issues would lead to less material support for political figures and organizations that advocate for those extreme political positions. Participants were given the opportunity to donate money to organizations that supported their initial extreme position. And after asking participants to explain their position, what the researchers found was that, indeed, they were less likely to show that kind of material support.

From the perspective of civility, what is interesting about this study is less the increase in political moderation than the question of understanding. Civil discourse – the kind of discourse where I can claim my needs even as I recognize that you have valid needs, too – does not necessarily require that we hold middle-of-the-road views. But it does require understanding. It requires that we understand that there is more than one valid point of view on most political issues. And it requires that we understand – at least a little bit – the intricacies of the policy position we would like to see implemented.

In their conclusion, Fernbach and his fellow researchers write that:

Previous research has shown that intensively educating citizens can improve the quality of democratic decisions following collective deliberation and negotiation. One reason for the effectiveness of this strategy may be that educating citizens on how policies work moderates their attitudes, increasing their willingness to explore opposing views and to compromise.

And this is really the point. It matters less that we all moderate our attitudes than that we understand the policies on which we claim to hold such strong opinions. The more we understand, the more it becomes clear just how little we understand. And it is from that place – from a self-reflective acknowledgment of our own ignorance – that we can begin to see that our adversaries may have a point, and that their point of view may deserve serious consideration, too.