The above image is from the always wonderful xkcd. It makes fun of the turing test, a test where a human tries to tell if he’s having a written conversation with a computer or another human. More recently the economist Bryan Caplan suggested an Ideological Turing Test:
The Ideological Turing Test is a concept invented by American economist Bryan Caplan to test whether a political or ideological partisan correctly understands the arguments of his or her intellectual adversaries: the partisan is invited to answer questions or write an essay posing as his opposite number. If neutral judges cannot tell the difference between the partisan’s answers and the answers of the opposite number, the candidate is judged to correctly understand the opposing side.
The context here is liberal economist Paul Krugman who claimed “One side of the picture is open-minded and sceptical. We have views that are different, but they’re arrived at through paying attention. The other side has dogmatic views.” In response Caplan challenged Krugman to an Ideological Turing Test, believing Krugman did not understand his opponents as well as he thought. Naturally this did not go over well.
Noah Smith then entered the fray arguing “Against the Ideological Turing Test“:
To put it bluntly, ideologies are large parts bullshit to begin with, and so it’s possible to bullshit your way through ideological tests.
But this is a bit academic. The fact is, for reasons mentioned above, actual Ideological Turing Tests are impractical. Instead, what you usually hear is people saying to an intellectual opponent: “I bet you couldn’t pass an Ideological Turing Test.” But since public arguments are very, very far away from an actual ITT, this is just bluster. It’s a nerdy-sounding way to say “You’re too dumb to understand my point.”
Priceless response from Tyler Cowen on twitter:
That is, in failing to understand why his opponent considerd an Ideological Turing Test useful, Noah Smith ironically proves the point. Though Smith himself is certain he would pass such a test, if such a thing were even feasible. Obviously I have my doubts about Smith.
Why bring this up? Because I’m watching the highly partisan debate unfold around two recent bestselling books. The first is Capital in the Twenty-First Century by Thomas Piketty. That book argues capitalism tends to ever increasing wealth for the upper .1%. So Piketty’s solution is much higher taxes on the rich. The debate around his thesis is bad enough, but recently some errors were found in the data underlying the book. So economists are really having at it. I was going to use as an example Brad Delong insulting Russ Roberts on twitter as a troll. But looks like Delong deleted his tweet. In any case, Delong apologized, which is still up. So much partisan anger. So little attempt at understanding.
But that’s nothing compared to the debate about Nicholas Wade’s book about race and genes. Arguing about the wealthy 1% is contentious, but simply can’t compare to race and genetics. Again, what’s most noticeable is how little understanding each side has of the other’s core arguments. A lot of comments are really about how the other side is just stupid.
What’s the solution? A good approach is to attempt the mental exercise of explaining your opponent’s views, which is the essence of the Ideological Turing Test. I suspect most people would do far worse than they suppose. But this approach works even if you are merely forced to explain the views you already hold. From Tom Stafford’s “The best way to win an argument“:
A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University suggested that in many instances people believe they understand how something works when in fact their understanding is superficial at best. They called this phenomenon “the illusion of explanatory depth“. They began by asking their study participants to rate how well they understood how things like flushing toilets, car speedometers and sewing machines worked, before asking them to explain what they understood and then answer questions on it. The effect they revealed was that, on average, people in the experiment rated their understanding as much worse after it had been put to the test.
Perhaps, they figured, people who have strong political opinions would be more open to other viewpoints, if asked to explain exactly how they thought the policy they were advocating would bring about the effects they claimed it would.
So one group was asked to give reasons for why they held beliefs. A second group was asked to explain how their policy would work.
The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. People who had previously been strongly for or against carbon emissions trading, for example, tended to became more moderate – ranking themselves as less certain in their support or opposition to the policy.
An Ideological Turing Test is not just a test. It’s about taking a mental stretch to empathize with your opponents position. A great learning experience. And if that’s too much work, just outlining the implications of your own position can lead to greater self-knowledge. Writing this blog has certainly had that effect on me.