pinker’s fallacy

Something’s been bothering me about the line of reasoning in Steven Pinker’s book The Better Angels of Our Nature, but it took me awhile. It seemed fairly clear that the book wasn’t pragmatically useful, since it advances a complacent thesis, but I couldn’t see how that made it actually wrong. Then I remembered the book that made such a splash recently: Nassim Taleb’s The Black Swan.

Taleb’s thesis, which has been reprinted all over the place, is that most predictive models fail to adequately account for rare, extreme variations. These “outlier” possibilities, though extremely unlikely by themselves, are not at all rare when you compound them. For example, in Northern California, school is canceled when there are downed electrical wires, damage to the school, snow, or certain other conditions. None of these things happen often, but I could be fairly certain that I’d get at least two unanticipated holidays every year, and sometimes more.

Pinker’s argument is that, because of technological and ideological progress, we are attacking each other on a vastly reduced scale. In all likelihood, if you derive a simple “violence rate” by dividing Incidents/Population, he is right. But there are certain scenarios, such as a global nuclear war, that would fly in the face of any previous trend. The correct approach to the Holocaust is not only counting the number of deaths, but also considering the event’s implications. In addition to the obvious, negative significance of genocide, it was quite easy for one empire, acting mostly unilaterally, to commit mass murder with industrial technology.

We know all this instinctively. If somebody offers you a deal where 999 out of 1000 times you win $200, but 1 out of 1000 times you lose every dollar in the bank, you will decline. The fact that you probably just passed on $200 won’t even bother you much. Even if, when you lose, you only lose $180,000 — making the offer a good deal statistically — you will still decline, because you couldn’t survive getting unlucky.

That’s why Pinker’s argument doesn’t sit right. If there is a “perfect storm” scenario that threatens most of humanity, then that’s reason enough to be vigilant. Whether or not it happens is irrelevant; hopefully it doesn’t. When it comes to industrialized possibilities for violence, one black swan matters more than a hundred predictable, humane years.

About these ads