No sympathy for statistics

Share on FacebookShare on Google+Email this to someoneTweet about this on Twitter

Sympathy and callousness: The impact of deliberative thought on donations to identifiable and statistical victims. I think the figure to the left really says it all. People give more money to individuals who are identifiable as opposed to plain clear statistics, but adding statistics to a face actually suppresses giving! One of the reasons that I think people dislike data and statistics is that they’re rather stupid and can’t remember the context or process them analytically. But these data do suggest I think that another reason is that numbers are emotionally deadening and probably don’t give result in the reward sense that one wants or needs. Putting an identifiable person into a statistical framework removes the emotional impulse as well. This is all rather disheartening when you think about the nature of representative democracy.

Labels:

9 Comments

  1. I’ve heard it suggested that Bill Gates will have better luck helping Africa than other do-gooders because he’s been taking a typically nerdy analytical approach to the problem.

  2. This is not surprising. I can remember a couple of articles in the Economist the last few years detailing studies suggesting that fear of being ripped off is a big limit on altruistic behavior. If I see a face I think I can sense whether he’s needy or not. Throw in some statistics and I have an intuitive sense that there are some cons hiding in the numbers. We’d much rather not be made fools than help people.

  3. This could also be interpreted as rational behavior. $20 given to baby Jessica buys a helluva lot more marginal utility than $20 given to UNICEF. Giving $20 to your local VFW chapter may or may not be more “efficient” than giving $20 to a national veteran’s charity, but it will buy you a lot more goodwill in your local community. This might be a good argument for why compulsory giving (taxes) are essential – to promote efficiency above individual marginal utility in charitable-donations decision-making.

  4. Razib is depressingly right. People cannot relate emotionally to numbers. Sad.

  5. Basically, people care about persons and not about populations. Populations are made up of persons, but have none of the appealing traits of persons.  
     
    The very common flip side of this is that people who think statistically to be utterly ruthless in the elimination of useless populations. If a given population is too high, for example, the most effective way to bring it to the proper size is to eliminate as many redundant individuals as possible according to some rational calculus. This is hardly an imaginary threat — it’s hardcore Malthusianism. (One criticism of the Nazis and Bolshevisks is just that they eliminated the wrong groups, productive kullaks and Jews). 
     
    In short, unsentimental rationality has its down side. 
     
    I’m skeptical that Gates (or as far as that goes, Bono) will do an especially wonderful job. Ironically, Jeffrey Sachs, one of the anti-hunger brains, has a big black mark on his record based on his participation in the Soviet shock-treatment scheme, which was based on a theory contrary to the anti-hunger crusade.

  6. When I find something emotionally upsetting, I’ll often push myself into a more analytical mode of thinking as a defense. I think that’s what is happening here. Statistics push most of us into a more analytical, less emotional mode of thinking, and empathy is emotional. 
     
    When you really *get* some statistics-level atrocity on an emotional level, it can be pretty damned overwhelming. Historical horrors like the holocaust or slavery are overwhelming when you get them emotionally, which is why it’s a lot easier to abstract them out and think about numbers.

  7. As an aside, this point is brought up in Orson Scott Card’s Memory of Earth books (which I gather are a retelling of the Book of Mormon). There’s a point where both a human and an AI have let some really horrible things happen to a small group of people. The human never saw the people, so never got her empathy sense going for them, while the AI worked in terms of some evaluation of the well being of all the humans on the planet, and this small group’s hardship didn’t amount to enough to require any action.  
     
    And _Ringworld Engineers_ has a similar discussion, with a very intelligent being asserting that humans can’t understand large numbers of deaths in a meaningful way, but instead imagine their own death repeated many times.

  8. “One death is a tragedy, but a million deaths is a statistic.” — J. Stalin and/or some other guy.

  9. John’s right about people caring about persons rather than ‘populations’, though I would put it that they don’t give ‘moral standing’ to abstractions, but they do to flesh and blood humans, which doesn’t mean they will behave on the up and up with flesh and blood humans but they will tend to do so more. One can see this in that quite a few people are perfectly willing to commit say insurance fraud since being nice to an insurance company is like being nice to a rock or an equation, but the same people would not cheat someone they considered a flesh and blood human they were dealing with, and they don’t see that they are cheating the other policyholders, especially if it is a mutual insurance company when they do so. This causes a big problem for “working socialism” since in such a system no individual deals directly with another, the state is always between them, and unless the state inspires some sort of Hobbesian fear in the members of society, they will never behave ‘morally’ in their dealings with it to the extent they do with flesh and blood humans. 
     
    I would also think that this is not a bug in the human moral sense, but a feature, maybe from a risk aversion perspective. Someone who will give abstractions ‘moral standing’ are also the same sort of people who are perfectly willing to shoot large numbers of kulaks with a perfectly clean conscience because they stand in the way of a necessarily abstract utopia. So maybe it’s not so bad.

a