Wanna get your nerd on?

Share on FacebookShare on Google+Email this to someoneTweet about this on Twitter

Read Overcoming Bias. There is a non-trivial intersection of audiences between here & there.

17 Comments

  1. Everytime I visit that site I find myself thinking of lyrics from a They Might Be Giants song: 
     
    Here’s hoping you don’t  
    Become a robot. 
    Clang! Clang! Clang! Clang! 
    Whoops! Too late!

  2. It’s by far one of my favorite sites. Anyone who hasn’t been reading it already has a lot of kicking themselves to do. I’m temporarily violating my self-imposed restriction from commenting on blogs (ignoring meatspace responsibilities is tempting enough) to plug OB because it’s just that great.

  3. includedmiddle, you remind me of John Sabotta. Reductionist doom!

  4. As far as the robot thing goes, sure. I can’t go along with the Evil Sorcerers thing, though.  
     
    If I thought that I’d probably be hanging out there all the time. I have no desire to be a robot, but a robot with supernatural powers? That’s another story all together.

  5. It always seems a little short on facts for me to find it interesting.

  6. Ditto.  
     
    Also, the idea that all – or even most – or even many – decisions can be made by means of mathematical formulas, which really is peddled on a regular basis over there (robots indeed), is simply obtuse to the point of absurdity. It’s a classic example of the drunk looking for his keys under the lightpost. Imagine if you tried to run a company, an army, etc, this way.  
     
    Actually, there is a historical parallel – its name is Robert McNamara. McNamara was not a Bayesian as far as I know, but this is just a detail. He was certainly an aficionado of quantitative, data-driven management. Whereas General Giap, at least so far as I know, relied on intuition, guesswork, rules of thumb, etc. The US military has been pretty down on “body counts” since then.

  7. that … decisions can be made by means of mathematical formulas 
     
    This proposition can be interpreted in a variety of ways, e.g.:  
     
    (1) That the decision-making processes of human beings “in the wild” may be completely described by “mathematical formulas”.  
     
    (2a) That the decision-making processes of an artificial intelligence could similarly be completely described by mathematical formulas.  
     
    (2b) That the task of designing an artificial intelligence is equivalent to the task of choosing “which formulas to use”.  
     
    (3a) That the decision-making processes of human beings can be improved by applying some of the formal discoveries of decision theory.  
     
    (3b) That the decision-making processes of human beings can be improved by making some specific model out of decision theory personally normative.

  8. Also, the idea that all – or even most – or even many – decisions can be made by means of mathematical formulasAll decisions are made by mathematical models. Those models have mathematical descriptions. Mathematical descriptions therefore encompass all methods of decision-making. 
     
    There are limitations to the ability to make decisions explicitly and rationally, but that is no cause to reject rationality completely, as you have done in your comment.

  9. I think Mitchell Porter’s teasing out of the various interpretations of mathematical modeling of decisions is good. As Robyn Dawes is fond of pointing out, though, most people are much more linear (as in ordinary regression) in their thinking than they themselves typically acknowledge. So, 3a and 3b seem like no brainers and 1 might be closer to the truth than some would like. I think the real problem with some of the Overcoming Bias contributors is that they ignore their own advice about certain priors–namely, even if we have ‘complete describability’ via mathematical models of decisions processes, why think that this will, of itself, yeild the ability to duplicate these processes in other physical models? Informationalism is fine, but not to the exlusion of a sensitivity to physical facts and laws. If emotions are, as Antonio D’Amasio says, somatic responses to the chemical states of our cerebral spinal fluid, then without some massive chemical advances, we’re a long way form the immortality hoped and hyped at Overcoming Bias. The attitude on this site (informationalism combined with physical savvy) seems much more realistic.

  10. Caledonian puts it most succinctly. 
     
    There’s a saying in the numerical analysis community that any computational problem can be defined as finding the maximum of a function – and that this is almost always the wrong way to do it. 
     
    Mitchell, your point (1) is a strawman except in the most abstract, Laplacian sense; (2a) and (2b) are true by definition; no one could possibly object to (3a); and no one could possibly believe in (3b). 
     
    Except that the OB people seem to. In this post, for example, Robin Hanson basically says that he doesn’t believe in intuition and he can’t imagine why anyone would. Perhaps the poor man has had some kind of hemispherectomy, and the right half of his skull is stuffed with cotton wool. 
     
    I am left with one inescapable conclusion: none of these academics has ever held any significant position of responsibility. They have never worked as subordinates in a productive organization, and they have never found themselves in a position where they had to make actual decisions that affected actual people. Hemispherectomies aside, I’m pretty confident that if they did, they would find themselves using their intuitive faculties much as the rest of us. 
     
    This idea of scientific policymaking is a fad. It’s an epiphenomenon of 20th-century metastatic government. It’s an increasingly threadbare way to disguise a fact that is obvious to anyone not within commuting distance of Washington, DC, which is that we are in fact ruled by a vast bureaucracy whose decisions are constructed by intellectual fashion and personal prejudice, and which employs a vast army of numerically deft flacks to justify these prejudices in reverse. If the Great Society didn’t discredit “social science,” I don’t know what could. 
     
    For most problems – certainly including the question Hanson and Roberts are debating – you have no meaningful numbers and no way to perform experiments. As a decisionmaker, you have no recourse but to actually think. It simply boggles the mind to realize that Yudkowsky, Hanson and company probably actually believe that, if Napoleon had performed the proper Bayesian incantations, he would have won at Waterloo. But I suppose people have believed even weirder things.

  11. Mencius, rather than assuming what jobs Robin Hanson has or has not held, why don’t you just read his bio? 
     
    Nowhere did anyone say anything about Napoleon winning at Waterloo, or that the fact that he didn’t talk about Bayesian probabilities indicates anything. Earlier you seemed to be saying Giap was succesful because of intuition while McNamara failed because he didn’t use it. Isn’t all that ignoring the circumstances people were in and the plausibility of them attaining victory? Giap had more impressive victories against the French (Dien-bien-phu) and McNamara had to serve under LBJ. Using the same sort of logic we could say that Stalin’s strategy of murdering his best officiers and believing everything Hitler said was one that others should imitate. 
     
    If the Great Society didn’t discredit “social science,” I don’t know what could. 
    Having a degree in economics gives you good job prospects in the private sector. Levitt has successfully been able to uncover cheating by teachers and has been hired by online gambling sites to help them detect it. The methods of social science work, astrology doesn’t. Once again, using your logic we could say that the Soviet Union discredited science or atheism or that Nazi Gemany discredited the idea of “race” or anti-communism. 
     
    no meaningful numbers and no way to perform experiments. As a decisionmaker, you have no recourse but to actually think. 
    You’re right, numbers and experiments don’t have a goddamn thing to do with thinking. What the hell? I don’t think you understand Bayesianism all that well if you don’t see any connection between the subjective priors used and thinking. 
     
    Seriously, Mencius, you disappoint me more and more each day. And you consider Lawrence Auster an important thinker!

  12. I’d actually love to see the OB boys try to apply Bayesian decision theory to military leadership. Which believe it or not, in many 19th-century pedagogical traditions was actually the canonical example of decisionmaking. Sadly, it looks like it’ll have to remain the dog that didn’t bark. 
     
    No – the “natural experiments” of McNamara versus Giap, or of the Office of Economic Opportunity versus sanity, cannot lead us to any general conclusions. That’s kind of the point. However, they can and should have led people to wonder a little about the methodological epistemology of quantitative management. 
     
    Of course Bayesianism is inherently quantitative. Bayes’ Theorem is a mathematical formula, for cripes’ sake. If you don’t have numbers, you can’t do math. 
     
    The peril of these quantitative, “scientific” management methodologies is the good old lamppost fallacy – it leads you to ignore intuitive styles of thinking that can actually find your keys in the dark, and for which you have a large built-in hardware accelerator. Instead, you are constantly subject to the temptation to pull nonexistent, irrelevant numbers out of your ass. Hence the “body count.” 
     
    Hats off to Steven Levitt if he did an honest day’s work once or twice. I’m not saying statistics are useless. I’m saying that no one in the private sector has taken quantitative decision theory seriously since McNamara gave us the Edsel or whatever.

  13. “natural experiments” 
    A natural experiment is something like the type described here where the independent variable is nearly random in its application, not like your examples. 
     
    I’m saying that no one in the private sector has taken quantitative decision theory seriously since McNamara gave us the Edsel or whatever. 
    Once again, the slightest bit of googling could enlighten you, but you prefer to remain ignorant. From Wikipedia: In 1946 McNamara joined Ford Motor Company, due to the influence of a Colonel he worked under named Charles “Tex” Thornton. Thornton had read an article in Life magazine which reported that the company was in dire need of reform. He was one of ten former WW II officers known as the “Whiz Kids”, who helped the company to stop its losses and administrative chaos by implementing modern planning, organization, and management control systems. Starting as manager of planning and financial analysis, he advanced rapidly through a series of top-level management positions. McNamara opposed Ford’s planned Edsel automobile and worked to stop the program even before the first car rolled off the assembly line. He eventually succeeded in ending the program in November 1960. McNamara also came close to terminating the Lincoln, forcing product planners to reinvent the car for 1961. On November 9, 1960, McNamara became the first president of Ford from outside the family of Henry Ford. McNamara received substantial credit for Ford’s expansion and success in the postwar period. 
    McNamara opposed the Edsel and his number-crunching worked pretty well for the company. It hasn’t gone out of style either, hence Ian Ayre’s “Supercrunchers”. Google uses Bayes like Microsoft uses if-statements. How popular is the use of Austrian economics in the private sector? 
     
    I’d actually love to see the OB boys try to apply Bayesian decision theory to military leadership. Which believe it or not, in many 19th-century pedagogical traditions was actually the canonical example of decisionmaking. Sadly, it looks like it’ll have to remain the dog that didn’t bark. 
    I think there are many specific forms of leadership they haven’t tackled. Hanson did though try to set up prediction markets for terrorism before the stupid public got all upset and has advocated halving military spending for much of the same reason he advocates halving health care spending. Much of what they discuss is very generalized and could be applied to military leadership though. 
     
    Of course Bayesianism is inherently quantitative. Bayes’ Theorem is a mathematical formula, for cripes’ sake. If you don’t have numbers, you can’t do math. 
    And math has nothing to do with thinking, which is why people with low IQs regularly win Fields Medals. 
     
    it leads you to ignore intuitive styles of thinking that can actually find your keys in the dark, and for which you have a large built-in hardware accelerator. 
    Our intuitions are very, very, very frequently wrong. Robin Hanson says they are valid evidence, but the Bayesians want to augment them. Not doing so is basically not thinking.

  14. Did you read that Michael Perelman article you linked to, quoting Scheiber? It’s quite good.  
     
    There’s a difference between a natural experiment and a good natural experiment. The latter are quite rare. Once the cherries are picked, people start picking the dingleberries. Again: lamppost syndrome. 
     
    I wasn’t seriously accusing McNamara of being responsible for the Edsel. Google uses statistical methods for search, not for management decisions. GE uses statistical methods for quality control, not for management decisions. And what is Ian Ayres? Another professor.

  15. Did you read that Michael Perelman article you linked to, quoting Scheiber? It’s quite good. 
    Which article are you referring to? Could you link to it again? 
     
    There’s a difference between a natural experiment and a good natural experiment. The latter are quite rare. Once the cherries are picked, people start picking the dingleberries. Again: lamppost syndrome. 
    YOU are the one who referenced those incidences, so YOU are the one looking under lampposts. 
     
    I wasn’t seriously accusing McNamara of being responsible for the Edsel. 
    He wasn’t merely not responsible, he fought against it. His tenure at Ford was generally considered a successful one, so your point about his methods failing in the private sector was wrong. 
     
    And what is Ian Ayres? Another professor. 
    Supercrunchers isn’t just about academics. It discusses things like evidence-based medicine which have better track-records than the intuition of doctors and are beginning to usurp their authority to make decisions. I haven’t read his book so I don’t know what he says about “boardrooms and government agencies”, but I do know that Yahoo and Marketocracy use one of Hanson’s obsessions, internal prediction markets. Ronald Coase had some explanations based on transaction costs on why firms aren’t as market-filled as Koch Industries, but I would emphasize your point that under current regulations corporate governance is nowhere near optimal efficiency. Just like doctors don’t want to cede authority to algorithms, I would expect managers to be friendly toward it either.

  16. Okay, but the problem I have with the Overcoming Bias site is that they are all in favor of data-based decision-making, but they never have any real data in their postings. I like data. They, however, seem to like the concept of data, but they don’t actually like data itself.

  17. They often discuss the results of studies on heuristics and biases and Robin Hanson loves RAND and other studies on health care. You’re right that much of it is general talk about Bayesian rationality rather than anything specific with data.

a