Monopoly allows innovation to flourish

Share on FacebookShare on Google+Email this to someoneTweet about this on Twitter

Updated

This may be old hat for some readers, but it’s worth reviewing and providing some good new data for. The motivation is the idea that monopoly-haters have that when some company comes to dominate the market, they will have no incentive to change things — after all, they’ve already captured most of the audience. The response is that industries where invention is part of the companies’ raison d’etre attract dynamic people, including the executives.

And such people do not rest on their laurels once they’re free from competition — on the contrary, they exclaim, “FINALLY, we can breathe free and get around to all those weird projects we’d thought of, and not have to pander to the lowest common denominator just to stay afloat!” Of course, only some of those high-risk projects will become the next big thing, but a large number of trials is required to find highly improbable things. When companies are fighting each other tooth-and-nail, a single bad decision could sink them for good, which makes companies in highly competitive situations much more risk-averse. Conversely, when you control the market, you can make all sorts of investments that go nowhere and still survive — and it is this large number of attempts that boosts the expected number of successes.

With that said, let’s review just a little bit of history impressionistically, and then turn to a new dataset that confirms the qualitative picture.

Taking only a whirlwind tour through the pre-Information Age time period, we’ll just note that most major inventions could not have been born if the inventor had not been protected from competitive market forces — usually from protection by a monopolistic and rich political entity. Royal patronage is one example. And before the education bubble, there weren’t very many large research universitities in your country where you could carry out research — for example, Oxford, Cambridge, and… well, that’s about it, stretching back 900 years. They don’t call it “the Ivory Tower” for nothing.

Looking a bit more at recent history, which is most relevant to any present debate we may have about the pros and cons of monopolies, just check out the Wikipedia article on Bell Labs, the research giant of AT&T that many considered the true Ivory Tower during its hey-day from roughly the 1940s through the early 1980s. From theoretical milestones such as the invention of information theory and cryptography, to concrete things like transistors, lasers, and cell phones, they invented the bulk of all the really cool shit since WWII. They were sued for antitrust violations in 1974, lost in 1982, and were broken up by 1984 or ’85. Notice that since then, not much has come out — not just from Bell Labs, but at all.

The same holds true for the Department of Defense, which invented the modern airliner and the internet, although they made large theoretical contributions too. For instance, the groundwork for information criteria — one of the biggest ideas to arise in modern statistics, which tries to measure the discrepancy between our scientific models and reality — was laid by two mathematicians working for the National Security Agency (Kullback and Leibler). And despite all the crowing you hear about the Military-Industrial Complex, only a pathetic amount actually goes to defense (which includes R&D) — most goes to human resources, AKA bureaucracy. Moreover, this trend goes back at least to the late 1960s. Here is a graph of how much of the defense outlays go to defense vs. human resources (from here, Table 3.1; 2008 and beyond are estimates):


There are artificial peaks during WWII and the Korean War, although it doesn’t decay very much during the 1950s and ’60s, the height of the Cold War and Vietnam War. Since roughly 1968, though, the chunk going to actual defense has plummeted pretty steadily. This downsizing of the state began long before Thatcher and Reagan were elected — apparently, they were jumping on a bandwagon that had already gained plenty of momentum. The key point is that the state began to give up its quasi-monopolistic role in doling out R&D dollars.

Update: I forgot! There is a finer-grained category called “General science, space, and technology,” which is probably the R&D that we care most about for the present purposes. Here is a graph of the percent of all Defense outlays that went to this category:


This picture is even clearer than that of overall defense spending. There’s a surge from the late 1950s up to 1966, a sharp drop until 1975, and a fairly steady level from then until now. This doesn’t alter the picture much, but removes some of the non-science-related noise from the signal. [End of update]

Putting together these two major sources of innovation — Bell Labs and the U.S. Defense Department — if our hypothesis is right, we should expect lots of major inventions during the 1950s and ’60s, even a decent amount during the 1940s and the 1970s, but virtually squat from the mid-1980s to the present. This reflects the time periods when they were more monopolistic vs. heavily downsized. What data can we use to test this?

Popular Mechanics just released a neat little book called Big Ideas: 100 Modern Inventions That Have Changed Our World. They include roughly 10 items in each of 10 categories: computers, leisure, communication, biology, convenience, medicine, transportation, building / manufacturing, household, and scientific research. They were arrived at by a group of around 20 people working at museums and universities. You can always quibble with these lists, but the really obvious entries are unlikely to get left out. There is no larger commentary in the book — just a narrow description of how each invention came to be — so it was not conceived with any particular hypothesis about invention in mind. They begin with the transistor in 1947 and go up to the present.

Pooling inventions across all categories, here is a graph of when these 100 big ideas were invented (using 5-year intervals):


What do you know? It’s exactly what we’d expected. The only outliers are the late-1990s data-points. But most of these seemed to be to reflect the authors’ grasping at straws to find anything in the past quarter-century worth mentioning. For example, they already included Sony’s Walkman (1979), but they also included the MP3 player (late 1990s) — leaving out Sony’s Discman (1984), an earlier portable player of digitally stored music. And remember, each category only gets about 10 entries to cover 60 years. Also, portable e-mail gets an entry, even though they already include “regular” e-mail. And I don’t know what Prozac (1995) is doing in the list of breakthroughs in medicine. Plus they included the hybrid electric car (1997) — it’s not even fully electric!

Still, some of the recent ones are deserved, such as cloning a sheep and sequencing the human genome. Overall, though, the pattern is pretty clear — we haven’t invented jackshit for the past 30 years. With the two main monopolistic Ivory Towers torn down — one private and one public — it’s no surprise to see innovation at a historic low. Indeed, the last entries in the building / manufacturing and household categories date back to 1969 and 1974, respectively.

On the plus side, Microsoft and Google are pretty monopolistic, and they’ve been delivering cool new stuff at low cost (often for free — and good free, not “home brew” free). But they’re nowhere near as large as Bell Labs or the DoD was back in the good ol’ days. I’m sure that once our elected leaders reflect on the reality of invention, they’ll do the right thing and pump more funds into ballooning the state, as well as encouraging Microsoft, Google, and Verizon to merge into the next incarnation of monopoly-era AT&T.

Maybe then we’ll get those fly-to-the-moon cars that we’ve been expecting for so long. I mean goddamn, it’s almost 2015 and we still don’t have a hoverboard.

Labels: , , ,

51 Comments

  1. hm. i’m generally skeptical about theories of innovation, period, at this point. i’m sure you’ll get a lot of criticisms in the comments so i’ll leave it at that ;-)

  2. ok, a small comment. i’ve seen theories, heard stories, which are good an explaining how particular institutional frameworks and such extract more juice out particular technologies. making technologies better and more efficient. but game changing technology invented out of the blue which transforms the whole economy seems more like an exogenous dynamic to the systems which we have a good grasp of.

  3. Think of a population genetic analogy: finding a suite of genes that dialed up your IQ, skin color, height, or whatever by a little bit, vs. a suite of genes that gave you some function that you didn’t even have before — binocular vision, a four-chambered stomach, etc. 
     
    Finding the first set isn’t too hard, and the process doesn’t take long. Finding the second set is much less likely and takes very long. It only took thousands of years to dial down Europeans’ and Asians’ level of skin pigmentation, but the basic machinery that gives skin its pigmentation took much longer to design. 
     
    In technology, we’ve only been tinkering around with how active one of our organs is — but I want a new organ, one that does something the existing ones don’t!

  4. a book on my desk, waiting to be read, is “Competition and Growth; reconciling theory and evidence” by Philippe Aghion & Rachel Griffith, which from a quick scan, goes into exactly this topic. I mention this only because you might be interested in it, if you’re interesting in this subject.

  5. Isn’t it a matter of domestic monopolies competing in a bigger game? Defense spending needs an enemy to justify itself, after all.

  6. Search engines and the genome are pretty good for any thirty year period. A bit more than nothing in terms of innovation. 
     
    I think you could add in CDMA, giant magnetoresistance, autonomously driven cars (on the level of the wright brothers), the microarray, 4G programming languages (on the level of Leibniz’s fluxion notation), MP3/MP4 and psychologically based compression, the incarnation of Codd’s set theoretical work in relational databases, and the web browser.

  7. Surely it depends why the monopoly organisation got its monopoly. If it’s a state organization whose raison d’etre (at least partly) is innovation that’s different to if it’s a profit-making enterprise that happens to be better at making profits than competitors.

  8. I’m working on a business right now. We offer a rather innovative service that the monopoly in our town doesn’t. The monopoly doesn’t offer this service because doing so would force them to be at market price instead of the monopoly price.

  9. What does this mean for Kurzweil’s thesis? Is it a crock of sh*t? 
     
    Kurzweil does seem to engage in quite a bit of data fitting in the “exponential growth curves” in his book.

  10. I wonder how you would distinguish this from the hypothesis that technological developments have gotten harder over time.  
     
    Besides that, technology has been path-dependent. For example, the semiconductor industry has had thirty years of finding incremental ways to manufacture smaller circuits. Plausibly, the same research effort could have been applied to some fundamentally different trajectory, which would look more pathbreaking in retrospect, but would never have made economic sense at any step.

  11. That second chart is a little bitter for me. My dad was an Air Force scientist laid off in 1970. When I was a little kid I always thought I would be a scientitst like my dad, but the next few years of penury convinced me it was a sucker’s game. We lived around families headed by high school graduates that had much more affluence and material possessions than us (and they hated NERDS!)  
     
    The threat of nuclear annihilation aside, the Cold War was great- it kept a lot of people well employed and produced a lot of great technology.

  12. A nit about your data: perhaps it takes a while — a few decades — to realize an invention is a “big idea”. 
     
    The number of awarded patents (little monopolies) has skyrocketed recently.

  13. It’s also possible that radical innovation is following a logistic curve, and that we can’t expect a second cornucopia of technological delights until we remove some of our limitations (like those imposed by our current squishy gray matter). In this view, the diminished investment is a logical response to diminishing returns. 
    I actually am more inclined to agree with your assessment, though. When I look at how hard it is to shake loose a few million in funding for IEC fusion research, or cold fusion research, I have to think that we’ve become institutionally moribund. Not that I think either of those approaches is likely to pan out, but compared to something like the F-22 fighter they seem like absolute no-brainers for government funding.

  14. “A nit about your data: perhaps it takes a while — a few decades — to realize an invention is a “big idea”.” 
     
    I agree. I bet it took awhile to figure out that the laser could be used for all sorts of things. I would also add that the internet is a total game-changer. It is a recent arrival, and I would put it ahead of any other invention since 1945. Besides, any list that doesn’t include the Snuggie clearly must be taken with a grain of salt.

  15. Just to be sure, I hope no one forgets that the major complaint against monopolies is market control and manipulation while fixing prices to their advantage alone and blocking competition. I know that’s not the subject of this post but I wanted to make sure it was said.

  16. Chemdude, what’s a snuggie?

  17. I’m also reading Kurzweil’s “The Singularity is Near” now and his narrative of exponential acceleration does conflict with his post. I found it notable though that he says we are currently in the “fifth paradigm” of computing (integrated transistors) now, which has lasted a good deal longer than previous ones. His explanation for that is that a new paradigm only comes along when the old one has petered out and we’ve still got a decade or so of extra juice left before we have to move on to 3D/molecular chips.

  18. It’s a blanket with sleeves. Some people use it to keep warm at night. I was being facetious :) 
     
    http://www.youtube.com/watch?v=2xZp-GLMMJ0

  19. I also think you have a real problem knowing what inventions are major ones until some time later, which biases that survey towards past inventions.

  20. A lot of inventions are invisible to the average person. New processes for manufacturing computer chips, hard drives, fiber optics, composites, and other goods are invisible to the public. They could walk thru a factory employing some of these new processes and see nothing unusual. The stuff that is really cool requires a lot of training to understand how it works. 
     
    The fact that DNA sequencing tech has advanced so far so fast strikes me as evidence that lot of innovation is going on. The fact that computer processing power has increased for decades along with hard drive capacities and fiber optic bandwidth is what enabled the wondrous internet. 
     
    I see big advances getting made with stem cells such as induced pluripotent stem cells. I expect stem cells and gene therapies to become very useful in the next 10 years. 
     
    I do not see a dearth of innovation. Quite the opposite in fact.

  21. 1) Your discussion on the general benefits of monopolies hinges on TWO institutions. 
     
    2) You didn’t try to quantify how much of the surge in innovation can be ascribed to the two institutions in question. 
     
    3) Microsoft is the typical example of a non-innovative monopoly. AFAIK they have “invented” exactly nothing (though they have been important in providing a standard platform, occasionally refining others’ innovations, and bringing them to the masses). Google owes its existence to an innovative breakthrough (the PageRank algorithm), but this clearly occurred before they turned into a near-monopoly (they are definitely NOT a monopoly, even now).

  22. I also think you have a real problem knowing what inventions are major ones until some time later, which biases that survey towards past inventions. 
     
    Yep. And even then, who decides whether or not the invention is major.  
     
    PS, Snuggie = back-to-front dressing gown.

  23. I’m a little surprised that a post on a blog called “Gene Expression” thinks that innovation is slowing down, given the rapid pace of discovery and innovation in the life sciences right now. Not all of this innovation is leading to consumer goods terribly quickly, but how long did it take from the invention of the CRT to consumer television, from the invention of the transistor to transistor radios in every pocket? (That said, I do see a longer cycle from major breakthrough to mass consumer good than there used to be, but does that mean that there is less innovation, or that something else has changed?) 
     
    Prozac was claimed as the first SSRI, though Eli Lilly had to issue a retraction of that claim. But it was discovered in 1974, and first sold in 1986, so doesn’t belong to 1995.

  24. We offer a rather innovative service that the monopoly in our town doesn’t. The monopoly doesn’t offer this service because doing so would force them to be at market price instead of the monopoly price. 
     
    Remember what the discussion is about, though — major inventions. The way you describe things, this isn’t going to be the next transistor. 
     
    I wonder how you would distinguish this from the hypothesis that technological developments have gotten harder over time.  
     
    My argument assumes the mechanism is ability to conduct a large vs. small number of trials (to uncover rare but big things). For most innovation, that requires near-monopoly of a market, but for others not so much — so, theoretical physics (string theory) should have made a lot more progress than applied physics or technology during the past 30 years. Less funding needed. 
     
    A nit about your data: perhaps it takes a while — a few decades — to realize an invention is a “big idea”. 
     
    In general this isn’t true, for the arts or sciences. It’s quite rare that a big artist or scientist isn’t recognized as a genius in their own lifetime, in general. Same with the work they create, including inventions. For example, it took less than 10 years for the group who invented the transistor to be awarded the Nobel Prize. It was obvious how huge that invention was. Same with telephones, cars, etc. — what person needed 30 years to see how important these things were? 
     
    Even with theoretical developments, this is true — only 14 years separating the discovery of cosmic background radiation and the Nobel to Penzias and Wilson (also working at Bell Labs). 
     
    Remember, this isn’t the public voting on the 100 Big Ideas — academics, curator-types at museums, and so on.  
     
    1) Your discussion on the general benefits of monopolies hinges on TWO institutions. 
     
    Wrong — it hinges on 100 inventions. We then see if they were produced during the more lax period of regulation, or the heavily de-regulating period when Bell Labs was divested and the state sector shrunk like crazy. 
     
    And in any case, those “two” institutions invented damn near everything worth mentioning in the past 60 years. If you didn’t already know that, I pointed it out and provided links. 
     
    2) You didn’t try to quantify how much of the surge in innovation can be ascribed to the two institutions in question. 
     
    Sure, but like I said it’s obvious, and so left as an exercise for the reader. Someone else can look firm-by-firm and see how Bell Labs and the federal government have had a much higher success rate than just about any other entity, during their heyday.  
     
    3) Microsoft is the typical example of a non-innovative monopoly. AFAIK they have “invented” exactly nothing (though they have been important in providing a standard platform, occasionally refining others’ innovations, and bringing them to the masses).  
     
    Well, no related company has invented anything then, if we’re sticking with big inventions. But where did the computer, internet, modem, information theory, UNIX, the C and C++ programming languages, the transistor, fiber optics systems, etc., come from? That’s right: those “two” institutions. 
     
    Also note that the title says that monopoly allows innovation to flourish — it’s necessary, not sufficient. 
     
    And anyway, Microsoft is way more innovative, on a smaller level, than other related companies (like, e.g., Apple). Note that nearly everyone prefers their products over the competitors’ — Windows OS, Internet Explorer, and the Office Suite being the most obvious examples. 
     
    Google owes its existence to an innovative breakthrough (the PageRank algorithm), but this clearly occurred before they turned into a near-monopoly (they are definitely NOT a monopoly, even now). 
     
    They existed before they were near-monopolistic — I think that’s a tautology. Their major contributions followed after they dominated the search engine market, though: Blogger, Google Maps, Google Books, etc.

  25. The way you describe things, this isn’t going to be the next transistor. 
     
    True, but I wonder what incentive a monopoly would have to introduce an innovation that lowered prices significantly without an accompanying increase in demand… A government monopoly might pursue that goal (since its raison d’etre would be innovation), but I don’t see a private monopoly going that route.

  26. “And anyway, Microsoft is way more innovative, on a smaller level, than other related companies (like, e.g., Apple). Note that nearly everyone prefers their products over the competitors’ — Windows OS, Internet Explorer, and the Office Suite being the most obvious examples.” 
     
    Um, no it isnt. You are talking about original ideas, not new implementations of old ones. 
     
    Microsoft is a largely reactive company which has used it’s muscle to copy other peoples ideas: Windows copies the original Mac OS, which was the first commercial GUI only OS. Excel was designed to take on Lotus 1-2-3 and won because of it’s OS’ predominance and the failure of OS/2 ( a failure by a large, then almost monopolistic company called IBM) . Netscape had the first popular browser and MS takes that on with IE. MS writes T-SQL to compete with standard SQL And tries to compete with Google on search, and Apple – the iPod – with the Zune. The entire Office suite takes on competitors who invented technologies elsewhere but were overtaken by MS’s monopolistic Juggxrnaught.  
     
    No innovation. No new ideas. Just copying. 
     
    Most of what it does, it does badly. Windows was a GUI only OS in 1995, 12 years years after the Mac was GUI only, T-Sql is a mess, IE is not used by anybody I know, and probably only by people who do nothing to their computer. So its use is an artifact of moms and dads not changing stuff on the monopoly OS they get with the cheap Dell.  
     
    Where MS does not have that advantage over other companies – like in search or phone oses – it may as well hand in the towel. There is no there there. 
     
    I have no idea how this relates to the rest of your thesis, although I agree that innovation is slowing down, I cant agree on the cause. The MS comment, though, shows a lack of knowledge which puts in doubt the rest.

  27. No-one’s mentioned Schumpeter in this thread, so I thought I would. His treatment of monopoly in Capitalism, Socialism and Democracy is remarkably positive. He thinks they tend to temporary and self-correcting, and he endorses A.’s view that by sheltering short-term inefficiencies they can promote innovation. One of Schumpeter’s key ideas was that capitalism was sowing the seeds of its own destruction. Part of this was that, as the economy becomes more efficient it destroys the sheltered niches that support innovations at a certain point in their development.

  28. Microsoft is way more innovative, on a smaller level, than other related companies (like, e.g., Apple)  
     
    Huh? Apple is the first company that married a modern frontend (Aqua) to a hard core backend (Unix/BSD). No one who knows how to code uses Windows over Macs. Check out MIT CS or Bay Area startups some day…the people who are writing the code for the next generation of applications are all on Macs. That means machine learning, physics, genetics, systems programmers, you name it.  
     
    With darwinports or Fink you can get all the power of the GNU tools. With Rails or Django you can run the exact same webapp on your local laptop that you’d run on the server. You have the same suite of poweruser tools on the frontend that you do on the backend — you don’t have to context switch. Developing where you deploy has huge advantages.  
     
    I pity the data analyst who tries to analyze TB scale data with Excel (!) rather than GNU cut/paste/awk/etc or the web designer who has to deal with IE’s CSS bugs [which is all of them].  
     
    The only things Windows is superior for are gaming apps. It is an operating system for children and people scared of the command line. But it is now time to put away childish things!  
     
    Another good ref:  
     
    http://www.paulgraham.com/microsoft.html

  29. “With the two main monopolistic Ivory Towers torn down — one private and one public — it’s no surprise to see innovation at a historic low.” 
     
    The missing piece of this argument is to show that the innovations of the 1940s-70s were predominantly the work of those two institutions – i.e., if you remove them from consideration, the curve is fairly flat. I doubt that’s entirely true, though no doubt there’s something in it. 
     
    There’s also the argument that we’ve picked the low-hanging fruit in some areas – in chemicals and pharmaceuticals, for example, the rate of interesting innovations seems to have declined, even though they are (largely) outside the defence/telecom sectors.

  30. The most important point here is whether large monopolistic organizations (such as the military or Bell Labs at its peak) innovate as cost-effectively as startups. I don’t think they do.  
     
    The most innovative consumer-facing big companies are Goog and Apple. And tons of their stuff comes from startup acquisitions rather than internal development.  
     
    That said, there are certain kinds of “big science” that are only feasible for large organizations with big budgets. Sequencing the genome is the canonical example. But then again, just look at the innovative shot in the arm that whole enterprise got once there were *two* horses in the race! Competition is a driver of innovation.  
     
    Their major contributions followed after they dominated the search engine market, though: Blogger, Google Maps, Google Books, etc.  
     
    First, dominating search required tons of innovation *and* acquisition. Kaltix’s block eigenvalue computation and Ori Allon’s contextual search were both acquisitions — and there are dozens more that don’t get any publicity.  
     
    Second, Blogger was an acquisition of Evan Williams’ company. Google Earth was an acquisition of Keyhole. Google Voice was an acquisition of Grand Central. Maybe half of their cool stuff sprung from acquisitions of innovative startups rather than in-house development.  
     
    This is not to take anything away from Google, which is probably the most innovative big company around, but most big cos aren’t Google. They have layer upon layer of Assistant to the Regional Manager.  
     
    Also, to address a few other points… 
     
    1. Microsoft and Google are pretty monopolistic, and they’ve been delivering cool new stuff at low cost  
     
    What innovation has MS come out with? Honestly interested here. The cool stuff that is under the MS brand are things like Project Natal and Photosynth, both of which (again) sprung from acquisitions of innovative startups.  
     
    2. Overall, though, the pattern is pretty clear — we haven’t invented jackshit for the past 30 years.  
     
    See above re: CDMA, etc.  
     
    3. we’ll just note that most major inventions could not have been born if the inventor had not been protected from competitive market forces  
     
    I kind of see where you’re coming from with this, but you take the thesis too far. The Industrial Revolution gave the commercial incentives to advance thermodynamics (viz. all the engines). The Information Era gave similar incentives to advance computer science and electrical engineering.  
     
    There is a distinction between “isolated from day-to-day market pressue” and “part of a monopoly”. The small university research lab doesn’t have monopolistic amounts of resources, but they do have time.

  31. By the way, for anyone interested in *why* Microsoft died intellectually, I recommend this WSJ profile. There is one key sentence:  
     
    http://online.wsj.com/article/SB121261241035146237.html?mod=googlewsj 
     
    Gradually, Mr. Ballmer made his imprint. He restructured the company to give more decision-making power to executives, and elevated people with general management experience into positions previously held by technology-focused executives. He also worked to settle Microsoft’s many lawsuits, taking a more conciliatory line than Mr. Gates typically had, Microsoft executives say. 
     
    Boom, that’s it right there. Ballmer hired MBA morons like this:  
     
    http://www.businessweek.com/bschools/content/jun2007/bs20070607_329811.htm 
     
    In my post-MBA job hunt, Microsoft was not the most obvious fit?I’m not a very technical guy. On my first day at Microsoft it took me 30 minutes just to find the latch to open my laptop (though I did successfully find the “on” button pretty quickly). I think that’s why my MBA at Kellogg has played such a vital part in my career development.  
     
    Yeah, you *think* that he’s being self-deprecating. Read the rest and the smile will fall off your face. 
     
    Ballmer also killed key projects because they would cannibalize the co’s revenues. This is what led to the rise of Google Apps:  
     
     
    In one case, two vice presidents clashed over the future of NetDocs, a promising effort to offer software programs such as word processing over the Internet. The issue: Because NetDocs risked cannibalizing sales of Microsoft?s cash-cow Office programs, some executives wanted NetDocs killed. 
     
    Messrs. Gates and Ballmer were unable to settle on a plan. First, NetDocs ballooned to a 400-person staff, then it got folded into the Office group in early 2001, where it died. 
     
     
    That in a nutshell is why monopolistic organizations almost never innovate themselves away.

  32. Just had a thought — Macs vs. Windows are kind of like our political climate.  
     
    The Democrats are an alliance of the technical overclass with the underclass. The Republicans roughly represent the middle class. And all the momentum is with the Democrats right now, because they appeal to both the unwashed masses *and* the college grads.  
     
    In the same way, Macs appeal to the computing underclass (grandma) and the overclass (MIT CS PhD, Bay Area hacker). Windows is the OS of the American sarariman middle class, who knows how to hit Ctrl-Alt-Delete but wouldn’t know a thread from a process if their life depended on it. And again, the momentum is only in one direction.

  33. Last thing. MS is the most technically reviled big company around today because they actively fsck web developers. At first MS did this on purpose, but now it’s become a habit, forcing web developers to respond in innovative ways. See:  
     
    fixoutlook.org 
    ie6update.com

  34. Since switching positions I’ve been required to code on a Mac. I don’t like it and would prefer to go back to XP. So I’m one counter-example to geecee’s claim, but I’m not a very good programmer.

  35. It is an operating system for children and people scared of the command line. 
     
    In other words, it is an OS for all but 1/10000 of the target audience for Windows, Macs, Linux, etc.

  36. I’m not so sure that Google has invented anything since PageRank. They tend to buy start-ups that invent interesting things instead. 
     
    It’s much easier to invent something outside of a corporate immune system.

  37. Jason — MapReduce, AJAX, GFS, BigTable are all serious breakthroughs. And those are just some of the published ones off the top of my head.  
     
    MapReduce and AJAX have literally spawned whole industries based on imitation. Hadoop is a MapReduce clone that became Yahoo’s backend and then MS’s backend (after the Powerset acquisition). And even nontechnical people understand the impact of AJAX.  
     
    Lots of other improvements are subtle but nontrivial. As one example, think about how Goog’s indexer seems to always have up to date versions of web pages. That is in part due to Grimes’ et al.’s statistical recurrence model which uses ideas from Kalman Filtering to predict when a page will next be updated based on its past history.  
     
    They have made too many breakthroughs in machine learning to name, but here’s a short list of apps which have some serious shit under the hood: Google Translate, Google Maps/Streetview, Gmail spam filtering, Goog 411… 
     
    Bottom line is that Goog is the big dog today in terms of CS innovation. If you know some of the people there, or what’s going on inside, there are things they are doing that no one else can do — both because of the concentration of talent and the scale they’re working at. For example, take a look at Barroso and Holzle’s recent 108 page writeup on warehouse scale machines:  
     
    http://everythingsysadmin.com/2009/05/warehouse-scale-machines-the-d.html 
     
    Or Jeff Dean’s talk on Goog’s backend:  
     
    http://glinden.blogspot.com/2008/06/jeff-dean-on-google-infrastructure.html 
     
    I believe in giving credit where it is due. Goog is still the most innovative big company in tech. Apple is a very solid second. After them I tend to start thinking of startups rather than MS, IBM, Yahoo, etc. 
     
    (With two caveats. First, search.yahoo.com has superior cross-keyword recommendations relative to Goog. Second, Bing’s image search is superior to Googe. I’ve often wondered why someone doesn’t just put Orbitz or something like that inline on a search engine frontpage…I guess that day is quickly approaching).

  38. In other words, it is an OS for all but 1/10000 of the target audience for Windows, Macs, Linux, etc.  
     
    But it’s not that uncommon. Macs are at least 10% market share and growing steadily. They are superior for anyone who does not play games. I say this as someone who used to write DirectX games in Windows for fun. The only thing better about Windows (and it is not totally trivial) is that you can resize windows from all four corners and that the maximize button works.  
     
    Also — I was intentionally being a mite provocative, but I am seeing the tea leaves. MS has been alienating both techies *and* the general public. It is notorious for spyware, its PR has been bad, and it is *very* poor for scientific programming and server-side development. Client side development is ok if you spend the time to learn the whole Windows stack, but there are so many advantages to developing in the cloud (especially incremental updates!) that this is a dying programming model.  
     
    Let me put it like this: I know of exactly *one* impressive startup (Stackoverflow.com) that uses MS technology. But it’s the exception that proves the rule. Similarly, in bioinformatics, only wet lab scientists use Excel for anything but the most trivial of calculations. The places that shape the future of genomics — NCBI, Whitehead, Wellcome Trust, ILMN — are either exclusively or preferably into Unix.  
     
    Basically, the people who build and program computers determine where the future of the industry is going. And that future is not on Windows.

  39. There was a time, back in the 1960′s, when 3/4 of US R&D was paid for the US government, much of it by the Defense Department. This came to annoy people, and in 1972 Congress passed a little law called “The Mansfield Amendment” (Sen. Mike Mansfield, later ambassador to Japan if memory serves). TMA basically cut of funding for military R&D that was not specificially aimed at direct and immediate military use. Since then, R&D funding in the USA has fallen a bit (from about 2.7% ro 2.3%), and 2/3 or so R&D in this country is now funded by business. There seems to be general agreement that business-oriented R&D is much more aimed at the short-term than military R&D was. It’s more D than R, in other words. 
     
    Also it seems to be much more trend-oriented. There’s wads of dough, relatively speaking, for neater and faster web applications and computer electronics, and I suppose energy research is getting some money. But astronautics hasn’t advanced much since the last moon shot in 1972, chemistry seems to have frozen in place except for medical uses, building construction has advanced in tiny steps, etc.  
     
    Again a lot of this can be tracked to the Nixon Era, specifically to Nixon’s deliberate decision to boost medical and biological research spending at the expense of physical science and engineering. And for the life of me, I can’t say that was a totally BAD decision — current day medicine and biological understanding is probably 50 years ahead of where it would have been without Nixon’s action. You really wouldn’t to try tackling AIDS or Alzheimers or some strains of influenza with a basic 1970 understanding of biology! 
     
    And there’s been a cost to this. Without access to very expensive apparatus, experimental and theoretical physicists have been marking time for much of the last 30 years. Nuclear engineering has stagnated; unconventional energy schemes such as tidal dams and OTEA have dropped from sight; huge delays in engineering projects of sorts have become commonplace (think of the various FAA modernization schemes which have been on going since the middle 1980s; think of costs and delays which Boston’s Big Ditch has endured); indeed, virtually all forms of modern day engineering seem to have atrophied into mere shells of what they once were as analog devices and machinery are replaced by microchips. (Nothing against the microchip, but pitot tubes and concrete rebar and gears still have their uses too, even if they’ve ceased to be trendy enough for students to waste time on).  
     
    Something else — in say 1950, engineers and scientists were in the top 10% of the population in income and educational level (and dare I say, social prestige). They were up there with lawyers and doctors and high ranking military officers and civil servants; some of them were famous enough to have recognizable names! Up at the very top of things, think of Werner von Braun, or Richard Feynmann, or Stanislaus Ulam, or Robert Oppenheimer or Norbert Weiner or John von Neuman, or Jonas Salk. This doesn’t quite exist any more. The “famous” scientists are mostly known only to other scientists or laymen who try to follow popular accounts of the sciences. The “ordinary” scientists and engineers and mathematicians are no long viewed as especially prominent for having college degrees; they are “white collar workers” — “educated” but far too plentiful in the modern job market to merit the sort of premium pay given MBAs or LLDs; oddly, their work is more valued today than earlier, but because of this it is better managed — engineers today don’t have knowledge of the days when it was assumed 20% of ones time was spent on tasks that seemed interesting to oneself. (No doubt, they spend 20% of their time crusing the Internet, but I don’t think it’s quite the same thing.)  
     
    Remember the moment in 1986 when engineers at Thiokol tried to persuade their bosses that lauching the Space Shuttle in icy weather could be dangerous, and the bosses decided to “put on their management hats” and ignore the lowly engineers? I think this is a teribly significant story but I can’t quite say how — I just use as an example on the infrequent occasions I get asked about engineering as a career.  
     
    So. Some types of technical and scientific innovation have slowed up. This seems to be a side effect of major changes elsewhere in society; the implication is that speeding up innovation will take additional changes in society — virtually all of them changes no one is eager to accept. Not likely in what’s left of my lifetime, in other words — Good luck to the rest of you!

  40. By the way, I’ve often thought that it would be amusing if someone wrote a virus that installed Firefox, Chrome, Safari, or even IE8 and removed IE6/IE7 as the defaults. They would be heros…probably never have to buy a dinner again in tech circles.  
     
    (It might be even possible to uninstall Windows and install OSX on x86 platforms, but that might actually break a lot of people’s installs, which would be inconvenient).

  41. Interesting comment, mike. This bit jumped out at me, though:  
     
    virtually all forms of modern day engineering seem to have atrophied into mere shells of what they once were as analog devices and machinery are replaced by microchips.  
     
    There are significant advantages to doing as much as you can in software rather than hardware. Makes you more nimble.

  42. Geecee — yes, much can be done in software, and in many wonderous ways software is far more versatile than old fashioned analog devices. But because it is so easy to use software for so much, there seems to be a tendency to do as much as possible in software, and ignore the sort ot old crude stuff with cams and pistons and counterweights, etc., that used to occupy engineers’ thoughts.  
     
    With possible consequences — it’s been suggested that the Air France jet that fell into the Atlantic a couple of days back first ran into trouble when its pitot tubes — air pressure sensing devices that go back with little modification to the 1930′s — froze over at high altitude and low temperature. Pitot tubes aren’t sexy, they don’t get upgraded all that often and they don’t merit the attention that might have been given to say upgrades in the autopilot software. Till now, anyhow. 
     
    Another funny story, which got some play on the Internet a week or two back, it seems the US government has lost the knowledge of just how to make H-bombs. There are some materials and techniques that for various good reasons aren’t spelled out in textbooks, but are kept in people’s minds. Well and good, but for twenty years or so the US government actually hasn’t been making a lot of H-bombs, and the people who have the secret know how have been aging and retiring, without opportunity to pass their secret knowledge to younger co-workers. So now, at great expense, since we do need that kind of knowledge, we’ll have to run a batch of expensive experiments (and maybe run some Q-cleared FBI agents to old folks homes to question some irrated oldsters about just what they were doing on particular dates in 1959 and 1968 and so on), 
     
    Another bit of humor — NASA is about out of radioactive isotopes for powering spacecraft systems at Jupiter and further out in the solar system. Again, the USA hasn’t been building up stocks of Thorium and Neptunium and the like for several decades, so that we need some, we’re having to start up the manufacturing process from scratch again, making all the old mistakes as we relearn things — just like the good old days of the 1950′s.  
     
    You get the idea, I trust? Computers are lovely, but they are not themselves sensors, or power sources, or nuclear catalysts. We shouldn’t ignore these things because they aren’t fashionable; we shouldn’t make the mistake of assuming there aren’t other phenomena to study and eventually master even if they aren’t capable of being simulated by microprocessors. Getting back to the main point, our preoccupation with computers may be giving us a restricted set of innovations and new technologies.

  43. Mike, excellent points. I had heard of the Fogbank fiasco…reminds me of how fragile civilization is (in Zimbabwe, they’ve gone from electricity back to candles in 15 years).  
     
    The low-level device engineering — i.e. the physical contact with the real world — is sorely needed for those fancy APIs. One example that comes to mind for me is the recent memristor — without innovation in the low-level device stuff, you’ll never be able to build a nice python interface on top of it.  
     
    I remember the first time I read Kailath’s book on Linear Systems…I was struck by (a) the elegance and (b) the difficulty of implementing the analog feedback in Watt’s governor. The first thing I thought of was how tightly coupled the physical system was to the control system, and how different this is from the modern division between “dumb” sensors/actuators and smart control. Relatedly, it took me a while to realize that IIR filter theory comes from constraints on what can be realized in an analog system. 
     
    Do you blog anywhere? I like to think we have a pretty good handle here at GNXP on modern CS/genetics/stats, but you seem to have some more old school engineering chops and I think it’d be educational to read your musings.

  44. Recognition of invention was historical. If people invent less, how come we have to replace our cell phone or computers every two years?

  45. My own recollections from the last 50 years: 
     
    “astronautics hasn’t advanced much since the last moon shot in 1972″ 
     
    The manned space program is extremely risk adverse and only uses flight proven equipment, i.e., old stuff. NASA’s unmanned space program is much more innovative, e.g., the Mars explorers. Also the military is strongly pushing RPV’s and robotics. Non-government astronautics has been progressing. 
     
    “chemistry seems to have frozen in place except for medical uses” 
     
    Computer modeling of chemical structures has significantly improved, e.g., quantum mechanics, transition states, charge distributions, i.e., smart material design. Chemistry plays a critical role in nanotech. Nanotech improves surface characteristics for new catalysts. The materials in modern products are stronger, lighter, and more durable than in prior decades…that is largely due to chemisty. 
     
    “Without access to very expensive apparatus, experimental and theoretical physicists have been marking time for much of the last 30 years.” 
     
    Physics has multi-billion dollar particle accelerators and fusion experiments that employ teams of hundreds of scientists. Some of the most expensive computers are largely devoted to physics. If progress in physics has been slow it is because the problems are extremely difficult and each generation of accelerator or fusion reactor costs far, far more. 
     
    There has been significant progress in optics, nanotech, super conductivity, quantum mechanics, and cosmology. 
     
    “Nuclear engineering has stagnated” 
     
    There are many new, innovative designs. The problem has been a public that fears nuclear power. That is changing. 
     
    “unconventional energy schemes such as tidal dams and OTEA have dropped from sight” 
     
    The Hawaiian OTEA research station was never able to produce competitively priced electricity. There are many ongoing tidal dam projects around the world. Likewise for geothermal. These projects have major engineering problems which is why windmills and solar power dominate the news. 
     
    “huge delays in engineering projects of sorts have become commonplace” 
     
    The political/legal environment slows down big projects. 
     
    In the 1950′s the US didn’t even have an interstate highway system. Today cities have massive infrastructure which require maintenance and hampers new development, e.g., digging a ditch means cutting through existing pavement, plumbing, power lines, and communication lines. 
     
    “virtually all forms of modern day engineering seem to have atrophied into mere shells of what they once were” 
     
    Better materials, better design tools, etc. Buildings must withstand earthquakes and be energy efficient. Compare a modern bullet train to a 1950′s train. Compare modern mining equipment to what was available in 1950. 
     
    “Something else — in say 1950, engineers and scientists were in the top 10% of the population in income and educational level (and dare I say, social prestige). They were up there with lawyers and doctors and high ranking military officers and civil servants;” 
     
    In 1950 engineers and scientists weren’t at the top. The heyday followed the Sputnik scare after which the US began a major push to promote science education and the space program began in earnest. The media was very favorable toward scientists and engineers (I Dream of Genie, Star Trek). This peaked in the late 1960′s and early 1970′s. By the late 1970′s aerospace engineers were out of work and the salaries and prestige for scientiests were declining. The media turned against technology. The Sputnik period was atypical for US history.

  46. mike, I think Nixon made an unambiguously bad decision. Have you heard of the RAND health experiment? Health care delivers no benefits on the margin. And the more cutting edge a treatment is, the more expensive and dangerous it generally is. 
     
    As a contrast to the geecee/MM line about the march of hip Democratic elites, I give you Andrew Gelman on nasty, brutish and short Democrats. Admittedly, he wrote it in 1994, around the time when as a kid I thought it perfectly sensible to refer to myself as an “ultrarepublican”.

  47. The Sputnik period was atypical for US history. 
     
    Edison, Ford and Bell were living legends – American folk heroes decades before Sputnik. Now kids round out their top 5 Americans with Rosa Parks, Harriet Tubman and MLK. I really can’t imagine that people in 1960 would be all that wowed by what we have today. A color television in every room and you can pay your bills online – whoopdeedoo. Sure, lots and lots of incremental improvements, but not much in our everyday lives they didn’t already have. Compare that to 1910 to 1960 or 1860 to 1910 – the changes in people’s lives were spectacular – and the former period included the 15 year stretch of the Great Depression and WWII. 
     
    If you look at GDP statistics at the BEA site, you can see that our rate of growth has clearly shifted downwards over the last 40 years.

  48. The RAND health experiment was carried out about 30 years ago. Certainly medicine’s diagnostic abilities and, to a lesser extent, treatment abilities, have improved enough since then to warrant a new study. For example, CT scans, endoscopic techniques, laparoscopic surgical techniques, coronary angioplasty, and many current cancer therapeutics either did not exist then or were in their infancy. Death rates from most types of cancer have declined since the time of the RAND experiment suggesting some contribution of health care to patients’ benefit even if it is expensive. 
     
    There were also some statistical problems with the RAND data discussed by a U. Minnesota professor in 2007 in the journal Health Affairs. Here is a link to a review of that article: 
    http://blog.hcfama.org/?p=1210%3Cbr%3E%3C/a%3E

  49. ziel: 
     
    You’ve got to be kidding. Give your 1960 man an iPhone, and show him Google, Google Maps, Wikipedia, Babblefish and Youtube. Then use the GPS/navigation computer to guide you home. Then show him your music collection (you know, in the little silver thing that looks like a fat metalic book of matches with a tiny TV screen on it, and that contains a couple hundred hours of music and a dozen TV shows). Help him buy a book or two online and have it shipped overnight. Or get him a Kindle and let him buy books that just appear instantly. Have him book a flight and hotel online. Give him a laptop with Word and Excel and (if he’s a techie) maybe Maple. Hell, give him a good calculator and inform him his sliderule is now a museum piece (along with his typewriter).  
     
    Any one of those technologies will let him know, unambiguously, that he’s living in Future Land without the flying cars. Our cars may look like smaller, longer-lived 1960s cars, our handguns may be little better than 1940s tech, and our houses may be flimsier (but less toxic) than in 1960, but our computers and consumer electronics rock.

  50. Should mention Boldrin and Levine’s Against Intellectual Monopoly, which takes a strong contrary view: They argue that patents or monopoly privileges of any sort are unneeded and counterproductive.  
     
    This has been David Levine’s big push within econ in the last decade or so, and he’s been quite successful in getting economists to soften their priors on the issue. It helps that he uses a mix of theory and data to tell his tale….And that he’s a great public speaker…. 
     
    It’s a big thesis, but here’s one key anecdote: Countries that tighten intellectual property rights don’t get a boost in innovation. This seems a pretty robust empirical fact. Tough to reconcile with “Innovation requires Monopoly.”  
     
    A free version of the book is in the homepage link.

  51. “. Give your 1960 man an iPhone, and show him Google, Google Maps, Wikipedia, Babblefish and Youtube. Then use the GPS/navigation computer to guide you home.” 
     
    If the 1960′s man is into science fiction then he would expect to have a philosophical conversation with the computer. Although he might be expecting a mainframe. The calculator wouldnt impress him at all.

a