Monday, December 14, 2009

Who argues the most from authority?   posted by agnostic @ 12/14/2009 01:00:00 PM
Share/Bookmark

Google results for +"nobel laureate" +X, where X is one of the following:

Chemistry: 317,000
Physics: 415,000
Medicine: 467,000
Economics: 484,000

Of course, there are more winners to refer to in Physics than in Economics, so we should control for that. Dividing the number of Google results by the number of winners gives these per capita rates:

Chemistry: 2032
Physics: 2231
Medicine: 2395
Economics: 7446

If the intellectual merit of a body of ideas is not so well established, you're more likely to deflect attention by reassuring everyone that, hey, it can't be that crazy -- after all, the guy is a Nobel laureate. Perhaps that's why physics ranks above chemistry here, what with string theory etc. taking it further into speculation compared to more grounded chemistry.

Labels: ,




Saturday, October 24, 2009

When you can meet online, will colloquia disappear?   posted by agnostic @ 10/24/2009 07:08:00 PM
Share/Bookmark

The other day I saw a flier for a colloquium in my department that sounded kind of interesting, but I thought "It probably won't be worth it," and I ended up not going. After all, anyone with an internet connection can find a cyber-colloquium to participate in -- and drawn from a much wider range of topics (and so, one that's more likely to really grab your interest), whose participants are drawn from a much wider range of people (and so, where you're more likely to find experts on the topic -- although also more know-nothings who follow crowds for the attention), and whose lines of thought can extend for much longer than an hour or so without fatiguing the participants.

So, this is something like the Pavarotti Effect of greater global connectedness: local opera singers are going to go out of business because consumers would rather listen to a CD of Pavarotti. It's only after it becomes cheap to find the Pavarottis and distribute their work on a global scale that this type of "creative destruction" will happen. Similarly, if in order to get whatever colloquia gave them, academics migrated to email discussion groups or -- god help you -- even a blog, a far smaller number of speakers will be in demand. Why spend an hour of your time reading and commenting on the ideas of someone you see as a mediocre thinker when you could read and comment on someone you see as a superstar?

Sure, perceptions differ among the audience, so you could find two sustained online discussions that stood at opposite ends of an ideological spectrum -- say, biologists who want to see much more vs. much less fancy math enter the field. That will prevent one speaker from getting all the attention. But even here, there would be a small number of superstars within each camp, and most of the little guys who could've given a talk here or there before would not get their voices heard on the global stage. Just like the lousy local coffee shops that get displaced by Starbucks -- unlike the good locals that are robust to invasion -- they'd have to cater to a niche audience that preferred quirkiness over quality.

So the big losers would be the producers of lower-quality ideas, and the winners would be the producers of higher-quality ideas as well as just about all consumers. Academics wear both of these hats, but many online discussion participants might only sit in and comment rather than give talks themselves. It seems more or less like a no-brainer, but will things actually unfold as above? I still have some doubts.

The main assumption behind Schumpeter's notion of creative destruction is that the firms are competing and can either profit or get wiped out. If you find some fundamentally new and better way of doing something, you'll replace the old way, just as the car replaced the horse and buggy. If academic departments faced these pressures, the ones who made better decisions about whether to host colloquia or not would grow, while those who made poorer decisions would go under. But in general departments aren't going to go out of business -- no matter how low they may fall in prestige or intellectual output, relative to other departments, they'll still get funded by their university and other private and public sources. They have little incentive to ask whether it's a good use of money, time, and effort to host colloquia in general or even particular talks, and so these mostly pointless things can continue indefinitely.

Do the people involved with colloquia already realize how mostly pointless they are? I think so. If the department leaders perceived an expected net benefit, then attendance would be mandatory -- at least partial attendance, like attending a certain percent of all hosted during a semester. You'd be free to allocate your partial attendance however you wanted, just like you're free to choose your elective courses when you're getting your degrees -- but you'd still have to take something. The way things are now, it's as though the department head told its students, "We have several of these things called elective classes, and you're encouraged to take as few or as many as you want, but you don't actually have to." Not exactly a ringing endorsement.

You might counter that the department heads simply value making these choices entirely voluntary, rather than browbeat students and professors into attending. But again, mandatory courses and course loads contradict this in the case of students, and all manner of mandatory career enhancement activities contradict this in the case of professors (strangely, "faculty meetings" are rarely voluntary). Since they happily issue requirements elsewhere, it's hard to avoid the conclusion that even they don't see much point in sitting in on a colloquium. As they must know from first-hand experience, it's a better use of your time to join a discussion online or through email.

The fact that colloquia are voluntary gives hope that, even though many may persist in wasting their time, others will be freed up to more effectively communicate on some topic. Think of how dismal the intellectual output was before the printing press made setting down and ingesting ideas cheaper, and before strong modern states made postage routes safer and thus cheaper to transmit ideas. You could only feed at the idea-trough of whoever happened to be physically near you, and you could only get feedback on your own ideas from whoever was nearby. Even if you were at a "good school" for what you did, that couldn't have substituted for interacting with the cream of the crop from across the globe. Now, you're easily able to break free from local mediocrity -- hey, they probably see you the same way! -- and find much better relationships online.

Labels: , , ,




Sunday, May 10, 2009

Measuring the shelf-life of student interest in their subjects, using Google Trends   posted by agnostic @ 5/10/2009 11:55:00 PM
Share/Bookmark

To test how sensitive Google Trends is to fundamental changes in the thing you're asking about, I decided to see if it could pick up the seasonality of fruit availability. Sure enough, it does. Just check blueberries or pomegranate: when the fruits are plentiful, people are very interested in them; outside the peak season, interest plummets. Interestingly, you see something very similar for how much people are searching Google for intellectual topics, which is an indicator of how long their interest lasts.

I noticed a funny cyclical pattern awhile back when I searched Google Trends for slavery. I had a hunch but filed it away. Now I've looked into it, and it's what I thought -- it tracks the school year, specifically when mid-term and final papers are due during the fall and spring semesters. There's sharp drop during Christmas vacation, and a steady low level during summer vacation. To show that this is true, you see the same pattern for postmodernism, Freud, Foucault, semiotics -- plus Darwin, evolutionary psychology, differential equations, and linear algebra.

That shows how long the average student, after exposure to some body of thought, retains interest in it over their lifetime -- about a day or so after the exam is done. And intellectual merit hardly seems to matter -- real stuff like differential equations doesn't seem any stickier than the snake oil of postmodernism. If you're an educator who's ever suffered from the delusion that you can inspire lasting interest in your subject, these graphs should wake you up. Sure, there's that one student whose eagerness for the subject is just a bit creepy (unless she's a cute girl, of course), but most of your students will treat your class like they do the movies they see in the theater -- or the Malcolm Gladwell books their parents read -- which provide brief fun but are forgotten a week or so later.

This more or less contradicts the proponents of higher ed for everyone, of a core curriculum, and of similar policies that are based on the assumption that students retain anything at all. After their Harvard undergrad educations, most alumni had no clue what causes the seasonal change in weather. (They tend to say that it's due to the elliptical orbit of Earth around the Sun -- summer when it's closest and winter when it's farthest away.) If they're just going to flush out the course's content once the semester is over, why make them take the course in the first place? Except for the school to get their money, and for the professor to keep his job through high enrollment.

"But higher ed is not just pre-professional training -- it's about cultivating the garden of their mind!" Well, if the average student were at all intellectually curious, maybe. But most aren't -- once their final paper is in, flusssshhh! To revisit the topic of the education bubble, most arts and humanities majors could cruise through undergrad in two years tops, unless they were dead set on becoming academics, in which case they'd really need to absorb a lot more information. But if you're majoring in history or English in order to go to law school, who cares if you only surveyed one period of English poetry, rather than from Beowulf to the Beats? Obvious exceptions are technical or professional majors, such as engineers needing to know calculus, statistics, etc., which might take them three years to complete.

The cold hard reality, shown by the Google Trends data above, is that just about all students are going to junk everything they ever learned in college once they're done with the course -- not even once they graduate. Therefore, having them schlump around all day in these throwaway courses only wastes their time, money, and energy, which could be spent producing stuff. Aside from signaling that they haven't gone braindead or really fucked up their work ethic after high school, a college degree doesn't mean much, unless it's a technical one. So, give them a year or two to prove this, and then get them out into the real world. They'll probably come out the other end of college with healthier livers to boot.

Labels: , , ,




Thursday, March 05, 2009

Will information criteria replace p-values in common use? Some trends   posted by agnostic @ 3/05/2009 12:30:00 AM
Share/Bookmark

P-values come from null hypothesis testing, where you test how likely your observed data (and more extreme data) are under the assumption that the null hypothesis is true. As such, they do not allow us to decide which of a variety of hypotheses or models is true. The probability they encode refers to the observed data under an assumption -- it does not refer to the hypotheses on the table.

Using information criteria allows us to decide between a variety of hypotheses or models about how the world works. They formalize Occam's Razor by rewarding models that show a good fit to the observed data, while penalizing models that have lots of parameters to estimate (i.e., those that are more complex). Whichever one best balances this trade-off wins.

Although I'm not a stats guy -- I'm much more at home cooking up models -- I've been told that the broader academic world is becoming increasingly hip to the idea of using information criteria, rather than insist on null hypothesis testing and reporting of p-values. So, let's see what JSTOR has to say.

I did an advanced search of all articles for "p value" and for "Akaike Information Criterion" (the most popular one), looking at 5-year intervals just to save me some time and to smooth out the year-to-year variation. I start when the AIC is first mentioned. For the prevalence of each, I end in 2003, since there's typically a 5-year lag before articles end up in JSTOR, and estimating the prevalence requires a good guess about the population size. For the ratio of the size of one group to the other, I go up through 2008, since this ratio does not depend an accurate estimate of the total number of articles. From 2004 to 2008, there are 4132 articles with "p value" and 927 with "Akaike Infomration Criterion," so the estimate of the ratio isn't going to be bad even with fewer articles available during this time.

Intervals are represented by their mid-point. Someone else can do the better job of searching year by year, perhaps restricting the search to social science journals to see if real headway is being made. (It would be uninteresting to see a rise of the popularity of information criteria in statistics journals.) Here are the trends in the use of each, as well as the ratio of p-value to AIC:


It's promising that both are increasing over the past 30-odd years, since that means more people are bothering to be quantitative. Still, less than 5% of articles mention p-values or information criteria -- some of that is due to the presence of arts and humanities journals, but there's still a big slice of the hard and soft sciences that needs to be converted. Also encouraging is the steady decline in the dominance of p-values to the AIC: they're still about 4.5 times as commonly used in academia at large, but that's down from about 15.5 times as common in the mid-1970s, a 71% decline. Graduate students and young professors -- the writing is on the wall. Aside from being intellectually superior, information criteria will give you a competitive edge in the job market, at least in the near future. After that, they will be required.

Labels: , ,




Friday, January 25, 2008

What needs to change in academia?   posted by the @ 1/25/2008 06:08:00 PM
Share/Bookmark



more

Labels: , , , ,