Error in the age of personal genomics

danielx.pngOver at Genetic Future Dr. Daniel MacArthur points out some errors in deCODE’s interpretation services. Dr. MacArthur presumably knows his maternity, though if the X chromosome results were correct one would guess that Dr. MacArthur is actually adopted and that his mother might be a Lumbee Indian.
But it makes me wonder how confused people are going to be due to problems with false results. In particular, as these technologies become very cheap many families with make recourse to them. Sometimes this will highlight “extrapair paternity events,” but sometimes there will be errors and siblings may face a period of uncertainty in relation to possibly discordant results. The likelihood of a false result creating an unexpected situation is conditional on various probabilities, the error rate of the results, and the likelihood of an extrapair paternity event (which varies from demographic to demographic and family to family). I guess we’ll have more data in the near future….

When mammoths roamed (rarely)

Brian Switek, The extended twilight of the mammoths:

So, if the team’s analysis is correct, both mammoths and horses lived in the interior of Alaska between about 11,000 and 7,000 years ago. This is significantly more recent than the youngest fossil remains of horses and mammoths, dated between 15,000 and 13,000 years ago. There are at least two factors that might contribute to this disparity. The first is that fossils from this more recent time were preserved but have not yet been found. More likely, though, is that the populations of both mammoths and horses had dwindled to the point where fossil preservation was becoming increasingly unlikely. There were so few of them that the death of an individual in circumstances amenable to preservation was becoming rarer and rarer.
Either way, this discovery has important implications for the extinction of horses and mammoths in North America. Based upon the fossil data alone it had been hypothesized that both disappeared around the time that humans became established in North America.* Some have taken this association to suggest that humans engaged in a blitzkrieg in which naive New World megamammals were quickly dispatched by the human hunters. If the new evidence is correct, though, humans did not wipe out horses and mammoths overnight. Instead humans lived alongside dwindling populations in Alaska for thousands of years. Likewise, these new findings also contradict the favored hypothesis of one of the study’s authors, Ross MacPhee, who previously proposed that some kind of “hyperdisease” carried by humans (or animals that traveled with humans) quickly wiped out these animals. The pattern of extinction was obviously more protracted.

This seems about right. Excuse the analogy, but it sometimes seems that models of human-caused extinction of mega-fauna portray ancient hunter-gatherers as Einsatzgruppen, and the mega-fauna as Jews and Communists. Though genocides of human populations in the concerted manner of the Germans against the Jews, Gypsies and other groups during World War II have occurred periodically, more often what we see is a slow wearing down and attrition of marginal groups at the expense of dominant ones.
It seems a plausible model that when mega-fauna were plentiful hunters would focus on them, but once the mega-fauna became rare naturally the return on investment would decrease and it would become rational to shift to other prey organisms. This implies that many mega-fauna likely persisted in isolated pockets as relict populations, and may have been killed off only far later, or perhaps even succumbed to a natural environmental calamity. In another era the last herds of wild horses would probably have gone extinct due to drought, or perhaps been hunted down by a random group of humans who had no idea that they were decreasing the biological diversity of the planet.

On scales of norms

There has been more blogospheric discussion on the topic of my post Doing the right thing, doing the legal thing. Megan McArdle, who started the discussion, has a long post elaborating on her objections to strategic defaults. Steve Waldman has two good posts up. Finally, at The Big Money Daniel Gross concludes:

Of course, corporate managers and financiers don’t suffer from these neuroses. Do you think billionaire investor Sam Zell feels any guilt or shame because his buyout of the Tribune Co., which had $12.9 billion in debt, ended in a Chapter 11 filing last December? Rather than worry about whether Americans will take cues from modest homeowners who make a tough decision not to stay current on debt, perhaps we should worry about middle-class Americans taking cues from billionaires and Fortune 500 companies who make the rational decision not to stay current on debt.

As I admitted earlier, as a matter of description a society where people are relatively trustworthy, and do the right thing as opposed to the legal thing, is a happier place than one where trust is in short supply. The problem is that these societies don’t emerge out of thin air, but are created though vigilant policing of norms, and frankly a level of interpersonal nosiness or homogeneity which is probably considered uncouth or retrograde today.
Perhaps what we saw in the last generation was the slow but steady exhaustion of values which arose in the context of small towns and urban neighborhoods. With the decline of the small town and the decay of close-knit urban neighborhoods perhaps the modern state is one where atomistic rational actors are intent on doing what they can get away with because of the anonymity which is the normal course of existence. The power of modern media means that Americans, no matter where they live, see how the high and might live and how they comport themselves in their peer groups. The “community” has now expanded into cyberspace and the norms are not just created by interaction with those whom you meet face to face.
Instead of demanding that Americans stay true to the values of old, and basic decency, I think it is perhaps time to engage with the future which we are facing. The horse of old-time values has left the barn. Perhaps more transparency in personal records and immediate access by anyone in relation to anyone will bring back some accountability to the choices one makes.

Being black as a state of mind

Walter_Francis_White.pngA few days ago I pointed out that actors with visible Asian ancestry, such as Keanu Reeves, Mark-Paul Gosselaar and Dean Cain, can play white characters, while those with visible African ancestry can not (I will leave it to debate whether you think Rashida Jones or Jennifer Beals are violations of that rule or not). On the other hand, it does seem that people with no visible African ancestry can identify as black American due to the norm of hypodescent. For example, consider Walter White, who identified as a black man despite his visible white European appearance. White used his ability to “pass” during his long Civil Rights career since he could operate “undercover.”
One of the unsurprising things which modern genomics is uncovering is that though the median African American has European admixture on the order of ~20%, there is a wide variance. Consider this plot (I’ve reedited it a bit, the figure can be found in this paper) :

Read More

Less religion = more religious activism?

Tom Rees:

It seems that when Christianity is popular, Christians are content with the idea of a firewall separating Church and State. It’s only when Christianity begins to lose it’s influence over the population at large that Christians begin to campaign for the State to adopt a Christian character.
Looking at survey data from 18 Western countries, they found:
-The fewer Christians in a country, the greater the support among Christians for a greater public role for religion (as shown in the graph).
-The polarization of views between Christians and non-religious on a public role for religion is greatest in countries where there are fewest Christians.

The relation is illustrated with a nice scatterplot:
Achterberg_2009_truce_cance.png
Some of this can be attributed to specific factors in Europe relating to religious pluralism. Consider my coblogger Martin Rundkvist’s reflections on carolling. Even if a society is very secular, if the dominant religious orientation is uniform, then its background assumptions suffuse one’s daily life. One can therefore be a “cultural” Catholic or Lutheran, with an attachment to the exoteric forms associated with the religion, without being a believer. But when you have religious pluralism thrown into the mix people are going to disagree strenuously about exoteric forms. This applies even to the post-religious; an American atheist from a Jewish background may have a different attitude toward Christmas than an American atheist from a Catholic background. In other words, as European societies have become less Christian over the past generation, they’ve also had to face more religious pluralism. Christians will become more assertive and aggressive in direct response to Europe’s growing Muslim community, which wishes to contest the tacit monopoly that Christianity has long had in Europe as the Faith.
But another issue which might be at work is that as nominal or marginal believers fall away, the set of individuals who remain committed Christians are more religious and exhibit more fidelity to their identity than before. This may result in a group of Christians who are much more cohesive and can engage in collective action out of proportion to their numbers. Whereas before more marginal and nominal members of the community might have served as a check on excessive activism, today those individuals may no longer be part of the Christianity community.
The power of an organized Christian community is clear in a society such as South Korea. Though only around 30% of the population is Christian, with almost half the population not having a religious affiliation at all, Christians have been over-represented in positions of power. The growth of the Christian religion has been rapid, but has slowed over the past 15 years. It seems possible that it may be nearing its “natural limit.” But that does not mean that it won’t influential in the years to come.

The fossil record is spotty

Spatial Organization of Hominin Activities at Gesher Benot Ya’aqov, Israel:

The spatial designation of discrete areas for different activities reflects formalized conceptualization of a living space. The results of spatial analyses of a Middle Pleistocene Acheulian archaeological horizon (about 750,000 years ago) at Gesher Benot Ya’aqov, Israel, indicate that hominins differentiated their activities (stone knapping, tool use, floral and faunal processing and consumption) across space. These were organized in two main areas, including multiple activities around a hearth. The diversity of human activities and the distinctive patterning with which they are organized implies advanced organizational skills of the Gesher Benot Ya’aqov hominins.

ScienceDaily has a summary:

Evidence of sophisticated, human behavior has been discovered by Hebrew University of Jerusalem researchers as early as 750,000 years ago — some half a million years earlier than has previously been estimated by archaeologists.
The discovery was made in the course of excavations at the prehistoric Gesher Benot Ya’aqov site, located along the Dead Sea rift in the southern Hula Valley of northern Israel, by a team from the Hebrew University Institute of Archaeology. Analysis of the spatial distribution of the findings there reveals a pattern of specific areas in which various activities were carried out. This kind of designation indicates a formalized conceptualization of living space, requiring social organization and communication between group members. Such organizational skills are thought to be unique to modern humans.
Attempts until now to trace the origins of such behavior at various prehistoric sites in the world have concentrated on spatial analyses of Middle Paleolithic sites, where activity areas, particularly those associated with hearths, have been found dating back only to some 250,000 years ago.
The new Hebrew University study, a report on which is published in Science magazine, describes an Acheulian (an early stone tools culture) layer at Gesher Benot Ya’aqov that has been dated to about 750,000 years ago. The evidence found there consists of numerous stone tools, animal bones and a rich collection of botanical remains.

500,000 years! This could be wrong, who knows? My network of priors in this area is too thin to evaluate the probability (leave it to someone like John Hawks). Rather, it is important to remember that the fossil record gets really thin the further back you go. Hominins were never common to begin with, at least before the recent past. There’s a huge fossil gap in China for example between the early Homo erectus and the Holocene. I bring this up because John Horgan has been arguing that there’s no evidence for “war” before the rise of agriculture, based on Brian Ferguson’s research. Part of the issue might be how you define war. But another issue might be that this is a case where the sample sizes over time are small enough that you might actually miss a lot.
Citation: Nira Alperson-Afil, Gonen Sharon, Mordechai Kislev, Yoel Melamed, Irit Zohar, Shosh Ashkenazi, Rivka Rabinovich, Rebecca Biton, Ella Werker, Gideon Hartman, Craig Feibel, and Naama Goren-Inbar, patial Organization of Hominin Activities at Gesher Benot Ya’aqov, Israel, (18 December 2009), Science 326 (5960), 1677. [DOI: 10.1126/science.1180695]

Canada & North American theocracy

In the comments Europeans often point out that nations we Americans consider very secular, such as the United Kingdom and the Netherlands, actually provide state subsidy to religious schools. Part of the issue here is that Americans have a caricature of Europeans in mind, just as Europeans often have a caricature of Americans. Though in terms of their personal beliefs most Europeans are more secular than most Americans, that does not mean that we Americans can infer from that particularities of how Europeans organize their relationship between church & state. When the American republic was founded a proactive effort was made to separate the national government from any particular church or religion (a precedent which was eventually followed by the states in the early 19th century). At that time even nations with a reputation for religious tolerance, such as the Netherlands, arguably treated their minorities as what we would recognize today as “dhimmis” (see Divided by Faith: Religious Conflict and the Practice of Toleration in Early Modern Europe).

And yet despite the lack of national promotion of one particular sect America remains an exceptionally religious nation, at least by belief. Most European societies took different tracks (I think one major confusion by Americans is the idea that there is one European outlook on particular questions). Which brings me to a weird historical oddity I recently stumbled upon: Newfoundland had a purely sectarian public school system until 1997. You can read about it here. This system seems to be what the Catholic Church would have preferred in the 19th century for the United States, as the public school system was strongly tinged with Protestant presuppositions (e.g., reading Protestant Bibles). In the United States the Church lost. In Newfoundland it looks like they obtained a satisfactory compromise.

(the title is a joke, Canadians supposedly have a sense of humor)

Addendum: Before a Canadian points this out, yes, I’m aware that some provinces still allow for tax-support of sectarian schools.

Brain size & microcephaly genes

Microcephaly Genes Associated With Human Brain Size:

Highly significant associations were found between cortical surface area and polymorphisms in possible regulatory regions near the gene CDK5RAP2. This gene codes for a protein involved in cell-cycle regulation in neuronal progenitor cells — cells that migrate to the cerebral cortex during the second trimester of gestation and eventually become fully functioning neurons. The cerebral cortex is the outer layer of the brain, often referred to as “gray matter.” The most highly developed part of the human brain, the cerebral cortex is responsible for higher cognitive functions, such as thinking, perceiving, producing and understanding language, some of which is considered uniquely human.

Similar but less significant findings were made for polymorphisms in two other microcephaly genes, known as MCPH1 and ASPM. All findings were exclusive to either males or females but the functional significance of this sex-segregated effect is unclear.

One particularly interesting feature of this new discovery is that the strongest links with cortical area were found in regulatory regions, rather than coding regions of the genes,” said Andreassen. “One upshot of this may be that in order to further understand the molecular and evolutionary processes that have determined human brain size, we need to focus on regulatory processes rather than further functional characterization of the proteins of these genes. This has huge implications for future research on the link between genetics and brain morphology.”

Wouldn’t be the first time that genes which have a connection to pathologies turn out to be useful in illuminating normal human variation. It’ll be on the site of PNAS someday.

Crime way down. Who exactly knows stuff?

Despite recession, crime keeps falling:

In times of recession, property crimes, in particular, are expected to rise.

They haven’t.

Overall, property crimes fell by 6.1 percent, and violent crimes by 4.4 percent, according to the six-month data collected by the FBI. Crime rates haven’t been this low since the 1960’s, and are nowhere near the peak reached in the early 1990’s.

Who expected crime to increase? Did you? I did. But I didn’t know anything about crime statistics over time so I was working off naive intuition. Did social scientists expect this? I recall a lot of worry in the media about a year ago that the crime drop which started in the 1990s would be reversed, and I shared the worry. Here’s Matt Yglesias worrying last January:

I think this is worth worrying about. One thing we know about crime is that when wages and employment levels for low-skill workers are high, crime goes down. Another is that mass incarceration works – increase the number of beds in prison and the number of sentence-years handed out and the crime rate drops. But the first of these is the reverse of what happens in a recession, and the second we’ve already pushed well past the limit of cost-effectiveness (see here) and it’s inconceivable to me that you could actually push this far enough to compensate for the declining economy in the context of declining state budgets.

It’s easy to find national uniform crime reports data back to 1960, and unemployment rates. Quick correlations between 1960-2008 are:

Violent Crime Aggregated 0.37
Murder 0.52
Rape 0.37
Robbery 0.53
Assault 0.24

Property Crime Aggregated 0.53

One seems to see a modest expectation for a rise in crime then over this time period. But poking around the ICPSR I came across Eric Monkkonen’s data sets on homicide in New York City going back to the 19th century. Below are homicides per capita by year between 1900 and 2000. The second chart is log-transformed.


It seems that there’s another “Depression Paradox” here. The economic distress of the Great Depression seems to have been associated with less crime, while the economic exuberance of the 1920s led to more crime. So if I constrained the time series from 1920-1940 the correlations might be quite different.

All things equal the recent past is a better guide to the near future than the less recent past. But it’s important to remember that history does sometimes work in cycles, and the deeper past can occasionally give us insights which the recent past can not. One could construct a tentative model whereby basal crime rates reflect cultural norms, and once norms and crime hit a particular “equilibrium” it may take a bit of a “shock” for it to shift out of the stable state.

Louisiana should be the happiest and New York the least

Objective Confirmation of Subjective Measures of Human Well-Being: Evidence from the U.S.A.:

A huge research literature, across the behavioral and social sciences, uses information on individuals’ subjective well-being. These are responses to questions–asked by survey interviewers or medical personnel–such as “how happy do you feel on a scale from 1 to 4?” Yet there is little scientific evidence that such data are meaningful. This study examines a 2005-2008 Behavioral Risk Factor Surveillance System random sample of 1.3 million United States citizens. Life-satisfaction in each U.S. state is measured. Across America, people’s answers trace out the same pattern of quality of life as previously estimated, using solely nonsubjective data, in a literature from economics (so-called “compensating differentials” neoclassical theory due originally to Adam Smith). There is a state-by-state match (r = 0.6, P < 0.001) between subjective and objective well-being. This result has some potential to help to unify disciplines.

Basically they constructed an index of quality of life based on objective metrics. They then compared how those metrics related to surveyed subjective happiness, and came up with this 50 state scatterplot:
happiness.png
The correlation being ~0.6, that means that 36% of the variation of of subjective assessments of life satisfaction can be predicted by the variation in on the presumed predictors of happiness. Note that underlying demographic variables are controlled here. As I said above, Louisiana is ranked so that it should be the happiest state while New York the least. Here’s the full list of 50 states:

Read More