Virginity & heritability

Genes may time loss of virginity:

As genetic determinism goes, the new findings are modest. Segal’s team found that genes explain a third of the differences in participants’ age at first intercourse – which was, on average, a little over 19 years old. By comparison, roughly 80% of variations in height across a population can be explained by genes alone.

On the other hand, conservative social mores might delay a teen’s first sexual experience, causing scientists to low-ball the effect of genes. Indeed, Segal’s team noticed a less pronounced genetic effect among twins born before 1948, compared with those who came of age in the 1960s or later.

As for the specific genes involved, another team previously found that a version of a gene encoding a receptor for the neurotransmitter dopamine is associated with age at first intercourse. Others have linked the same version of the gene – called DRD4 – to impulsive, risk-taking behaviour.

The paper is Age at first intercourse in twins reared apart: Genetic influence and life history events.

FuturePundit notes:

The team found a weaker effect from genes with people born before 1948. This supports an argument I’ve made here previously: the breakdown of old cultural constraints on behavior frees up people to follow genetically driven desires and impulses. We become more genetically driven as external constraints weaken.

When you remove the strength of environmental parameters from the equation it naturally results in a greater salience of heritable ones. Ergo, the logic whereby you can make the case that in a perfect meritocracy there will be much stronger genetic sorting by class (via assortative mating, etc.).

Related: DRD4 and virginity.

How big does the N need to be?

Estimating the number of unseen variants in the human genome:

…Consistent with previous descriptions, our results show that the African population is the most diverse in terms of the number of variants expected to exist, the Asian populations the least diverse, with the European population in-between. In addition, our results show a clear distinction between the Chinese and the Japanese populations, with the Japanese population being the less diverse. To find all common variants (frequency at least 1%) the number of individuals that need to be sequenced is small (∼350) and does not differ much among the different populations; our data show that, subject to sequence accuracy, the 1000 Genomes Project is likely to find most of these common variants and a high proportion of the rarer ones (frequency between 0.1 and 1%). The data reveal a rule of diminishing returns: a small number of individuals (∼150) is sufficient to identify 80% of variants with a frequency of at least 0.1%, while a much larger number (> 3,000 individuals) is necessary to find all of those variants. Finally, our results also show a much higher diversity in environmental response genes compared with the average genome, especially in African populations.

The details of this matters for genetic architecture, especially for complex traits such as height & IQ.

Women are the genetic future

Dan MacArthur has a post up where he discusses 23andMe’s outreach to “mommy bloggers.” This makes economic sense for any firm in this field. There’s only so much money to be made out of telling blue eyed nerds that they carry the gene for blue eyes. To use a computer analogy the way you can get the Apple II of personal genomics would be to convince millions of pregnant women of the utility of your tools.

Evolving to become more miserable?

In A Farewell to Alms, Gregory Clark provides data on interest rates to show that Europeans gradually developed lower time preferences. In other words, they were more likely to delay gratification and plan for the future — paying back loans, for example. He also interprets data on wills as showing that most people of English descent today are the genetic legacy of the middle class, the poor and the aristocracy mostly having failed to reproduce themselves. That leaves us with a society where the average person maximizes their long-term material welfare much better than their counterparts would have in the Middle Ages or before. There appears to be somewhat of a drawback, though: doing so makes you more miserable over the long term.

John Tierney recently reviewed
a series of studies on how the intensity of guilt and regret change over time. Read the most recent article for free here, which contains five related studies. The journal article and Tierney’s write-up are brief and straightforward, so I own’t belabor the details here. Basically, in the short term, indulgence-driven guilt stings more than prudence-driven regret, and this motivates us toward virtuous behavior, such as delaying material gratification. In the long term, though, guilt has faded away and regret over missing out on life’s pleasures weighs more heavily on our mind.

Oddly, then, maximizing long-term material well-being minimizes long-term hedonic well-being. If the big shift to low time preferences was as recent as Clark suggests — during the Modern and especially Industrial period — then perhaps our brain’s pleasure or reward system hasn’t had enough time to rewire itself to make us feel warm and fuzzy about having saved, abstained, and done the prudent thing in the past. Rather, since all other human groups before the big change, and certainly other primate groups, had very high time preferences, the reward system is probably designed to make us feel happy as we pour over a mental photo album that’s stuffed with memories of irresponsible fun and indulgence.

Hey, no one ever said that changing the world and getting shit done was going to be emotionally uplifting.

I’d like to see follow-up studies focus on individual differences in how strongly they are motivated by guilt vs. regret. Most personality questionnaires measure something called excitement seeking or novelty seeking, as well as impulsiveness. We might predict that impulsive and excitement-seeking people are more motivated by avoiding regret than avoiding guilt, which leads them toward indulging more in the present. You could re-do all of the five studies in the article above, but using personality traits as predictor variables. If different parts of the brain light up when we feel guilt vs. regret, you could see if impulsive and excitement-seeking people showed greater responses to regret-based scenarios than guilt-based scenarios. (E.g., they read a story about someone else feeling these emotions, they reflect on an episode from their own lives, they see pictures of the faces of others expressing these emotions, and so on.)

On an applied level, if you suffer from “hyperopia” — planning to much for your material future — you can push yourself to indulge merely by reflecting on how you may in 20 years regret missing out on having fun now. If you remind yourself that “You’ll regret it if you don’t,” then you won’t find yourself sighing later on about that more exciting trip you should have taken your son on, that year of working in a more fulfilling city for less pay, or that student who made a pass at you that you should have slept with.

Connections between Mendelian diseases and natural variation

I’ve written before about a pattern emerging from genome-wide association studies–genes in which mutations cause rare extreme forms of a phenotype often harbor common variation that influence natural, non-disease variation in that same phenotype. A pair of new studies on variation in cardiac repolarization (summarized here) provide an additional example of this pattern.

It’s worth noting that this was something of an obvious hypothesis–candidate gene association studies often targeted gene known to cause Mendelian disorders when mutated. In retrospect, the reason these studies were often inconclusive was a simple lack of power.

Posted in Uncategorized

Facebook is not a revolution

A follow up to my earlier post on information technology, In The Age Of Facebook, Researcher Plumbs Shifting Online Relationships:

“You can ask somebody, ‘Of your 300 Facebook friends how many are actually friends?’ and people will say, ‘Oh, 30 or 40 or 50,’ “ said Baym. “But what having a lot of weak-tie relationships is giving you access to are a lot of resources that you wouldn’t otherwise have. Because we do tend to cluster in relationships with strong ties to people that are pretty similar to ourselves. So they don’t necessarily know a whole lot that we don’t know. They haven’t necessarily been a lot of places that we haven’t been. They can’t volunteer to show us around Sydney, Australia, or give advice on a good reading on a topic. So there are all of these little bits of information and wisdom and social support that people can provide each other when they have a weak-tie relationship — and they can really open up access to resources that we wouldn’t have otherwise.”

The 30-50 number should be familiar, as it is in the same range as what ethologists such as Robin Dunbar have been reporting for years in terms of how many friendships a human can plausibly manage. Social technology has limits in terms of how much it can leverage our innate capabilities. On other hand it seems plausible that the “long tail” of weak acquaintances can yield some utility in terms leaking more information into one’s social network from the outside. Quantitative shifts in network structure and scope on the margins may very well lead to qualitative changes in human societies, but I don’t think we’ve really thought through in much detail the substantive ramifications.

More skepticism of natural selection

In the wake of last week’s paper, looks like another one is coming down the pipe, Hundreds of Natural-Selection Studies Could be Wrong, Study Demonstrates:

“These statistical methods have led many scientists to believe that natural selection acted on many more genes in humans than it did in chimpanzees, and they conclude that this is the reason why humans have developed large brains and other morphological differences,” said Nei. “But I believe that these scientists are wrong. The number of genes that have undergone selection should be nearly the same in humans and chimps. The differences that make us human are more likely due to mutations that were favorable to us in the particular environment into which we moved, and these mutations then accumulated through time.”
Nei said that to obtain a more realistic picture of natural selection, biologists should pair experimental data with their statistical data whenever possible. Scientists usually do not use experimental data because such experiments can be difficult to conduct and because they are very time-consuming.

Good luck on getting experimental data on humans! In any case the paper will be out in PNAS later this week. Doesn’t look like it’s on the website yet.

Paternity rates by population

A few years ago a paper came out, How Well Does Paternity Confidence Match Actual Paternity?:

Evolutionary theory predicts that males will provide less parental investment for putative offspring who are unlikely to be their actual offspring. Cross‐culturally, paternity confidence (a man’s assessment of the likelihood that he is the father of a putative child) is positively associated with men’s involvement with children and with investment or inheritance from paternal kin. A survey of 67 studies reporting nonpaternity suggests that for men with high paternity confidence rates of nonpaternity are(excluding studies of unknown methodology) typically 1.9%, substantially less than the typical rates of 10% or higher cited by many researchers. Further cross‐cultural investigation of the relationship between paternity and paternity confidence is warranted.

I’ve referred to this paper before, but I thought it might be useful to post the rates for various populations. See the original paper for sources & discussion.
Note that the second set of results from paternity testing laboratories obviously are subject to selection bias.

Read More

One gene controlling ant behavior?

Single Gene Shapes the Toil of Ants’ Fighter and Forager Castes:

Researchers studying the social behavior of ants have found that a single gene underlies both the aggressive behavior of the ant colony’s soldiers and the food gathering behavior of its foraging caste.
The gene is active in soldier ants, particularly in five neurons in the front of their brain, where it generates large amounts of its product, a protein known as PKG. The exact amount of the protein in the ants’ brains is critical to their behavior.

The article goes on talk to about correlates of PKG variation in humans….