Speech and motor feedback
I am really enjoying Buzsaki’s book. The chapter I’m in now is about perturbations of the intrinsic network activity of the nervous system. He examines some of the earliest developing examples of intrinsic neural activity: self-generated activities that help organize the retina to visual cortex mapping and the twitches and kicks of the developing fetus. He is building the point that a nervous system that cannot generate motor output cannot really perceive the world in a meaningful way.
“Only through movement can distances be measured and incorporated into our sensory scheme. For an immobile observer, direction, distance, and location of sensory information are incomprehensible and meaningless concepts.”
At another level of analysis he notes that a baby’s babbles are self-organized output that is then perturbed by parents via their reaction to babbles that chance upon parts of their own words. Perturbation can reshape the weights of the network and organize them in terms of experience.
It made me wonder about the FOXP2 story. There has been some debate about where exactly the deficit lies in the KE family, the family that has a heritable language disorder. Some have suggested that the main effect of FOXP2 loss is to reduce fine motor control of the face and jaw (and tongue?) that would allow for speech production. I wonder if this motor loss might propagate backwards to a more general loss of language functions. Buzsaki covers some studies in which developing rats are immobilized in one way or the other. The inability to produce motor outputs disallows the creation of coherent sensory maps, presumably because the spontaneous motor programs activate sets of muscles that cause corresponding sets of sensory receptors to be co-activated. At a coarse level, if you move your arm, it might run into something and now the parts of your arm that are near each other get to make coherent sensory input. Cells that fire together, wire together and a somatotopic sensory map is born.
So I think you can see the analogy I am imagining. Since the fine motor control output in KE family members is dysfunctional, the fine sensory perceptions that motor control should map to are also disturbed. I suppose this is one way that a developmental defect that leads to a specific motor problem could affect perception of language as well. Thinking this way would probably still require some sort of magic trick where the sensory patterning fails fairly deep in the hierarchy of language perception processing.
I wanted to make sure I was right about the motor dispute, so I pulled out the latest review. No discussion of intrinsic oscillations and language development, but in case y’all want an update here are some key areas researchers are focusing on:
1) Finding other natural mutations in the FOXP2 gene that lead to a phenotype similar to the original KE family.
2) Using Chromatin-Immunoprecipitation Chip (ChIP-Chip, I dread when they come up with a version of this specifically for our primate cousins) to discover molecular targets downstream of FOXP2. FOXP2 is a transcription factor, which means it binds to DNA. ChIP-Chip allows you to lock the FOXP2 onto whatever DNA it is associated with at the moment, pull those specific pieces of DNA out of solution, and see which pieces of DNA you pulled out by seeing if they stick to other chunks of DNA of which you know the identity.
3) Pursuing the observation that FOXP2 is actively regulated (they make more or less of the protein) during different types of vocal behavior in birds.
4) Pursuing the recent discovery that mice make ultrasonic vocalizations. If we can draw clear analogies between these vocalizations and birdsong or language then we can bring to bear the full arsenal of transgenic manipulations available in the mouse model.
Maybe this review was biased towards certain a certain approach, but I’m not seeing much here in terms of further characterizing the developmental phenotype associated with FOXP2 mutations. Is it not ethical or something? Did the KE family stop breeding? I think it might be worth taking a look at their very early EEGs. Finding a network-level signature for the disorder could provide an intermediate level phenotype and obviate the need to justify analogies between birdsong, mouse vocalizations, and human speech.





IIRC, the affected family members have impairments in both the cognitive and motor part of language, but I don’t think the motor defecit is causing the cognitive defecit. You could get around orofacial motor impairment by using sign language. After some googling, I couldn’t tell whether or not people have tried to teach sign language to the affected KE family members, but that seems an obvious solution. If it had worked, we would’ve heard about it by now, so probably there’s more to the story than motor difficulties.
The paper about ultrasonic “songs” in mice made quite a splash, but like many things in science, it was not really that new.
It has been known for 30+ years that rats make ultrasonic vocalizations, that their variation & quality are correlated with reproductive fitness, and that their production varies depending on the hormonal state of potential mates. This is essentially what the mouse paper was all about, yet the basic phenomenon had been in print for quite a long time.
That’s not to say mice (& rats) have something to tell us about language. I think they do, particularly the role of preoptic, hypothalamic and midbrain centers involved in the production of emotional utterances. These areas are involved in rodent vocalizations, and may be in humans too– particularly for non-verbal output such as laughter, crying, moaning, etc.
As for Buzaki’s remarks about motor output being important in sensory map development, I think he is absolutely right and I don’t think it stops with the more well-known topographic maps seen in cortex and other laminar structures. I am willilng to bet that emotional expressions also serve to wire up the networks involved in the recognition of social cues. There may be maps that lay between say, the olfactory cortex and preoptic/hypothalamic/midbrain output nuclei, but they don’t have a intelligible topology, so far as we presently know.