Showing posts with label evolution. Show all posts
Showing posts with label evolution. Show all posts

Tuesday, July 10, 2007

Why are blondes more attractive than brunettes?

As a young brown-eyed, brown-haired girl growing up in Orange County, CA, I found this "stereotype" repeatedly, bewilderingly, validated. Although I defended myself with Van Morrison and a sizeable artillery of blonde-jokes, behind my façade of self-assurance I continued to wonder: why are blonde hair and blue eyes "prettier"? Now, as a slightly more mature brunette with a more comprehensive understanding of natural selection, I still find the question intriguing. Why did Europeans evolve to prefer blonde hair and blue eyes? What do these features indicate about health and fertility?

Psychology Today has an excerpt from the book Why Beautiful People Have More Daughters, by Alan S. Miller and Satoshi Kanazawa, which explores "Ten Politically Incorrect Truths About Human Nature," including the mystery of the "blonde bombshell":
Long before TV—in 15th- and 16th- century Italy, and possibly two millennia ago—women were dying their hair blond. Women's desire to look like Barbie—young with small waist, large breasts, long blond hair, and blue eyes—is a direct, realistic, and sensible response to the desire of men to mate with women who look like her. There is evolutionary logic behind each of these features.

Blond hair is unique in that it changes dramatically with age. Typically, young girls with light blond hair become women with brown hair. Thus, men who prefer to mate with blond women are unconsciously attempting to mate with younger (and hence, on average, healthier and more fecund) women. It is no coincidence that blond hair evolved in Scandinavia and northern Europe, probably as an alternative means for women to advertise their youth, as their bodies were concealed under heavy clothing.

Women with blue eyes should not be any different from those with green or brown eyes. Yet preference for blue eyes seems both universal and undeniable—in males as well as females. One explanation is that the human pupil dilates when an individual is exposed to something that she likes. For instance, the pupils of women and infants (but not men) spontaneously dilate when they see babies. Pupil dilation is an honest indicator of interest and attraction. And the size of the pupil is easiest to determine in blue eyes. Blue-eyed people are considered attractive as potential mates because it is easiest to determine whether they are interested in us or not.

The irony is that none of the above is true any longer. Through face-lifts, wigs, liposuction, surgical breast augmentation, hair dye, and color contact lenses, any woman, regardless of age, can have many of the key features that define ideal female beauty. And men fall for them. Men can cognitively understand that many blond women with firm, large breasts are not actually 15 years old, but they still find them attractive because their evolved psychological mechanisms are fooled by modern inventions that did not exist in the ancestral environment.
The article also explains why men prefer women with small waists and large breasts (both are correlated with levels of estrogen and progesterone, indicating greater fecundity), why beautiful people have more daughters (physical attractiveness is more important for girls than boys, and parents can bias the sex ratio depending on the traits they can offer), why men sexually harass women (it's about respect), and why most suicide bombers are Muslim (the 72 virgins waiting patiently in heaven aren't trivial). Some of the hypotheses are a little dubious to me, but it's an interesting read nonetheless.

Link to the article.

Friday, May 4, 2007

The evolution of the central nervous system

The first animals to evolve on Earth did not have nervous systems. They spent their days and nights sitting on the sea floor, passively filtering water for food particles, a lifestyle that continues in their extant descendants: the sponges. Their lifestyle does not require movement or the ability to sense and respond to the environment, making a nervous system a useless ornamentation.

Clearly, movement and sensation can be beneficial, and the next group of animals to evolve perform these deeds, and quite beautifully. This lineage gave rise to the jellyfish and comb jellies of today, which control their movements using a network of neurons distributed diffusely throughout their bodies (called a "nerve net"). They do not have brains; they do not even have clusters of neurons or major nerve trunks. Thus, although they can sense touch and detect chemicals, they cannot decipher where a given stimulus is. For this reason, these animals respond to stimuli with a reflexive movement of the entire body, regardless of where on the body the stimuli was detected.

Then about 590 million years ago, give or take a large margin of error, the Earth witnessed the dawn of bilateral animals; that is, animals exhibiting one, and only one, plane of symmetry, endowing them with not only a top and a bottom, but a left and a right, and a front and a back (as opposed to jellyfish, which have only a top and a bottom). "Bilateria," as this group is called, includes most extant animal species, from leeches and butterflies to sea squirts and kittens.

One of the consequences of bilateralism is cephalization, which is the evolutionary trend toward concentrating sensory structures and nervous tissue at the anterior (front) end, which is whichever end tends to lead during movement. This cluster of nervous tissue (e.g., the brain, or "ganglia" in most other species) needs to connect to the rest of the body in order to control movement and collect additional sensory information, thus necessitating some sort of nerve trunk (e.g. the vertebrate spinal cord). The net result of this concentration of nervous tissue in discrete parts of the body is a centralized nervous system (CNS).

How and when did the centralization of the nervous system occur? We know that jellyfish and comb jellies did not have centralized nervous systems, but extant Bilateria do (I'll get to the exception, hemichordates, later on). To understand the complexity of reconstructing evolutionary history, it's important to understand a little about how the Bilateria are classified. Recent phylogenetic studies have grouped Bilateria into three main branches: he deuterostomes, which consists primarily of chordates (e.g. fish, amphibians, mammals, birds, jawless fish, etc) and echinoderms (e.g. sea stars, sand dollars, sea cucumbers, etc), and two branches of protostomes, Ecdyzoa (insects and roundworms (e.g. nematodes)) and Lophotrochozoa (molluscs and annelid worms (e.g. earthworms and leeches)). At some point in history, these three bilateral branches all had a common ancestor, and the name of this hypothetical beast is Urbilateria (the slide on the right is from a presentation I did of this paper).

Urbilateria represents a crucial position in our evolutionary history, marking not only the origin of Bilateria, but also the last common ancestor of the deuterostomes and protostomes. Thus when trying to understand the evolution of Bilateria-specific traits, such as a CNS, it is of critical importance to paint a clear and comprehensive picture of Urbilateria.

In order to reconstruct ancestral forms such as Urbilateria, which fossil remains are unspecified or unknown, biologists draw inferences from comparative analysis, using different extant lineages. For example, a characteristic present in fruit flies, nematodes, and mice was most likely present in their last common ancestor (Urbilateria). Conversely, a characteristic present in one lineage (e.g. mice) but not the others (fruit flies and nematodes) was most likely an evolutionary innovation of the former (mouse) lineage. One crucial point is the characteristics must bear some sort of similarity between lineages, or a common evolutionary origin is unlikely. For example, our eyes are markedly different from the compound eyes of fruit flies, so our eyes must have evolved independently from those of the fruit fly, indicating that Urbilateria did not have eyes (although it may have had some sort of light sensor/photodetector).

What about the nervous system of Urbilateria? How was it organized (if at all), and how complex was it? Which characteristics of the nervous systems of today's animals are novelties of their personal evolutionary lines, and which have been retained these hundreds of millions of years? The extant species of Bilateria have CNSs, but there is one key difference: in deuterostomes, the central nerve cord is on the dorsal (back) side, while in both branches of protostomes, the nerve cord is on the ventral (belly) side. This difference has made the reconstruction of Urbilateria's nervous system quite difficult and controversial, but there are two main hypotheses.

The first is called the inversion hypothesis, and was proposed by the zoologist Anton Dohrn in 1875. He theorized that Urbilateria was a worm-like creature with a ventral nerve cord, and that the protostomes retained this orientation, whereas an ancestor of the deuterostomes turned itself upside down and gave rise to animals with dorsal nerve cords. The other hypothesis is known as the gastroneuralia-notoneuralia hypothesis, and predicts that Urbilateria had a diffuse nervous system. After branching off from one another, the deuterostomes and protostomes independently evolved CNS's on opposite sides of the body.

For over a century, these hypotheses have been reasserted and rejected time and time again, as people find striking similarities (supporting a common origin) and crucial differences (supporting an independent origin) between the development of the CNS in vertebrates and fruit flies. So what is the significance of these similarities and differences? What does it all mean? Here we get to the main rub of evolutionary biology: in essence, it is a question of probability, and is exasperatingly subjective. The overall probability of an independent versus common origin has to be estimated, and that estimation is based on putative homologies between different branches. Thus, one must determine the degree of similarity at which two structures may be deemed the result of evolutionary conservation.

Not unexpectedly, when one is tracing hundreds of millions of years of evolutionary history, there are many question marks. One concept that is important to grasp is that those hundreds of millions of years have not treated all lineages equally; some animals have evolved more than others. Across Bilateria, the rates of gene loss and gene alteration are remarkably asymmetrical. By comparing the genomes of Bilateria with genomes of animals that diverged before Bilateria (e.g. jellyfish), one can determine how much a Bilaterian animal has evolved from the last common ancestor. For example, a gene present in humans but not fruit flies, but which is present in jellyfish, must have been present in Urbilateria but lost in the lineage that gave rise to fruit flies. By doing this sort of analysis between man, fruit flies, and nematodes, geneticists discovered that the genome of man had changed the least of the three. Both fruit flies and nematodes, then, underwent periods of rapid rates of molecular evolution, with large gene losses, meaning that we have many genes that were lost along their lineages.

For those familiar with research in developmental biology, something a bit disturbing is now evident. The model organisms used in this field are fruit flies, nematodes, mouse, zebrafish, African clawed frog, and sea urchins. The first two are members of Ecdyzoa, while the rest are deuterostomes, leaving the entire branch of Lophotrochozoa unexplored. Given the rapid rate of evolution of the Ecdyzoa, it's clear that piecing together evolution by merely comparing vertebrates to insects is severely limited. In fact, comparisons with pre-Bilaterian animals has demonstrated that the Lophotrochozoans retain more ancestral features than both the Ecdyzoans and deuterostomes, and have more gene conservation with vertebrates than any Ecdyzoan.

In the most recent issue of Cell, Alexandru Denes of Detlev Arendt's lab explored the CNS of a member this hitherto ignored branch, Platynereis durmerilii, an adorable marine ragworm which has deviated little from the Urbilaterians, prompting Dr. Arendt to call it a "living Urbilateria." Platynereis lives in the same environment that Urbilateria would have (shallow marine waters), has a "prototypical" invertebrate nerve cord (arranged like a rope ladder instead of a hollow cord like ours), and undergoes an "ancient" type of cell division in its earliest stages as a blastula (spiral cleavage).

The development of all animals with any sort of tissue specification (i.e. anything but sponges) is controlled by genes that specify what a given cell in the embryo will become (e.g. skin, muscle, nerve, etc). The developing nervous system then undergoes further specification, in which "neural specification genes" divide the primordial CNS into different regions; the progeny of one region will control muscle movement, the progeny of another will convey sensory information, etc. This group looked for these genes by using a method called in situ hybridization, and found that between vertebrates and Platynereis, the domains of eight different genes have largely identical spatial relations to each other (the picture on the left is the authors' cartoon version of the expression patterns; the bottom is the middle of the CNS, moving up goes outward toward the side of the CNS). Moreover, similar neuron types emerge from the corresponding domains, and send axons to similar destinations! The CNS of Platynereis is actually more similar to the vertebrate CNS than the latter is to the fruit fly's, indicating that the CNS of both vertebrates and Platynereis are likely to be more similar to the ancestral form than that of the fly.

The striking similarities between these two animals, separated by hundreds of millions of years of evolution, implies a common evolutionary origin from an equally complex ancestral pattern. In other words, Urbilateria must have had these same sets of genes, in the same spatial orientation, patterning its nervous system, which must have likewise been centralized. Of course, this is still a matter of probability, but it seems highly unlikely that such a complex arrangement of genes could have been recruited independently to specify evolutionarily unrelated cell populations.

The question now is how this inversion happened. Flipping over is not trivial; there are many differences between up and down, and this potential burden is one of the major impediments to acceptance of the inversion hypothesis. "Down" is the sea floor, which combined with gravity becomes the source of friction; "Up" is the source of falling objects and other dangers, as well as sunlight. An inversion of the body axis would have thus involved a significant change in lifestyle, which is difficult to imagine, but not unprecedented.

Many animals swim "upside down," such as brine shrimp and upside-down catfish, both of which are more efficient feeders in this orientation. Perhaps in a few hundred million years, natural selection will reshape the other anatomical details to fit these new lifestyles, dorsal will become ventral, and zoologists will thus classify them as having their nerve cords on the opposite side as their closest relatives. To some extent, this process has already begun with the catfish. Most fish have light underbellies and dark backs, to aid in camouflage. The upside-down catfish, on the other hand, actually has a dark "underbelly" and a light "back," to help it camouflage in its preferred orientation. Thus, for the catfish, swimming upside down began as a behavioral change, which made its genome permitting of certain genetic changes (i.e. coloration and a propensity and eventual instinct for swimming on its back).

It is not unreasonable, then, that the evolutionary trend towards a dorsalized nerve cord began as a change in a behavioral trait, and was later followed by genetic evolution. A similar sequence of events was likely involved in the animal invasion of the land and flying.

Saturday, April 7, 2007

Color me fantastic

For some background on vision, particularly color vision, read my previous post.

Millions of years ago, before the dinosaurs roamed, the remote ancestors of mammals probably had magnificent color vision. When the dinosaurs came to ecological power, these ancestors (mammal-like reptiles), were banished from the daylight hours and became creatures of the night. During this period, their visual systems evolved be maximally sensitive to light, allowing their proficiency at color discrimination to deteriorate in exchange. Although many mammals have returned to reclaim their diurnal dominion, their color vision still pales (get it?) in comparison to that of other vertebrates, such as reptiles, fish, and birds, whose ancestors did not suffer this period of exile.

Most mammals (some primates, including humans, are exceptions) rely on a dichromatic visual system, with cones that respond optimally to either short (S) or medium (M) wavelengths. (For more on this, see the primer) Our remote ancestors, along with the aforementioned other vertebrates, had three, four, or possibly more types of cones. A broader range of cones enables an exponentially enhanced ability to distinguish colors, but was unnecessary during our ancestors' tenure as nocturnal animals and was subsequently lost.

Our more proximal ancestors, apes and Old World (African) monkeys, regained a third cone, the long (L) wavelength-responsive cone, an adaptation that was instrumental to our evolutionary success. This cone allowed primates to distinguish red from green, which would have been advantageous for purposes of foraging. There are two major questions that arise from this trichromatic revival: How did it happen? and How did our higher visual systems adapt to interpret/perceive this new dimension of sensory experience?

The first is a question of genetics, which I'll answer briefly before delving into the neuroscience. Outside of the lab, new genes aren't added to the genome de novo; they arise from duplications of extant genes. Over time (on an evolutionary timescale), mutations change one or both copies of the gene such that they end up coding for different proteins. The genes that code for the M and L cones are very similar, and are on the same chromosome; it is thus likely that when Old World monkeys regained trichromacy, the gene for the M cone duplicated and mutated to form a gene that coded for the L cone (the divergence of color sensitivities would have been favored by natural selection). It is further believed that the M cone originated from a duplication/mutation/diversion combo from the S cone (which is on a different chromosome).

So that's how it happened on a genetic/peripheral level, but what about the mind? What good is it to be able to have this increased diversity in the retina if the brain is only capable of processing a dichromatic visual world? How much time had to pass before we could not only see this new dimension of light, but could interpret it as well?

A very exciting paper came out recently in Science, in which a group led by Gerald Jacobs at UCSB (the king of color vision, I'm told) used genetically engineered mice to model this crucial adaptation to trichromacy. Mice are, like most mammals and our ancestors, dichromatic, with only S and M cones. The group furnished the mice with the gene for the human L cone, resulting in mice with all the retinal components of trichromatic vision. They then sought to determine whether the mouse nervous system would be able to capitalize on this new information, thus endowing the mouse with an enhanced sensory experience. Alternatively, the visual system may be genetically wired such that it could only handle inputs from the two existing classes of cones, implying that the homologous primate adaptation would have required generations for adaptive rewiring.

To address this question, the group wanted to see if the "trichromatic" mice were able to distinguish between two colors which their dichromatic siblings (and ancestors) would have considered the same. To do this, they put them in front of three colored panels, on which two different wavelengths or intensities were displayed (they should be able to distinguish, for example, a dim green light from a bright red light...for background see primer). The mice were trained to identify which of three panels was illuminated with a different color than the other two; a correct choice was rewarded with drops of soymilk. After about 17,000 tests (literally), most of the mice with the S, M, and human L cone were able to learn the task, and chose correctly 80% of the time! (Mice with only the S and M cones still performed no better than chance after a similar number of trials.)

This demonstrates, quite wonderfully, that "the mammalian brain is sufficiently plastic that it can extract and compare a new dimension of sensory input." It follows, they speculate, that the primate that first inherited an L cone would have immediately reaped the benefits, and would likely have enjoyed a selective advantage.

I think this is quite a marvelous find, and the paper made me happy and excited, but I am also not entirely surprised by the result. It is merely a reminder of the essence of evolution, in that it tends to be guided by everyday biological mechanisms; things that seem special or mysterious in our natural history and our living world are made possible by these same mechanisms. I've previously discussed this idea that the parsimony of evolution would allow novel adaptations in the periphery to exploit previously evolved central circuitry. For this study, I think it's helpful to supplement this idea with a brief discussion of the striking plasticity of the developing nervous system.

When an animal is developing, the pathways of the visual system are highly sensitive to certain visual stimuli, and grow and refine under the guidance of the animal's experience. Some of you may have heard of experiments in which kittens were raised in boxes in which their only visual stimuli were black and white vertical stripes. Their visual pathways developed such that neurons in the visual processing areas responded primarily to vertical lines, and failed to respond to horizontal lines when presented later in life. In a sense, their brains were not able to see horizontal planes.

It seems to follow that if an animal is endowed with a "new" element of visual stimulation, rather than deprived, its developing visual pathways would accommodate this visual world by wiring up accordingly. As the brain of a dichromat was already capable of comparing information from two different classes of cones to compute a color representation in the brain, it may not have been too demanding a request to factor a third into the equation. Importantly, this adaptation would take place during the development of the first animal that possessed the new cone, as opposed to an evolutionary timescale.

This brings up an intriguing thought: what would happen if a mutation arose that endowed humans with a fourth cone? Our resulting percept of the world is a bit difficult to fathom...would we be able to distinguish light mixtures that normal people cannot? What would these colors look like? What if this new cone was able to detect light outside our visible spectrum--ultraviolet light, like bees and some birds, or infrared light, like rattlesnakes?

Friday, March 9, 2007

These spinal networks were made for walking

The invasion of the land by animals was an astonishing evolutionary feat, necessitating a number of substantial changes to the body: limbs with digits, structures for obtaining oxygen from the air, a relatively waterproof covering to prevent dehydration, and sturdy structures to support the body in a medium much less buoyant than water, to name a few. When these pilgrims first bridged the immense gulf between land and water, almost every system in the vertebrate body underwent substantial modifications, but what about the nervous system?

The different optical and sound properties of water and air required significant adaptations for our visual and auditory systems, but these adaptations were largely peripheral: the lens changed shape to adapt to the different refractive indices, and the bones of the middle ear evolved from other bones of the face. Perhaps the most significant behavioral modification (which would thus require notable rewiring of the neural circuitry) was the transition from swimming to walking. Did animals need to invent completely new pathways to support a wider repertoire of locomotion?

First, it's necessary to have a general understanding of the neural basis of locomotion: central pattern generators (CPGs). I posted on CPGs a little while ago; the basic idea is they are networks of neurons, located in the spinal cord, which coordinate all of the muscles involved in locomotion without input from the brain. I focused on bipedal motion, but CPGs control rhythmic locomotory movements in all vertebrates, including those that swim and fly. Thus a more focused question is: when animals transitioned from swimming to walking, could the same CPGs that controlled aquatic locomotion handle the different coordination needed between a body and its limbs for walking?

An excellent paper was published today in Science that explored this question using a robotic salamander (named Salamander robotica because scientists are pretty bad at naming things). Salamanders are considered to be the most similar to the first terrestrial vertebrates, and are thus often used as a model system for studying the evolution of new anatomical structures for terrestrial (vs aquatic) locomotion.

Salamanders can rapidly switch from swimming (using undulations similar to those of primitive fish), to walking (using diagonally-opposed limbs that move together while the body forms an S-shape, like an alligator). Looking at the animal's movements from above, the body can be seen as either moving in a traveling wave or a standing wave, respectively, and neural activity along the spinal cord is likely to mirror this effect.

The group, led by physicist Auke Jan Ijspeert and neurobiologist Jean-Marie Cabelguen, designed Salamander robotica with an electronic "spinal cord" to determine whether the same spinal network could produce both swimming and stepping patterns, and how it might transition between the two.

Their spinal cord was controlled by an algorithm that incorporated essential known or speculated attributes of salamander locomotion. First, the group knew (from a study they did in 2003) that the transition between standing and traveling waveforms can be elicited simply by changing the strength of the excitatory drive from a specific region of the brainstem. In this experiment, a weak drive induced the slow, standing wave of the walking gait, while a stronger drive induced the traveling wave of the swimming motion. Second, the authors reasoned that there are two fundamental CPGs controlling salamander locomotion: the body CPG, located along the spinal cord, and the limb CPG, located at each of the limbs.

With these parameters in place, Salamander robotica set forth on her quest to traverse land and sea, to test whether her "primitive" swimming circuit (the body CPG) would be able to coordinate with the "newer" circuits of her phylogenically recent limbs to produce the waddling gait of her sentient inspiration. Watch the results:



So mechanistically, how does she do it? The group found that at low frequencies, both CPGs are active; the limbs then alternate appropriately, and are coordinated with the movements of the body. At higher frequencies, the limb CPGs are overwhelmed, and thus the limbs tuck in as the body CPGs take over.

One interpretation here is that the group is good at building robots, so Salamander robotica did exactly what they wanted it to do. Another interpretation, the one that got this study into Science and into my blog (quite selective, really), is that the spinal locomotor network controlling trunk movements has remained essentially unchanged during the evolutionary transition from aquatic to terrestrial locomotion.

I think this experiment was highly innovative. As you might imagine, it is quite difficult to study evolution in a controlled laboratory setting (global warming suffers from similar drawbacks, but that's a whole 'nother post), so using robotics as an experimental model is quite promising. The core finding, however, was not surprising to me.

The transition from water to land necessitated a daunting number of anatomical changes and, looking back 370 million years, it seems an unsurmountable divide. But the success of evolution hinges on the fact that it occurs gradually, and rarely involves unusual or extraordinary biological processes. It is thus logical that a common, underlying neural mechanism for propulsion can produce a variety of movements; we see this in modern humans, as well. As I pointed out in my earlier post, the same CPGs--in fact, the same neurons in the same CPG--are used for walking, running, hopping, and skipping. The fact that locomotion is largely independent of conscious control strengthens this rationale; it is much more straightforward to make small adjustments to the system by tinkering "downstream." Thus, when animals adapted to terrestrial locomotion, they used the most efficient (and thus most likely to be successful) strategy: recruiting the same neural circuits used for aquatic locomotion.

Which is not to say that this finding is any less wonderful. It is simply a powerful reminder that, as evolutionary biologist Neil Shubin wrote, "the ancient world was transformed by ordinary mechanisms of evolution, with genes and biological processes that are still at work, both around us and inside our bodies." This is, in his words, "something sublime."

For more information, The Neurophilosopher has an excellent, more detailed post on this paper.

Sunday, February 25, 2007

The evolution of language acquisition

The animal kingdom is full of noisy beasts. Screeching monkeys, barking dogs, squawking chickens…animals use sounds to communicate with each other, both friends and foes. These sounds are instinctual, and although the animals may learn, through observation, when a particular sound is appropriate, the quality of the sounds is not learned; the vocalizations are acoustically innate. What about human vocalizations? Excepting expressions of intense emotions (crying, screaming, laughing), most forms of human vocalizations, particularly speech, are acquired through a learning process. Babies learn to speak by perceiving sounds, imitating them, and then modifying their own vocal output in a process called “vocal learning.”

Vocal learning, or acquiring vocalizations through imitation rather than instinct, is not unique to humans, but is extremely rare in the animal kingdom. In fact, only three other distantly related groups of mammals (elephants, bats, and cetaceans (whales and dolphins)) and three distantly related groups of birds (hummingbirds, parrots, and songbirds) are capable of vocal learning.

The group that has received the most attention from neuroscientists, with respect to vocal learning, includes the songbirds. A growing body of research focuses on the brain structures that allow birds to perceive, learn, and generate songs, and the hope is that this research may contribute to understanding the neural mechanisms underlying human language acquisition. These structures, which include seven “vocal brain nuclei,” are involved in networks that mediate both the perception and production of sounds, allowing birds to hear themselves, hear others, and control the acoustic structure of their vocal output.

By looking at genes that are up- or down-regulated in response to singing behavior, researchers (notably Dr. Erich Jarvis at Duke University) found that these seven nuclei are strikingly similar in location, connectivity, and behaviorally-driven gene expression across all three bird groups, and speculates that similar brain structures are at play in other vocal learners.

This image shows comparable vocal and auditory brain areas among vocal learning birds and humans. Left hemispheres are shown, as that is the dominant side for human language. From Jarvis, Ann. N.Y. Acad. Sci 1026: 749-777 (2004).


Because human brain lesions and brain imaging studies do not allow for a high resolution analysis, the neuroanatomy of comparable vocal nuclei in humans has not been demonstrated. It is interesting to note however, that like birds, humans have brain regions in the cerebrum that control the acoustic structure of their vocal behavior. For example, Broca’s area plays a selective role in speech production; humans with damage to this region have difficulty speaking, but have little or no deficits in comprehension. Somewhat surprisingly, the neuroanatomy of similar structures in other vocal learning mammals (elephants, bats, cetaceans) has not been examined.

Given that parrots and hummingbirds diverged from one another about 65 million years ago, and that birds evolved 50-100 million years after mammals, the evolution of vocal learning is perplexing. Not only did this complex behavior evolve in phylogenically disparate groups, but also, in every species that has been examined, it is governed by similar neural structures. How did these striking similarities evolve?

Dr. Jarvis suggests three hypotheses. The first is convergent evolution; that is, similar structures evolved independently in all vocal learners. This implies significant constraints on how these structures can evolve to mediate this complex behavior. (For comparison, the “eye” is believed to have evolved independently eleven times, but each time has produced dramatically different structures…compare a human eye to those of a spider).

The second hypothesis is that all vocal learners came from a common ancestor, and that vocal learning was thus lost in all non-learners. Given that these animals diverged hundreds of millions of years ago, this would imply that the capacity for vocal learning was lost independently by every other animal; non-human primates would have lost this capacity multiple times before humans evolved with the trait intact. The third hypothesis modifies the first, positing that all animals have rudimentary neural structures for vocal learning, but that these structures have been independently amplified in vocal learners.

All three of these hypotheses are possible alternatives, and all are constrained by the rarity of vocal learning in the animal kingdom. Vocal learning has clear benefits: by permitting the modification of sounds, it allows for innovative and flexible communication. This system may be the foundation upon which spoken language in humans was established, and certainly contributes to reproductive success in songbirds. Further, these attributes may allow animals to maximize sound propagation in novel environments (e.g. if an animal must adjust from living in an open savannah to living on a heavily forested mountain). If such a useful behavior could evolve seven independent times, why didn’t it evolve more often? Alternatively, if vocal learning was present early on, why would most animals have lost the capacity?

Jarvis’s answer: predation. The ability to make novel, varied sounds, and to maximize their propagation, is an excellent way to advertise one’s presence to potential predators. Thus, for the majority of animals, the benefits of vocal learning are far outweighed by the hazards it brings.

Only seven known groups of distantly related animals are capable of vocal learning--this in itself is fascinating. The fact that seven similar brain structures have evolved in three of these learners only adds to the excitement. Have the brains of mammalian vocal learners evolved similar mechanisms for perceiving and producing sounds? More importantly, have humans? Did this ability to imitate and improvise sounds lead to the evolution of human language?