The Gestural Origins of Language
Human language may have evolved from manual gestures, which survive today as a "behavioral fossil" coupled to speech
Gesture on the Brain
Like spoken language, sign language among deaf people appears to depend critically on the left side of the brain. Indeed, left-sided brain damage can produce deficits in signing that parallel the deficits in spoken language. For example, anterior lesions in the vicinity of Broca's area produce deficits in expressive signing, whereas more posterior lesions result in deficits in the comprehension of signing. In a study of brain activity as measured by functional magnetic resonance imaging, Helen J. Neville and her colleagues of the University of Oregon have confirmed that both Broca's and Wernicke's areas, the two main language-mediating areas in the left side of the brain, are activated in deaf signers while they watch sentences in ASL. This activity was similar to that observed in hearing people when they listened to spoken sentences. Unexpectedly, the deaf showed much more activity in the right side of the brain than did the hearing people, perhaps reflecting the more prominent spatial component (which is known to be predominantly a right-hemisphere function).
A gestural origin for language may well explain the close association between handedness and the cerebral asymmetry for language. Nearly all right-handers are dominant in the left cerebral hemisphere for speech. Among left-handers the relation is less clear, with perhaps 60 percent being left dominant for speech and the remainder about equally divided between those who are right dominant and those with bilateral representation in the brain. Doreen Kimura, now at Simon Frasier University, has also noted that right-handers tend to gesture with their right hands when they speak, whereas left-handers tend to gesture with both hands. A possible explanation for this pattern of association, suggested by genetic models of handedness proposed independently by Marian Annett of the University of Leicester and Christopher McManus of University College London, is that there may be an allele (a variant of a gene) predisposing left-brain dominance for both speech and hand control. In those lacking this allele, handedness becomes a matter of chance, and handedness and cerebral dominance for speech are decoupled. Selection of the left-brain dominance gene may well have occurred during a period in hominid evolution when vocalization began to emerge as an accompaniment to gesture, eventually to replace it as the dominant mode.
Of course, there are other interpretations of the anatomical relation between hand gestures and the language areas in the brain. Elizabeth Bates of the University of California at San Diego suggests that language is a parasitic system that is overlaid on areas of the brain that originally evolved to do more basic kinds of sensorimotor work. Indeed, the areas that serve language continue to do the same non-linguistic work today: "They have not given up their day job," according to Bates. This would include the motor areas of the frontal cortex, together with sensory regions that mediate both the perception of sound and the multiple experiences that go into what we call "meaning." From this sensorimotor point of view, language and gesture are planned and executed together all the time because they are running off of the same neural systems, and the planning of language inevitably "leaks" over into gesture, which is a by-product.
In this view, hand gestures are something of a "fifth wheel" in the evolution of language—only along for the ride. I would argue, however, that the richness of human sign languages and hand gestures belies such a superfluous evolutionary origin for these phenomena. Gestures are not haphazardly associated with speech; they can convey information in a systematic way. Indeed, in sign languages, gestures carry information completely free of the spoken word. Rather than being a "fifth wheel," perhaps gestures are the remains of the "unicycle" on which language first evolved.