Marisol M. Gutierrez

Problem Search

Brain and Language:

A Perspective From Sign Language

Discover Magazine, August 1998

In an article contributed to by Daphne Bavelier, David Corina and Helen Neville, the brain's left-hemisphere's domination over language was attacked. The question they posed was this: To what extent is the organization of the language areas influenced by the language experience of the individual? Through the use of spoken language, studies have shown that the network governing language is dominated by the left side of the brain. These areas include the inferior prefrontal cortex (Broca's area), the posterior superior temporal lobe (Wernicke's area), the anterior superior temporal lobe and the middle prefrontal cortex. Bavelier and her colleagues argue that the findings in these studies are biased. They contend that using spoken language to gain knowledge of the brain's language organization makes it unclear whether left-hemisphere language dominance is due to the need to analyze sequential structures, the building blocks of language, or if it is due to processing acoustic signals in spoken language.

In contrast to spoken languages, sign languages use spatial location and motion of hands to convey linguistic information. By comparing spoken languages and sign languages, one can determine what structures are truly common to all human languages and what language structures of the brain are determined by the environment in which the language is learned. Spoken and sign language both have the same abstract capabilities like phonology, syntax, semantics, pragmatics, etc.; but the languages differ in how they convey these capabilities. In sign language, for example, phonology is created by the different positions and shapes of the signer's hands relative to his or her body, whereas, in spoken language, acoustic properties, like nasality, are used.Things like case marking and word order have been replaced by spatial mechanisms.

Since it is known that the right-hemisphere plays a large role in visuo- spatial processing, it would appear that the right-hemisphere should play a larger role in sign language than does the left-hemisphere. Yet, in cases where native signers have suffered damage to their Wernicke's area, the signer experiences deficits in comprehension just as a native speaker would with damage to the same area. Similarly, if Broca's area is damaged, the deaf signer has difficulty in signing. This includes non-fluent signing such as single-signs, yet, as expected, comprehension remains intact. Indeed, studies have shown, through images taken from healthy native signers, that sign language does rely on Broca's area, and other relevant areas of the left- hemisphere, in sign production.

This is not to say that the right-hemisphere plays no role in language, in either spoken or sign languages. The right side of the brain does contribute to spoken language by inferring emotion in language and being able to appreciate jokes and puns. So both sides of the brain contribute to language, though in different ways. The right-hemisphere is necessary for broad interpretation across sentences and the global meaning of language, while the left-hemisphere is responsible for fine language processing and literal meaning. The right-hemisphere does, however, appear to play a larger role in sign language than it does in spoken language. During comprehension of sign language, there is a larger participation by the right-hemisphere than there is during comprehension of written languages. So, when processing sentences, the left-hemisphere does not dominate in sign language as it does in spoken language.

This finding supports the view that the cerebral organization of the brain for language may be altered by the constraints of the language itself. What is not clear, according to Bavelier and others, is the functional role of some areas in the right-hemisphere in sign language. There are areas in the right side of the brain that participate in sign language but do not process visuo-spatial information. There are patients who cannot recognize an object visually, although they can through other senses, but who can still understand the sign for that object.

Perhaps the reason for greater right-hemisphere participation in sign language than in spoken language is due to the higher demands on visuo-spatial processing. Maybe these supplemental areas were recruited in order to deal with this greater demand. For whatever the reasons, it is clear that the areas for language in the left-hemisphere are common to all languages and that the final organization of the language system is partly determined by the language experience of the individual.