Logo IMG


The Functionalist's Dilemma

George Lakoff

Language, Consciousness, Culture: Essays on Mental Structure. Ray Jackendoff. xxvi + 403 pp. The MIT Press, 2007. $36.

Science, as Thomas Kuhn famously observed, does not progress linearly. Old paradigms remain as new ones begin to supplant them. And science is very much a product of the times.

The symbol-manipulation paradigm for the mind spread like wildfire in the late 1950s. Formal logic in the tradition of Bertrand Russell dominated Anglo-American philosophy, with W. V. O. Quine as the dominant figure in America. Formalism reigned in mathematics, fueled by the Bourbaki tradition in France. Great excitement was generated by the Church-Turing thesis that Turing machines, formal logic, recursive functions and Emil Post's formal languages are equivalent. The question naturally arose: Could thought be characterized as a symbol-manipulation system?

The idea of artificial intelligence developed out of an attempt to answer that question, as did the information-processing approach to cognitive psychology of the 1960s. The mind was seen as computer software, with the brain as hardware. The software was what mattered. Any hardware would do: a digital computer or the brain, which was called wetware and seen (incorrectly) as a general-purpose processor. The corresponding philosophy of mind, called functionalism, claimed that you could adequately study the mind independently of the brain by focusing on the mind's functions as carried out by the manipulation of abstract symbols.

The time was ripe for Noam Chomsky to adapt the symbol-manipulation paradigm to linguistics. Chomsky's metaphor was simple: A sentence was a string of symbols. A language was a set of such strings. A grammar was a set of recursive procedures for generating such sets. Language was syntacticized—placed mathematically within a Post system, with abstract symbols manipulated in algorithmic fashion by precise formal rules. Because the rules could not look outside the system, language had to be "autonomous"—independent of the rest of the mind. Meaning and communication could play no role in the structure of language. The brain was irrelevant. This approach was called generative linguistics, and it continues to have adherents in many linguistics departments in the United States.

In the mid-1970s, another paradigm shift occurred. Neuroscience burst onto the intellectual stage. Cognitive science expanded beyond formalist cognitive psychology to include neural models. And cognitive linguistics emerged, whose proponents (including me) see language and thought not as an abstract symbol-manipulation system but as physically embodied and reflecting both the specific properties and the limitations of our brains and bodies. Cognitive linguistics has been steadily developing into a rigorously formulated neural theory of language based on neural-computation theory and actual developments in neuroscience.

Ray Jackendoff's new book, Language, Consciousness, Culture, is set solidly within the old generative-linguistics paradigm. In it, Jackendoff staunchly defends functionalism and the symbol-manipulation paradigm. "Some neuroscientists say we are beyond this stage of inquiry, that we don't need to talk about 'symbols in the head' anymore. I firmly disagree," he notes. He goes on to argue that the symbolic representations given by linguists are simply right, and he takes the brain to be irrelevant. Interestingly, he does not cite the major work arguing the opposite, Jerome Feldman's 2006 book, From Molecule to Metaphor. Feldman shows how the analyses of language and thought done by cognitive linguists can be characterized in terms of neural computation. But, as Jackendoff says, "Cognitive Grammarians . . . have been steadfastly ignored by mainstream generative linguistics." Just as Kuhn would have predicted.

All this creates a dilemma for Jackendoff. He sees the limitations of the functionalist paradigm and rails correctly against Chomsky's syntacticization of meaning, but he stays with a version of symbolic logic, in which meaning is also syntacticized by a formal logical syntax.

Jackendoff has read widely in cognitive science and neuroscience, while "steadfastly ignoring" the literature of cognitive and neural theories of language, which answers many of the questions he raises, although in a paradigm he refuses to consider. He sees correctly that the cognitive and brain sciences ought to be taken seriously by philosophers and social scientists, but his forays into social, moral and political ideas are limited by his functionalist approach.

Take the question of meaning. In 1963, I proposed a theory of generative semantics in which a version of formal logic became an input to generative grammars. I was later joined in this enterprise by James D. McCawley and John Robert Ross, two of Chomsky's best-known students. Among our tenets were that conceptual structure is generative, that it is prior to and independent of language, and that it is inaccessible to consciousness. Jackendoff argued strongly against this position at the time, but in this book, only 40 years later, he accepts these tenets, while keeping Chomsky's idea that syntactic structure is independent of meaning. Jackendoff adopts a parallel-structure theory in which he holds both ideas at once. As we did then, he now declares that Chomsky's syntactocentrism is a "scientific mistake." Yet, as a Chomskyan syntactician, he has to keep a version of the "scientific mistake"—an autonomous syntax for grammar alongside his autonomous syntax for meaning (a kind of symbolic logic).

In the 1960s, Charles J. Fillmore proposed a theory of "case grammar" in which there were semantic roles (agent, patient, experiencer and so on) and principles mapping these roles to grammar. This idea was accepted in cognitive linguistics and has been developed over the past 40 years by Fillmore and many others in the theory of grammatical constructions, in which semantics is directly paired with syntactic form. Jackendoff adopts a version of this theory without mentioning Fillmore. Laudable, if a little late.

In 1975, Fillmore began the development of "frame semantics," expanding the notion in great detail over the next three decades. Conceptual framing has become central in cognitive linguistics worldwide and is widely applied, as in my work on political analysis over the past decade. Jackendoff accepts a much less precise and less worked-out version of frames set forth by Erving Goffman and Marvin Minsky in the mid-1970s, but he "steadfastly ignores" Fillmore's elaborate research and its widespread application.

In 1997, Srini Narayanan, in his dissertation at the University of California, Berkeley, worked out a neural computational account of actions and events, which generalizes to the semantics of aspect (event structure) in linguistics and actually computes the logic of aspect. In Language, Consciousness, Culture, Jackendoff tries to adapt Chomsky's syntactic structures to action structure, which Patricia Greenfield of UCLA first attempted in the 1960s. Jackendoff's account, coming a decade after Narayanan's, doesn't characterize actions nearly as well, does not compute the logic of actions, does not characterize the semantics of aspect and does not fit the theory of neural computation. But it is gratifying to see Jackendoff trying to link motor actions to linguistics (as Chomsky never would), in an attempt to break out of the functionalist mold without leaving it.

Jackendoff is asking questions well beyond the Chomskyan enterprise, and in some cases he approaches what cognitive linguists have achieved. But one place he gets it very wrong is conceptual metaphor.

Mark Johnson and I wrote Metaphors We Live By (1980) almost three decades ago. Since then hundreds of researchers have developed a whole field of study around the subject. In our 1999 book Philosophy in the Flesh, Johnson and I elaborated in great detail on Narayanan's neural computational theory of metaphor.

In the neural theory, conceptual metaphor arises in childhood when experiences regularly occur together, activating different brain regions. Activation repeatedly spreads along neural pathways, progressively strengthening synapses in pathways between those brain regions until new circuitry is formed linking them. The new circuitry physically constitutes the metaphor, carrying out a neural mapping between frame circuitry in the regions and permitting new inferences. The conceptual metaphor MORE IS UP (as in "prices rose," "the temperature fell") is learned because brain regions for quantity and verticality are both activated whenever you pour liquid into a glass or build any pile. AFFECTION IS WARMTH (as in "She's a warm person," or "She's an ice queen") because when you are held affectionately as a child by your parents, you feel physical warmth. Hundreds of such primary metaphors are learned early in life. Complex metaphors are formed by neural bindings of these primary metaphors. And metaphorical language expresses both primary and complex metaphors.

Because we first experience governance within the family, one widespread primary metaphor is A GOVERNING INSTITUTION IS A FAMILY, with authority based on parental authority. Within the literal Family Frame, rights and obligations arise from what is allowed and required, given the desires and responsibilities of parents and children. Children want to be fed and taken care of, and parents are required to provide for them. Children have other desires that may be allowed or forbidden. Parents may require certain things of children.

Under the metaphor A GOVERNING INSTITUTION IS A FAMILY, what is required by an authority is called an obligation, and what is allowed or has to be provided by an authority is called a right. The metaphor applies at various levels, so there are higher governing institutions, such as societies, nature or the universe, and metaphorical authorities, such as social or moral norms, natural laws and God. At each level, the logic of family-based authority is metaphorically duplicated for rights and obligations, with authorities at a lower level subjected to authority at a higher level. No special metaphors unique to rights and obligations are needed. Other independently existing primary metaphors flesh out the complexities: Because ACHIEVING A DESIRED PURPOSE IS GETTING A DESIRED OBJECT, rights are seen as metaphorical possessions, which can be given to you, held onto or lost. Because requirements can be difficult and DIFFICULTIES ARE BURDENS, we speak of "taking on" or "undertaking" obligations.

You would never know any of this from reading Jackendoff's brief discussion of whether rights and obligations are understood metaphorically. He reaches the conclusion he has to reach: that no conceptual metaphor at all is used in understanding rights and obligations. This is not surprising, because typically he has largely ignored the cognitive linguistics literature.

Had the discussion of rights and obligations in Language, Consciousness, Culture appeared in the late 1960s, it would have been seen as excellent. But coming out nearly 40 years later, it is inadequate, because it fails to explain why we reason about rights and obligations as we do, both in the West and elsewhere in the world. The neural-metaphorical understanding gives a correct account of the data plus an explanation grounded in biology. Such explanations are lacking throughout the book because Jackendoff still holds to functionalism.

For a cognitive linguist like myself, reading Jackendoff's book is both painful and hopeful—painful because he keeps trying to do interesting and important intellectual work while being stuck in a paradigm that won't allow it, and hopeful because he may help the transition from a brain-ignoring symbol-manipulation paradigm to a brain-based neural theory of thought and language. I wish that other linguists, both generative and cognitive, had his scope and intellectual ambition.

comments powered by Disqus

Connect With Us:


Sigma Xi/Amazon Smile (SciNight)

Subscribe to Free eNewsletters!

RSS Feed Subscription

Receive notification when new content is posted from the entire website, or choose from the customized feeds available.

Read Past Issues on JSTOR

JSTOR, the online academic archive, contains complete back issues of American Scientist from 1913 (known then as the Sigma Xi Quarterly) through 2005.

The table of contents for each issue is freely available to all users; those with institutional access can read each complete issue.

View the full collection here.


Subscribe to American Scientist