Friday, December 28, 2007
Lakoff, George. "The Functionalist's Dilemma [Review of Jackendorff's LANGUAGE, CONSCIOUSNESS, CULTURE]." AMERICAN SCIENTIST (January-February 2008).
Language, Consciousness, Culture: Essays on Mental Structure by Ray Jackendoff. Cambridge, MA: MIT Press, 2007. xxvi + 403 pp. $36. Science, as Thomas Kuhn famously observed, does not progress linearly. Old paradigms remain as new ones begin to supplant them. And science is very much a product of the times. The symbol-manipulation paradigm for the mind spread like wildfire in the late 1950s. Formal logic in the tradition of Bertrand Russell dominated Anglo-American philosophy, with W. V. O. Quine as the dominant figure in America. Formalism reigned in mathematics, fueled by the Bourbaki tradition in France. Great excitement was generated by the Church-Turing thesis that Turing machines, formal logic, recursive functions and Emil Post's formal languages are equivalent. The question naturally arose: Could thought be characterized as a symbol-manipulation system? The idea of artificial intelligence developed out of an attempt to answer that question, as did the information-processing approach to cognitive psychology of the 1960s. The mind was seen as computer software, with the brain as hardware. The software was what mattered. Any hardware would do: a digital computer or the brain, which was called wetware and seen (incorrectly) as a general-purpose processor. The corresponding philosophy of mind, called functionalism, claimed that you could adequately study the mind independently of the brain by focusing on the mind's functions as carried out by the manipulation of abstract symbols. The time was ripe for Noam Chomsky to adapt the symbol-manipulation paradigm to linguistics. Chomsky's metaphor was simple: A sentence was a string of symbols. A language was a set of such strings. A grammar was a set of recursive procedures for generating such sets. Language was syntacticized—placed mathematically within a Post system, with abstract symbols manipulated in algorithmic fashion by precise formal rules. Because the rules could not look outside the system, language had to be "autonomous"—independent of the rest of the mind. Meaning and communication could play no role in the structure of language. The brain was irrelevant. This approach was called generative linguistics, and it continues to have adherents in many linguistics departments in the United States. In the mid-1970s, another paradigm shift occurred. Neuroscience burst onto the intellectual stage. Cognitive science expanded beyond formalist cognitive psychology to include neural models. And cognitive linguistics emerged, whose proponents (including me) see language and thought not as an abstract symbol-manipulation system but as physically embodied and reflecting both the specific properties and the limitations of our brains and bodies. Cognitive linguistics has been steadily developing into a rigorously formulated neural theory of language based on neural-computation theory and actual developments in neuroscience. Ray Jackendoff's new book, Language, Consciousness, Culture, is set solidly within the old generative-linguistics paradigm. . . . Read the rest of the review here: http://www.americanscientist.org/BookReviewTypeDetail/assetid/56419.