What can linguists learn about the nature of language when we look at it from a computational perspective, and take this perspective seriously?
Much of current syntactic theory assumes a computational viewpoint, directly or indirectly. But is this meant more or less metaphorically? If not metaphorically, how do current theoretical issues relate to long-standing (and more recent) discussions about, for instance, generative capacity? Is this really so easy to dismiss as 'not-part-of-I-language'? What can we learn by looking at linguistic phenomena in terms of the general problems of constructing and interpreting structured signals as linear sequences? What is the role of probabilistic facts, at the core of many computational systems? (Etc.)
In the first part of this multidisciplinary seminar we will aim at solidifying a common grounding in relevant concepts related to the generative capacity of grammars and the Chomsky hierarchy, (Shannon) information, computational learning theory, and even some basics of animal communication/thinking (bearing in mind that we are attempting to understand a biological implementation of a computational 'organ', and if so there should be 'model organisms' out there to help focus the task, at least in principle).
In the second part of the course we will look at how the ideas in the first part relate to current research on language, perhaps even giving us a natural grounding for some of these.
Among the things we're interested in looking at:
We're hoping for a semester in which we strengthen our ability to discuss linguistic issues in computational terms, and then apply that ability in open-minded discussions that lead to creative research ideas and innovative final projects.