Summary of Discussion of Language

By Nicholas Plummer, Michael Colloton, Akeed Habeeb (some additions and comments by Frawley)

Language is domain-specific with sub domains. It has different representations and procedures for syntax, phonology, and semantics. There are no general procedures throughout all domains and modules [except in the sense that language is implemented on the general purpose computer of the mind-brain, and so there are some general constraints, like left-to-right processing]. The structure of language is not traceable to (the operations of a putative) general intelligence. A person can have a low I.Q. but very good language. Dissociation is not connected with this. Language is not equivalent or reducible to thought.

Summary of presentation by Prof. Idsardi: PHONOLOGY
Phonology is the domain of language that involves the study of sounds that convey meaning. These sounds can give meaning as to "Who?" (speaker recognition), "What?" (the message), "Where?" (location), and "How?" (emotion).

Phonology generally tends to study different languages and how they compare to one another in respect to tone, meter, and stresses, and how these features can convey meaning. One of the most prominent ideas that comes out of this study is the fact that much information is actually conveyed through inaudible structures in the sounds/productions of languages. These inaudible structures are made evident through chunks, groups, and constituents that leave their effects on audible features. [That is, the sound system of language is an abstract structure, and what is, technically, not expressly said is the overall organizing system.] Additionally, this information is not always consciously explainable, as is evidenced by the fact that a native speaker, simply know which set of sounds is "better," or more "acceptable," but not really why this is so. Native speakers are not taught this outright; their intuitions are just "there." Noam Chomsky refers to this situation of the development of rich abstract knowledge from finite, untutored input as the "poverty of stimulus theory."

Summary of Presentation by Prof. Frawley: Semantics
There are two kinds of meaning, truth-conditional, the domain of semantics, and non-truth conditional meaning (or significance), the domain of pragmatics. The latter studies, among other things, implicature, non-truth conditional inferences triggered by an exchange. These vary by culture, yet still have some regularities.

One of the major tools in the study of semantics is entailment, necessary truth-conditional inference. Entailment elucidates such things as ambiguity and vagueness. Ambiguity refers to things that have two intrinsic meanings. For example, apple could denote could be either a computer or a fruit. [And so you can assert each meaning with no contradiction: it is an apple, not an apple.] Vagueness refers to multiple non-intrinsic meanings of a term, decidable from context. The color of an apple (fruit) is vague, as revealed by the inability to differentiate color by asserting each in a positive and negative context: it is an apple, not an apple, does not differentiate red apples from green ones.

How do we describe the semantic structure of language? Two general ways: through conceptual structures and through formal structures. The former requires some kind of intrinsic content to the interpretation of well formed expressions; the latter construes the interpretation mechanism as a purely formal, logical procedure.

An example of conceptual structure:

There is a difference in language between countable and uncountable things. Countable things can be conceptually construed as bounded things and uncountable stuff is known as unbounded stuff. Things or actions can be bounded as completed. Prepositional notions could also rely on the bounded/unbounded distinction: across indicates a bounded relation.

An example of formal structure:

The phrase every dog means that whatever we are saying must apply to all dogs: it denotes the intersection of each dog's property set. If we say a dog, then whatever we are talking about can apply to any dog: the union of all dogs' property sets. Seen this way, meaning is a logical interpretation procedure that ranges over conceptual content: we do not care what properties count as dog properties, for example -- just that there is some set. For interpretation, it matters less what counts as a dog -- which can be language-specific -- than the way language uses formal operations to interpret expressions.

Summary of Presentation by Prof. Phillips: Syntax
Language is a communication technique which involves transmitting the rich context of one's thoughts and mental representations through a very simple medium, that of speaking. From this speech, the listener then interprets the relatively simple representations of speech into a rich context of their own mental thoughts and representations. Hopefully, these received representations were not lost through the medium and are the same, or sufficiently similar to, the speaker's mental representations to get the message communicated. Although it seems unlikely for this communication to work, it does, and does so with extreme frequency.

Language is very structured. Syntax helps provide this structure. Sentence structure can be delineated through sentence structure trees, which are a finite set of symbols that impose formal representations on language. One very interesting fact of these sentence structure trees is recursion, which involves embedding of identical elements inside each other and provides for an infinite number of possible sentences, with, theoretically, infinite length.

Note that just as in phonology, syntax uses abstract, unspoken organization to guide sentence construction. Consider the following ungrammatical sentence: *What did you see the boy who was reading? This sentence is impossible as a way to question the object of the verb in the embedded relative clause (you saw the boy who was reading what). Evidently all languages disallow movement of items out of complex nouns, like nouns that are modified by relative clauses. This kind of constraint is abstract, untaught, and universal. It is thus like other abstract features of language that appear to underlie the overall organization of the language as a modular system.

Language Acquisition
Language is acquired in a regular patter, through regular phases, and without explicit instruction (from positive instances). Moreover, the decisions about the essential structure of language stop at some point (grammar is unrevised), and so language is learnable.

Crying is the baby's first means of communication, followed by cooing, where the baby practices using the musculature of the mouths. There is a lot of control needed over musculature and nerve connections for the mouth and throat needed to produce well-formed expressions. Through practice, the baby begins to babble. The child's speech has become digitized. After about a year the child can say one word but can understand about a hundred. There is asymmetry between production of words and comprehension of them. Output is not in direct correlation with input as can be seen by babies saying nouns first (even in SOV languages).

The idea of learnability is the idea that Input --> Program --> Output. The program is the universal grammar part and the output is the competence, the phonetics, syntax, and the semantics. Input is taken in from the parents in a varied, impoverished, and positive "motherese" and is sent to the child's Universal Grammar. The UG abstract, representational, and structural. Properties of Program/Output are not directly traceable to Input.

We can see the abstract structure in children's decisions about language by the regularities in their errors: substitutions of stops for continuants in phonology; acquisition of derivational morphology before inflectional morphology; deletion of pronominal subjects in syntax (pro-drop), and lexical innovations to fill in gaps by assigning forms to expected combinations of meaning units (cold this up!).