By Nicholas Plummer, Michael Colloton, Akeed Habeeb (some additions and comments by Frawley)
Language is domain-specific with sub domains. It has different representations and procedures for syntax, phonology, and semantics. There are no general procedures throughout all domains and modules [except in the sense that language is implemented on the general purpose computer of the mind-brain, and so there are some general constraints, like left-to-right processing]. The structure of language is not traceable to (the operations of a putative) general intelligence. A person can have a low I.Q. but very good language. Dissociation is not connected with this. Language is not equivalent or reducible to thought.
Phonology generally tends to study different languages and how they compare to one another in respect to tone, meter, and stresses, and how these features can convey meaning. One of the most prominent ideas that comes out of this study is the fact that much information is actually conveyed through inaudible structures in the sounds/productions of languages. These inaudible structures are made evident through chunks, groups, and constituents that leave their effects on audible features. [That is, the sound system of language is an abstract structure, and what is, technically, not expressly said is the overall organizing system.] Additionally, this information is not always consciously explainable, as is evidenced by the fact that a native speaker, simply know which set of sounds is "better," or more "acceptable," but not really why this is so. Native speakers are not taught this outright; their intuitions are just "there." Noam Chomsky refers to this situation of the development of rich abstract knowledge from finite, untutored input as the "poverty of stimulus theory."
One of the major tools in the study of semantics is entailment, necessary truth-conditional inference. Entailment elucidates such things as ambiguity and vagueness. Ambiguity refers to things that have two intrinsic meanings. For example, apple could denote could be either a computer or a fruit. [And so you can assert each meaning with no contradiction: it is an apple, not an apple.] Vagueness refers to multiple non-intrinsic meanings of a term, decidable from context. The color of an apple (fruit) is vague, as revealed by the inability to differentiate color by asserting each in a positive and negative context: it is an apple, not an apple, does not differentiate red apples from green ones.
How do we describe the semantic structure of language? Two general ways: through conceptual structures and through formal structures. The former requires some kind of intrinsic content to the interpretation of well formed expressions; the latter construes the interpretation mechanism as a purely formal, logical procedure.
An example of conceptual structure:
There is a difference in language between countable and uncountable things. Countable things can be conceptually construed as bounded things and uncountable stuff is known as unbounded stuff. Things or actions can be bounded as completed. Prepositional notions could also rely on the bounded/unbounded distinction: across indicates a bounded relation.
An example of formal structure:
The phrase every dog means that whatever we are saying must apply to all dogs: it denotes the intersection of each dog's property set. If we say a dog, then whatever we are talking about can apply to any dog: the union of all dogs' property sets. Seen this way, meaning is a logical interpretation procedure that ranges over conceptual content: we do not care what properties count as dog properties, for example -- just that there is some set. For interpretation, it matters less what counts as a dog -- which can be language-specific -- than the way language uses formal operations to interpret expressions.
Language is very structured. Syntax helps provide this structure. Sentence structure can be delineated through sentence structure trees, which are a finite set of symbols that impose formal representations on language. One very interesting fact of these sentence structure trees is recursion, which involves embedding of identical elements inside each other and provides for an infinite number of possible sentences, with, theoretically, infinite length.
Note that just as in phonology, syntax uses abstract, unspoken organization to guide sentence construction. Consider the following ungrammatical sentence: *What did you see the boy who was reading? This sentence is impossible as a way to question the object of the verb in the embedded relative clause (you saw the boy who was reading what). Evidently all languages disallow movement of items out of complex nouns, like nouns that are modified by relative clauses. This kind of constraint is abstract, untaught, and universal. It is thus like other abstract features of language that appear to underlie the overall organization of the language as a modular system.
Crying is the baby's first means of communication, followed by cooing, where the baby practices using the musculature of the mouths. There is a lot of control needed over musculature and nerve connections for the mouth and throat needed to produce well-formed expressions. Through practice, the baby begins to babble. The child's speech has become digitized. After about a year the child can say one word but can understand about a hundred. There is asymmetry between production of words and comprehension of them. Output is not in direct correlation with input as can be seen by babies saying nouns first (even in SOV languages).
The idea of learnability is the idea that Input --> Program --> Output. The program is the universal grammar part and the output is the competence, the phonetics, syntax, and the semantics. Input is taken in from the parents in a varied, impoverished, and positive "motherese" and is sent to the child's Universal Grammar. The UG abstract, representational, and structural. Properties of Program/Output are not directly traceable to Input.
We can see the abstract structure in children's decisions about language by the regularities in their errors: substitutions of stops for continuants in phonology; acquisition of derivational morphology before inflectional morphology; deletion of pronominal subjects in syntax (pro-drop), and lexical innovations to fill in gaps by assigning forms to expected combinations of meaning units (cold this up!).