Friday, March 21, 2014

Linguistics in the 20th century

Reading first some Chomsky and then, with diminishing respect, looking for something better at the dear old Concord Library and discovering Whorf, I find it interesting that so much energy was spent by the former defeating academic adversaries rather than addressing the topics at hand. 

It is too bad they lived in an age before it was possible to imagine teaching a computer to understand some limited part of a natural language. In narrow world language processing the question is how to capture meaning and be able to extract it automatically from examples of natural language. The idea is that if you have a sufficiently well defined "world" object, its member variables can set themselves. So you have to put your money where your mouth is and write software that fills in data objects from language samples. Then do something with the objects. You have to confront the notion of word meaning at a mathematical level to do it successfully. I am trying to do that with best models and the proto semantics.

But the controversies of the 20th century did not go away. I hope Whorf would approve my scheme of proto semantic shapes, filled with words having native context for each speaker of the language. I hope he would be sympathetic to my view that many of our primary abstract words are developed from games in childhood which are culture specific. I wonder if he would also approve of my idea that the cultural differences get built on top of simpler meaning entities that maybe are universal.

For example, do not all cultures include: "person", "place", "thing", "want", "ask", and action verbs, and things like transformation, sequence, and grouping? Note, it does not really matter if it is universal or not in terms of programming for a single language.

But there is something to what Chomsky is saying that I feel is very true. The words in my mind are encoded with musculature - what you could call the phonemes [or is it morphemes?]. As I dozed off last night, the word "only" split off an "-ly", which brought a sensation of deep meaning. It would not be surprising at all if the concepts that used "ly" where physically manifested in my brain as a thinking mechanism that includes those muscles. But that is the implementation of meaning not the form or content of it. So yeah, Chomsky, meaning is there in the physical implementation of language, and the use of such muscles is critically important. The same way my computer program uses silicon dioxide and larger objects like transistors to perform the operations I want in a computer program. But it is the computer program which is of most interest with the elusive "meaning" we want to understand.

Separately, the algebraic rules of grammar are necessary for extracting subtleties of meaning. They apply during parsing, and ordering of the input into narrative structures. But how important is it? Aren't single concepts usually described with adjacent words? It is a tough subject.
Update: My son David mentions that Sasurre (sp?) wrote that the form and sounds of words cannot be connected to their meaning. I agree completely. At the same time, a system that stores meanings in efficient way will use a strong degree of parallelism between the word forms/sounds and the word meanings.

No comments:

Post a Comment