Friday, July 28, 2023
Free speech dies in a "secure" world
Thursday, July 27, 2023
Computer eloquence is not a priority (for me)
Who cares how eloquent a computer is when all that matters is that its information is useful?
Saturday, July 22, 2023
Speech without comprehension
Where can I be snide if not my own blog? So, the AI rage these days is about the wonderful language mimicry of ChatGPT and OpenAI. What a deep understanding of sentence structure provides, brought to you by the likes of Chomsky and other dolts north of the Charles, is that now computers can produce fine sentences and paragraphs with no understanding; except that the form of the question determines the form of the answer - with tidbits of internet-scraped content inserted in the appropriate blank spaces in the sentence structure.
Beautifully articulated prose with no understanding is what you get from Chomsky himself. Imagine believing that the transistor explains the software written on a computer! Or that internet protocols are all that is needed to understand internet content! Or that the meaning of "red" could somehow be embedded in grammar and syntax. Anyway, Chomsky's stupidity is now leveraged on the world and engineers are frantically trying to figure out how "large language models" are supposed to do things like order a pizza. They cannot.
But the old "assistants" like Siri and Alexa work fine using keywords. Software assistants use keywords. My approach has always been to do better keyword matching, to include narrative pattern matching.
Anyway, something genuinely ironic is beginning to emerge in the latest ChatGPT developments. They are starting to market "structured commands" as a kind of input template "to save the user having to re-type the same things over and over". I think that is not really what is going on. It seems obvious that the attempts to make ChatGPT work are going to be introducing keywords. Some poor bloke thinks all they need to do is structure the command, let ChatGPT do something with that input, then write a program that acts on the basis of what ChatGPT produces as output. The irony is that: it will not take them long before realizing that ChatGPT did not add and value and can be eliminated as an unnecessary intermediary between structured commands and structured output.
In the end, Chomsky's work will have the lasting value of making computers seem literate. But for language understanding, there has been no progress from that direction.
I am proud I went to BU: My university was south of the Charles River. We were phenomenologist in Boston, while they were logical positivists in Cambridge. It explains why they are still trying to write programs to discover reality over there.
Wednesday, July 19, 2023
Barbara Waksman in 2010
Continuing the posting of nice pictures of Barb. We don't have too many from when she was in hygiene school:
Friday, July 14, 2023
Personal goals for guitar playing
It takes a level of skill to play the guitar with grace. It takes a another level to play with athleticism. In the end (and in private) you want to be able to play with abandon.
Saturday, July 8, 2023
Knapping ideas - mental arrowheads
Very conscious of getting decrepit, I am frantically trying to finish certain works on paper to leave behind as my "arrowheads". Hopefully someone will look carefully along "the beach", and find what I wrote.
Monday, July 3, 2023
Saturday, July 1, 2023
Fixing the narrative pattern <--> topic correspondence
What needs to get fixed is my thinking about it. Sadly, every narrative pattern needs to correspond with a unique topic, and I must deal with equivalence of narrative and of topic.
E.g. now I must think of (X) as indicating a sub-narrative on one hand, and topic part of another topic on the other hand. Sadly, I now have to figure out why a sub-topic is equivalent to its container, when the containing "parent" adds no additional parts or attributes, beyond X. Or I have to live with them being different.
Update: This is WRONG. Narratives are too complicated to be topics structures. And ((X)) cannot be another topics from X. Trying to set up topic "equivalence" is a nightmare. Also, the whole point of the ledger is to handle multiple topics. X,Y,Z, .... cannot be a topic, it is a narrative that becomes a ledger. It is like this: a sequence of points is not a point and it is a mistake to think in that direction.