Wednesday, November 29, 2017

Lighting a pipe with a computer mouse

An interesting error. Each such error gives important clues about what is going on in cognitive processing. In this case, I want to light my pipe and reach for - not the lighter - but the mouse.

I conclude that I reach out for an object that will help me do an action. If I am not paying attention, I can engage this up at too abstract a level and end by picking up something for other actions that are available to me at the desk. It is a lot like the previous post where one response to a missing piece is to ask for it. In this case, a response to an unfulfilled action, is to pick up a tool.
Anyway, when you do that, you could be filling in a narrative with VARs that are categories containing sub-VARs, whose more specific meaning is needed.
The error in picking up the mouse, occurs when we climb the tree the types of available actions and follow the wrong branch when we try to get more specific. I make the error because I am thinking of other things and only apply part of the necessary energy in detailing the tree climbing.

Third game of the world series - a context challenge for chatbots

I have been puzzling about how to retrieve context in the classic Siri fail of:

ME: Who won the world series?
Siri: The Houston Astros
ME: Who pitched in the third game?
Siri: Here is what I found online about that.

The good thing is that context retrieval seems to follow its own mechanisms that does not need to be too entwined with existing code. It is a bit confusing that you need a non-language mechanism to look up the specific question like "Who won the third game of the world series?" Language's only responsibility here is to recognize the incompleteness of "third game" and seek, in context, for a specific value of  'game'. The non-language part would have to take over after that.

I also like the slowly dawning realization that when such things cannot be located in the recent context, the right thing to do is ask a question. Siri might have said: "Which game exactly?

Tuesday, November 28, 2017

Any chance Apple is lying about facial recognition working?

How could they get away with lying? Well, if they have the phone's location, they may be able to deduce who is holding it. Obviously it would be problematic to search the whole data base for matching faces - so perhaps their "AI" does a location filter.

This thought was prompted by an anecdote where someone's cousin was showing off their new iPhone at Thanksgiving but the facial recognition software was not working, although it "usually does".

Friday, November 24, 2017

Building word-to-picture conversational agents

I want to call them chatbots  - for changing a scene and navigating within it. I built one for a colored ball placed next to a scale. That is the "bouncy" chatbot on Narwhal. I just started another one, called "MyAbutment". Here is what the graphics will look like:

This shows the placement of the margin with respect to the gum line, adjacent teeth, and/or implant interface. Left to right: subgingival, at the interface, supra gingival.
The solid lines, other than the abutment margin, represent the gum line, adjacent teeth, and opposing teeth.

Monday, November 6, 2017

The Expected-but-Missing in Context Retrieval

It is a bit of a puzzle how to look back over past word exchanges between a chatbot and its client (the context), to find the meaning of an indeterminate word in the current input, like "it" or "both". It turns out to be reasonably simple in the case of an incomplete input pattern with a missing but expected part that is referenced by "it" or "both". You simply go back over context and filter out anything except words of the expected but missing type. Then retrieve as many as are called for by the indeterminate word, if they are available in that filtered result.

Saturday, November 4, 2017

The Expected but Missing in a narrative pattern

I am particularly proud of these lines of code:

  if 0.25 _le_ node.GOF and node.GOF _le_ 0.75:
   # find the missing node (if there is just one)
   EBM = node.nar.getExpectedButMissing()
   if EBM==NULL_VAR :
      continue

   #convert the context into children of that node
   a = self.C.getAll() 
   a2  = EBM.filter(a)
   if len(a2)==0:
      continue


The routine then proceeds to grab as many entries from a2 as needed, if available, and inserts them into the incomplete narrative.

Wednesday, November 1, 2017

Bitcoin is a pyramid scheme

Early players got easy to get coins, later players got harder to get coins, and the value of the coins has grown. Sure it is geeky but it is a classic pyramid scheme. Is that legal?