Wednesday, August 31, 2016

The close integration of different perceptions within our experience

About dogs and crushes on girls.
Scientists have found that dogs perceive kind words differently from the same words spoken without kindness. Nor are dogs fooled by kind sounding gibberish. Instead, based on MRI scans of their right/left hemispheres while being spoken to, it is observed that dogs are using both hemispheres in perceiving the positive re-enforcement of praise. This has numerous interesting consequences. I see it as suggesting that spoken language of humans shares a emotional component with the language of dogs. Just as we seem to share musical scales with birds. And these things come naturally to all of us species.

Written language is a poor thing, bereft of the emotional content carried by tone and enunciation, it simply stands to remind us of what the spoken language was like. In fact we must fill it in and "speak to ourselves while reading" to get a full sense of what is written. Still, we might get it wrong, with something like sarcasm, or reading a script out loud. This non-emotional content is huge for humans but may be different for dogs and birds.

We had that conversation about "common coding" theory, saying that the muscles used to say words are connected with the memories associated to those words, and connected with how the memories are stored. The consequence, based on the dog observation, is that we have similar muscles to dogs.

Perhaps separating thought and musculature creates a false dichotomy. Perhaps language and its function of communication are too closely tied to everything else that we act on, perceive, and experience. I don't know but I want to mention one more example.

When you have a strong crush on a girl and are trying hard to put her out of your mind, physical sensations breaks through the barrier. Physical touch becomes a trigger for recalling the person. Could it be that there is a pattern of 'woman' so built-in to my act of physical sensation, that each touch speaks her name? If this was so, then presumably the most primitive creatures would experience love. It is not much fun.

Wednesday, August 24, 2016

Port Huron Float Down - illegal "floaters" from USA to Canada, Aug 24 2016

The question comes up as to whether the Americans who were 'rescued' were also tagged before they were 'released' back into America. This in turn leads to the speculation of using "tag and recapture" to estimate the number of Americans in terms of the ratio of re-captured individuals.

Tuesday, August 23, 2016

Spell Hacker

I have a premise for a story: a guy living in a world of magic, who does not himself have magical ability. But it turns out that spells are like software and almost invariably are buggy. So this fellow learns how to hack the bugs in spells. For example, he and his group are immobilized by a spell but he leans slightly forward and back a few times and cracks himself out, then thinks: "let's see...there usually is a bush somewhere with a...there's one...[he looks underneath the bush]....there it is! This spell is a piece of crap." He fiddles with something....there is a "poof"...and his friends are freed as well.

Friday, August 19, 2016

Road layout determines driving behavior independent of traffic density

You could say that driving behavior consists of speeding up, slowing down, and changing lanes. I note - while driving past Route 20 going south on 128 at 1:30PM Friday - that people are making the same lane changes that, later on, gunk up the traffic, but which do not at the lower density. The behavior is still there but it does not affect traffic at this density.
On the other hand, speed changes and opportunistic lane changes (the ones that aren't necessary) are related to density. 

Thursday, August 18, 2016

How do we take the measure of a word?

I was stuck on this for a long time but I now believe the answer is:
 - Within a hierarchy of word categories that surround the word in a particular semantic frame. The location of a word in this hierarchy is thoroughly analogous to how numbers are located in smaller and smaller half intervals of [0,1]. But words by themselves are not infinitely subdivisible.

Wednesday, August 17, 2016

how to handle a polarity change deep inside the sub narrative tree?

A totally idiosyncratic question, but the answer is theoretical, and postulates that a narrative can have only a positive value or negative value. You are not waiting for an ambiguous good/bad to occur but are operating with a current one.
So the answer to the question is: whenever a deep polarity change occurs, the whole evolving narrative should get its polarity changed as well.
Update: It is particularly interesting to suppose a collection of alternative interpretations that widens and narrows on the way through a text but with one component - the value - being shared by all the alternatives. This 'value' is always current and is not projected as a possible future. 

Thursday, August 11, 2016

Owl hoots, the clave, and narrative patterns

I was having a sleepless night and tried to turn it to good use thinking about how to implement the "moving topic" in Narwhal. I had three good ideas, which I will write more about later. The basic idea of the "moving topic" is that you move through the words of a sentence with a collection of hypothetical meanings that gets added to and narrowed down as you go.

Then there is the clave (or "stick") which Glenway Fripp has been telling me about. Apparently it serves as a fixed rhythmic pattern - whether or not you hear it - and this fixed pattern replaces the regular "beat" of European music, in all Latin American music. This idea of an irregular rhythm being the basis for variation (of the overlayed melody, not the clave "beat") has been in my mind recently - since I started hearing about it in July.

And then around 4:30 AM a Great Horned Owl started hooting. It has a three-note rhythm at the beginning, followed by a more complex rhythm sequence of hoots. The latter sequence of hoots (as I listened carefully to them) were varying in 'attack'/ 'duration'/ 'intensity'/ 'inflection', and in enough different ways that I was not sure if the owl was repeating itself. A casual listener would conclude the song was the same each time with only subtle [and presumably meaningless] differences in vocalization of each hoot.

I bet that is not right. I bet that three-note prelude allows sync'ing to it and guarantees the rhythm expectation (of any listener) for the more variable part of the pattern to follow. I also bet that if you recorded the same owl hoot over and over, you would see a very deterministic variation in the second sequence of notes. As a computer programmer, I know how much information can be encoded in a binary sequence. And with the clave-like sequence of owl hoots, given the idea of an evolving narrative pattern, it totally makes sense that the hoot variations would be within a fixed rhythmic sequence, but with more and more inflection variation towards the end of the sequence.

There is absolutely no mathematical basis for the assumption that all owl hoot sequences are the same, or that their songs are aesthetic but without content. There is plenty of room in that data for subtle meanings. In fact there is plenty of room for a collection of hypothetical narratives to be narrowed down to a final confirmed meaning.
Update: The point is that the theory that birds sing for aesthetic reasons and to show off - but not to communicate, then that theory has to explain a change in variability part way through the song. I don't see how it can.

Saturday, August 6, 2016

Value propogation in verb and adjective narratives

We have relations like
A -v-> B
and
A _v/ B
The question is: how do the values associated to A, v, and B become a value for the whole expression? Let 0=bad and 1=good, we can write rules like the following.
Always ignore the value of A and use:
value( A -v-> B ) = value(B) + value(v) +1
value( A _v/ B ) = value(A)*value(B)
For fun you can write:
1 - val( A-v->B) = val(A) + val(B)
     val( A _v/ B ) =val(A) * val(B)
Also, it think we should use:
val( A, B) = val(B) 
and
val(A::B)=val(B) 

Friday, August 5, 2016

GOF - the goodness of fit formula [a work in progress]

I think this is right, and took the time to draw it. (Actually it is  not, see below)

We consider all possible assignments of words of text to slots of a narrative pattern, including not using some of the slots. We are interested in assignments that use as many slots as possible. Let p_used be the number of slots used in a given assignment and let |text| be the number of words of text. Let "delta i" be the difference between the first and last indices of words that are used plus one. Then a measure of goodness of fit (or "GOF") for the assignment is:
 
This rewards for having extra slots that match but does not punish for having extra slots that are un-matched. To do that, pre-multiply by pattern length divided by text length.
Update: as usual there is a bit of confusion as something like this starts to finalize.  In fact the formula confuses between the number of slots used out of a total available in the narrative pattern with the number of indices used (in the many-to-many mapping of slots to indices) out of the available indices in the text. If we assume the above formula has p_used to be the number of indices of text tokens that are used in pattern matching, then the missing piece to penalize for un-used slots is the factor (u/n) where u is the number of slots of the pattern that are used and n is the total available number of slots. So, less elegantly but more correctly we can let 
|p_used|= num pattern slots used 
|p| = num pattern slots available
|text_used| = num text token indices used
|text| = num text token indices available
di = (first index used - last index used + 1)
then define
GOF = ( |p_used|/|p| )  *  (|text_used|/|text|) * (|text_used|/ di)
Where the first factor measures how much of the pattern is used. Second factor measures consumption of the text. Third factor measures how clustered is the use. But note that the "use" of the pattern may be smeared out over the text. 

Currently (in Sept): I am favoring one optimization involving u = numSlotsUsed() of a narrative, n = slotsUsed(), and r = words read, f = lastwordread-firstwordread+1. The formulas, for goodness of fit is gof = (u/n)*(r/f). A different version, used in recursion, is simply 'u' along with trying to read as many words as possible.

Tuesday, August 2, 2016

Things rich people say - continuing the series

Overheard:
"...the neighborhood was kind of run down, I mean for a million bucks you expect a little more.."

see also here.

More about the 'story" noun type

In principle this may not be too hard to add to the proto semantics. 
Verbs of 'story'
person "tell" story   (write: person-tell->story)
person "listen to" story (equivalently: story-listened to by->person )
thing "evokes" story  (thing-evokes->story)

Adjectives of 'story'
There is one adjective qualifying a story as "about" another narrative pattern. So we write
story _/ narrative

There are also some semantic rules about plans of action being converted to story and vice versa.