Tuesday, October 30, 2018
Tornados in Massachusetts
Pretty rare and, I think there were two in the last five years. Funny thing, I was in both of them: in Concord ~1/4 mile away and in Woods Hole my house was right underneath the thing as it dissipated - quite a puff of wind.
Thursday, October 11, 2018
Little Adventures in Woods Hole
Each day has a little something different. Hopefully that continues. Yesterday it was seeing False Albacore - turquoise kite shapes, gone before the blink of an eye, disdaining my lure in passing. Today it was a weird giant "needle fish":
Other recent moments of fun: losing two lures in as many minutes on my first fishing venture, by dinghy, in Buzzards Bay.
Also: 5 deer in the yard one time, 18 turkeys another time. Gray seals in the water of Lackeys Bay behind Nonamesett.
Also: my first bluefish and how to find clams at Ram Island (with a friend of a friend named Tyler):
Other recent moments of fun: losing two lures in as many minutes on my first fishing venture, by dinghy, in Buzzards Bay.
Also: 5 deer in the yard one time, 18 turkeys another time. Gray seals in the water of Lackeys Bay behind Nonamesett.
Also: my first bluefish and how to find clams at Ram Island (with a friend of a friend named Tyler):
Sunday, October 7, 2018
The stirring of new ideas
Talking with youths - Tyler Boone and David Levy, and a bit the "cool kids" out at Intuition Machines, who are friends of my son David, gets me back to thinking harder about hierarchies. My "merge split append" algorithm (which might be called a "silver" algorithm [versus the final "golden" algorithm]) does its work using an existing hierarchy.
So I have been thinking that the way you use a hierarchy must be pretty similar to the way the hierarchy is created in the first place. Mechanisms involved with creating the hierarchy are becoming interesting. To which end, this figure. I am striving to understand how dPerception/dValue can be an independent variable in a learning formula.
I'll figure it out. Gimme a little while.
So I have been thinking that the way you use a hierarchy must be pretty similar to the way the hierarchy is created in the first place. Mechanisms involved with creating the hierarchy are becoming interesting. To which end, this figure. I am striving to understand how dPerception/dValue can be an independent variable in a learning formula.
I'll figure it out. Gimme a little while.
Hebbs versus Persig
Let us start with separate individual entities (vagueness about this is a problem, but in some versions, the entities are somebody's idea of a "neuron") that are bound together progressively by a Hebbian principle of "what fires together, wires together" [very poetic]. The result is that clusters arise as groups of associated individuals. But we can arrive at the same groupings by a "Persigian" principle that starts with a single large group of all the individuals, then splits up that group, a bit at a time.
I just realized something cool about the difference between a bottom-up Hebbian grouping and a top-down Persigian grouping: Persigian grouping leaves behind a sequence of larger groups so, as a procedure, it produces a hierarchy in the course of generating the final grouping. The Hebbian procedure does not. In other words, all the looser associations that are a biproduct of the top-down procedure are absent from a bottom-up procedure.
I just realized something cool about the difference between a bottom-up Hebbian grouping and a top-down Persigian grouping: Persigian grouping leaves behind a sequence of larger groups so, as a procedure, it produces a hierarchy in the course of generating the final grouping. The Hebbian procedure does not. In other words, all the looser associations that are a biproduct of the top-down procedure are absent from a bottom-up procedure.
Subscribe to:
Posts (Atom)