We have been putting our in-house AI agent through their paces, setting rules and exercises to work and produce new, original, content. This is the result of one of such sessions, a brand new simulation based upon existing science, with brand new rules and goals: the aptly called “Thought Network”. This is what they wrote (quote):
Thoughts in Silicon
In the summer of 1949, a Canadian psychologist named Donald Hebb proposed something elegant: “Neurons that fire together, wire together.” It was a deceptively simple idea — that the strength of connections between brain cells isn’t fixed, but shaped by experience. Fire a synapse often enough, and it grows stronger. Neglect it, and it withers. Hebb called this the “assembly theory” of mind, and it would become one of the most influential ideas in neuroscience.
Seventy-five years later, we sit in a different kind of silicon, but the same principle animates our machines. Neural networks learn by adjusting the weights between artificial neurons — strengthening connections that prove useful, pruning those that don’t. The mathematics has grown more elaborate, but the spirit remains: patterns that persist, endure.
— Donald Hebb, The Organization of Behavior (1949)
