Rant alert!
I’m not a scientist but I was an avid and curious reader. As a programmer I had to watch for the smallest detail such as the semicolon at the end of a line of code and the macro, what my client’s business was all about. Most important, I had fun doing it. Maybe the most interesting subject I discovered was the Science of Complexity and my favorite Complexity scientist, Stuart Kaufman, the author of At Home in the Universe: The Search for the Laws of Self-Organization and Complexity available at Amazon
Increasing entropy is one of the fundamental laws of the Universe, it means increasing disorder, increasing randomness. The only way for a closed system to become organized is to acquire energy like the Earth gets energy from the Sun. After a few billion years random molecules get organized into life and humans. Atoms like to get together provided there is enough energy to get them to do it. Stuart Kaufman calls it “Order for free”
Kauffman contends that complexity itself triggers self-organization, or what he calls “order for free,” that if enough different molecules pass a certain threshold of complexity, they begin to self-organize into a new entity–a living cell. Kauffman uses the analogy of a thousand buttons on a rug–join two buttons randomly with thread, then another two, and so on. At first, you have isolated pairs; later, small clusters; but suddenly at around the 500th repetition, a remarkable transformation occurs–much like the phase transition when water abruptly turns to ice–and the buttons link up in one giant network.
At Home in the Universe - The Search for the Laws of Self-Organization and Complexity - NASA/ADS.
The brain is no exception. In my most unscientific way I came to the idea (long before I discovered Complexity ) that the way the brain works is by pattern matching. It stores huge amounts of patterns, sounds, images, memories, faces, voices, smells, and it has the ability to match incoming data with stored data. That’s how one recognizes a face among millions almost instantly. The deeper the memories (more data) the faster the recognition. I don’t have a clue of how it’s done but it probably is not much different from how ants and bees organize themselves. Anything that does not work dies and we get evolution of what works.
Punch Line
That’s probably how neural networks and machine learning works. When I started as a programmer not all was code, some computing machines were programmed using plugboards
https://www.google.com/search?q=plugboards&newwindow=1&client=safari&rls=en&sxsrf=APwXEdePj5BfCYvn3PFK0_WdS-Tax40rcw:1682409288858&source=lnms&tbm=isch&sa=X&ved=2ahUKEwjUo52Ax8T-AhUMh1wKHTXSDMgQ_AUoAXoECAIQAw&biw=1162&bih=645&dpr=2
Most everything that can be done by code can be done by hardware. Back when Alan Turing was building the machine to break the Enigma code they had discussions about using vacuum tubes (valves in British) – now replaced by transistors – (my first computer had 2000 triodes). Vacuum tubes won out.
tri·ode | ˈtrīōd |
noun
a vacuum tube having three electrodes.
• a semiconductor rectifier having three connections.
The Dictionary
Conclusion
It’s the data, Stupid!
The Captain