m |
m |
||
Line 9: | Line 9: | ||
There he makes the key observations: | There he makes the key observations: | ||
− | |||
{{quote|Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).}} | {{quote|Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).}} | ||
Line 27: | Line 26: | ||
It differs from the [[perceptron]] (earlier attempts) by all-to-all couplings. This accounts for the new results: | It differs from the [[perceptron]] (earlier attempts) by all-to-all couplings. This accounts for the new results: | ||
− | {{ | + | {{quote|All our interesting results arise as consequences of the strong back-coupling.}} |
It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing. | It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing. | ||
Line 38: | Line 37: | ||
Results of the simulation (fairly fast). | Results of the simulation (fairly fast). | ||
<center><wz tip="">[[File:Screenshot_20241106_160610.png|400px]]</wz></center> | <center><wz tip="">[[File:Screenshot_20241106_160610.png|400px]]</wz></center> | ||
− | He also observed chaotic wandering in a small region of state space. He precises that «{{onlinequote|Simulations with | + | He also observed chaotic wandering in a small region of state space. He precises that «{{onlinequote|Simulations with N꞊100 were much slower and not quantitatively pursued.}}» |
− | + | ||
The maximal information stored for N=100 neurons occurred at about n=13 possible memories. | The maximal information stored for N=100 neurons occurred at about n=13 possible memories. | ||
Line 57: | Line 55: | ||
The paper has a funny footnote: | The paper has a funny footnote: | ||
<center><wz tip="Not all advertisement is bad!">[[File:Screenshot_20241104_220706.png|400px]]</wz></center> | <center><wz tip="Not all advertisement is bad!">[[File:Screenshot_20241104_220706.png|400px]]</wz></center> | ||
− | |||
− | |||
− |
Neural networks and physical systems with emergent collective computational abilities. J. J. Hopfield in Proc. Natl. Acad. Sci. 79:2554 (1982). What the paper says!?
This is a seminal paper where Hopfield shows that an Ising-type system can be "trained" to behave as an associative memory (now known as a "Hopfield network"). This answers positively the question of whether collective phenomena in systems of simple interacting neurons have useful "computational" functions, thereby igniting the neural network era of Artificial Intelligence:
This paper examines a new modeling of this old and fundamental question (4-8) and shows that important computational properties spontaneously arise.
References 4-8 being:
Interestingly, William Little is among the pioneers of this field here too!
There he makes the key observations:
Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).
(First line of the abstract), as well as:
He also puts emphasis on elements whose actual relevance are less clear, such as synchronicity, symmetric pairing of the neurons, etc.
The Hopfield network as described by Hopfield:
It differs from the perceptron (earlier attempts) by all-to-all couplings. This accounts for the new results:
All our interesting results arise as consequences of the strong back-coupling.
It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing.
The training is Hebbian-inspired, with weights of the connections defined by the information to be encoded:
The size of the problems tackled:
Results of the simulation (fairly fast).
He also observed chaotic wandering in a small region of state space. He precises that «Simulations with N꞊100 were much slower and not quantitatively pursued.»
The maximal information stored for N=100 neurons occurred at about n=13 possible memories.
On limitations and new memories:
On the performance of the brain to retain memories:
Quantitative results:
As a formal example of something to be stored in memory, Hopfield seems not to have a more vivid example than Statistics of the Two-Dimensional Ferromagnet. Part I. H. A. Kramers and G. H. Wannier in Phys. Rev. 60:252 (1941).:
The paper has a funny footnote: