m
m
 
Line 9: Line 9:
  
 
There he makes the key observations:
 
There he makes the key observations:
 
 
{{quote|Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).}}
 
{{quote|Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).}}
  
Line 27: Line 26:
  
 
It differs from the [[perceptron]] (earlier attempts) by all-to-all couplings. This accounts for the new results:
 
It differs from the [[perceptron]] (earlier attempts) by all-to-all couplings. This accounts for the new results:
{{cite|All our interesting results arise as consequences of the strong back-coupling.}}
+
{{quote|All our interesting results arise as consequences of the strong back-coupling.}}
 
It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing.
 
It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing.
  
Line 38: Line 37:
 
Results of the simulation (fairly fast).  
 
Results of the simulation (fairly fast).  
 
<center><wz tip="">[[File:Screenshot_20241106_160610.png|400px]]</wz></center>
 
<center><wz tip="">[[File:Screenshot_20241106_160610.png|400px]]</wz></center>
He also observed chaotic wandering in a small region of state space. He precises that «{{onlinequote|Simulations with N = 100 were much slower and not quan-
+
He also observed chaotic wandering in a small region of state space. He precises that «{{onlinequote|Simulations with N꞊100 were much slower and not quantitatively pursued.}}»
titatively pursued.}}»
+
  
 
The maximal information stored for N=100 neurons occurred at about n=13 possible memories.
 
The maximal information stored for N=100 neurons occurred at about n=13 possible memories.
Line 57: Line 55:
 
The paper has a funny footnote:
 
The paper has a funny footnote:
 
<center><wz tip="Not all advertisement is bad!">[[File:Screenshot_20241104_220706.png|400px]]</wz></center>
 
<center><wz tip="Not all advertisement is bad!">[[File:Screenshot_20241104_220706.png|400px]]</wz></center>
 
== References ==
 
<references />
 

Latest revision as of 15:32, 6 November 2024

Neural networks and physical systems with emergent collective computational abilities. J. J. Hopfield in Proc. Natl. Acad. Sci. 79:2554 (1982).  What the paper says!?

Screenshot 20241023 005448.png

This is a seminal paper where Hopfield shows that an Ising-type system can be "trained" to behave as an associative memory (now known as a "Hopfield network"). This answers positively the question of whether collective phenomena in systems of simple interacting neurons have useful "computational" functions, thereby igniting the neural network era of Artificial Intelligence:

This paper examines a new modeling of this old and fundamental question (4-8) and shows that important computational properties spontaneously arise.

References 4-8 being:

Screenshot 20241106 150234.png

Interestingly, William Little is among the pioneers of this field here too!

There he makes the key observations:

Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons).

(First line of the abstract), as well as:

  • this is inspired from neurobiology.
  • this can be implemented by electronic circuits.
  • this is independent from details of the model.
  • the network can be trained (any wished solution can be made a stable point).

He also puts emphasis on elements whose actual relevance are less clear, such as synchronicity, symmetric pairing of the neurons, etc.

Screenshot 20241106 152558.png

The Hopfield network as described by Hopfield:

Screenshot 20241106 151311.png

It differs from the perceptron (earlier attempts) by all-to-all couplings. This accounts for the new results:

All our interesting results arise as consequences of the strong back-coupling.

It is also asynchronous, so less lookalike a clocked computer, and dealing with abstract encoding as opposed to signal processing.

The training is Hebbian-inspired, with weights of the connections defined by the information to be encoded:

Screenshot 20241106 152038.png

The size of the problems tackled:

Screenshot 20241106 160433.png

Results of the simulation (fairly fast).

Screenshot 20241106 160610.png

He also observed chaotic wandering in a small region of state space. He precises that «Simulations with N꞊100 were much slower and not quantitatively pursued.»

The maximal information stored for N=100 neurons occurred at about n=13 possible memories.

On limitations and new memories:

Screenshot 20241106 161458.png

On the performance of the brain to retain memories:

Screenshot 20241106 161624.png

Quantitative results:

Screenshot 20241106 162433.png

As a formal example of something to be stored in memory, Hopfield seems not to have a more vivid example than Statistics of the Two-Dimensional Ferromagnet. Part I. H. A. Kramers and G. H. Wannier in Phys. Rev. 60:252 (1941).:

Screenshot 20241104 221000.png

The paper has a funny footnote:

Screenshot 20241104 220706.png