Abstract

Bidirectionality, forward and backward information flow, is introduced in neural networks to produce two-way associative search for stored stimulus-response associations (Ai,Bi). Two fields of neurons, FA and FB, are connected by an n × p synaptic marix M. Passing information through M gives one direction, passing information through its transpose MT gives the other. Every matrix is bidirectionally stable for bivalent and for continuous neurons. Paired data (Ai,Bi) are encoded in M by summing bipolar correlation matrices. The bidirectional associative memory (BAM) behaves as a two-layer hierarchy of symmetrically connected neurons. When the neurons in FA and FB are activated, the network quickly evolves to a stable state of two-pattern reverberation, or pseudoadaptive resonance, for every connection topology M. The stable reverberation corresponds to a system energy local minimum. An adaptive BAM allows M to rapidly learn associations without supervision. Stable short-term memory reverberations across FA and FB gradually seep pattern information into the long-term memory connections M, allowing input associations (Ai,Bi) to dig their own energy wells in the network state space. The BAM correlation encoding scheme is extended to a general Hebbian learning law. Then every BAM adaptively resonates in the sense that all nodes and edges quickly equilibrate in a system energy local minimum. A sampling adaptive BAM results when many more training samples are presented than there are neurons in FA and FB, but presented for brief pulses of learning, not allowing learning to fully or nearly converge. Learning tends to improve with sample size. Sampling adaptive BAMs can learn some simple continuous mappings and can rapidly abstract bivalent associations from several noisy gray-scale samples.

© 1987 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Designs and devices for optical bidirectional associative memories

Clark C. Guest and Robert TeKolste
Appl. Opt. 26(23) 5055-5060 (1987)

Design for a massive all-optical bidirectional associative memory: the big BAM

Jason M. Kinser, H. John Caulfield, and Joseph Shamir
Appl. Opt. 27(16) 3442-3444 (1988)

Holographic associative memory based on adaptive learning including outer-product learning

Ho Hyung Suh and Sang Soo Lee
Appl. Opt. 31(2) 199-204 (1992)

References

  • View by:
  • |
  • |
  • |

  1. T. Kohonen, “Correlation Matrix Memories,” IEEE Trans. Comput. C-21, 353 (1972).
    [Crossref]
  2. T. Kohonen, Self-Organization and Associative Memory (Springer-Verlag, New York, 1984).
  3. J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
    [Crossref]
  4. G. A. Carpenter, S. Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine,” Comput. Vision Graphics Image Process. 37, 54 (1987).
    [Crossref]
  5. S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, II: Feedback, Expectation, Olfaction, and Illusions,” Biol. Cybern. 23, 187 (1976).
    [PubMed]
  6. S. Grossberg, “A Theory of Human Memory: Self-Organization and Performance of Sensory-Motor Codes, Maps, and Plans,” Prog. Theor. Biol. 5, 000 (1978).
  7. S. Grossberg, “How Does a Brain Build a Cognitive Code?,” Psychol. Rev. 87, 1 (1980).
    [Crossref] [PubMed]
  8. S. Grossberg, Studies of Mind and Brain: Neural Principles of Learning, Perception, Development, Cognition, and Motor Control (Reidel, Boston, 1982).
  9. S. Grossberg, The Adaptive Brain, I and II (North-Holland, Amsterdam, 1987).
  10. B. Kosko, “Bidirectional Associative Memories,” IEEE Trans. Syst. Man Cybern. SMC-00, 000 (1987).
  11. B. Kosko, “Fuzzy Associative Memories,” in Fuzzy Expert Systems, A. Kandel, Ed. (Addison-Wesley, Reading, MA, 1987).
  12. S. Grossberg, “Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural Networks,” Stud. Appl. Math. 52, 217 (1973).
  13. W. S. McCulloch, W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bull. Math. Biophys. 5, 115 (1943).
    [Crossref]
  14. B. Kosko, “Fuzzy Entropy and Conditioning,” Inf. Sci. 40, 165 (1986).
    [Crossref]
  15. J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982).
    [Crossref] [PubMed]
  16. M. A. Cohen, S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks,” IEEE Trans. Syst. Man Cybern. SMC-13, 815 (1983).
    [Crossref]
  17. D. B. Parker, “Learning Logic,” Invention Report S81-64, File 1, Office of Technology Licensing, Stanford U. (Oct.1982).
  18. D. B. Parker, “Learning Logic,” TR-47, Center for Computational Research in Economics and Management Science, MIT (Apr.1985).
  19. D. E. Rumelhart, G. E. Hinton, R. J. Williams, “Learning Internal Representations by Error Propagation,” ICS Report 8506, Institute for Cognitive Science, U. California San Diego (Sept.1985).
  20. P. J. Werbos, “Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences,” Ph.D. Dissertation in Statistics, Harvard U. (Aug.1974).
  21. B. Kosko, C. Guest, “Optical Bidirectional Associative Memories,” Proc. Soc. Photo-Opt. Instrum. Eng.758, (1987).
  22. J. J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proc. Natl. Acad. Sci. U.S.A. 81, 3088 (1984).
    [Crossref] [PubMed]
  23. S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, I: Parallel Development and Coding of Neural Feature Detectors,” Biol. Cybern. 23, 121 (1976).
    [Crossref] [PubMed]
  24. R. Hecht-Nielsen, “CounterPropagation Networks,” in Proceedings, First International Conference on Neural Networks (IEEE, New York, 1987).

1987 (3)

G. A. Carpenter, S. Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine,” Comput. Vision Graphics Image Process. 37, 54 (1987).
[Crossref]

B. Kosko, “Bidirectional Associative Memories,” IEEE Trans. Syst. Man Cybern. SMC-00, 000 (1987).

B. Kosko, C. Guest, “Optical Bidirectional Associative Memories,” Proc. Soc. Photo-Opt. Instrum. Eng.758, (1987).

1986 (1)

B. Kosko, “Fuzzy Entropy and Conditioning,” Inf. Sci. 40, 165 (1986).
[Crossref]

1984 (1)

J. J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proc. Natl. Acad. Sci. U.S.A. 81, 3088 (1984).
[Crossref] [PubMed]

1983 (1)

M. A. Cohen, S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks,” IEEE Trans. Syst. Man Cybern. SMC-13, 815 (1983).
[Crossref]

1982 (1)

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982).
[Crossref] [PubMed]

1980 (1)

S. Grossberg, “How Does a Brain Build a Cognitive Code?,” Psychol. Rev. 87, 1 (1980).
[Crossref] [PubMed]

1978 (1)

S. Grossberg, “A Theory of Human Memory: Self-Organization and Performance of Sensory-Motor Codes, Maps, and Plans,” Prog. Theor. Biol. 5, 000 (1978).

1977 (1)

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

1976 (2)

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, II: Feedback, Expectation, Olfaction, and Illusions,” Biol. Cybern. 23, 187 (1976).
[PubMed]

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, I: Parallel Development and Coding of Neural Feature Detectors,” Biol. Cybern. 23, 121 (1976).
[Crossref] [PubMed]

1973 (1)

S. Grossberg, “Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural Networks,” Stud. Appl. Math. 52, 217 (1973).

1972 (1)

T. Kohonen, “Correlation Matrix Memories,” IEEE Trans. Comput. C-21, 353 (1972).
[Crossref]

1943 (1)

W. S. McCulloch, W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bull. Math. Biophys. 5, 115 (1943).
[Crossref]

Anderson, J. A.

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

Carpenter, G. A.

G. A. Carpenter, S. Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine,” Comput. Vision Graphics Image Process. 37, 54 (1987).
[Crossref]

Cohen, M. A.

M. A. Cohen, S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks,” IEEE Trans. Syst. Man Cybern. SMC-13, 815 (1983).
[Crossref]

Grossberg, S.

G. A. Carpenter, S. Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine,” Comput. Vision Graphics Image Process. 37, 54 (1987).
[Crossref]

M. A. Cohen, S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks,” IEEE Trans. Syst. Man Cybern. SMC-13, 815 (1983).
[Crossref]

S. Grossberg, “How Does a Brain Build a Cognitive Code?,” Psychol. Rev. 87, 1 (1980).
[Crossref] [PubMed]

S. Grossberg, “A Theory of Human Memory: Self-Organization and Performance of Sensory-Motor Codes, Maps, and Plans,” Prog. Theor. Biol. 5, 000 (1978).

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, II: Feedback, Expectation, Olfaction, and Illusions,” Biol. Cybern. 23, 187 (1976).
[PubMed]

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, I: Parallel Development and Coding of Neural Feature Detectors,” Biol. Cybern. 23, 121 (1976).
[Crossref] [PubMed]

S. Grossberg, “Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural Networks,” Stud. Appl. Math. 52, 217 (1973).

S. Grossberg, Studies of Mind and Brain: Neural Principles of Learning, Perception, Development, Cognition, and Motor Control (Reidel, Boston, 1982).

S. Grossberg, The Adaptive Brain, I and II (North-Holland, Amsterdam, 1987).

Guest, C.

B. Kosko, C. Guest, “Optical Bidirectional Associative Memories,” Proc. Soc. Photo-Opt. Instrum. Eng.758, (1987).

Hecht-Nielsen, R.

R. Hecht-Nielsen, “CounterPropagation Networks,” in Proceedings, First International Conference on Neural Networks (IEEE, New York, 1987).

Hinton, G. E.

D. E. Rumelhart, G. E. Hinton, R. J. Williams, “Learning Internal Representations by Error Propagation,” ICS Report 8506, Institute for Cognitive Science, U. California San Diego (Sept.1985).

Hopfield, J. J.

J. J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proc. Natl. Acad. Sci. U.S.A. 81, 3088 (1984).
[Crossref] [PubMed]

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982).
[Crossref] [PubMed]

Jones, R. S.

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

Kohonen, T.

T. Kohonen, “Correlation Matrix Memories,” IEEE Trans. Comput. C-21, 353 (1972).
[Crossref]

T. Kohonen, Self-Organization and Associative Memory (Springer-Verlag, New York, 1984).

Kosko, B.

B. Kosko, “Bidirectional Associative Memories,” IEEE Trans. Syst. Man Cybern. SMC-00, 000 (1987).

B. Kosko, C. Guest, “Optical Bidirectional Associative Memories,” Proc. Soc. Photo-Opt. Instrum. Eng.758, (1987).

B. Kosko, “Fuzzy Entropy and Conditioning,” Inf. Sci. 40, 165 (1986).
[Crossref]

B. Kosko, “Fuzzy Associative Memories,” in Fuzzy Expert Systems, A. Kandel, Ed. (Addison-Wesley, Reading, MA, 1987).

McCulloch, W. S.

W. S. McCulloch, W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bull. Math. Biophys. 5, 115 (1943).
[Crossref]

Parker, D. B.

D. B. Parker, “Learning Logic,” Invention Report S81-64, File 1, Office of Technology Licensing, Stanford U. (Oct.1982).

D. B. Parker, “Learning Logic,” TR-47, Center for Computational Research in Economics and Management Science, MIT (Apr.1985).

Pitts, W.

W. S. McCulloch, W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bull. Math. Biophys. 5, 115 (1943).
[Crossref]

Ritz, S. A.

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

Rumelhart, D. E.

D. E. Rumelhart, G. E. Hinton, R. J. Williams, “Learning Internal Representations by Error Propagation,” ICS Report 8506, Institute for Cognitive Science, U. California San Diego (Sept.1985).

Silverstein, J. W.

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

Werbos, P. J.

P. J. Werbos, “Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences,” Ph.D. Dissertation in Statistics, Harvard U. (Aug.1974).

Williams, R. J.

D. E. Rumelhart, G. E. Hinton, R. J. Williams, “Learning Internal Representations by Error Propagation,” ICS Report 8506, Institute for Cognitive Science, U. California San Diego (Sept.1985).

Biol. Cybern. (2)

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, II: Feedback, Expectation, Olfaction, and Illusions,” Biol. Cybern. 23, 187 (1976).
[PubMed]

S. Grossberg, “Adaptive Pattern Classification and Universal Recoding, I: Parallel Development and Coding of Neural Feature Detectors,” Biol. Cybern. 23, 121 (1976).
[Crossref] [PubMed]

Bull. Math. Biophys. (1)

W. S. McCulloch, W. Pitts, “A Logical Calculus of the Ideas Immanent in Nervous Activity,” Bull. Math. Biophys. 5, 115 (1943).
[Crossref]

Comput. Vision Graphics Image Process. (1)

G. A. Carpenter, S. Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine,” Comput. Vision Graphics Image Process. 37, 54 (1987).
[Crossref]

IEEE Trans. Comput. (1)

T. Kohonen, “Correlation Matrix Memories,” IEEE Trans. Comput. C-21, 353 (1972).
[Crossref]

IEEE Trans. Syst. Man Cybern. (2)

B. Kosko, “Bidirectional Associative Memories,” IEEE Trans. Syst. Man Cybern. SMC-00, 000 (1987).

M. A. Cohen, S. Grossberg, “Absolute Stability of Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks,” IEEE Trans. Syst. Man Cybern. SMC-13, 815 (1983).
[Crossref]

Inf. Sci. (1)

B. Kosko, “Fuzzy Entropy and Conditioning,” Inf. Sci. 40, 165 (1986).
[Crossref]

Proc. Natl. Acad. Sci. U.S.A. (2)

J. J. Hopfield, “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982).
[Crossref] [PubMed]

J. J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties Like Those of Two-State Neurons,” Proc. Natl. Acad. Sci. U.S.A. 81, 3088 (1984).
[Crossref] [PubMed]

Proc. Soc. Photo-Opt. Instrum. Eng. (1)

B. Kosko, C. Guest, “Optical Bidirectional Associative Memories,” Proc. Soc. Photo-Opt. Instrum. Eng.758, (1987).

Prog. Theor. Biol. (1)

S. Grossberg, “A Theory of Human Memory: Self-Organization and Performance of Sensory-Motor Codes, Maps, and Plans,” Prog. Theor. Biol. 5, 000 (1978).

Psychol. Rev. (2)

S. Grossberg, “How Does a Brain Build a Cognitive Code?,” Psychol. Rev. 87, 1 (1980).
[Crossref] [PubMed]

J. A. Anderson, J. W. Silverstein, S. A. Ritz, R. S. Jones, “Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model,” Psychol. Rev. 84, 413 (1977).
[Crossref]

Stud. Appl. Math. (1)

S. Grossberg, “Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural Networks,” Stud. Appl. Math. 52, 217 (1973).

Other (9)

R. Hecht-Nielsen, “CounterPropagation Networks,” in Proceedings, First International Conference on Neural Networks (IEEE, New York, 1987).

T. Kohonen, Self-Organization and Associative Memory (Springer-Verlag, New York, 1984).

S. Grossberg, Studies of Mind and Brain: Neural Principles of Learning, Perception, Development, Cognition, and Motor Control (Reidel, Boston, 1982).

S. Grossberg, The Adaptive Brain, I and II (North-Holland, Amsterdam, 1987).

B. Kosko, “Fuzzy Associative Memories,” in Fuzzy Expert Systems, A. Kandel, Ed. (Addison-Wesley, Reading, MA, 1987).

D. B. Parker, “Learning Logic,” Invention Report S81-64, File 1, Office of Technology Licensing, Stanford U. (Oct.1982).

D. B. Parker, “Learning Logic,” TR-47, Center for Computational Research in Economics and Management Science, MIT (Apr.1985).

D. E. Rumelhart, G. E. Hinton, R. J. Williams, “Learning Internal Representations by Error Propagation,” ICS Report 8506, Institute for Cognitive Science, U. California San Diego (Sept.1985).

P. J. Werbos, “Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences,” Ph.D. Dissertation in Statistics, Harvard U. (Aug.1974).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Asynchronous BAM recall. Approximately six neurons update per snapshot. The associated spatial patterns (S,E), (M,V), and (G,N) are stored. Field FA contains 140 neurons; FB,108. Perfect recall of (S,E) is achieved when recall is initiated with a 40% noise-corrupted version of (S,E).
Fig. 2
Fig. 2 Matrix–vector multiplier BAM.
Fig. 3
Fig. 3 BAM volume reflection hologram.
Fig. 4
Fig. 4 Sampling adaptive BAM noisy training set. Forty-eight randomly generated gray-scale noise patterns are presented to the system. Unlike in simple heteroassociative storage, no sample is presented long enough for learning to fully or nearly converge. Twenty-four of the samples are noisy versions of the bipolar association (Y,W); twenty-four are noisy versions of (B,Z). Three samples are displayed from each training set. Samples are presented four at a time—from the (Y,W) training set, then four from the (B,Z) training set, then the next four from the (Y,W) training set, etc. Both fields FA and FB contain forty-nine samples, violate the storage capacity m ≪ min(n,p) for simpleheteroassociative storage.
Fig. 5
Fig. 5 Sampling adaptive BAM associative recall and abstraction. A new noisy version of Y is presented to field FA. Initial BAM STM activation across FA and FB is random. The BAM converges to the pure bipolar association (Y,W) it has never experienced but has abstracted from the noisy training samples in Fig. 4.
Fig. 6
Fig. 6 Sampling adaptive BAM STM superimposition and associative recall. A new noisy version of Z is presented to field FB. This time the bipolar association (Y,W) recalled in Fig. 5 is reverberating in STM. This thought is soon crowded out of STM by the environmental stimulus Z. Again the BAM converges to the unobserved pure bipolar association, this time (B,Z), it abstracted from the noisy training samples.

Equations (42)

Equations on this page are rendered with MathJax. Learn more.

M = A 1 T B 1 + + A m T B m .
A i M = A i A i T B i + i j ( A i A j T ) B j = B i .
A M B , A M T B , A M B , A M T B , A f M B f , A f M T B f ,
a i = { 1 if B M i T > 0 , 0 if B M i T < 0 ,
b j = { 1 if A M j > 0 , 0 if A M j < 0 ,
E ( A , B ) = ½ A M B T ½ B M T A T = A M B T = i j a i b j m ij ,
E ( A , B ) = A M B T I A T + T A T J B T + S B T .
E ( A , B ) i j | m ij | .
Δ E = Δ A M B T = i Δ a i j b j m ij = i Δ a i B M i T .
M T = ( A 1 T B 1 ) T + ( A m T B m ) T = B 1 T A 1 + + B m T A m .
M = X 1 T Y 1 + + X m T Y m
X i M = ( X i X i T ) Y i + j i ( X i X j T ) Y j = n Y i + j i ( X i X j T ) Y j = j c ij Y j ,
1 / n H ( A i , A j ) ~ 1 / p H ( B i , B j ) ,
c ij < > 0 iff H ( A i , A j ) < > n / 2 .
c ij = X i X j T = ( number of common elements ) ( number of different elements ) = [ n H ( A i , A j ) ] H ( A i , A j ) = n 2 H ( A i , A j ) .
A i X i T < > c ij iff H ( A i , A j ) < > n / 2
A 1 = ( 1 0 1 0 1 0 ) B 1 = ( 1 1 0 0 ) , A 2 = ( 1 1 1 0 0 0 ) B 2 = ( 1 0 1 0 ) .
X 1 = ( 1 1 1 1 1 1 ) Y 1 = ( 1 1 1 1 ) , X 2 = ( 1 1 1 1 1 1 ) Y 2 = ( 1 1 1 1 ) .
X 1 T Y 1 = ( 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 ) , X 2 T Y 2 = ( 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 ) .
M = ( 2 0 0 2 0 2 2 0 2 0 0 2 2 0 0 2 0 2 2 0 2 0 0 2 ) .
A 1 M = ( 4 2 2 4 ) ( 1 1 0 0 ) = B 1 , A 2 M = ( 4 2 2 4 ) ( 1 0 1 0 ) = B 2 ,
B 1 M T = ( 2 2 2 2 2 2 ) ( 1 0 1 0 1 0 ) = A 1 , B 2 M T = ( 2 2 2 2 2 2 ) ( 1 1 1 0 0 0 ) = A 2 ,
A M = ( 2 2 2 2 ) ( 1 0 1 0 ) = B 2 ,
A M = ( 2 2 2 2 ) ( 0 1 0 1 ) = B 2 c ,
a ˙ i = a i + j S ( b j ) m ij + I i ,
b ˙ j = b j + i S ( a i ) m ij + J j ,
a ˙ i = a i + ( A i a i ) [ S ( a i ) + I i E ] a i [ j m ij S ( b j ) + I i I ] ,
b ˙ j = b j + ( B j b j ) [ S ( b j ) + J j E ] b j [ i m ij S ( a i ) + J j I ] .
( 0 M M T 0 ) ,
E ( A , B ) = i 0 a i S ( x i ) x i d x i i j S ( a i ) S ( b j ) m ij i S ( a i ) I i + j 0 b j S ( y j ) y j d y j j S ( b j ) J j .
Ė = i S ( a i ) a ˙ i [ a i + j S ( b j ) m ij + I i ] i S ( b j ) b ˙ j [ b j + i S ( a i ) m ij + J j ] = i S ( a i ) a ˙ i 2 j S ( b j ) b ˙ j 2 0 ,
m ˙ ij = m ij ,
m ˙ i j = m ij + a i b j .
m ˙ ij = m ij + S ( a i ) S ( b j ) .
m ij = S e ( a i ) S e ( b j ) .
1 S ( a i ) S ( b j ) 1 .
m ˙ ij + m ij = 1 ,
m ij ( t ) = e t m ij ( 0 ) + 0 t e ( s t ) ds = e t m ij ( 0 ) + ( 1 e t ) 1 as t increases for any initial m ij ( 0 ) .
E ( A , B , M ) = F + 1 / 2 i j m ij 2 ,
m ˙ ij S ( a i ) S ( b j ) + S ( a i ) a ˙ i m ij S ( b j ) + S ( b j ) b ˙ j m ij S ( a i ) .
Ė = i j m ˙ ij [ S ( a i ) S ( b j ) m ij ] i S a ˙ i 2 j S b ˙ j 2 = i j m ˙ ij 2 i S ( a i ) a ˙ i 2 j S ( b j ) b ˙ j 2 0 ,
m ˙ ij = ( i i m ij ) b j ,

Metrics