Abstract
A neural net capable of restoring continuous level library vectors from memory is considered. As with Hopfield’s neural net content addressable memory, the vectors in the memory library are used to program the neural interconnects. Given a portion of one of the library vectors, the net extrapolates the remainder. Sufficient conditions for convergence are stated. Effects of processor inexactitude and net faults are discussed. A more efficient computational technique for performing the memory extrapolation (at the cost of fault tolerance) is derived. The special case of table lookup memories is addressed specifically.
© 1987 Optical Society of America
Full Article | PDF ArticleMore Like This
Kwan F. Cheung, Les E. Atlas, and Robert J. Marks
Appl. Opt. 26(22) 4808-4813 (1987)
M. Ibrahim Sezan, Henry Stark, and Shu-Jen Yeh
Appl. Opt. 29(17) 2616-2622 (1990)
Bohdan Macukow and Henri H. Arsenault
Appl. Opt. 26(5) 924-928 (1987)