Robert Manson Sawko, Malgorzata Zimon
SIAM/ASA JUQ
A boundary layer method for accelerating the solution of the differential equations representing the dynamics of an analog relaxation neural net in a high gain limit is presented. The inverse of the gain parameter in an analog neuron's transfer function is used as a small parameter, in terms of which the net dynamics may be separated into two time scales. This separation leads to economies in the numerical treatment of the associated differential equations, i.e., the acceleration in question. Illustrative computations are presented. © 1993.
Robert Manson Sawko, Malgorzata Zimon
SIAM/ASA JUQ
Simeon Furrer, Dirk Dahlhaus
ISIT 2005
Karthik Visweswariah, Sanjeev Kulkarni, et al.
IEEE International Symposium on Information Theory - Proceedings
Kafai Lai, Alan E. Rosenbluth, et al.
SPIE Advanced Lithography 2007