Implementing Plastic Weights in Neural Networks using Low Precision Arithmetic
2009 (English)In: Neurocomputing, ISSN 0925-2312, Vol. 72, no 4-6, 968-972 p.Article in journal (Refereed) Published
In this letter, we develop a fixed-point arithmetic, low precision, implementation of an exponentially weighted moving average (EWMA) that is used in a neural network with plastic weights. We analyze the proposed design both analytically and experimentally, and we also evaluate its performance in the application of an attractor neural network. The EWMA in the proposed design has a constant relative truncation error, which is important for avoiding round-off errors in applications with slowly decaying processes, e.g. connectionist networks. We conclude that the proposed design offers greatly improved memory and computational efficiency compared to a naive implementation of the EWMA's difference equation, and that it is well suited for implementation in digital hardware.
Place, publisher, year, edition, pages
2009. Vol. 72, no 4-6, 968-972 p.
Exponentially weighted moving average, Fixed-point arithmetic, Leaky integrator, Low precision variables, Neural networks, Plastic weights
IdentifiersURN: urn:nbn:se:kth:diva-6240DOI: 10.1016/j.neucom.2008.04.007ISI: 000263372000030ScopusID: 2-s2.0-58149468833OAI: oai:DiVA.org:kth-6240DiVA: diva2:10892
2nd International Work-Conference on the Interplay Between Natural and Artificial Computation, La Manga del Mar Menor, SPAIN, JUN 18-21, 2007
QC 20100917. Uppdaterad från Submitted till Published (20100917)
QC 201509152006-10-092006-10-092015-09-15Bibliographically approved