Learning Convolutive Features for Storage and Transmission between Networked Sensors
2015 (English)In: IEEE International Joint Conference on Neural Networks / [ed] IEEE, IEEE , 2015, 1-8 p.Conference paper (Refereed)
Discovering an efficient representation that reflects the structure of a signal ensemble is a requirement of many Machine Learning and Signal Processing methods, and gaining increasing prevalence in sensing systems. This type of representation can be constructed by Convolutive Non-negative Matrix Factorization (CNMF), which finds parts-based convolutive representations of non-negative data. However, convolutive extensions of NMF have not yet considered storage efficiency as a side constraint during the learning procedure. To address this challenge, we describe a new algorithm that fuses ideas from the 1) parts-based learning and 2) integer sequence compression literature. The resulting algorithm, Storable NMF (SNMF), enjoys the merits of both techniques: it retains the good-approximation properties of CNMF while also taking into account the size of the symbol set which is used to express the learned convolutive factors and activations. We argue that CNMF is not as amenable to transmission and storage, in networked sensing systems, as SNMF. We demonstrate that SNMF yields a compression ratio ranging from 10:1 up to 20:1, depending on the signal, which gives rise to a similar bandwidth saving for networked sensors.
Place, publisher, year, edition, pages
IEEE , 2015. 1-8 p.
, IEEE International Joint Conference on Neural Networks (IJCNN), ISSN 2161-4393
Research subject Computer Science
IdentifiersURN: urn:nbn:se:kth:diva-173806DOI: 10.1109/IJCNN.2015.7280827ISI: 000370730603125ScopusID: 2-s2.0-84951002606ISBN: 978-1-4799-1959-8OAI: oai:DiVA.org:kth-173806DiVA: diva2:854951
International Joint Conference on Neural Networks 2015
QC 201604112015-09-182015-09-182016-04-11Bibliographically approved