Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Learning Convolutive Features for Storage and Transmission between Networked Sensors
KTH, School of Electrical Engineering (EES), Communication Networks. KTH, School of Electrical Engineering (EES), Centres, ACCESS Linnaeus Centre.
2015 (English)In: IEEE International Joint Conference on Neural Networks / [ed] IEEE, IEEE , 2015, p. 1-8Conference paper, Published paper (Refereed)
Abstract [en]

Discovering an efficient representation that reflects the structure of a signal ensemble is a requirement of many Machine Learning and Signal Processing methods, and gaining increasing prevalence in sensing systems. This type of representation can be constructed by Convolutive Non-negative Matrix Factorization (CNMF), which finds parts-based convolutive representations of non-negative data. However, convolutive extensions of NMF have not yet considered storage efficiency as a side constraint during the learning procedure. To address this challenge, we describe a new algorithm that fuses ideas from the 1) parts-based learning and 2) integer sequence compression literature. The resulting algorithm, Storable NMF (SNMF), enjoys the merits of both techniques: it retains the good-approximation properties of CNMF while also taking into account the size of the symbol set which is used to express the learned convolutive factors and activations. We argue that CNMF is not as amenable to transmission and storage, in networked sensing systems, as SNMF. We demonstrate that SNMF yields a compression ratio ranging from 10:1 up to 20:1, depending on the signal, which gives rise to a similar bandwidth saving for networked sensors.

Place, publisher, year, edition, pages
IEEE , 2015. p. 1-8
Series
IEEE International Joint Conference on Neural Networks (IJCNN), ISSN 2161-4393
National Category
Computer Systems
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-173806DOI: 10.1109/IJCNN.2015.7280827ISI: 000370730603125Scopus ID: 2-s2.0-84951002606ISBN: 978-1-4799-1959-8 (print)OAI: oai:DiVA.org:kth-173806DiVA, id: diva2:854951
Conference
International Joint Conference on Neural Networks 2015
Projects
EOLAS
Note

QC 20160411

Available from: 2015-09-18 Created: 2015-09-18 Last updated: 2016-04-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
de Fréin, Ruairí
By organisation
Communication NetworksACCESS Linnaeus Centre
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 25 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf