kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Feature Reuse For A Randomization Based Neural Network
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-4406-536x
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-7926-5081
KTH, School of Electrical Engineering and Computer Science (EECS), Centres, ACCESS Linnaeus Centre. KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-2638-6047
2021 (English)In: 2021 Ieee International Conference On Acoustics, Speech And Signal Processing (ICASSP 2021), Institute of Electrical and Electronics Engineers (IEEE) , 2021, p. 2805-2809Conference paper, Published paper (Refereed)
Abstract [en]

We propose a feature reuse approach for an existing multi-layer randomization based feedforward neural network. The feature representation is directly linked among all the necessary hidden layers. For the feature reuse at a particular layer, we concatenate features from the previous layers to construct a large-dimensional feature for the layer. The large-dimensional concatenated feature is then efficiently used to learn a limited number of parameters by solving a convex optimization problem. Experiments show that the proposed model improves the performance in comparison with the original neural network without a significant increase in computational complexity.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2021. p. 2805-2809
Keywords [en]
Multi-layer neural network, randomization based neural network, convex optimization, feature reuse
National Category
Telecommunications
Identifiers
URN: urn:nbn:se:kth:diva-305415DOI: 10.1109/ICASSP39728.2021.9413424ISI: 000704288403012Scopus ID: 2-s2.0-85114863008OAI: oai:DiVA.org:kth-305415DiVA, id: diva2:1615817
Conference
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), JUN 06-11, 2021, ELECTR NETWORK
Note

Part of proceedings: ISBN 978-1-7281-7605-5, QC 20230118

Available from: 2021-12-01 Created: 2021-12-01 Last updated: 2023-01-18Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Liang, XinyueSkoglund, MikaelChatterjee, Saikat

Search in DiVA

By author/editor
Liang, XinyueSkoglund, MikaelChatterjee, Saikat
By organisation
Information Science and EngineeringACCESS Linnaeus Centre
Telecommunications

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 26 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf