Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
High-dimensional neural feature using rectified linear unit and random matrix instance
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0002-8534-7622
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0003-1285-8947
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0002-7926-5081
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0003-2638-6047
2020 (engelsk)Inngår i: 2020 IEEE international conference on acoustics, speech, and signal processing, Institute of Electrical and Electronics Engineers (IEEE), 2020, s. 4237-4241Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

We design a ReLU-based multilayer neural network to generate a rich high-dimensional feature vector. The feature guarantees a monotonically decreasing training cost as the number of layers increases. We design the weight matrix in each layer to extend the feature vectors to a higher dimensional space while providing a richer representation in the sense of training cost. Linear projection to the target in the higher dimensional space leads to a lower training cost if a convex cost is minimized. An l(2)-norm convex constraint is used in the minimization to improve the generalization error and avoid overfitting. The regularization hyperparameters of the network are derived analytically to guarantee a monotonic decrement of the training cost and therefore, it eliminates the need for cross-validation to find the regularization hyperparameter in each layer.

sted, utgiver, år, opplag, sider
Institute of Electrical and Electronics Engineers (IEEE), 2020. s. 4237-4241
Serie
International Conference on Acoustics Speech and Signal Processing ICASSP, ISSN 1520-6149
Emneord [en]
Rectified linear unit, random matrix, convex cost function
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-292709DOI: 10.1109/ICASSP40776.2020.9054736ISI: 000615970404097Scopus ID: 2-s2.0-85091295366OAI: oai:DiVA.org:kth-292709DiVA, id: diva2:1544039
Konferanse
IEEE International Conference on Acoustics, Speech, and Signal Processing, MAY 04-08, 2020, Barcelona, SPAIN
Merknad

QC 20210710

Tilgjengelig fra: 2021-04-14 Laget: 2021-04-14 Sist oppdatert: 2022-06-25bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Javid, Alireza M.Venkitaraman, ArunSkoglund, MikaelChatterjee, Saikat

Søk i DiVA

Av forfatter/redaktør
Javid, Alireza M.Venkitaraman, ArunSkoglund, MikaelChatterjee, Saikat
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric

doi
urn-nbn
Totalt: 39 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf