kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Sensitivity Analysis for Deep Learning: Ranking Hyper-parameter Influence
Univ Reading, Dept Comp Sci, Reading, Berks, England..
Univ Reading, Dept Comp Sci, Reading, Berks, England..ORCID iD: 0000-0002-9256-1192
KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).ORCID iD: 0000-0001-6306-6777
Univ Cambridge, Cambridge, England..
2021 (English)In: 2021 Ieee 33Rd International Conference On Tools With Artificial Intelligence (Ictai 2021), Institute of Electrical and Electronics Engineers (IEEE) , 2021, p. 512-516Conference paper, Published paper (Refereed)
Abstract [en]

We present a novel approach to rank Deep Learning (DL) hyper-parameters through the application of Sensitivity Analysis (SA). DL hyper-parameter tuning is crucial to model accuracy however, choosing optimal values for each parameter is time and resource-intensive. SA provides a quantitative measure by which hyper-parameters can be ranked in terms of contribution to model accuracy. Learning rate decay was ranked highest, with model performance being sensitive to this parameter regardless of architecture or dataset. The influence of a model's initial learning rate was proven to be low, contrary to the literature. Additionally, the importance of a parameter is closely linked to model architecture. Shallower models showed susceptibility to hyper-parameters affecting the stochasticity of the learning process whereas deeper models showed sensitivity to hyper-parameters affecting the convergence speed. Furthermore, the complexity of the dataset can affect the margin of separation between the sensitivity measures of the most and the least influential parameters, making the most influential hyper-parameter an ideal candidate for tuning compared to the other parameters.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2021. p. 512-516
Series
Proceedings-International Conference on Tools With Artificial Intelligence, ISSN 1082-3409
Keywords [en]
Sensitivity Analysis, Deep Learning, Hyper-parameter Tuning, Hyper-parameter rank, Hyper-parameter Influence
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-309319DOI: 10.1109/ICTAI52525.2021.00083ISI: 000747482300075Scopus ID: 2-s2.0-85123932953OAI: oai:DiVA.org:kth-309319DiVA, id: diva2:1641530
Conference
IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI), NOV 01-03, 2021, ELECTR NETWORK, Washington, DC, USA
Note

QC 20220302

Part of proceedings ISBN: 978-1-6654-0898-1

Available from: 2022-03-02 Created: 2022-03-02 Last updated: 2022-06-25Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Martino, Ivan

Search in DiVA

By author/editor
Ojha, VarunMartino, Ivan
By organisation
Mathematics (Dept.)
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 42 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf