Robust classification using hidden markov models and mixtures of normalizing flows Show others and affiliations
2020 (English) In: 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP), Institute of Electrical and Electronics Engineers (IEEE) , 2020, article id 9231775Conference paper, Published paper (Refereed)
Abstract [en]
We test the robustness of a maximum-likelihood (ML) based classifier where sequential data as observation is corrupted by noise. The hypothesis is that a generative model, that combines the state transitions of a hidden Markov model (HMM) and the neural network based probability distributions for the hidden states of the HMM, can provide a robust classification performance. The combined model is called normalizing-flow mixture model based HMM (NMM-HMM). It can be trained using a combination of expectation-maximization (EM) and backpropagation. We verify the improved robustness of NMM-HMM classifiers in an application to speech recognition.
Place, publisher, year, edition, pages Institute of Electrical and Electronics Engineers (IEEE) , 2020. article id 9231775
Series
IEEE International Workshop on Machine Learning for Signal Processing, ISSN 2161-0363
Keywords [en]
Generative models, Hidden Markov models, Neural networks, Speech recognition, Backpropagation, Learning systems, Maximum likelihood, Maximum principle, Mixtures, Signal processing, Trellis codes, Combined model, Expectation Maximization, Generative model, Hidden state, Mixture model, Robust classification, Sequential data, State transitions
National Category
Probability Theory and Statistics Signal Processing
Identifiers URN: urn:nbn:se:kth:diva-291596 DOI: 10.1109/MLSP49062.2020.9231775 ISI: 000630907800045 Scopus ID: 2-s2.0-85096485816 OAI: oai:DiVA.org:kth-291596 DiVA, id: diva2:1539538
Conference 30th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2020, 21 September 2020 through 24 September 2020, virtual, Espoo21 September 2020 through 24 September 2020
Note QC 20210324
Part of conference proceedings: ISBN 9781728166629
2021-03-242021-03-242024-05-02 Bibliographically approved