kth.sePublications
Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Monte Carlo Filtering Objectives
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL. AI Lab, Volvo Car Corporation.ORCID iD: 0000-0002-8117-5982
AI Lab, Volvo Car Corporation.
Chalmers University of Technology, Gothenburg, Sweden.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.ORCID iD: 0000-0003-0579-3372
2021 (English)In: IJCAI International Joint Conference on Artificial Intelligence, International Joint Conferences on Artificial Intelligence , 2021, p. 2256-2262Conference paper, Published paper (Refereed)
Abstract [en]

Learning generative models and inferring latent trajectories have shown to be challenging for time series due to the intractable marginal likelihoods of flexible generative models. It can be addressed by surrogate objectives for optimization. We propose Monte Carlo filtering objectives (MCFOs), a family of variational objectives for jointly learning parametric generative models and amortized adaptive importance proposals of time series. MCFOs extend the choices of likelihood estimators beyond Sequential Monte Carlo in state-of-the-art objectives, possess important properties revealing the factors for the tightness of objectives, and allow for less biased and variant gradient estimates. We demonstrate that the proposed MCFOs and gradient estimations lead to efficient and stable model learning, and learned generative models well explain data and importance proposals are more sample efficient on various kinds of time series data. 

Place, publisher, year, edition, pages
International Joint Conferences on Artificial Intelligence , 2021. p. 2256-2262
Keywords [en]
Artificial intelligence, Learning systems, Monte Carlo methods, Generative model, Gradient estimates, Marginal likelihood, Monte Carlo filtering, Objective estimations, Optimisations, Property, Sequential Monte Carlo, State of the art, Times series, Time series
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-316354Scopus ID: 2-s2.0-85125449391OAI: oai:DiVA.org:kth-316354DiVA, id: diva2:1687747
Conference
30th International Joint Conference on Artificial Intelligence, IJCAI 2021, Virtual/Online, 19-27 August 2021
Note

Part of proceedings: ISBN 978-0-9992411-9-6

QC 20220816

Available from: 2022-08-16 Created: 2022-08-16 Last updated: 2022-08-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Scopus

Authority records

Chen, ShuangshuangBjörkman, Mårten

Search in DiVA

By author/editor
Chen, ShuangshuangBjörkman, Mårten
By organisation
Robotics, Perception and Learning, RPL
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 71 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf