kth.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Conditional mutual information-based contrastive loss for financial time series forecasting
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0003-2579-2107
RISE Research Institutes of Sweden, Stockholm, Sweden.ORCID-id: 0000-0003-4298-3634
KTH, Skolan för elektroteknik och datavetenskap (EECS), Intelligenta system, Teknisk informationsvetenskap.ORCID-id: 0000-0002-7807-5681
2020 (Engelska)Ingår i: Proceedings ICAIF '20: The First ACM International Conference on AI in Finance, Association for Computing Machinery (ACM) , 2020Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

We present a representation learning framework for financial time series forecasting. One challenge of using deep learning models for finance forecasting is the shortage of available training data when using small datasets. Direct trend classification using deep neural networks trained on small datasets is susceptible to the overfitting problem. In this paper, we propose to first learn compact representations from time series data, then use the learned representations to train a simpler model for predicting time series movements. We consider a class-conditioned latent variable model. We train an encoder network to maximize the mutual information between the latent variables and the trend information conditioned on the encoded observed variables. We show that conditional mutual information maximization can be approximated by a contrastive loss. Then, the problem is transformed into a classification task of determining whether two encoded representations are sampled from the same class or not. This is equivalent to performing pairwise comparisons of the training datapoints, and thus, improves the generalization ability of the encoder network. We use deep autoregressive models as our encoder to capture long-term dependencies of the sequence data. Empirical experiments indicate that our proposed method has the potential to advance state-of-the-art performance.

Ort, förlag, år, upplaga, sidor
Association for Computing Machinery (ACM) , 2020.
Nyckelord [en]
Classification (of information), Deep neural networks, Equivalence classes, Finance, Signal encoding, Time series, Compact representation, Conditional mutual information, Financial time series forecasting, Learn+, Learning frameworks, Learning models, Over fitting problem, Small data set, Time-series data, Training data, Forecasting
Nationell ämneskategori
Data- och informationsvetenskap
Identifikatorer
URN: urn:nbn:se:kth:diva-313547DOI: 10.1145/3383455.3422550Scopus ID: 2-s2.0-85095337230OAI: oai:DiVA.org:kth-313547DiVA, id: diva2:1669305
Konferens
ICAIF '20: The First ACM International Conference on AI in Finance, New York, NY, USA, October 15-16, 2020
Anmärkning

Part of ISBN 9781450375849

QC 20220614

Tillgänglig från: 2022-06-14 Skapad: 2022-06-14 Senast uppdaterad: 2022-06-25Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Wu, HanweiGattami, AtherFlierl, Markus

Sök vidare i DiVA

Av författaren/redaktören
Wu, HanweiGattami, AtherFlierl, Markus
Av organisationen
Teknisk informationsvetenskap
Data- och informationsvetenskap

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 32 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf