kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Minimax Two-Stage Gradient Boosting for Parameter Estimation
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Decision and Control Systems (Automatic Control).ORCID iD: 0009-0008-4893-0473
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Decision and Control Systems (Automatic Control).ORCID iD: 0000-0003-0355-2663
2023 (English)In: 2023 62nd IEEE Conference on Decision and Control, CDC 2023, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 1189-1194Conference paper, Published paper (Refereed)
Abstract [en]

Parameter estimation is an important sub-field in statistics and system identification. Various methods for parameter estimation have been proposed in the literature, among which the Two-Stage (TS) approach is particularly promising, due to its ease of implementation and reliable estimates. Among the different statistical frameworks used to derive TS estimators, the min-max framework is attractive due to its mild dependence on prior knowledge about the parameters to be estimated. However, the existing implementation of the minimax TS approach has currently limited applicability, due to its heavy computational load. In this paper, we overcome this difficulty by using a gradient boosting machine (GBM) in the second stage of TS approach. We call the resulting algorithm the Two-Stage Gradient Boosting Machine (TSGBM) estimator. Finally, we test our proposed TSGBM estimator on several numerical examples including models of dynamical systems.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2023. p. 1189-1194
Keywords [en]
estimation theory, Gradient Boosting, statistical decision theory, Two-Stage approach
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:kth:diva-343732DOI: 10.1109/CDC49753.2023.10383385ISI: 001166433800144Scopus ID: 2-s2.0-85184808110OAI: oai:DiVA.org:kth-343732DiVA, id: diva2:1839927
Conference
62nd IEEE Conference on Decision and Control, CDC 2023, Singapore, Singapore, Dec 13 2023 - Dec 15 2023
Note

QC 20240222

Part of ISBN 979-8-3503-0124-3

Available from: 2024-02-22 Created: 2024-02-22 Last updated: 2024-04-04Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Lakshminarayanan, BraghadeeshRojas, Cristian R.

Search in DiVA

By author/editor
Lakshminarayanan, BraghadeeshRojas, Cristian R.
By organisation
Decision and Control Systems (Automatic Control)
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 26 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf