Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Model selection with covariance matching based non-negative lasso
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-6855-5868
2020 (English)In: Signal Processing, ISSN 0165-1684, E-ISSN 1872-7557, Vol. 170, article id 107431Article in journal (Refereed) Published
Abstract [en]

We consider the problem of model selection for high-dimensional linear regressions in the context of support recovery with multiple measurement vectors available. Here, we assume that the regression coefficient vectors have a common support and the elements of the additive noise vector are potentially correlated. Accordingly, to estimate the support, we propose a non-negative Lasso estimator that is based on covariance matching techniques. We provide deterministic conditions under which the support estimate of our method is guaranteed to match the true support. Further, we use the extended Fisher information criterion to select the tuning parameter in our non-negative Lasso. We also prove that the extended Fisher information criterion can find the true support with probability one as the number of rows in the design matrix grows to infinity. The numerical simulations confirm that our support estimate is asymptotically consistent. Finally, the simulations also show that the proposed method is robust to high correlation between columns of the design matrix.

Place, publisher, year, edition, pages
Elsevier, 2020. Vol. 170, article id 107431
Keywords [en]
Covariance matching, Extended Bayesian information criterion, Generalized least squares, High-dimensional inference, Model selection, Non-negative lasso, Regularization, Sparse multiple measurement vector model
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-267783DOI: 10.1016/j.sigpro.2019.107431ISI: 000515206600030Scopus ID: 2-s2.0-85077691312OAI: oai:DiVA.org:kth-267783DiVA, id: diva2:1411745
Note

QC 20200304

Available from: 2020-03-04 Created: 2020-03-04 Last updated: 2020-03-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Jansson, Magnus

Search in DiVA

By author/editor
Owrang, ArashJansson, Magnus
By organisation
Information Science and Engineering
In the same journal
Signal Processing
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 22 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf