kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Predicting Graph Signals Using Kernel Regression Where the Input Signal is Agnostic to a Graph
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-1285-8947
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0003-2638-6047
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Information Science and Engineering.ORCID iD: 0000-0002-2718-0262
2019 (English)In: IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, ISSN 2373-776X, Vol. 5, no 4, p. 698-710Article in journal (Refereed) Published
Abstract [en]

We propose a kernel regression method to predict a target signal lying over a graph when an input observation is given. The input and the output could be two different physical quantities. In particular, the input may not be a graph signal at all or it could be agnostic to an underlying graph. We use a training dataset to learn the proposed regression model by formulating it as a convex optimization problem, where we use a graph-Laplacian based regularization to enforce that the predicted target is a graph signal. Once the model is learnt, it can be directly used on a large number of test data points one-by-one independently to predict the corresponding targets. Our approach employs kernels between the various input observations, and as a result the kernels are not restricted to be functions of the graph adjacency/Laplacian matrix. We show that the proposed kernel regression exhibits a smoothing effect, while simultaneously achieving noise-reduction and graph-smoothness. We then extend our method to the case when the underlying graph may not be known apriori, by simultaneously learning an underlying graph and the regression coefficients. Using extensive experiments, we show that our method provides a good prediction performance in adverse conditions, particularly when the training data is limited in size and is noisy. In graph signal reconstruction experiments, our method is shown to provide a good performance even for a highly under-determined subsampling.

Place, publisher, year, edition, pages
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC , 2019. Vol. 5, no 4, p. 698-710
Keywords [en]
Kernel, Signal processing, Mathematical model, Machine learning, Training, Predictive models, Image reconstruction, Linear model, regression, kernels, graph signal processing, graph-Laplacian
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-264148DOI: 10.1109/TSIPN.2019.2936358ISI: 000492993200007Scopus ID: 2-s2.0-85071685668OAI: oai:DiVA.org:kth-264148DiVA, id: diva2:1376313
Note

QC 20191209

Available from: 2019-12-09 Created: 2019-12-09 Last updated: 2022-06-26Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Venkitaraman, ArunChatterjee, SaikatHändel, Peter

Search in DiVA

By author/editor
Venkitaraman, ArunChatterjee, SaikatHändel, Peter
By organisation
Information Science and Engineering
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 168 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf