Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 102) Show all publications
Liu, D., Vu, M. T., Chatterjee, S. & Rasmussen, L. K. (2019). ENTROPY-REGULARIZED OPTIMAL TRANSPORT GENERATIVE MODELS. In: 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP): . Paper presented at 44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), MAY 12-17, 2019, Brighton, ENGLAND (pp. 3532-3536). IEEE
Open this publication in new window or tab >>ENTROPY-REGULARIZED OPTIMAL TRANSPORT GENERATIVE MODELS
2019 (English)In: 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), IEEE , 2019, p. 3532-3536Conference paper, Published paper (Refereed)
Abstract [en]

We investigate the use of entropy-regularized optimal transport (EOT) cost in developing generative models to learn implicit distributions. Two generative models are proposed. One uses EOT cost directly in an one-shot optimization problem and the other uses EOT cost iteratively in an adversarial game. The proposed generative models show improved performance over contemporary models on scores of sample based test.

Place, publisher, year, edition, pages
IEEE, 2019
Series
International Conference on Acoustics Speech and Signal Processing ICASSP, ISSN 1520-6149
Keywords
Optimal transport, generative models
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-261047 (URN)10.1109/ICASSP.2019.8682721 (DOI)000482554003151 ()2-s2.0-85068999197 (Scopus ID)978-1-4799-8131-1 (ISBN)
Conference
44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), MAY 12-17, 2019, Brighton, ENGLAND
Note

QC 20191001

Available from: 2019-10-01 Created: 2019-10-01 Last updated: 2019-10-01Bibliographically approved
Zaki, A., Mitra, P. P., Rasmussen, L. K. & Chatterjee, S. (2019). Estimate exchange over network is good for distributed hard thresholding pursuit. Signal Processing, 156, 1-11
Open this publication in new window or tab >>Estimate exchange over network is good for distributed hard thresholding pursuit
2019 (English)In: Signal Processing, ISSN 0165-1684, E-ISSN 1872-7557, Vol. 156, p. 1-11Article in journal (Refereed) Published
Abstract [en]

We investigate an existing distributed algorithm for learning sparse signals or data over networks. The algorithm is iterative and exchanges intermediate estimates of a sparse signal over a network. This learning strategy using exchange of intermediate estimates over the network requires a limited communication overhead for information transmission. Our objective in this article is to show that the strategy is good for learning in spite of limited communication. In pursuit of this objective, we first provide a restricted isometry property (RIP)-based theoretical analysis on convergence of the iterative algorithm. Then, using simulations, we show that the algorithm provides competitive performance in learning sparse signals vis-a-vis an existing alternate distributed algorithm. The alternate distributed algorithm exchanges more information including observations and system parameters.

Place, publisher, year, edition, pages
Elsevier, 2019
Keywords
Sparse learning, Distributed algorithm, Greedy pursuit algorithm, RIP analysis
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-240987 (URN)10.1016/j.sigpro.2018.10.010 (DOI)000453494200001 ()2-s2.0-85055577903 (Scopus ID)
Note

QC 20190110

Available from: 2019-01-10 Created: 2019-01-10 Last updated: 2019-06-11Bibliographically approved
Venkitaraman, A., Frossard, P. & Chatterjee, S. (2019). KERNEL REGRESSION FOR GRAPH SIGNAL PREDICTION IN PRESENCE OF SPARSE NOISE. In: 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP): . Paper presented at 44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), MAY 12-17, 2019, Brighton, ENGLAND (pp. 5426-5430). IEEE
Open this publication in new window or tab >>KERNEL REGRESSION FOR GRAPH SIGNAL PREDICTION IN PRESENCE OF SPARSE NOISE
2019 (English)In: 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), IEEE , 2019, p. 5426-5430Conference paper, Published paper (Refereed)
Abstract [en]

In presence of sparse noise we propose kernel regression for predicting output vectors which are smooth over a given graph. Sparse noise models the training outputs being corrupted either with missing samples or large perturbations. The presence of sparse noise is handled using appropriate use of l(1)-norm along-with use of l(2)-norm in a convex cost function. For optimization of the cost function, we propose an iteratively reweighted least-squares (IRLS) approach that is suitable for kernel substitution or kernel trick due to availability of a closed form solution. Simulations using real-world temperature data show efficacy of our proposed method, mainly for limited-size training datasets.

Place, publisher, year, edition, pages
IEEE, 2019
Series
International Conference on Acoustics Speech and Signal Processing ICASSP, ISSN 1520-6149
Keywords
Kernel regression, graph signal processing, Sparse noise, graph-Laplacian, iteratively reweighted least squares
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-261058 (URN)10.1109/ICASSP.2019.8682979 (DOI)000482554005132 ()2-s2.0-85068972923 (Scopus ID)978-1-4799-8131-1 (ISBN)
Conference
44th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), MAY 12-17, 2019, Brighton, ENGLAND
Note

QC 20191002

Available from: 2019-10-02 Created: 2019-10-02 Last updated: 2019-10-02Bibliographically approved
Venkitaraman, A., Chatterjee, S. & Händel, P. (2019). On Hilbert transform, analytic signal, and modulation analysis for signals over graphs. Signal Processing, 156, 106-115
Open this publication in new window or tab >>On Hilbert transform, analytic signal, and modulation analysis for signals over graphs
2019 (English)In: Signal Processing, ISSN 0165-1684, E-ISSN 1872-7557, Vol. 156, p. 106-115Article in journal (Refereed) Published
Abstract [en]

We propose Hilbert transform and analytic signal construction for signals over graphs. This is motivated by the popularity of Hilbert transform, analytic signal, and modulation analysis in conventional signal processing, and the observation that complementary insight is often obtained by viewing conventional signals in the graph setting. Our definitions of Hilbert transform and analytic signal use a conjugate symmetry-like property exhibited by the graph Fourier transform (GFT), resulting in a 'one-sided' spectrum for the graph analytic signal. The resulting graph Hilbert transform is shown to possess many interesting mathematical properties and also exhibit the ability to highlight anomalies/discontinuities in the graph signal and the nodes across which signal discontinuities occur. Using the graph analytic signal, we further define amplitude, phase, and frequency modulations for a graph signal. We illustrate the proposed concepts by showing applications to synthesized and real-world signals. For example, we show that the graph Hilbert transform can indicate presence of anomalies and that graph analytic signal, and associated amplitude and frequency modulations reveal complementary information in speech signals.

Place, publisher, year, edition, pages
Elsevier, 2019
Keywords
Graph signal processing, Analytic signal, Hilbert transform, Demodulation, Anomaly detection
National Category
Control Engineering
Identifiers
urn:nbn:se:kth:diva-240988 (URN)10.1016/j.sigpro.2018.10.016 (DOI)000453494200011 ()2-s2.0-85056192636 (Scopus ID)
Note

QC 20190110

Available from: 2019-01-10 Created: 2019-01-10 Last updated: 2019-06-11Bibliographically approved
Venkitaraman, A., Chatterjee, S. & Händel, P. (2019). Predicting Graph Signals Using Kernel Regression Where the Input Signal is Agnostic to a Graph. IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 5(4), 698-710
Open this publication in new window or tab >>Predicting Graph Signals Using Kernel Regression Where the Input Signal is Agnostic to a Graph
2019 (English)In: IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, ISSN 2373-776X, Vol. 5, no 4, p. 698-710Article in journal (Refereed) Published
Abstract [en]

We propose a kernel regression method to predict a target signal lying over a graph when an input observation is given. The input and the output could be two different physical quantities. In particular, the input may not be a graph signal at all or it could be agnostic to an underlying graph. We use a training dataset to learn the proposed regression model by formulating it as a convex optimization problem, where we use a graph-Laplacian based regularization to enforce that the predicted target is a graph signal. Once the model is learnt, it can be directly used on a large number of test data points one-by-one independently to predict the corresponding targets. Our approach employs kernels between the various input observations, and as a result the kernels are not restricted to be functions of the graph adjacency/Laplacian matrix. We show that the proposed kernel regression exhibits a smoothing effect, while simultaneously achieving noise-reduction and graph-smoothness. We then extend our method to the case when the underlying graph may not be known apriori, by simultaneously learning an underlying graph and the regression coefficients. Using extensive experiments, we show that our method provides a good prediction performance in adverse conditions, particularly when the training data is limited in size and is noisy. In graph signal reconstruction experiments, our method is shown to provide a good performance even for a highly under-determined subsampling.

Place, publisher, year, edition, pages
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2019
Keywords
Kernel, Signal processing, Mathematical model, Machine learning, Training, Predictive models, Image reconstruction, Linear model, regression, kernels, graph signal processing, graph-Laplacian
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-264148 (URN)10.1109/TSIPN.2019.2936358 (DOI)000492993200007 ()2-s2.0-85071685668 (Scopus ID)
Note

QC 20191209

Available from: 2019-12-09 Created: 2019-12-09 Last updated: 2019-12-09Bibliographically approved
Liang, X., Javid, A. M., Skoglund, M. & Chatterjee, S. (2018). DISTRIBUTED LARGE NEURAL NETWORK WITH CENTRALIZED EQUIVALENCE. In: 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP): . Paper presented at 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) (pp. 2976-2980). IEEE
Open this publication in new window or tab >>DISTRIBUTED LARGE NEURAL NETWORK WITH CENTRALIZED EQUIVALENCE
2018 (English)In: 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), IEEE, 2018, p. 2976-2980Conference paper, Published paper (Refereed)
Abstract [en]

In this article, we develop a distributed algorithm for learning a large neural network that is deep and wide. We consider a scenario where the training dataset is not available in a single processing node, but distributed among several nodes. We show that a recently proposed large neural network architecture called progressive learning network (PLN) can be trained in a distributed setup with centralized equivalence. That means we would get the same result if the data be available in a single node. Using a distributed convex optimization method called alternating-direction-method-of-multipliers (ADMM), we perform training of PLN in the distributed setup.

Place, publisher, year, edition, pages
IEEE, 2018
Keywords
Distributed learning, neural networks, data parallelism, convex optimization
National Category
Communication Systems
Identifiers
urn:nbn:se:kth:diva-237152 (URN)10.1109/ICASSP.2018.8462179 (DOI)000446384603029 ()2-s2.0-85054237028 (Scopus ID)
Conference
2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Note

QC 20181025

Available from: 2018-10-25 Created: 2018-10-25 Last updated: 2019-08-20Bibliographically approved
Venkitaraman, A., Chatterjee, S. & Händel, P. (2018). Extreme learning machine for graph signal processing. In: 2018 26th European Signal Processing Conference (EUSIPCO): . Paper presented at 26th European Signal Processing Conference, EUSIPCO 2018, Rome, Italy, 3 September 2018 through 7 September 2018 (pp. 136-140). European Signal Processing Conference, EUSIPCO, Article ID 8553088.
Open this publication in new window or tab >>Extreme learning machine for graph signal processing
2018 (English)In: 2018 26th European Signal Processing Conference (EUSIPCO), European Signal Processing Conference, EUSIPCO , 2018, p. 136-140, article id 8553088Conference paper, Published paper (Refereed)
Abstract [en]

In this article, we improve extreme learning machines for regression tasks using a graph signal processing based regularization. We assume that the target signal for prediction or regression is a graph signal. With this assumption, we use the regularization to enforce that the output of an extreme learning machine is smooth over a given graph. Simulation results with real data confirm that such regularization helps significantly when the available training data is limited in size and corrupted by noise.

Place, publisher, year, edition, pages
European Signal Processing Conference, EUSIPCO, 2018
Series
European Signal Processing Conference, ISSN 2219-5491
National Category
Signal Processing
Identifiers
urn:nbn:se:kth:diva-241525 (URN)10.23919/EUSIPCO.2018.8553088 (DOI)000455614900028 ()2-s2.0-85059801757 (Scopus ID)9789082797015 (ISBN)
Conference
26th European Signal Processing Conference, EUSIPCO 2018, Rome, Italy, 3 September 2018 through 7 September 2018
Note

QC 20180123

Available from: 2019-01-23 Created: 2019-01-23 Last updated: 2019-02-01Bibliographically approved
Venkitaraman, A., Chatterjee, S. & Händel, P. (2018). MULTI-KERNEL REGRESSION FOR GRAPH SIGNAL PROCESSING. In: 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP): . Paper presented at 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) (pp. 4644-4648). IEEE
Open this publication in new window or tab >>MULTI-KERNEL REGRESSION FOR GRAPH SIGNAL PROCESSING
2018 (English)In: 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), IEEE, 2018, p. 4644-4648Conference paper, Published paper (Refereed)
Abstract [en]

We develop a multi-kernel based regression method for graph signal processing where the target signal is assumed to be smooth over a graph. In multi-kernel regression, an effective kernel function is expressed as a linear combination of many basis kernel functions. We estimate the linear weights to learn the effective kernel function by appropriate regularization based on graph smoothness. We show that the resulting optimization problem is shown to be convex and propose an accelerated projected gradient descent based solution. Simulation results using real-world graph signals show efficiency of the multi-kernel based approach over a standard kernel based approach.

Place, publisher, year, edition, pages
IEEE, 2018
Keywords
Graph signal processing, kernel regression, convex optimization
National Category
Signal Processing
Identifiers
urn:nbn:se:kth:diva-237154 (URN)10.1109/ICASSP.2018.8461643 (DOI)000446384604162 ()2-s2.0-85054280684 (Scopus ID)
Conference
2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Note

QC 20181025

Available from: 2018-10-25 Created: 2018-10-25 Last updated: 2019-08-20Bibliographically approved
Javid, A. M., Chatterjee, S. & Skoglund, M. (2018). Mutual Information Preserving Analysis of a Single Layer Feedforward Network. In: Proceedings of the International Symposium on Wireless Communication Systems: . Paper presented at 15th International Symposium on Wireless Communication Systems, ISWCS 2018, 28 August 2018 through 31 August 2018. VDE Verlag GmbH
Open this publication in new window or tab >>Mutual Information Preserving Analysis of a Single Layer Feedforward Network
2018 (English)In: Proceedings of the International Symposium on Wireless Communication Systems, VDE Verlag GmbH , 2018Conference paper, Published paper (Refereed)
Abstract [en]

We construct a single layer feed forward network and analyze the constructed system using information theoretic tools, such as mutual information and data processing inequality. We derive a threshold on the number of hidden nodes required to achieve a good classification performance. Classification performance is expected to saturate as we increase the number of hidden nodes more than the threshold. The threshold is further verified by experimental studies on benchmark datasets. Index Terms-Neural networks, mutual information, extreme learning machine, invertible function.

Place, publisher, year, edition, pages
VDE Verlag GmbH, 2018
Keywords
Data handling, Information theory, Learning systems, Wireless telecommunication systems, Benchmark datasets, Classification performance, Extreme learning machine, Feed-forward network, Hidden nodes, Index terms, Mutual informations, Single layer, Network layers
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-247133 (URN)10.1109/ISWCS.2018.8491242 (DOI)2-s2.0-85056724949 (Scopus ID)9781538650059 (ISBN)
Conference
15th International Symposium on Wireless Communication Systems, ISWCS 2018, 28 August 2018 through 31 August 2018
Note

QC 20190403

Available from: 2019-04-03 Created: 2019-04-03 Last updated: 2019-04-03Bibliographically approved
Ghayem, F., Sadeghi, M., Babaie-Zadeh, M., Chatterjee, S., Skoglund, M. & Jutten, C. (2018). Sparse Signal Recovery Using Iterative Proximal Projection. IEEE Transactions on Signal Processing, 66(4), 879-894
Open this publication in new window or tab >>Sparse Signal Recovery Using Iterative Proximal Projection
Show others...
2018 (English)In: IEEE Transactions on Signal Processing, ISSN 1053-587X, E-ISSN 1941-0476, Vol. 66, no 4, p. 879-894Article in journal (Refereed) Published
Abstract [en]

This paper is concerned with designing efficient algorithms for recovering sparse signals from noisy underdetermined measurements. More precisely, we consider minimization of a nonsmooth and nonconvex sparsity promoting function subject to an error constraint. To solve this problem, we use an alternating minimization penalty method, which ends up with an iterative proximal-projection approach. Furthermore, inspired by accelerated gradient schemes for solving convex problems, we equip the obtained algorithm with a so-called extrapolation step to boost its performance. Additionally, we prove its convergence to a critical point. Our extensive simulations on synthetic as well as real data verify that the proposed algorithm considerably outperforms some well-known and recently proposed algorithms.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2018
Keywords
Sparse signal recovery, compressed sensing, SL0, proximal splitting algorithms, iterative sparsification-projection
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:kth:diva-223260 (URN)10.1109/TSP.2017.2778695 (DOI)000423703600003 ()2-s2.0-85037644363 (Scopus ID)
Note

QC 20180216

Available from: 2018-02-16 Created: 2018-02-16 Last updated: 2018-02-16Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2638-6047

Search in DiVA

Show all publications