Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy PerspectiveShow others and affiliations
2019 (English)In: 2019 IEEE Information Theory Workshop, ITW 2019, Institute of Electrical and Electronics Engineers (IEEE), 2019, article id 8989240Conference paper, Published paper (Refereed)
Abstract [en]
In this paper, we obtain generic bounds on the variances of estimation and prediction errors in time series analysis via an information-theoretic approach. It is seen in general that the error bounds are determined by the conditional entropy of the data point to be estimated or predicted given the side information or past observations. Additionally, we discover that in order to achieve the prediction error bounds asymptotically, the necessary and sufficient condition is that the 'innovation' is asymptotically white Gaussian. When restricted to Gaussian processes and 1-step prediction, our bounds are shown to reduce to the Kolmogorov-Szegö formula and Wiener-Masani formula known from linear prediction theory.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2019. article id 8989240
Keywords [en]
Computational complexity, Entropy, Error analysis, Forecasting, Harmonic analysis, Information theory, Conditional entropy, Error bound, Estimation and predictions, Gaussian Processes, Information-theoretic approach, Linear prediction, Prediction errors, Side information, Time series analysis
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-274783DOI: 10.1109/ITW44776.2019.8989240ISI: 000540384500101Scopus ID: 2-s2.0-85081100218OAI: oai:DiVA.org:kth-274783DiVA, id: diva2:1446316
Conference
2019 IEEE Information Theory Workshop, ITW 2019, 25-28 August 2019, Visby, Sweden
Note
QC 20200706
Part of ISBN 9781538669006
2020-06-242020-06-242024-10-25Bibliographically approved