Transfer-Entropy-Regularized Markov Decision Processes
2022 (English)In: IEEE Transactions on Automatic Control, ISSN 0018-9286, E-ISSN 1558-2523, Vol. 67, no 4, p. 1944-1951Article in journal (Refereed) Published
Abstract [en]
We consider the framework of transfer-entropy-regularized Markov Decision Process (TERMDP) in which the weighted sum of the classical state-dependent cost and the transfer entropy from the state random process to the control input process is minimized. Although TERMDPs are generally formulated as nonconvex optimization problems, an analytical necessary optimality condition can be expressed as a finite set of nonlinear equations, based on which an iterative forward-backward computational procedure similar to the Arimoto-Blahut algorithm is developed. It is shown that every limit point of the sequence generated by the proposed algorithm is a stationary point of the TERMDP. Applications of TERMDPs are discussed in the context of networked control systems theory and non-equilibrium thermodynamics. The proposed algorithm is applied to an information-constrained maze navigation problem, whereby we study how the price of information qualitatively alters the optimal decision polices.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2022. Vol. 67, no 4, p. 1944-1951
Keywords [en]
Entropy, Markov processes, Optimization, Random processes, Rate-distortion, Standards, Thermodynamics, Behavioral research, Computation theory, Iterative methods, Networked control systems, Nonlinear equations, Computational procedures, Markov Decision Processes, Navigation problem, Necessary optimality condition, Non equilibrium thermodynamics, Nonconvex optimization problem, Optimal decisions, Stationary points, Process control
National Category
Control Engineering
Identifiers
URN: urn:nbn:se:kth:diva-308832DOI: 10.1109/TAC.2021.3069347ISI: 000776167500027Scopus ID: 2-s2.0-85103769098OAI: oai:DiVA.org:kth-308832DiVA, id: diva2:1637930
Note
QC 20250512
2022-02-152022-02-152025-05-12Bibliographically approved