background
logo
ArxivPaperAI

Information divergences of Markov chains and their applications

Author:
Youjia Wang, Michael C. H. Choi
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT), Probability (math.PR), Computation (stat.CO)
journal:
--
date:
2023-12-08 00:00:00
Abstract
In this paper, we first introduce and define several new information divergences in the space of transition matrices of finite Markov chains which measure the discrepancy between two Markov chains. These divergences offer natural generalizations of classical information-theoretic divergences, such as the $f$-divergences and the R\'enyi divergence between probability measures, to the context of finite Markov chains. We begin by detailing and deriving fundamental properties of these divergences and notably gives a Markov chain version of the Pinsker's inequality and Chernoff information. We then utilize these notions in a few applications. First, we investigate the binary hypothesis testing problem of Markov chains, where the newly defined R\'enyi divergence between Markov chains and its geometric interpretation play an important role in the analysis. Second, we propose and analyze information-theoretic (Ces\`aro) mixing times and ergodicity coefficients, along with spectral bounds of these notions in the reversible setting. Examples of the random walk on the hypercube, as well as the connections between the critical height of the low-temperature Metropolis-Hastings chain and these proposed ergodicity coefficients, are highlighted.
PDF: Information divergences of Markov chains and their applications.pdf
Empowered by ChatGPT