Entropy and Mutual Information for Markov Channels with General Inputs

T. Holliday, A. Goldsmith, and P. W. Glynn

Proceedings of the 2002 Allerton Conference on Communication, Control, and Computing, 824-833 (2002)

We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy rate for a symbol sequence is equal to the primary Lyapunov exponent for a product of random matrices. We then develop a continuous state space Markov chain formulation that allows us to directly compute entropy rates as expectations with respect to the Markov chains’ stationary distributions. We also show that the stationary distributions are continuous functions of the input symbol dynamics. This continuity facilitates optimization of the mutual information and allows the channel capacity to be written in terms of Lyapunov exponents.