Capacity of Finite State Markov Channels with General Inputs

T. Holliday, A. Goldsmith, and P. W. Glynn

Proceedings of the 2003 IEEE International Symposium on Information Theory, p. 289 (2003)

We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. We show that the entropy rate for a symbol sequence is equal to the primary Lyapunov exponent for a product of random matrices. We then develop a continuous state space Markov chain formulation that allows us to directly compute entropy rates as expectations with respect to the Markov chain’s stationary distribution. We also show that the stationary distribution is a continuous function of the input symbol dynamics. This continuity allows the channel capacity to be written in terms of Lyapunov exponents.