[Math] Markov chain with infinitely many states

markov chains

I understand that a Markov chain involves a system which can be in one of a finite number of discrete states, with a probability of going from each state to another, and for emitting a signal.

Thus,an $N \times N$ transition matrix and an $N \times N$ emission matrix of real numbers adequately describe a Markov chain with $N$ states and $M$ emissions.

Is it possible to have a Markov chain with an infinite number of states? For example, if $N=2$ is a LED that can glow blue or red, $N=\infty$ would be a LED which can glow a color that is any mixture of blue or red.

Can't an infinitely-large matrix be represented by a function of two variables (the two indices)?

Best Answer

Yes, Markov processes with infinitely many states are indeed considered. Random walks are a common example. The term "Markov chain" is often reserved for the case of a discrete state space. If the state space is finite, it's a "finite Markov chain". See e.g. http://www.statslab.cam.ac.uk/~james/Markov/