Solved – the difference between Markov chains and Markov processes

markov-processstochastic-processesterminology

What is the difference between Markov chains and Markov processes?


I'm reading conflicting information: sometimes the definition is based on whether the state space is discrete or continuous, and sometimes it is based on whether the time is discrete of continuous.

Slide 20 of this document:

A Markov process is called a Markov chain if the state space is discrete, i.e. is finite or countable space is discrete, i.e., is finite or countable.

http://www.win.tue.nl/~iadan/que/h3.pdf :

A Markov process is the
continuous-time version of a Markov chain.

Or one can use Markov chain and Markov process synonymously, precising whether the time parameter is continuous or discrete as well as whether the state space is continuous or discrete.


Update 2017-03-04: the same question was asked on https://www.quora.com/Can-I-use-the-words-Markov-process-and-Markov-chain-interchangeably

Best Answer

From the preface to the first edition of "Markov Chains and Stochastic Stability" by Meyn and Tweedie:

We deal here with Markov Chains. Despite the initial attempts by Doob and Chung [99,71] to reserve this term for systems evolving on countable spaces with both discrete and continuous time parameters, usage seems to have decreed (see for example Revuz [326]) that Markov chains move in discrete time, on whatever space they wish; and such are the systems we describe here.

Edit: the references cited by my reference are, respectively:

99: J.L. Doob. Stochastic Processes. John Wiley& Sons, New York 1953

71: K.L. Chung. Markov Chains with Stationary Transition Probabilities. Springer-Verlag, Berlin, second edition, 1967.

326: D. Revuz. Markov Chains. North-Holland, Amsterdam, second edition, 1984.

Related Question