Lets say there are two Markov processes with the same state space but different probability matrices $M_1$ and $M_2$. Would their sum: $$M = 0.5\cdot M_1 + 0.5\cdot M_2$$ be a Markov process?
Is the sum of markov processes a markov process
markov chainsmarkov-processstochastic-processes
Related Solutions
It this case, if for example $m_1\neq m_2, \sigma_1\sigma_2\rho\neq 0$, no, because the drift is not a function of $X_t + Y_t$.
In a general situation, when $X,Y$ are two non independant Markov chains, $X+Y$ is not.
To understand this with a more simple example, in a discret time setting, take for example $(X_n)$ be a simple random walk on $\mathbb Z$, and take $Y_n = n1_{X_1 = 1}$ (is is a Markov process, because is $Y_{n+1} = Y_n + 1_{X_1 = 1}$).
Then, $S = X+Y$ is not a Markov process: the law of $S_n|S_{n-1}$ depends on $X_1 = S_1$.
It might be important to differentiate between the various stochastic process types based on both state space and time variable. (Note: discrete space/time can also be called countable.) So there are 4 types:
- Discrete-spacetime: the process moves from state-to-state (each of which can be represented by integers) in discrete steps. For example, imagine a random walk on a graph that takes a step for each $t\in \mathbb{Z}_{\geq 0}$.
- Discrete-time continuous-space: the process moves in discrete turns, but takes continuous values. For instance, the classic (discrete-time) random walk of unit step-size on $\mathbb{R}^n$. (See also here).
- Continuous-time discrete-space: the process moves continuously in time, but in a countable space (e.g. see Continuous-time discrete-space models for animal movement).
- Continuous spacetime: the time variable is continuous, and the process moves in a continuous space (e.g. $\mathbb{R}^n$). This includes Brownian motion and other Ito processes.
The next part is not so clearly agreed upon in the literature. I will simply state the definitions I am used to seeing.
A Markov process is any stochastic process that satisfies the Markov property. It doesn't matter which of the 4 process types it is.
A Markov chain is a Markov process with a discrete state space (i.e. can be type 1 or 3).
A Discrete-time Markov chain (or discrete Markov chain) is a Markov process in discrete time with a discrete state space (i.e. type 1, above).
A Continuous-time Markov chain (or continuous Markov chain) is a Markov process with a discrete state space in continuous time (i.e. of type 3). (E.g. see here).
A Stationary process is a stochastic process with a joint probability distribution that does not change when translated in time (see here).
A Time-homogeneous Markov chain is a stationary Markov chain. This means that the transition probabilities do not change in time. So, the probability of going from one state $s_1$ to another state $s_2$, once you are at $s_1$, is always the same (i.e. it doesn't matter when you get there).
A discrete-time stationary Markov chain is the most classic case (and in fact what most people mean when they say Markov chain).
Best Answer
For a discrete state space, like we have here since we have a finite Markov matrix, and set of transitions that are valid in preserving total probability at 1 can be repeated and form a Markov chain. Therefore, the set of Markov processes on a finite state space is in 1-1 correspondence to matrices with entries in [0, 1] of the appropriate size whose column (or row, depending on your definition) sums are all 1. In particular, if $M_1$ and $M_2$ are two such matricies, averaging them elementwise will preserve the column (or row) sum, and the average of two probabilities is still a probability, so the resulting transition Matrix and therefore Markov process is valid.