Solved – a random process with “stationary independent increments”

iidself-studystochastic-processesterminology

I'm looking at a Solved Problem in "Schaum's Outline: Probability, Random Variables, and Random Processes", specifically Problem 5.21. In this problem it states:

Let $\{X(t), t \ge 0\}$ be a random process with stationary independent increment, and assume that $X(0) = 0$. Show that:

$$E[X(t)] = \mu_1 t$$

where:

$$\mu_1 = E[X(1)]$$

However, I'm not understanding… what is a "stationary independent increment?" Thus, I have no idea what the problem is talking about.

Here's a scan from the book:

example 5.21

Best Answer

One example of such a process is a random walk. We can define a random walk by the partial sums of an iid sequence, say $X_1, X_2, \dotsc$ is iid, $W_t= X_1+\dotsm+ X_t$ is then a random walk, and its increments $W_t-W_{t-1}=X_t$ s clearly iid, so stationary and independent. But that is a process in discrete time, but your time is $t>0$ so continuous. In that case a process with stationary independent increments is basically a continuous-time version of a random walk. One example is Brownian motion.

The process is $X_t, t>0$. Let $0<t_1<t_2< \dotsm <t_k$ be some times, then the process has independent increments if the increments $$ X_{t_2}-X_{t_1}, \dotsc, X_{t_k}-X_{t_{(k-1)}} $$ are independent. To require that the increments also be stationary only gives meaning if the time increments all are the same, so the series $0<t_1<t_2< \dotsm <t_k$ be equidistant.

Related Question