[Math] Biased Random Walk Converging to a Brownian Motion with drift (Donsker’s Theorem)

convergence-divergenceprobability theoryprobability-limit-theoremsstochastic-processesweak-convergence

Fix $N$ and suppose $\{X_n\}_{k=1}^{N}$ are i.i.d steps that are $\pm 1$ with equal probability. Then $S_n = \sum_{k\leq n} X_k $ is a simple random walk, and (with the right scaling) we know that the path $S_n$ converges to a Brownian motion as $N \to \infty$(This is an application Donsker's Theorem).

I want to consider a related, but slightly different set up. Fix $N$ and suppose $\{X_n\}_{k=1}^{N}$ are i.i.d steps that are $+1$ with probability $\frac{1}{2} + \frac{\lambda / 2}{\sqrt{N}} $ and $-1$ with probability $\frac{1}{2} – \frac{\lambda / 2}{\sqrt{N}}$. Then and $S_n = \sum_{k\leq n} X_k $ is a biased random walk. I want to show that (in the same scaling as above) $S_n$ converges to a Brownian motion with drift $\lambda$ as $N \to \infty$.

My attempt:
I think we can do this by some manipulation and then applying Donsker's Theorem. Let $Y_k = \sqrt{1 – \frac{\lambda^2}{n}}\left( X_k – \frac{\lambda}{\sqrt{n}} \right)$ so that the $Y_k$'s are mean $0$ and variance $1$. Then applying Donsker's Theorem to the $Y_k$'s gets us very close to the result I want. However, there is a pesky factor of $\sqrt{1-\frac{\lambda^2}{n}}$ that is in front.

I think the following is true and can fix the problem:
If $f_n \Rightarrow f$ are distributions in $C[0,1]$ and if $c_n \to 1$ are constants, then $c_n f_n \Rightarrow f$ in $C[0,1]$ too. My "proof" of this involves tightness and Prohorov's Theorem, so I'm hoping for an easier alternative.

Best Answer

Yes, $f_n \to f$ in distribution and $c_n \to 1$ implies $c_n \cdot f_n \to f$ in distribution. A short proof of this assertion is found here.

Related Question