Random Walks – A Random Walk with Uniformly Distributed Steps

pr.probabilityrandom walks

The following problem has bothered me for a long time.

Let us imagine a point on the real axis. At the beginning, it is located at point $O$. Then it will "walk" on the real axis randomly in the following way. For every step of the "walk", it will choose a real number $\Delta x$ uniformly from the interval $[-1,1]$, turn right, and move $\Delta x$ unit. Once it reaches the left side of the point $O$, it will "die" immediately.

Our task is find out the probability of the point is alive after $n$ steps of "walk" $P_n$.
I guess that $P_n=\frac{(2n)!}{4^n (n!)^2}$, but I can't prove this or explain why it is true.

Best Answer

Here is an argument that proves the conjecture. More generally, it shows that if $X_1,\dots,X_n$ is a "generic" sequence of positive real numbers, and we form a sum by permuting the terms randomly and putting random signs on the terms (uniform distribution over all possibilities), then the probability that all partial sums are positive is given by $P_n$ as expressed in the OP. By "generic" I mean that no nonempty signed subsequence sums to zero.

First consider a special case where the statement clearly holds: Assume that $X_1>>X_2>>X_3>>\dots >>X_n$ in the sense that the sign of any partial sum will depend only on the sign of the dominating term. A valid signed permutation (one where all partial sums are positive) of $n$ terms must then be formed by inserting the term $X_n$ into a valid signed permutation of $X_1,\dots,X_{n-1}$. Conversely if we insert $X_n$ into a valid signed permutation of $X_1,\dots,X_{n-1}$, then the only way we can turn this into a non-valid signed permutation is if we insert $X_n$ first and with a negative sign. Therefore the probability that a signed permutation of $n$ terms is valid is $$\frac12\cdot \frac34\cdot\frac56\cdots\frac{2n-1}{2n},$$ as required.

Next, let us move the point $(X_1,\dots,X_n)$ in $\mathbb{R}^n$ and see how the probability of a valid signed permutation might change. We need only consider what happens when we pass through a hyperplane given by a signed subsequence being equal to zero. Instead of introducing double indices let me take a generic example: Suppose we pass through the hyperplane where the sum $X_3+X_5+X_6-X_2-X_9$ changes sign. At the moment where the sign change occurs, the only signed permutations that go from valid to invalid or the other way are those where this particular sum occurs as the five first terms (in some order). Now for every signed permutation that goes from valid to invalid, there will be a corresponding one that goes from invalid to valid, by reversing those five first terms and changing their signs. For instance, if $$(X_5+X_6-X_9+X_3-X_2) + \dots$$ goes from valid to invalid, then $$(X_2-X_3+X_9-X_6-X_5) + \dots$$ will go from invalid to valid.

The conclusion is that the probability of a signed permutation being valid (having all partial sums positive) will never change, and will therefore always be equal to the product given in the OP.

UPDATE: Instead of looking at what happens when we cross one of these hyperplanes, here is another way of seeing that $P_n$ is independent of $X=(X_1,\dots,X_n)$, which has the additional feature of showing that $$P_0P_n + P_1P_{n-1}+\cdots + P_nP_0 = 1.$$

Suppose that $X_1,\dots,X_n$ are generic as described above. Let $P_n(X)$ denote the probability that a random signed permutation will have all partial sums positive, and suppose for induction on $n$ that we have shown that $P_k(X) = P_k$ is independent of $X$ for $k=1,\dots,n-1$.

Now let $Y_1,\dots, Y_n$ be a random signed permutation of $X$, and consider the distribution of the $k$ for which the partial sum $Y_1+\dots+Y_k$ is minimized. I claim that the probability that this occurs for a particular $k$ is $P_kP_{n-k}$, since it means that all consecutive sums of the form $Y_{k+1}+Y_{k+2}+\dots+Y_m$ for $m>k$ must be positive, while similarly all sums of the form $Y_m + Y_{m+1}+\dots+Y_k$ for $m\leq k$ must be negative.

Since generically the minimum is unique, we have $$P_0P_n(X) + P_1P_{n-1}+\cdots + P_n(X)P_0 = 1,$$ which shows that $P_n(X)$ is independent of $X$.

I like this very much, since the equation $P_0P_n + P_1P_{n-1}+\cdots + P_nP_0 = 1$ is crucial for my proof in the Monthly December 2007 of the Wallis product formula for pi, http://www.math.chalmers.se/~wastlund/monthly.pdf. See also Don Knuth's 2010 christmas tree lecture "Why pi?" under "Computer musings" at http://scpd.stanford.edu/knuth/index.jsp. I did a proof by induction and some algebraic manipulation in the Monthly paper, and Knuth used power series and the binomial theorem some 20-25 minutes into his talk, but the argument motivated by this question is much nicer!

Related Question