Let $L,R$ be resp. the Left and Right frog.
Let $n$ be the number of possible positions (for example $n=20$ in the picture at the bottom of this text).
Let $D_n$ be the random variable "distance between frogs". We have:
$$\tag{1}E(D_n)=\dfrac{n+1}{3}$$
Let us prove (1) by induction on the number $n$ of positions.
It is true for $n=4$ (see "Addendum 1" at the bottom of this text).
Let us assume (1) to be true at step $n$.
Let us prove that $E(D_{n+1})=\dfrac{n+1+1}{3}$, i.e., $E(D_{n+1})=E(D_{n})+\dfrac{1}{3}.$
To each configuration at step $n$ (that one can consider as a binary number with $n$ digits, two of them being "ones", $n-2$ being zero : vacant positions), one can associate 3 configurations at step $n+1$, by inserting a new position (a new zero):
either in region 1 defined to be on the left of L, or
in region 2, between L and R, or,
in region 3, i.e., on the right of R.
The mean width of region 2 is $(n+1)/3$ by the recursion assumption.
The two other regions, by an elementary consideration of symmetry, have the same mean width ; thus the mean width of each region is $(n+1)/3$. In other words, each region has an equal probability of being chosen for the insertion of a new place.
But it is only by inserting in region 2 that the mean distance increases by $1$. Insertion in the two others does not lead to an increase in the mean distance between L and R frogs.
Thus the mean distance increases by steps of $1/3$, proving (1).
Addendum:
1) The case $n=4$ is easily treated by enumeration. There are 6 possible pairs of position :
$$\begin{cases}
1 & 2 & \to & D_4=1\\
1 & 3 & \to & D_4=2\\
1 & 4 & \to & D_4=3\\
2 & 3 & \to & D_4=1\\
2 & 4 & \to & D_4=2\\
3 & 4 & \to & D_4=1
\end{cases}$$
giving $E(D_4)=\tfrac{5}{3}.$
The important thing is that all these positions have the same stationary probability. For being convinced of that consider the diagram:
and the associated matrix:
$$\begin{matrix} (a) \\ (b) \\ (c) \\ (d) \\ (e) \\ (f) \end{matrix}
\begin{bmatrix}
0 & 1 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1 & 0 & 0 \\
a & 0 & a & 0 & 0 & 0 \\
0 & a & 0 & 0 & 0 & a \\
0 & 0 & 0 & 1 & 0 & 0
\end{bmatrix} \ \ \text{with} \ \ a=1/2.$$
This matrix has a unique eigenvalue $\lambda$ with $|\lambda|=1$ which is precisely $\lambda=1$, all the others being such that $|lambda|<1$. One can verify that, for any $V$, $lim_{n \to \infty} M^nV$ is a vector with identical components, proving the equiprobability of all events in the stationary state.
2) this result has been confirmed by the following Matlab program giving an exact value of $D_n$ by an exhaustive list of elementary events:
format rat
n=6;I=1:n;
U=nchoosek(I,2)
mean(diff(U'))
3) and, for large values of $n$, by the following simulation program that has also produced the graphical representation given upwards:
function frogs
clear all;close all;hold on;grid on
n=20;%number of places
nt=10000;%number of instants
M=[];
LR=sort(ceil(n*(rand(1,2))));
L=LR(1);R=LR(2);
T=zeros(2,nt);
for k=1:nt
T(:,k)=[L;R];
L1=TestL(L,R,n);
R1=TestR(L1,R,n);
L=L1;R=R1;
end;
a=(nt-100):nt;
plot(T(1,a),a,'r','linewidth',1,'linesmoothing','on');
plot(T(2,a),a,'b','linewidth',1,'linesmoothing','on');
diff(T)
M=[M,mean(diff(T))];
mean(M)
%________________________
function Ls=TestL(L,R,n);
if L==1
if R>2
Ls=2;
else
Ls=1;
end;
else
if L+1<R
Ls=L+sign(rand-0.5);
else
Ls=L-1;
end
end;
%________________________
function Rs=TestR(L,R,n);
if R==n
if L<n-1
Rs=n-1;
else
Rs=n;
end;
else
if L+1<R
Rs=R+sign(rand-0.5);
else
Rs=R+1;
end
end;
Assumptions: L jumps first (if possible), then R jumps (if possible), being understood that the jump of R can be influenced by the jump L just did.
This problem has traditionally been called the "problem of flights", or of "random flights", and is connected with "radial functions" and "Hankel transforms". (But if you google "problem of flights" you will be swamped by articles about aviation.)
For $u\in\mathbb R^3$ the expectation $\phi(u)=E\exp(i\langle u,X\rangle)$ for $X$ uniformly distributed on the unit sphere is given by $\phi(u)=\sin(|u|)/|u|$ (this follows from the fact that each coordinate of $X$ is uniformly distributed on $[-1,1]$); "all" you need to do is find the inverse Fourier transform of $\phi^n$.
A 1947 paper of Quenouille is one classical reference; it cites a 1943 paper of Chandrasekhar. Some recent course notes are relevant: see equation (22) and what leads up to it. This 1985 survey paper gives a wealth of historical info, including many other references. (Since this work mostly dates from the first half of the last century, before the modern notation and terminology of probability theory stabilized, these references have a somewhat quaint or antique feel to them.)
Best Answer
Let $X_i$ denote the distance of jump $i$ of the frog. Each $X_i$ is independent of one-another. The total distance at jump $n$ is then $D_n=X_1+\cdots+X_n$. Define $\tau$ for $D_\tau$, where $\tau$ is the first time such that $D_\tau\geq d$. You are interested in $E(\tau)$. The distribution of $D_n$ is the $n$-fold convolution of $F$'s: $F_n = (F\star)^n:=F\star F\star\cdots \star F$.
$P(\tau> n) = P(D_n<d)$
Now use $E(\tau)=\sum_{n=0}^\infty P(\tau> n)=\sum_{n=0}^\infty P(D_n<d)$
Which we can simplify when $F$ is continuous by noting $P(D_n<d)=P(D_n\leq d)=F_n(d)$. I'm not sure if there's any more simplification you can do without an explicit form for $F$.