The fact that this involves "convolutions" and sums of i.i.d. random variables makes me think of trying to deduce the distribution from Moment generating functions. Using the independence of the $\xi_i$, we have (for $t<\lambda$),
$$
\begin{eqnarray*}
\mathbb{E}\left[e^{t\sum\limits_{n=1}^{\nu}\xi_{n}}\right]&{}={}&\sum\limits_{r=1}^{\infty}\int{\textbf{1}}_{\left\{\xi_{1}\leq 1,\,\ldots\,,\,\xi_{r-1}\leq 1\,,\,\xi_{r}>1 \right\}}e^{t\left(\sum\limits_{n=1}^{r}z_{n}\right)}f_{z_1}\ldots f_{z_r}dz_1\ldots dz_r\newline
&{}={}&\sum\limits_{r=1}^{\infty}\left(\dfrac{\lambda}{\lambda{}-{}t}\right)^r e^{t-\lambda}\left(1{}-{}e^{t-\lambda}\right)^{r-1}\newline
&{}={}&e^{t-\lambda}\dfrac{\lambda}{\lambda{}-{}t}\sum\limits_{r=1}^{\infty}\left(\dfrac{\lambda}{\lambda{}-{}t}\right)^{r-1} \left(1{}-{}e^{t-\lambda}\right)^{r-1}\newline
&{}={}&\left(\dfrac{e^{t-\lambda}\dfrac{\lambda}{\lambda{}-{}t}}{1{}-{}\left(\dfrac{\lambda}{\lambda{}-{}t}\right) \left(1{}-{}e^{t-\lambda}\right)}\right)\,\newline
&{}={}&\dfrac{1}{1{}-{}e^{\lambda{}-{}t}t/\lambda}\,.
\end{eqnarray*}
$$
This looks like $\sum\limits_{n=1}^{\nu}\xi_{n}$ is trying to be exponentially distributed with "rate" $\lambda/e^{\lambda{}-{}t}$, but I do not know this functional form by heart. Any ideas?
Edit:
Not knowing the explicit inverse of the final generating function form above, I thought of examining each term in the equivalent series representation: perhaps the individual terms have nicer inverses. If this is the case, then a series representation might be sufficient. If we perform the substitution $u{}={}t/\lambda$, so that $u<1$, note that the moment generating function may be re-written as
$$
\sum\limits_{r=1}^{\infty}\left(\dfrac{1}{1-u}\right)^r\left(1-e^{\lambda\left(u-1\right)}\right)^{r-1}e^{\lambda\left(u-1\right)}{}={}\sum\limits_{r=1}^{\infty}\sum\limits_{k=0}^{r-1}{r-1\choose k}\left(\dfrac{1}{1-u}\right)^r(-)^ke^{(k+1)\lambda(u-1)}\,.
$$
A series representation may be obtained, therefore, if we can invert the "atomic" moment generating functions
$$
\left(\dfrac{1}{1-u}\right)^r e^{(k+1)\lambda(u-1)}\,.
$$
Heuristically, we wish to solve an integral of the kind
$$
\int\limits_{-\infty}^{1}\left(\dfrac{1}{1-u}\right)^r e^{(k+1)\lambda(u-1)-xu}\,\,\mbox{d}u\,.
$$
For $x<\lambda(k+1)$, the integral's solution has the form
$$
(-\lambda)^{r}e^{-x}\left(\dfrac{x^{r-1}}{(r-1)!}\log(\lambda(k+1)-x){}+{}f_{r-1}(k)\right)
$$
where $f_{r-1}(k)$ is a rational function involving terms of, at most, degree "$r-1$" in $k$.
(Note: the explicit solution can be obtained by integrating the expression $\dfrac{(-\lambda)^re^{-x}}{\lambda(k+1)-x}$, "$r$"-times, w.r.t "$k$". A justification of this follows by differentiating the integral we wish to solve. Note, also, that our "$u$" substitution above was merely to make this presentation look nicer and puts this solution "off" by a factor of $\lambda$: the actual solution follows analogous operations using the $t$ variable, instead).
We solve the more general problem of computing the distribution of $\sigma=\inf\{k\geqslant1\mid S_k=0\}$ in a more general setting, then we show that this answers your question.
Assume that $P(\xi=1)=p$ and $P(\xi=-1)=q$ with $q=1-p$, for some $p$ in $(0,1)$.
Then the Markov property at time $1$ shows that $\sigma=1+\sigma_\downarrow$ with probability $p$ and $\sigma=1+\sigma_\uparrow$ with probability $q$, where $\sigma_\downarrow$ denotes the first hitting time of $0$ starting from $1$ and $\sigma_\uparrow$ denotes the first hitting time of $0$ starting from $-1$.
Another application of the Markov property at time $1$ and the homogeneity of the transition probabilities of $(S_k)$, show that $\sigma_\downarrow=1$ with probability $q$ and $\sigma_\downarrow=1+\sigma_\downarrow'+\sigma_\downarrow''$ with probability $p$, where $\sigma_\downarrow'$ and $\sigma_\downarrow''$ are independent copies of $\sigma_\downarrow$.
Likewise, $\sigma_\uparrow=1$ with probability $p$ and $\sigma_\uparrow=1+\sigma_\uparrow'+\sigma_\uparrow''$ with probability $q$, where $\sigma_\uparrow'$ and $\sigma_\uparrow''$ are independent copies of $\sigma_\uparrow$.
In terms of generating functions, these decompositions translate as follows. Consider, for some fixed $|s|<1$, $$g=E(s^\sigma)\qquad g_\uparrow=E(s^{\sigma_\uparrow})\qquad g_\downarrow=E(s^{\sigma_\downarrow})$$ Then $$g=s(pg_\downarrow+qg_\uparrow)$$ while $$g_\downarrow=qs+ps(g_\downarrow)^2\qquad g_\uparrow=ps+qs(g_\uparrow)^2$$ Solving the latter yields $$g_\downarrow=\frac{1-\sqrt{1-4pqs^2}}{2ps}\qquad g_\uparrow=\frac{1-\sqrt{1-4pqs^2}}{2qs}$$ where one chooses the roots of the quadratics with a minus sign to guarantee that $g_\downarrow\leqslant1$ and $g_\uparrow\leqslant1$.
Finally, these formulas for $g_\downarrow$ and $g_\uparrow$ yield $$g=1-\sqrt{1-4pqs^2}$$ Thus, for every $k\geqslant1$,
$$P(\sigma=2k)=\frac2k\binom{2k-2}{k-1}(pq)^k$$
In terms of Catalan numbers (see here for quite a few first values), this reads
$$P(\sigma=2k)=2C_{k-1}(pq)^k$$
The random time you consider is $$\sigma_8=\min\{9,\sigma\}$$ hence $P(\sigma_8=2k)=P(\sigma=2k)$ for every $k$ in $\{1,2,3,4\}$, that is, $$P(\sigma_8=2)=2pq\qquad P(\sigma_8=4)=2(pq)^2$$ $$P(\sigma_8=6)=4(pq)^6\qquad P(\sigma_8=8)=10(pq)^6$$ and $$P(\sigma_8=9)=1-\sum\limits_{k=1}^4P(\sigma=2k)$$
Best Answer
Convolution of two distributions.
$t_{x_0} = 0$
$t_{x_1} = 1$
$t_{h_0} = 0$
$t_{h_1} = 1$
Thus $$f_Y(t) = 0, t \le t_{x_0}+t_{h_0} ,$$
$$f_Y(t) = \int_{max(t_{h_0}, t-t_{x_1})}^{min(t_{h_1}, t-t_{x_0})} f_X(\tau)f_H(t-\tau)d\tau, \text{ } t_{x_0}+t_{h_0} \le t \le t_{x_1}+t_{h_1},$$
$$f_Y(t) = 0, t \ge t_{x_1}+t_{h_1} ,$$
THese translate to the following solution
First convolve two uniform distributions
$X(t) ~ U(0,1)$ and $H(t) ~ U(0,1)$
$$Y(t) = x(t).h(t) = \int_{-\infty}^{\infty} x(\tau)h(t-\tau)d\tau$$ The above convolution reduces to
$y(t) = 0, t\lt 0$
$y(t) = \int_{max(0,t-1)}^{min(1,t)} d\tau , 0\lt t \lt 2$,
$y(t) = 0 , t \gt 2$
The middle one will have to split into two intervals, namely $0\lt t \lt 1$ and $1\lt t \lt 2$
$y(t) = 0, t\lt 0$
$y(t) = \int_{0}^{t} d\tau =t, 0\lt t\lt 1$,
$y(t) = \int_{t-1}^{1} d\tau = 2-t, 1\lt t\lt 2$,
$y(t) = 0 , t \gt 2$
Now $W(t) = Y(t). S(t)$ where $S(t) ~U(0,1)$
For $0\lt t\lt 1$, the bounds are
$t_{s_0} = 0$
$t_{s_1} = 1$
$t_{y_0} = 0$
$t_{y_1} = 1$
$$W(t) = \int_{max(0,t-1)}^{min(1,t)} \tau d\tau = \int_{0}^{t}\tau d\tau = \frac{t^2}{2}, 0\lt t\lt 1$$,
For $1\lt t\lt 2$, $S(t)$ convolves with $Y(t)$ on two intervals namely $(0,1)$ and $(1,2)$. For the interval $(0,1)$ the bounds are
$t_{s_0} = 0$
$t_{s_1} = 1$
$t_{y_0} = 0$
$t_{y_1} = 1$
and for the interval $(1,2)$ the bounds are
$t_{s_0} = 0$
$t_{s_1} = 1$
$t_{y_0} = 1$
$t_{y_1} = 2$
Thus $$W(t) = \int_{max(0,t-1)}^{min(1,t)} \tau d\tau + \int_{max(1,t-1)}^{min(2,t)} (2-\tau) d\tau$$ $$ = \int_{t-1}^{1}\tau d\tau + \int_{1}^{t}(2-\tau) d\tau$$ $$ = -\frac{1}{2}(2t^2-6t+3), 1\lt t\lt 2$$,
For $2\lt t\lt 3$, $S(t)$ convolves with $Y(t)$ on $(1,2)$. For the interval $(1,2)$ the bounds are
$t_{s_0} = 0$
$t_{s_1} = 1$
$t_{y_0} = 1$
$t_{y_1} = 2$
Thus $$W(t) = \int_{max(1,t-1)}^{min(2,t)} (2-\tau) d\tau $$ $$ = \int_{t-1}^{2} (2- \tau) d\tau$$ $$ = \frac{(t-3)^2}{2}, 2\lt t \lt 3$$
and finally $W(t) = 0, t\gt 3$
Thus the $W(t)$ is defined by
$W(t) = 0 , t\lt 0$
$W(t) = \frac{t^2}{2}, 0\lt t \lt 1$
$W(t) = -t^2+3t-\frac{3}{2}, 1\lt t \lt 2$
$W(t) = \frac{(t-3)^2}{2}, 2\lt t \lt 3$
$W(t) = 0 , t\gt 3$