Let $X$ denote the time (measured in minutes) between the last bus which arrived at the bus station before you left your office and the next one. Let $Y$ denote the time (measured in minutes) between the next bus and the next bus after it. By hypothesis, the couple $(X,Y)$ is i.i.d. exponential with known parameter $\lambda=1/15$.
Question (a) asks for $P[X\lt16\mid X\gt10]$. This is not $P[X\lt16]$ but $P[X\lt6]$. Do you see why? Can you compute this value?
Additional hint: If $Z$ is exponentially distributed, then $P[Z\gt x+y\mid Z\gt y]=$ $____$.
Question (b) asks for $P[X+Y\lt16\mid X\gt10]$. The answer is $P[X+Y\lt6]$. Do you see why? Can you compute this value?
Question (c) refers to the same quite general property of Poisson processes which was used in questions (a) and (b). Do you see this property? Can you deduce the answer?
I have to say at the start that bus arrivals do not typically follow
an exponential distribution. So it is really hard to get out
of your mind how $actual$ buses work, if someone says interarrival
times are governed by an exponential process.
Maybe it is easier to think about something that really is
exponentially distributed. Suppose you have a very weak radioactive source and you are capturing particles it emits
in a counter. Suppose that the detection rate is one per 10
seconds. If you start keeping time at one particular click of
the counter, then the average wait for the next click is 10 seconds.
However, the no-memory property says if we start keeping time
at some arbitrary point in time (click or not), the average
wait until the next click is also 10 seconds. The decaying
particles are not 'keeping track' of each other, and they don't
'know' when you start counting.
Now suppose you have a paper tape on which clicks are recorded
along a time line. The tape will look pretty much the same whether you
read forwards or backwards: random marks spaced sometimes near
together, sometimes relatively far apart, but $on\; average$
10 seconds apart. Maybe it is possible to say this is due
to the no-memory property, but in my experience the usual
terminology for this is 'time-reversibility'.
Both no-memory
and time-reversibility are fundamental properties of exponential
processes, so I suppose it is possible to take a point of view
(for exponential processes) that your statement in bold type is true. But I'm not sure there is a lot of
intuitive value in trying to make this connection between memorylessness and time reversibility when you're just starting
to think about the curious properties of exponential models.
As another example on no-memory, suppose a computer unit in a satellite
survives an exponentially distributed length of time with
mean lifetime 10 years. Radiation hits are what cause
such computer units to die. If 8 years have already gone by,
you might think the computer unit is nearing the end of its
life. But if the lifetime really is exponentially distributed,
and it is still alive at 8 years, the expected time of death
from a random radiation hit is still 10 years away. For such
devices, we say "Used is as good as new." This is an appropriate
model for devices that die only because of random radiation
hits. (Of course if you have a census of dead satellite computers
of this type along with their 'death' dates,
you could check back to their 'birth' dates and see that they
were, on average, about 10 years before the death dates, but
that is not much of a profound statement.)
For humans in a certain population we might say that their
average lifetime at birth is 70 years. If such a person is
now 60 years old, it would not be reasonable to say that
he or she has another expected 70 years of life. People
do sometimes die of random accidents, but they also die
by 'wearing out' with age.
Most things we are familiar with
die from a combination of random accidents and gradual wearing
out: automobiles, light bulbs, pets, T-shirts, and so on.
Other events, like elections, bus arrivals, credit card bills,
and so on tend to happen at rather even intervals--sometimes
without much of a random component.
The reason intuition comes so hard when thinking about
exponentially distributed events is that there are relatively few
events in real life that happen according to an exponential
model.
In science things get modeled according to exponential distributions
for two reasons: (a) Some things really are exponentially
distributed--at least approximately. Service times at banks,
lives of transistors, radioactive decay, and so on. (b) Because
the no-memory rule makes it unnecessary to take past history
into account, exponential models are mathematically very easy
to handle; that makes it tempting to use exponential models
sometimes when they don't really apply very well.
Best Answer
Let the bus arrive at $B$ minutes after $9$ am, let the tram arrive at $T$ minutes after $9$ am.
Then the event $\left\{ T-10\leq B<T\right\} \cup\left\{ B-x<T\leq B\right\} $ stands for a 'meeting'. This is a union of two disjoint sets and to be calculated is $P\left\{ T-10\leq B<T\right\} +P\left\{ B-x<T\leq B\right\} $. This on base of a uniform distribution for $\left(B,V\right)$ on $\left[0,60\right]^{2}$.
This comes to calculation of $\frac{1}{3600}\int_{0}^{60}\int_{0}^{60}\left[t-10\leq b<t\right]dbdt+\frac{1}{3600}\int_{0}^{60}\int_{0}^{60}\left[b-x\leq t<b\right]dbdt$ where $\left[t-10\leq b<t\right]=1$ if $t-10\leq b<t$ and $\left[t-10\leq b<t\right]=0$ otherwise and likewise for function$\left[b-x\leq t<b\right]$ . This results in an expression $u\left(x\right)$ in $x$. To be solved is the equation $u\left(x\right)=0.5$.
Can you take it from here?
addendum:
To put some light on the mentioned integrals:
$\int_{0}^{60}\int_{0}^{60}\left[t-10\leq b<t\right]dbdt=\int_{0}^{10}\int_{0}^{t}dbdt+\int_{10}^{60}\int_{t-10}^{t}dbdt$
If $x\leq 60$ then:
$\int_{0}^{60}\int_{0}^{60}\left[b-x\leq t<b\right]dtdb=\int_{0}^{x}\int_{0}^{b}dtdb+\int_{x}^{60}\int_{b-x}^{b}dtdb$
And if $x>60$ then:
$\int_{0}^{60}\int_{0}^{60}\left[b-x\leq t<b\right]dtdb=\int_{0}^{60}\int_{0}^{b}dtdb$