The scenario you describe provides a good example for conditional independence, though you haven't quite described it as such. As the Wikipedia article puts it,
$R$ and $B$ are conditionally independent [given $Y$] if and only if,
given knowledge of whether $Y$ occurs, knowledge of whether $R$ occurs
provides no information on the likelihood of $B$ occurring, and
knowledge of whether $B$ occurs provides no information on the
likelihood of $R$ occurring.
In this case, $R$ and $B$ are the events of persons A and B getting home in time for dinner, and $Y$ is the event of a snow storm hitting the city. Certainly the probabilities of $R$ and $B$ will depend on whether $Y$ occurs. However, just as it's plausible to assume that if these two people have nothing to do with each other their probabilities of getting home in time are independent, it's also plausible to assume that, while they will both have a lower probability of getting home in time if a snow storm hits, these lower probabilities will nevertheless still be independent of each other. That is, if you already know that a snow storm is raging and I tell you that person A is getting home late, that gives you no new information about whether person B is getting home late. You're getting information on that from the fact that there's a snow storm, but given that fact, the fact that A is getting home late doesn't make it more or less likely that B is getting home late, too. So conditional independence is the same as normal independence, but restricted to the case where you know that a certain condition is or isn't fulfilled. Not only can you not find out about A by finding out about B in general (normal independence), but you also can't do so under the condition that there's a snow storm (conditional independence).
An example of events that are independent but not conditionally independent would be: You randomly sample two people A and B from a large population and consider the probabilities that they will get home in time. Without any further knowledge, you might plausibly assume that these probabilities are independent. Now you introduce event $Y$, which occurs if the two people live in the same neighbourhood (however that might be defined). If you know that $Y$ occurred and I tell you that A is getting home late, then that would tend to increase the probability that B is also getting home late, since they live in the same neighbourhood and any traffic-related causes of A getting home late might also delay B. So in this case the probabilities of A and B getting home in time are not conditionally independent given $Y$, since once you know that $Y$ occurred, you are able to gain information about the probability of B getting home in time by finding out whether A is getting home in time.
Strictly speaking, this scenario only works if there's always the same amount of traffic delay in the city overall and it just moves to different neighbourhoods. If that's not the case, then it wouldn't be correct to assume independence between the two probabilities, since the fact that one of the two is getting home late would already make it somewhat likelier that there's heavy traffic in the city in general, even without knowing that they live in the same neighbourhood.
To give a precise example: Say you roll a blue die and a red die. The two results are independent of each other. Now you tell me that the blue result isn't a $6$ and the red result isn't a $1$. You've given me new information, but that hasn't affected the independence of the results. By taking a look at the blue die, I can't gain any knowledge about the red die; after I look at the blue die I will still have a probability of $1/5$ for each number on the red die except $1$. So the probabilities for the results are conditionally independent given the information you've given me. But if instead you tell me that the sum of the two results is even, this allows me to learn a lot about the red die by looking at the blue die. For instance, if I see a $3$ on the blue die, the red die can only be $1$, $3$ or $5$. So in this case the probabilities for the results are not conditionally independent given this other information that you've given me. This also underscores that conditional independence is always relative to the given condition -- in this case, the results of the dice rolls are conditionally independent with respect to the event "the blue result is not $6$ and the red result is not $1$", but they're not conditionally independent with respect to the event "the sum of the results is even".
Best Answer
Hint:
Let (symbol) denote the event of (event)
As - Assault
R - Report
A - Arrested
C - Conviction
S - Serving
We know ( Note that | means given)
P(R|As) = 0.5
P(A|R) = 0.6
P(C|A) = 0.9
P(S|C) = 0.7
and
We want to know:
P(S|As) = ?
Before computing this mathematically, through conditional probability, let us think about this intuitively using probability tree diagrams.
anything with "^c" means complement as in the event did not occur e.g. R^c ($R^c$) means that the assault was not reported.
It seems that the only way to arrive at S from As is through R, A, C and S. This is correct as unreported assaults do not result in arrests, not being arrested cannot result in a conviction and not being convicted means not serving.
When going through the probability tree this way, we multiply the probabilities (eg the probability of getting double 6's when rolling a pair of fair dice is (1/6)*(1/6) = 1/36).
Thus, the answer would be 0.5*0.6*0.9*0.7, but how do we state this formally? More importantly, why does going through the probability tree in such a way call for multiplication?
Let us note that if we serve a sentence, it follows that we have been convicted.
If we were given that a conviction happened and were asked to compute the probability for a sentence, the answer would be 0.7, and we are done.
Is being convicted given? No. Go up the tree.
Being convicted means we were arrested.
Is being arrested given? No. Go up the tree.
Being arrested means we were reported. Given? No. Up.
Being reported means we committed an assault. This one is given.
Oh also, note that $S = R \cap A \cap C \cap S$, that is the event S is equal to the events R, A, C and S all happening together since there is no way someone can be sentenced if they are not first convicted, arrested, reported and are guilty of assault.
Recall we noted earlier: It seems that the only way to arrive at S from As is through R, A, C and S. This is correct as unreported assaults do not result in arrests, not being arrested cannot result in a conviction and not being convicted means not serving.
Thus, $P(S | As) = P(R \cap A \cap C \cap S | As)$.
By definition of conditional probability, we have
$P(R \cap A \cap C \cap S | As) = \frac{P(R \cap A \cap C \cap S \cap As)}{P(As)}$.
If we could say that the events R, A, C, S and As are independent, then we would have $\frac{P(R \cap A \cap C \cap S \cap As)}{P(As)} = \frac{P(R)P(A)P(C)P(S)P(As)}{P(As)}$.
Unfortunately, they are not independent, I think. If not, there is a rule here we can use called the chain rule of probability.
We compute $P(R \cap A \cap C \cap S \cap As)$ as follows:
$P(R \cap A \cap C \cap S \cap As) = P(S | C \cap A \cap R \cap As) P(C | A \cap R \cap As) P(A | R \cap As) P(R|As) P(As)$
Hey look something we know, P(R|As) = 0.5.
What about the others?
We could use the extremely long and tedious process of doing things like:
$P(A | R \cap As) = \frac{P(A \cap R \cap As)}{P(R \cap As)}$
or we could take note of the fact that
$R \cap As = R$
$A \cap R \cap As = A$
$C \cap A \cap R \cap As = C$
This is true by similar reasoning to $S = R \cap A \cap C \cap S$.
Thus, we have
$P(R \cap A \cap C \cap S \cap As) = P(S|C) P(C|A) P(A|R) P(R|As) P(As)$
Hey wait, we don't know P(As). All we know is that it happened.
That's okay, we don't want to compute $P(R \cap A \cap C \cap S \cap As)$. We instead want to compute
$P(R \cap A \cap C \cap S | As) = \frac{P(R \cap A \cap C \cap S \cap As)}{P(As)} = \frac{P(S|C) P(C|A) P(A|R) P(R|As) P(As)}{P(As)}$.
Alternatively, you can write $P(R \cap A \cap C \cap S \cap As| As)$ and then note that $P(As | As) = 1$.
To understand chain rule of probability more, you may want to check out The Birthday Problem. It's a popular probability problem.