Calculate the percentage reduction on the variance of the claim payment

actuarial-scienceconditional probabilityconditional-expectationprobabilityprobability distributions

The amount of a claim that a car insurance company pays out follows an exponential distribution. By imposing a deductible of d, the insurance company reduces the expected claim payment by 10%. Calculate the percentage reduction on the variance of the claim payment.

My attempt

Let $X$ = claim payment before deductible and $Y$= claim payment after deductible

$X \sim Exp(\frac{1}{\theta}) \Rightarrow f(x)=\frac{1}{\theta}e^{-\frac{x}{\theta}}$ and $F(x)=P(X<x)=1-e^{-\frac{x}{\theta}}$, $x>0,\theta>0$.

$\mu_x=\theta, \sigma^2_x=\theta^2, \mathbb{E}(X^2)=2\theta^2$

$Y=\left\{
\begin{array}{lr}
x-d, & \hspace{2mm} X>d \\
0, & \hspace{2mm} otherwise\\
\end{array}
\right.$

$G(Y|X>d)=P(Y<y|X>D)=\frac{P(X-d<y|X>d)}{P(X>d)}=\frac{P(d<X<y+d)}{P(X>d)}=\frac{F(y+d)-F(d)}{1-F(d)}=1-e^{-\frac{y}{\theta}}, y>0$.

Thus $g(y|X>d)=\frac{d}{dy}(1-e^{-\frac{y}{\theta}})=\frac{1}{\theta}e^{-\frac{y}{\theta}}, y>0$. This means $Y|X>d \sim Exp(\frac{1}{\theta})$.

Because $\mathbb{E}(Y)=(x-d)P(X>d)=0.9\mathbb{E}(X)=0.9\theta$, we have $P(X>d)=0.9$ since $0 \le P(X>d) \le 1$ and $y=x-d=\mathbb{E}(X)=\theta$?

Thus $\mathbb{E}(Y|X>d)=y \cdot P(X>d)= \theta \cdot 0.9$, which implies that $(\mathbb{E}(Y|X>d))^2=(0.9\theta)^2=0.81\theta^2$.

$\mathbb{E}(Y^2|X>d)=y^2 \cdot P(X>d)=\int_{d}^{\infty} y^2 f(x)dx=\int_{d}^{\infty} (x-d)^2 \frac{1}{\theta}e^{-\frac{x}{\theta}}dx.$

By letting $u=x-d$, I see that this integral equals $e^{-\frac{d}{\theta}}\mathbb{E}(X^2)=(1-F(d))\cdot 2\theta^2=0.9(2\theta^2)=1.8\theta^2$ and thus

$Var(Y|X>d)=1.8\theta^2-0.81\theta^2=0.99\theta^2=0.99\sigma^2_x$.

$ \therefore Var(Y|X>d$) is reduced by $1$%.

One solution I was looking at uses the law of total expectation as follows:

\begin{align*}
\mathbb{E}[Y^k]
&=
\mathbb{E}[Y^k \, | \, X \geq d]\cdot\mathbb{P}(X \geq d) + \mathbb{E}[Y^k \, | \, X < d]\cdot\mathbb{P}(X < d) \\
&=
\mathbb{E}[Y^k \, | \, X \geq d]\cdot\mathbb{P}(X \geq d) \\
&=
k! \lambda^k e^{-\frac{d}{\lambda}}, k \in \mathbb{N}
\end{align*}

Why is this true?

Best Answer

You wrote

Because $\mathbb E[Y] = (X-d) \Pr[X > d] = 0.9 \mathbb E[X]$

which is not correct. You should write $$\mathbb E[Y] = \mathbb E[X-d \mid X > d] \Pr[X > d] = \operatorname{E}[X]\Pr[X > d] = 0.9 \mathbb E[X];$$ that is to say, you have omitted the expectation operator, and expectation on the RHS is conditional on $X > d$; then since $X$ is memoryless, $(X - d \mid X > d) \sim X$. This is what allows us to claim $\mathbb E[X - d \mid X > d] = \mathbb E[X]$, and ultimately, $\Pr[X > d] = 0.9$. It is not necessary to do all the previous work. If you wish to perform the computation explicitly, then $$\begin{align} \operatorname{E}[Y] &= \int_{x=0}^\infty \max(x - d, 0) f_X(x) \, dx \\ &= \int_{x=d}^\infty (x-d) \frac{1}{\theta} e^{-x/\theta} \, dx \\ &= \int_{y=0}^\infty y \frac{1}{\theta} e^{-(y+d)/\theta} \, dy \tag{$x = y + d$} \\ &= e^{-d/\theta} \int_{y=0}^\infty \frac{y}{\theta} e^{-y/\theta} \, dy \\ &= \theta e^{-d/\theta} \\ &= \mathbb E[X] \Pr[X > d]. \end{align}$$ The purpose of memorylessness is to avoid this computation, but either way, it is not difficult.

To calculate the variance of $Y$, we first compute the second moment in the same way as we did the first: $$\mathbb E[Y^2] = \mathbb E[(X-d)^2 \mid X > d]\Pr[X > d] = \mathbb E[X^2] \Pr[X > d].$$ Again, we use the fact that $X$ is memoryless, hence $\left((X - d)^2 \mid X > d\right) \sim X^2$. So $$\operatorname{E}[Y^2] = 2\theta^2 \Pr[X > d] = 1.8 \theta^2,$$ and $$\operatorname{Var}[Y] = 1.8 \theta^2 - (0.9)^2 \theta^2 = 0.99 \theta^2.$$

It is easy to see in the general case that $$\mathbb E[Y^k] = \mathbb E[(X - d)^k \mid X > d]\Pr[X > d] + \mathbb E[0 \mid X \le d]\Pr[X \le d] = \mathbb E[X^k] \Pr[X > d].$$ This is just a consequence of the memorylessness property.

Then the moments are simply $$\mathbb E[X^k] = \int_{x=0}^\infty x^k \frac{1}{\theta} e^{-x/\theta} \, dx = \theta^{k-1} \int_{x=0}^\infty (x/\theta)^k e^{-x/\theta} \, dx = \theta^k \int_{z=0}^\infty z^k e^{-z} \, dz = \theta^k k!.$$ Alternatively, we can reason that $$M_X(t) = \mathbb E[e^{tX}] = \int_{x=0}^\infty \frac{1}{\theta} e^{tx} e^{-x/\theta} \, dx = \frac{1}{\theta(1/\theta - t)} \int_{x=0}^\infty (1/\theta - t) e^{-(1/\theta - t)x} \, dx = \frac{1}{1 - \theta t},$$ for $t < 1/\theta$. But by series expansion and linearity of expectation, $$\mathbb E[e^{tX}] = \sum_{k=0}^\infty \mathbb E \left[\frac{(tX)^k}{k!}\right] = \sum_{k=0}^\infty \frac{\mathbb E[X^k]}{k!} t^k,$$ hence $$\frac{1}{1 -\theta t} = \sum_{k=0}^\infty (\theta t)^k = \sum_{k=0}^\infty \frac{\mathbb E[X^k]}{k!} t^k,$$ and by comparing coefficients, we obtain $$\mathbb E[X^k] = \theta^k k!.$$

Related Question