Solved – How to calculate this formula for variance

time seriesvariance

I have a function which I would like to use Taylor expansion and calculate its variance by the following formula:

The formula for variance then becomes
\begin{align}
\operatorname{Var}(f(X))=[f'(EX)]^2\operatorname{Var}(X)+\frac{[f''(EX)]^2}{4}\operatorname{Var}^2(X)+\tilde{T}_3
\end{align}

got the formula from Variance of a function of one random variable

I have tried a few times to calculate a simple example to get to learn how to use it, but can not say I have succeeded.

So if anyone please could show me how to use the above formula for calculating the variance for this (simple) function:
$f(X)=100\times \exp(x)+100\times \exp(2x)$,

where the expected value of $x$ is $0.05$ and standard deviation is $0.1$.
normal distribution of x is assumed.

Best Answer

Can you not use $e^x \approx 1+ x+ \frac{X^2}{2} $ then $Var(e^X) \approx Var(X)+ \frac{Var(X^2)}{2} $ ?

It seems to me you want 2nd order approximation ..

Related Question