Scalar multiple of a gamma random variable

gamma distributionprobabilityprobability distributionsprobability theory

Let $X \sim \mathrm{Gamma}(\alpha, \beta)$, where $\alpha$ and $\beta$ are the shape and rate parameters respectively. If $X_i \stackrel{\mathrm{iid}}{\sim} \mathrm{Gamma} (\alpha_i, \beta), \mathrm{then} \sum\limits^k_{i = 1} X_i \sim \mathrm{Gamma} (\alpha_1 + \dots + \alpha_k, \beta)$.

However, according to an answer here, $kX \sim \mathrm{Gamma}\left(\alpha, \frac {\beta} k\right)$. I would have thought that $kX$ would be distributed according to the "addition law" I stated above such that we have $kX \sim \mathrm{Gamma}(k\alpha, \beta)$.

For example, let $X_1 \sim \mathrm{Gamma}(\alpha_1, \beta)$ and $X_2 \sim \mathrm{Gamma}(\alpha_2, \beta)$. From the "addition law" above, we have $X_1 + X_2 \sim \mathrm{Gamma}(\alpha_1 + \alpha_2, \beta)$. Now, let $Y_1 \sim \mathrm{Gamma}(\alpha, \beta)$ and $Y_2 \sim \mathrm{Gamma}(\alpha, \beta)$. Why wouldn't $2Y = Y_1 + Y_2 \sim \mathrm{Gamma}(2\alpha, \beta)$?

I suspect I am wrong and the answer I referenced is correct, since the person who posted the answer has a lot more reputation than I do 😛 so I am just wondering where the flaw in my logic is.

Best Answer

Your mistake is conflating $X_1 + X_2$ with $2X_1$.

In $X_1 + X_2$, the two random variables are independent, but in $2X_1$ you are taking the sum of two copies of $X_1$ which can be viewed as "completely dependent."

Side note: Another simple way to point out that the two are different: $\text{Var}(X_1+X_2)=\text{Var}(X_1) + \text{Var}(X_2) = 2 \text{Var}(X_1)$, but $\text{Var}(2X_1) = 4\text{Var}(X_1)$.

Related Question