Neural Networks – Expected Value Notation in GAN Loss

expected valueganneural networksnotation

I am reading Goodfellow's original paper on GANs.

What I struggle to understand is his notation of the subscript in expected values.

$$
\mathbb{E}_{\boldsymbol{x} \sim p_{data}(\boldsymbol{x})}\ldots
$$

If I understand it correctly then $\boldsymbol{x}$ is a realization of some random variable $\mathbf{x}$, but how can there be an expectation wrt. $\boldsymbol{x}$?

Or is $\boldsymbol{x}$ a random variable?

Thank you.

Edit: I do not think this is duplicate, as the referenced question does not answer what $\boldsymbol{x}$ means.

Best Answer

$E_{x\sim p(x)}[f(X)]$ means the expected value of $f(X)$ if its assumed to be distributed wrt $p(x)$, e.g. for a continuous distribution we have: $$E_{x\sim p(x)}[f(X)]=\int f(x)p(x)dx$$

It's used when the distribution of $x$ subject to change in an optimization problem. Specifically, in the paper, authors have two distributions (in page 5) $p_g$ and $p_{data}$.

Edit: And, the $x$ in the subscript of the expected value notation is not a realization. It's the random variable; or more specifically, in the paper it is the random vector, $\mathbf{x}$ (It's also in bold in Page 5).

Related Question