Typically you would say something like the following
$$ X \sim \textrm{Uniform}(-1,1) \tag{1} $$
then $X$ is a random variable and it follows the uniform distribution with parameters $-1$ and $1$. Typically a uniform distribution is like this.
$$ f_{X}(x) =\begin{align}\begin{cases} \frac{1}{b-a} & \textrm{ for } a \leq x \leq b \\ 0 & \textrm{ everywhere else } \end{cases} \end{align} \tag{2}$$
when with parameters $-1$ and $1$ we have
$$ f_{X}(x) =\begin{align}\begin{cases} \frac{1}{2} & \textrm{ for } -1 \leq x \leq 1 \\ 0 & \textrm{ everywhere else } \end{cases} \end{align} \tag{3}$$
In other words, it is saying that $x$ is a random variable and following a distribution given by $X$ in your case, with a distribution function given by it. In other words, it is like the $=$ sign but typically it can refer to a family of functions.
Edit:
If you read the paper in section $3$ on page $2$ it literally defines things.
To learn the generator’s distribution $p_{g}$ over data x, we define a
prior on input noise variables $p_{z}(z)$,
It uses notation like this in $4$
The generator G implicitly defines a probability distribution $p_{g}$ as
the distribution of the samples G(z) obtained when $z ∼ p_{z}$
...did you read the paper?
Edits:
For the second part if you read the paper, it is defined on page $2$.
where $p$ is the distribution to learn and $q_{\theta}$ is the distribution
defined by the implicit generator. The expectation is minimized over a
parametrized class of functions
and you note that $\otimes$ is the tensor product.
Best Answer
It usually means: "we are defining what's on the left of := to be what's on the right". This distinction originates from computer languages, where the mere equality symbol "=" denotes an assignment of one variable's value to another's. For example, in Mathematica they use "==" for being equal, and "=" for assignment.