If a random variable $X$ is given, then it induces the pushforward measure defines by
$$ E \mapsto \Bbb{P}( \{ \omega \in \Omega : X(\omega) \in E \}) = \Bbb{P}(X^{-1}(E)). $$
Mathematicians simply abbreviate this by $\Bbb{P}(X \in E)$ whenever no confusion arises. Replacing the particular choice of $E$ by a placeholder $\cdot$, we may symbolically write this pushforward measure by $\Bbb{P}(X \in \cdot)$.
If $X$ is real-valued, then $\Bbb{P}(X \in \cdot)$ defines a probability measure on $\Bbb{R}$. Now, recall a measure $\mu$ on $\Bbb{R}$ is often written symbolically as $\mu(dx)$, particularly in the context of integration where explicitly writing the variable on which integrands depend becomes important. Then the notation $\Bbb{P}(X \in dx)$ reduces to a particular case of this practice.
You can think that the symbolic notation $dx$ intuitively stands for any possible choices of infinitesimally small measurable sets. This practice is partially justified by the fact that if $\Bbb{P}(X \in \cdot)$ is a Borel measure on $\Bbb{R}$, then for any $f \in C_b(\Bbb{R})$,
$$ \int_{\Bbb{R}} f(x) \, \Bbb{P}(X \in dx) = \lim_{n\to\infty} \sum_{k=-\infty}^{\infty} f(x_k) \Bbb{P}(X \in [x_k, x_k + \Delta x) ), \quad \Delta x = \frac{1}{n} \text{ and } x_k = k \, \Delta x. $$
For intuition, suppose the sample space consists of a finite number of equally probable outcomes (this is of course not true for all probability spaces, but many situations can be approximated by something of this form). Then
$$ E(X+Y) = \frac{(x_1+y_1)+(x_2+y_2)+\cdots+(x_n+y_n)}n $$
and
$$ E(X)+E(Y) = \frac{x_1+x_2+\cdots+x_n}n + \frac{y_1+y_2+\cdots+y_n}n $$
which is obviously the same.
Best Answer
Let $\xi_1,\xi_2:\Omega\to\mathbb R$ be two random variables on the same probability space $(\Omega,\mathscr F,\mathsf P)$ . The expectation of either is defined by $$ \mathsf E\xi_i:= \int_\Omega \xi_i(\omega)\mathsf P(\mathrm d\omega). $$ The linearity of the expectation means that for any constants $\alpha_1,\alpha_2\in\Bbb R$ it holds that $$ \mathsf E[\alpha_1\xi_1+\alpha_2\xi_2] = \alpha_1\mathsf E\xi_1+\alpha_2 \mathsf E\xi_2 $$ which follows directly from the linearity of the Lebesgue integral in the definition of the expectation. Hence, the functional $\mathsf E$ defined over the space of random variables on the probability space $(\Omega,\mathscr F,\mathsf P)$ is linear.
For the independence over the product, yet again if $\xi_1,\xi_2,\dots,\xi_n$ are random variables on the same probability space as above, and they are mutually independent then $$ \mathsf E\left[ \prod_{i=1}^n\xi_i\right] = \prod_{i=1}^n\mathsf E\xi_i $$