I've been given $f(x,y) = 6y$ with boundaries $0 \leq y \leq x \leq 1$.
How do I find the expected value of $x$?
[Math] Finding expected value for random variable $X$ given a joint probability density function $f(x,y)$
probabilitystatistics
Related Solutions
Since the density function $f(x)$ is nonnegative, the integral formula for the expectation is really the difference of two integrals with nonnegative integrands (and hence nonnegative value): $$E[X] = \int_{-\infty}^{\infty} xf(x)\mathrm dx = \int_0^{\infty} xf(x)\mathrm dx - \int_{-\infty}^0 \vert x\vert f(x)\mathrm dx. $$ When both integrals are finite, their difference is finite too. If one of the integrals diverges but the other is finite, then some people say $E[X]$ exists but is unbounded while others deny the existence of $E[X]$ and say that $E[X]$ is undefined. (Perhaps this is why many theorems in probability avoid ambiguity by restricting themselves to random variables with finite means instead of random variables whose means exist.) If both integrals diverge, then the integral formula for $E[X]$ gives a result of the form $\infty - \infty$ and everybody agrees that $E[X]$ is undefined.
In summary, if $\int \vert x \vert f(x) dx$ is finite, then $\int x f(x) dx$ is also finite, and the value of the latter integral is called the expectation or expected value or mean of the random variable $X$ and denoted as $E[X]$, that is, $$E[X] = \int_{\infty}^{\infty} x f(x) dx.$$
Added Note: To my mind, the difference between saying that "$E[X] = \int xf(x) dx$ if the integral is finite" (as Sami wants to) and "$E[X] = \int xf(x) dx$ if $\int |x|f(x)\mathrm dx$ is finite" is that the second statement reminds the casual reader to check something instead of jumping to unwarranted conclusions. Many students have mistakenly calculated that a Cauchy random variable with density $[\pi(1+x^2)]^{-1}$ has expected value $0$ on the grounds that the integrand $x\cdot[\pi(1+x^2)]^{-1}$ in the integral for $E[X]$ is an odd function, and the integral is over a interval symmetric about the origin. But they would have discovered the error of their ways if they had carefully checked if $$\int_{-\infty}^{\infty} \vert x \vert \frac{1}{\pi(1+x^2)} dx = 2 \int_0^{\infty} x\frac{1}{\pi(1+x^2)} dx $$ is finite.
The random couple $(X,Y)$ has no PDF since its support is included in the diagonal $\{(x,x)\mid x\in\mathbb R\}$, which has Lebesgue measure zero.
Likewise, the conditional distribution of $X$ conditionally on $Y$ (or vice versa) has no density. For every $y$, the conditional distribution of $X$ conditionally on $Y=y$ is $\delta_y$ the Dirac mass at $y$. This measure has no density with respect to the Lebesgue measure.
Edit: One might add that the only reason why distributions are useful is to compute means. That is, the distribution of a random element $Z$ allows to compute $E[u(Z)]$ for every measurable real-valued function such that the expectation exists. Here, the distribution of $(X,Y)$ is crystal clear since, considering the distribution $P_X$ of $X$, one has $$ E[u(X,Y)]=\int u(x,x)\mathrm dP_X(x). $$ For example, if $P_X$ has a density $f_X$, $$ E[u(X,Y)]=\int u(x,x)f_X(x)\mathrm dx. $$
Best Answer
Calculate $$\iint_D (x)(6y)\,dy\,dx,$$ where $D$ is the part of the plane such that $0\le y\le x\le 1$.
So $D$ is the interior and boundary of the triangle with vertices $(0,0)$, $(1,0)$. and $(1,1)$.