Convergence in mean under absolutely continuous measure

convergence-divergencemeasure-theoryprobability theory

I was just wondering what kinds of convergence are "preserved" if we have two measures where one of them is absolutely continuous with respect to the second one.

Let's say we have two $\sigma$-finite measures $\mu$ and $\nu$ defined on a measurable space ($\Omega$, $\Lambda$) and $\mu << \nu$, i.e. $\mu$ is absolutely continuous with respect to $\nu$.
Let $(f_n)_{n \in \mathbb{N}}$ and $f$ be real-valued measurable functions on that space.
If the series $f_n$ converges to $f$ for $n \longrightarrow \infty$ $\nu$-a.e., it's pretty obvious it does so under $\mu$, too. A similar result holds for convergence in measure.

Now assume they are all integrable (with respect to each measure) and converge in mean under $\nu$:

$\lim_{n \to \infty} \int |f_n-f| d\nu = 0 $

Does it follow that: $\lim_{n \to \infty} \int |f_n-f| d\mu = 0 $ ?
What about convergence in the r-th mean? Does it change something if we have finite measures?

I tried proving this using Radon Nikodym but without success. Any hints or counterexamples will be appreciated. Thank you!

Best Answer

Let $\nu$ be Lebesgue measure on $(0,1)$ and $\mu (A)=\int \frac 1 {\sqrt x} dx$. Then $\mu << \nu$. If $f_n =\sqrt n I_{(0,\frac 1 n)}$ then $\int|f_n-0|d\nu \to 0$ but $\int|f_n-0|d\mu \to 2$.

Related Question