Solved – Optimal orthogonal polynomial chaos basis functions for log-normally distributed random variables

polynomialstochastic-processes

I hope this is the appropriate venue for this type of question. If not, please feel free to migrate! 🙂

I'm trying to solve a stochastic partial differential equation of the form $$\alpha(\omega)\nabla^2u=f$$
where $\alpha(\omega)$ represents a random field that is log-normally distributed, i.e. it has a probability density function $$f(x)=\frac{1}{x\sqrt{2\pi\sigma^2}}e^{-\frac{(\log(x)-\mu)^2}{2\sigma^2}}.$$

I want to represent the solution of this problem as a polynomial chaos expansion $$u=\sum_{i=0}^p u_i(x)\Psi_i(\xi)$$ where $u_i(x)$ is a deterministic coefficient and $\Psi_i(\xi)$ are orthogonal polynomials in terms of a random variable $\xi$ with the same log-normal probability density function.

According to Xiu & Karniadakis (2002), certain orthogonal polynomial bases give optimal (exponential) convergence of finite expansions to the true solution $u$. For instance, Hermitte polynomials are optimal for Gaussian distributions, Legendre polynomials for uniform distributions, Laguerre for gamma distributions etc (see the above paper, bottom of page 8).

What is the corresponding optimal polynomial basis for log-normal distributions?

Best Answer

There's no optimal polynomial basis for the log-normal distribution, because it's not determinate in the Hamburger sense. The approach indicated by Xiu and Karniadakis (Generalized Polynomial Chaos) doesn't always work. I'll be a bit informal here. For a rigorous treatement, see here.

Let $X$ denote a continuous random variable and $f_X(x)$ its pdf. Also, let's assume that $X$ has absolute finite moments of all order, i.e.,

$$ \int_\mathbb{R}|x|^nf_X(x)dx < \infty \quad \forall n \in N$$

This implies that $X$ has finite moments of all orders, i.e.

$$ m_n=\int_\mathbb{R}x^nf_X(x)dx < \infty \quad \forall n \in N$$

With these conditions, you can determine the family of orthogonal polynomials associated to $f_X(x)$, for example applying the Gram-Schmidt procedure to the monomial basis, or using the more stable Stieltjes procedure. The lognormal distribution satisfies these assumption, and thus there exist a set of orthonormal polynomials associated with it, the Stieltjes–Wigert polynomials.

However, this is not enough to be able to expand each mean-square integrable random variable $Y=g(X)$ in a series of orthogonal polynomials of $X$, which converges in square mean to $Y$. The reason is that with the conditions we stated, we cannot assure that the orthogonal polynomials are dense in $A=L^2(\Omega,\sigma(X),P)$, where $\Omega$ is the sample space of the probability space to which $X$ is associated, $P$ is the probability measure of the probability space and $\sigma(X)$ is the $\sigma$-algebra generated by $X$. If this sounds too complicated, think of it like this: the space of all square-mean integrable random variables $Y$ which are a measurable function $g(X)$ of $X$ is a very large space. Unless you are sure that the polynomials orthonormal with respect to $f_X(x)$ are dense in such a space, there may be some $Y$ which cannot be expressed as a series of said polynomials.

Under our assumptions, a necessary and sufficient condition for the density of the orthonormal polynomials in $A$ is that $f_X(x)$ is determinate in the Hamburger sense, i.e., that it is uniquely determined by the sequence of its moments of all orders. The lognormal distribution is not determinate in the Hamburger sense, and thus there exist some square-mean integrable RVs $Y=g(X)$ such that the expansion of $Y$ in a series of Stieltjes–Wigert polynomials either doesn't converge, or it converges to a limit which is not $Y$ (recall that here convergence is always intended in the mean square sense). For some examples, again see the paper I linked above.

End of the story, for your random coefficient Poisson PDE it's probably better to represent $u$ as a series of Hermite polynomials. The convergence rate will be quite slow, though.

Related Question