0
$\begingroup$

Consider a random scalar variable $X$ with arbitrary measure. I'm after a basis of polynomial functions $\{p_k\}_{k=0}^\infty$ which are orthonormal with respect to $X$ in the sense that

\begin{equation} \mathbb{E}_X [p_k(X)p_{k'}(X)] = \delta_{kk'}. \end{equation}

When discussing orthogonal polynomial bases, the measure of integration is usually assumed. For example, if $X \sim \mathcal{N}(0,1)$, then $\{p_k\}_{k=0}^\infty$ are the Hermite polynomials. However, it seems there ought to exist generic expressions for such orthogonal polynomials, with the coefficients given in terms of moments of $X$. For example, applying the typical Gram-Schmidt procedure, one can quickly find that

\begin{align} p_0(X) &= 1 \\ p_1(X) &= \frac{X - \mathbb{E}[X]}{\sqrt{\text{Var}[x]}} \\ p_2(X) &= \ \ \ ... \end{align}

Are there known expressions for the rest of this polynomial basis (or even just the next few elements)? In light of the expression for $p_1$, perhaps centered moments or cumulants are involved.

$\endgroup$
4
  • $\begingroup$ For arbitrary measure, I see no reason to think this is any simpler than the general Gram-Schmidt procedure en.wikipedia.org/wiki/Gram–Schmidt_process $\endgroup$ Aug 17, 2022 at 0:20
  • $\begingroup$ Well there's a 3-term relation so it's not an entirely generic Gram-Schmidt situation. But I still wouldn't expect a general formula that's much more useful than generic Gram-Schmidt. $\endgroup$ Aug 17, 2022 at 0:31
  • $\begingroup$ I see! Do you know of anywhere the next few terms might be written out? $\endgroup$ Aug 17, 2022 at 0:43
  • $\begingroup$ Actually you don't even need a measure. One can develop the theory of orthogonal polynomials starting from a scalar product on polynomials of the form L(pq) for a linear form L on polynomials. Check e.g. Akhiezer's book quoted below. $\endgroup$ Aug 17, 2022 at 6:20

1 Answer 1

1
$\begingroup$

For example, you can write orthogonal polynomials as determinants

$p_n(x) = c_n \, \det \begin{bmatrix} m_0 & m_1 & m_2 &\cdots & m_n \\ m_1 & m_2 & m_3 &\cdots & m_{n+1} \\ \vdots&\vdots&\vdots&\ddots& \vdots \\ m_{n-1} &m_n& m_{n+1} &\cdots &m_{2n-1}\\ 1 & x & x^2 & \cdots & x^n \end{bmatrix}$,

where $c_n$ is some constant for normalization and $m_k$ is the k-th moment.

A good book concerning orthogonal polynomials is Akhiezer, The Classical Moment problem

$\endgroup$
2
  • $\begingroup$ amazing! this is what I was looking for. is there any easy way to see why these polynomials are orthogonal (e.g. using some rule for the product of two determinants)? $\endgroup$ Aug 25, 2022 at 22:53
  • $\begingroup$ ah, nvm, I see it - you can see $p_n$ is orthogonal to any monomial in $\{1, ..., x^{n-1}\}$ by just taking the expectation and noting that the bottom row is now a bunch of moments equal to one of the above rows, so the determinant's zero, and that's enough. sweet trick. thanks! $\endgroup$ Aug 26, 2022 at 0:22

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.