3
$\begingroup$

It is well know that the convergence in distributions does not necessarily imply convergence in expectation, but implies convergence in expectation of bounded continuous functions.

Let $\{X_n\}$ be a sequence of random variables that converge in distribution to $X$. I would like to ask two examples as follows.

  1. An example such that $\mathbb{E}[X_n]$ does not convergence to $\mathbb{E}[X]$.
  2. An example such that $\mathbb{E}[X_n^k]$ does not convergence to $\mathbb{E}[X^k]$ for all $k = 1, 2,\ldots$. That is, the sequence does not converge in all the moments, not just a or a few fixed moments. Suppose that all their moments exist.
$\endgroup$
0

1 Answer 1

6
$\begingroup$

Let $P(X_{n} = n) = \frac{1}{n}$ and $P(X_{n} = 0) = 1 - \frac{1}{n}$. Then $X_{n}$ converges in distribution to $X=0$, but $\mathbb{E}(X_{n}^{k}) = \frac{1}{n} n^{k} + 0 = n^{k-1} \not\xrightarrow{n\to\infty} 0$ for each $k \in \mathbb{N}$.

UPDATE: If you prefer continuous random variables on $\mathbb{R}$, you can "smoothen out" the previous example:

Let $X \sim N(0,1)$ and $X_{n}$ have the probability density $$ \rho_{X_{n}}(x) = \frac{1}{n} \cdot \frac{1}{\sqrt{2\pi}}\, \exp(-(x-n)^2/2) + \left(1-\frac{1}{n}\right) \cdot \frac{1}{\sqrt{2\pi}}\, \exp(-x^2/2). $$ Then $X_{n} \stackrel{\mathrm{d}}{\to} X$.

Now denote the moments of $X$ by $A_{k} := \mathbb{E}(X^{k})$ and, for $m \geq 0$, let $A_{k,m} := \mathbb{E}((X+m)^{k}) \geq A_{k} + m^k$. It follows for each $k\in\mathbb{N}$: \begin{align*} \mathbb{E}(X_{n}^{k}) &= \int_{\mathbb{R}} x^{k} \rho_{X_{n}}(x)\, \mathrm dx \\ &= \frac{1}{n} A_{k,n} + \left(1-\frac{1}{n}\right) A_{k} \\ &\geq \frac{1}{n} (A_{k}+n^{k}) + \left(1-\frac{1}{n}\right) A_{k} \\ &= n^{k-1} + A_{k} \\ &\not\xrightarrow{n\to\infty} A_{k}. \end{align*}

$\endgroup$
4
  • $\begingroup$ I have edited the question. I would like to ask an example that fails to converge for all the moments not just a or some fixed moment(s). $\endgroup$
    – null
    Feb 28 at 8:47
  • $\begingroup$ I modified my answer correspondingly. Does this work for you? All the moments exist, but are no longer bounded as $n\to\infty$. (Previously, for each $k$ separately, the moments converged to $1 \neq 0$, now they diverge except for $k=1$.) $\endgroup$ Feb 28 at 8:51
  • $\begingroup$ Yes, the example works, thank you. I want to additionally ask, what if the distributions are continuous not discrete? Any example for this? $\endgroup$
    – null
    Feb 28 at 8:52
  • $\begingroup$ I made an update that covers continuous random variables. $\endgroup$ Feb 28 at 10:23

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.