4
$\begingroup$

Let $\mu, \mu_1, \mu_2, \dots$ be random measures on a Polish space (separable completely metrizable topological space) $(S, {\mathcal S})$. Suppose I know that

$$\int f d \mu_n \to \int f d\mu$$

in probability for each bounded continuous real-valued function. This would be the definition of weak convergence $\mu_n \Rightarrow \mu$, if I dropped "in probability" and the measures $\mu_n$ were deterministic.

Is there any standard way to extract a weakly convergent subsequence from $(\mu_n)$ consisting of almost all members of $(\mu_n)$? Possibly with additional assumptions? Where can I learn about such things?

Sorry if this is a trivial question.

$\endgroup$
6
  • $\begingroup$ I am not really sure what you mean by "consisting of almost all members", can you be more explicit? You can certainly say that there is a subsequence $(\mu_{n_k})$ which converges weakly almost surely; this is a standard fact for real-valued random variables and the proof works for random variables taking values in any metrizable topological space, such as the weak topology on a bounded set of measures on a Polish space. $\endgroup$ Apr 30, 2015 at 15:38
  • $\begingroup$ By "consisting of almost all members" I mean is there, almost surely, a random set A such that $n^{-1} |A \cap \{1, .., n\}| \to 0$ and $\mu_{n, n\not \in A} \Rightarrow \mu$ (or something in this direction). Also I am interested in subsequences that converge to $\mu$ given in the assumption above. $\endgroup$
    – Valentas
    Apr 30, 2015 at 15:54
  • 2
    $\begingroup$ Is it even true for real-valued random variables $X_n$ that if $X_n \to X$ i.p. then you can find an a.s. convergent subsequences which consists of almost all members in your sense? Is it true for the standard "typewriter sequence" counterexample? $\endgroup$ Apr 30, 2015 at 15:58
  • $\begingroup$ @NateEldredge , what is the "standard typewriter sequence"? $\endgroup$
    – Michael
    May 31, 2015 at 0:01
  • 1
    $\begingroup$ @Michael: See Example 4 here. $\endgroup$ May 31, 2015 at 0:02

1 Answer 1

0
$\begingroup$

It is interesting if you let the random index set depend on the realizations. For simplicity, restrict attention to random sequences $\{X_1, X_2, X_3, \ldots\}$ that converge to 0 in probability, but not with probability 1. We say that a set $B$ contains almost all positive integers if: $$ \lim_{n\rightarrow\infty} \frac{1}{n}| \{1, \ldots, n\} \cap B| = 1 $$

Claim 1: If $\{X_i\}_{i=1}^{\infty}$ are mutually independent, then with probability 1 there exists a random set $B$ (possibly dependent on $\{X_i\}$) that contains almost all positive integers and such that $X_i$ converges to $0$ over $i \in B$.

Claim 2: There exist examples where $\{X_i\}_{i=1}^{\infty}$ are mutually independent, but for which there exists no deterministic set $B$ that contains almost all positive integers and is such that $X_i$ converges to 0 with probability 1 over $i \in B$.

Claim 3: There are examples where no (possibly random) set $B$ with the desired properties exists. (In view of Claim 1, all such examples must have dependencies between the $X_i$ variables.)


Proof of Claim 1: Suppose $\{X_i\}_{i=1}^{\infty}$ are mutually independent and converge to 0 in probability. Then for all $\epsilon>0$ we have $Pr[|X_i|>\epsilon]\rightarrow 0$. It follows that there is a deterministic sequence of positive numbers $\{\epsilon_1, \epsilon_2, \epsilon_3, \ldots\}$ such that the following two things hold: \begin{align} &\lim_{i\rightarrow\infty} \epsilon_i = 0 \\ &\lim_{i\rightarrow\infty} Pr[|X_i| > \epsilon_i] = 0 \end{align} (As a quick explanation of why: Choose $\epsilon_1=1$. Find an index $n_2>1$ such that $Pr[|X_i|>1/2] \leq 1/2$ for all $i \geq n_2$ and define $\epsilon_i=\epsilon_1$ for all $i \in \{1, \ldots, n_2-1\}$, and define $\epsilon_{n_2}=1/2$. Then find an index $n_3>n_2$ such that $Pr[|X_i|>1/3] \leq 1/3$ for all $i \geq n_3$, and define $\epsilon_i = \epsilon_2$ for all $i \in \{n_2, \ldots, n_3-1\}$, and define $\epsilon_{n_3} = 1/3$, and so on.)

Now define the random set $B$ as follows: Include a positive integer $i$ in the set $B$ if and only if $\{|X_i|\leq \epsilon_i\}$. If the set $B$ has an infinite number of positive integers, then clearly $X_i$ converges to $0$ over $i \in B$. It remains to show that $B$ has almost all positive integers.

Define $I_i$ as an indicator function that is $1$ if $\{|X_i|>\epsilon_i\}$, and $0$ else. Define $N(k) = \sum_{i=1}^k I_i$ as the number of integers in $\{1, \ldots, k\}$ that are not in the set $B$. Notice that $|I_i-E[I_i]|\leq 1$ for all $i$, so: \begin{align} &E[(I_i-E[I_i])^2] \leq 1\\ &E[(I_i-E[I_i])^4]\leq 1 \end{align}

Define $S(k) = \sum_{i=1}^k E[I_i]$. Then for all $\delta>0$:
\begin{align} Pr\left[\left|\frac{N(k)-S(k)}{k}\right| \geq \delta\right] &=Pr\left[ |N(k)-S(k)| \geq \delta k \right]\\ &\leq Pr[(N(k)-S(k))^4 \geq \delta^4 k^4] \\ &= Pr\left[ \left(\sum_{i=1}^k(I_i-E[I_i])\right)^4 \geq \delta^4 k^4 \right]\\ &\leq \frac{E\left[\left( \sum_{i=1}^k(I_i-E[I_i]) \right)^4 \right]}{\delta^4 k^4} \end{align} where the final inequality holds by the Markov inequality. Because $\{I_i-E[I_i]\}_{i=1}^{\infty}$ are mutually independent and zero mean with second and fourth moments bounded by 1, it holds that there is a number $D>0$ such that: $$ E\left[\left( \sum_{i=1}^k(I_i-E[I_i]) \right)^4\right] \leq Dk + Dk^2 $$ Hence: $$ Pr\left[\left|\frac{N(k)-S(k)}{k}\right| \geq \delta\right] \leq \frac{Dk + Dk^2}{\delta^4 k^4} $$ The right-hand-side is summable, and so with probability 1: $$ \lim_{k\rightarrow\infty} \frac{N(k)-S(k)}{k} = 0 $$ However, $\frac{S(k)}{k} = \frac{\sum_{i=1}^k Pr[|X_i|>\epsilon_i]}{k} \rightarrow 0$, since it is the average of terms that converge to 0. It follows that with probability 1: $$ \lim_{k\rightarrow\infty} \frac{N(k)}{k} = 0 $$ and so (with prob 1) the random set $B$ contains almost all positive integers. $\Box$


Example for Claim 2: Define $\{X_1, X_2, X_3, \ldots\}$ mutually independent with: $$ X_i =\left\{ \begin{array}{ll} 1 &\mbox{ with probability $1/i$} \\ 0 & \mbox{ otherwise} \end{array}\right. $$ This example is well known to converge to 0 in probability, but not with probability 1. Suppose there is a deterministic set $B$ that contains almost all positive integers and for which $X_i$ converges to $0$ over $i \in B$ (we reach a contradiction).

For each positive integer $i$, define $g(i)$ as the number of elements in $\{i, i+1, \ldots, 2i\}$ are are not in $B$. Since $B$ has almost all positive integers, it is not difficult to show that: $$ \lim_{i\rightarrow\infty} \frac{g(i)}{i} = 0 $$ Now for each positive integer $i$, define $\theta_i$ as the probability that there is at least one index $j \in \{i, i+1, \ldots, 2i\} \cap B$ for which $X_j=1$. Then: \begin{align} \theta_i &= 1 - \prod_{j \in \{i, \ldots, 2i\} \cap B}\left(\frac{i-1}{i}\right)\\ &\geq 1 - \frac{\prod_{j=i}^{2i}\left(\frac{i-1}{i}\right)}{(1-1/i)^{g(i)}}\\ &= 1 - \frac{\left(\frac{i-1}{2i}\right)}{(1-1/i)^{g(i)}} \end{align} However, since $g(i)/i\rightarrow 0$ we have: $$ (1-1/i)^{g(i)} = \left((1-1/i)^{i}\right)^{g(i)/i} \approx (1/e)^{g(i)/i}\rightarrow 1 $$ and so: $$ \liminf_{i\rightarrow\infty} \theta_i \geq 1/2 $$ It follows that $\theta_i \geq 1/4$ for all sufficiently large positive integers $i$. Hence, all sufficiently large positive integers $i$ have the property that, with probability at least 1/4, there is an index $j>i$ such that $j\in B$ and $X_j=1$. So $X_i$ cannot converge to 0 with probability 1 over $i \in B$. $\Box$


Example for Claim 3: Consider the same $\{X_1, X_2, X_3, \ldots\}$ sequence from Claim 2, but now form a new (dependent) sequence $\{Y_1, Y_2, Y_3, \ldots\}$ by: $$ \{X_1, X_1, \: \: X_2, X_2, X_2, X_2, \: \: X_3, X_3, X_3, X_3, X_3, X_3, X_3, X_3, \ldots\} $$ Specifically, the $Y_i$ elements are filled in over frames, where each frame $k \in \{1, 2, 3, \ldots\}$ has size $2^k$ and consists of the same value $X_k$. It is clear that $Y_i$ converges to $0$ in probability (since $X_i$ converges to $0$ in probability).

Now take any (potentially random) set $B$ that contains almost all positive integers (the set $B$ is allowed to depend on the $\{X_i\}$ realizations). For all sufficiently large positive integers $k$, this set $B$ must contain at least half of the indices in frame $k$. But, with probability 1, $X_k=1$ for an infinite number of integers $k$. It follows that, with probability 1, $Y_i=1$ for an infinite number of elements $i \in B$. So, with probability 1, $Y_i$ does not converge to 0 over $i \in B$.

$\endgroup$
0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.