People always talk about CLT for dependent random variables, but among the probability graduate students, probably not too many know the exact research frontier of conditions under which CLT is guaranteed to hold. Well I am not here to push that frontier, nor to pretend I know the exact current status of research. But I would like to illustrate the power of simple reasoning with a simple scenario, well understood sixty some years ago, but nonetheless hardly taught in classrooms.

Consider a sequence of dependent identically distributed random variables, where is some positive integer: . This simply means are independent unless . We want to show that the sum of ‘s properly normalized converges to the normal distribution. When , the proposition specializes to the classical CLT.

I will present the sketch of the proof as a recipe, or a series of instructions.

The idea is to first show it for . Let be a function that grows with . Lump the ‘s into adjacent groups of terms, each followed by skipping one term. So to be precise, let , , …, . Doing this has the advantage that are now independent and identically distributed, because we skip one term between two consecutive ‘s. If we choose so that also goes to infinity, such as , then Lindeberg CLT works on ‘s, viewed as a triangular array; this is one reason why Lindeberg CLT is more useful than classical CLT statement sometimes. Now we have to worry about the that are skipped. Well since grows to infinity, the contribution of will be negligible if grows faster than , because the proportion of will than be less than 1 out of every , hence gets killed by the usual normalization constant in CLT. Thus the naive choice of doesn’t quite work. This completes the sketch for . To do the proof for general , one simply lumps together into groups of , which makes them -dependent.

I hope you are convinced at this point that -dependent CLT works. It essentially still relies on classical CLT, unlike the martingale CLT, which has to go through the original machinery used in proving CLT again, i.e., characteristic functions etc.

### Like this:

Like Loading...

*Related*

## About aquazorcarson

math PhD at Stanford, studying probability