Wednesday, March 27, 2013

Moment Generating Function.



As promised, the moment generating function. The moment generating function is generally applied in statistics to save us from very complex and tedious calculations. For instance, functions like these:

$$E[X],~E[X^2],~E[X^3],...$$

And these:

 $$E[(X-\mu)],~E[(X-\mu)^2],~E[(X-\mu)^2],...$$

The first sequence of functions are known as moments. $E[X]$ is called the "first" moment, $E[X^2]$ the "second" moment, and so on. As for the second sequence, these are known as the moments about the mean, and like last time are known as the first moment about the mean, the second, and so forth. Now, how can we solve for these? Well, the first is pretty easy. If $X$ is just raised to the first power then it becomes $\mu$, or the mean of X. For $E[(X-\mu)]$ we know it's zero, since the expectation operator is a linear operator, or $E[(X-\mu)]=E[X]-E[\mu]=\mu-\mu=0$. But what about higher moments? Well, for the first we could use the fact that:

  $$\sigma^2=E[X^2]+\mu^2$$

And solve for $E[X^2]$, but that assumes we already know $\sigma^2$ as well as the mean. So how can we do better? This is where the moment generating function comes in. I'll give the definition first, and then try to explain the reasoning. Without much further ado:

$$MGF\equiv\int_{-\infty}^{\infty}e^{tX}f(X)dx,~t\in\mathbb{R}$$

Now, the cool thing about the function $e^{tX}$ is that it has some useful properties that just so happen to be exactly what we need. For instance:
 $$e^{X}=1+\frac{tX}{1!}+\frac{t^{2}X^{2}}{2!}+\frac{t^{3}X^{3}}{3!}+...$$

Notice all those beautiful moments? That's exactly what we're looking for. If we can take advantage of that to find the value of whatever moments we need, then we're set. Let's start by plugging it in:

$$\int_{-\infty}^{\infty}\left(1+\frac{tX}{1!}+\frac{t^{2}X^{2}}{2!}+\frac{t^{3}X^{3}}{3!}+...\right)f(X)dx$$

Now, this looks sort of like a mess as is, but let's start by distributing out the function $f(X)$:

$$\int_{-\infty}^{\infty}\left(f(x)+\frac{tXf(X)}{1!}+\frac{t^{2}X^{2}f(X)}{2!}+...\right)dx$$

Now, remember the integral is a linear operator in the sense that we can do this:

$$\int_{-\infty}^{\infty}(f(x))dx+\int_{-\infty}^{\infty}\frac{tXf(X)}{1!}dx+\int_{-\infty}^{\infty}\frac{t^{2}X^{2}f(X)}{2!}dx+...$$

The integrals themselves are all for X, so that means we can pull out any constants or variables that aren't $X$. So this becomes:

 $$1+\frac{t}{1!}\int_{-\infty}^{\infty}Xf(X)dx+\frac{t^2}{2!}\int_{-\infty}^{\infty}X^{2}f(X)dx+\frac{t^3}{3!}\int_{-\infty}^{\infty}X^{3}f(X)dx...$$

The first term became one because it was simply the area under the probability function, which, by definition, equals one. Now, here's the interesting thing. All of the integrals become the different moments. See how this works. $E[X]$ is the first moment, which also happens to be the second term. $E[X^2]$ is the third term, and is the second moment. And so on. What about the first term? Well, it's the same as $E[X^0]$, which is the expected value of a constant $1$, which is obviously one. This is technically the zeroth (zeroth?) moment. Alright, now we have all the moments in one long formula. What now? Now we can differentiate with respect to $t$ to get what we want. Setting the full function up there equal to $\phi$ and taking the first derivative gives us:

$$ \frac{d\phi}{dt}=\int_{-\infty}^{\infty}Xf(X)dx+\frac{t}{1!}\int_{-\infty}^{\infty}X^{2}f(X)dx+\frac{t^2}{2!}\int_{-\infty}^{\infty}X^{3}f(X)dx+...$$

Well, that just got rid of the constant, and shifted the moments down. However, notice the first term. It's the first moment, but there's no $t$ attached to it. What can we do? Since this is a function of $t$, we can set it to zero. All the terms after the first moment are eliminated, and all that's left is the first moment. Here's the incredibly useful part. Taking the derivative again gets rid of the first moment. Now, we have the second moment unattached to $t$, with all other moments still multiplied by $t$, and as we keep differentiating the factorials on the bottom of the fractions disappear. What can we conclude? Whatever moment you're looking for is equal to:

$$E[X^n]=\left.\frac{d^{n}\phi}{dt^{n}}\right|_{t=0}$$

So, let's say you want the nth moment. Well, take the nth derivative with respect to $t$ of the original function, and then set it equal to zero. This gives us all moments out to infinity.

Next post I'll go into moments about a constant, most notably the mean.



No comments:

Post a Comment