Alright, a quick post about MGFs and the Bernoulli and Binomial distributions. Since the Bernoulli distribution is simply the Binomial distribution for the special case of $n=1$, we can see that the proof for the Binomial distribution will automatically include a proof for the Bernoulli distribution. I already showed it to find $E[x^2]$, but I'll restate it here. So, the MGF for discrete functions is:
$$\sum_{x=0}^{n}e^{tx}P_{x}(x)$$
Which for the Binomial distribution would be:
$$\sum_{x=0}^{n}e^{tx}\left( {\begin{array}{*{20}c}
n \\ x \\
\end{array}} \right) P^{x}(1-P)^{n-x}$$
Where we do this:
$$\sum_{x=0}^{n}\left( {\begin{array}{*{20}c}
n \\ x \\
\end{array}} \right) (Pe^{t})^{x}(1-P)^{n-x}$$
Now, because of the Binomial theorem we know that:
$$\sum_{x=0}^{n}\left( {\begin{array}{*{20}c}
n \\ x \\
\end{array}} \right) (Pe^{t})^{x}(1-P)^{n-x}=(Pe^{t}+1-P)^{n}$$
This is a much more manageable form. Now we can take derivatives of this and get whatever moment we want. Let's use the general case of $n$, but then check it for $n=1$ (i.e. the Bernoulli distribution).
Well, (setting $(Pe^{t}+1-P)^{n}=M_{x}(t)$) the first derivative is:
$$\frac{d(M_{x})}{dt}=e^{t}Pn(e^{t}P+1-P)^{n-1}$$
By using the chain rule. So this is the first moment. Setting $t=0$ here will give us the mean. Let's check it:
$$e^{0}Pn(e^{0}P+1-P)^{n-1}=nP$$
Well, for the Binomial distribution we proved that the mean is $P$, and for the Binomial we showed it was $nP$, which is perfect. We have the Binomial mean, and if we set $n=1$ then we have the Bernoulli. Now for the second moment and the variance. To do this we can use the fact that $\sigma^{2}=E[x^2]-\mu^2$. Well, we know $\mu$, but we need to know $E[x^2]$, or the second moment. Taking the second derivative gives:
$$\frac{d^{2}(M_{x})}{dt^{2}}=e^{t}Pn(e^{t}P+1-P)^{n-1}+e^{t}Pn\left(e^{t}P(n-1)(e^{t}P+1-P)^{n-2}\right)$$
By use of the product rule. Setting $t=0$, taking note of the fact that $(P+1-P)=1$, and then simplifying gives us:
$$Pn+(P)^{2}n(n-1)=Pn+(Pn)^{2}-(P)^{2}n$$
Now, plugging this back into the variance formula, as well as the mean, gives us:
$$\sigma^{2}=Pn+(Pn)^{2}-(P)^{2}n-(Pn)^{2}=Pn-(P)^{2}n=Pn(1-P)$$
Which is the variance of the Binomial distribution. Setting $n=1$ gives us $P(1-P)$, which is exactly the variance for the Bernoulli distribution.
No comments:
Post a Comment