Alright now we have to take the derivative we got the last time and differentiate again. Last time we had:
$$\frac{\partial M_{x}(t)}{\partial t}=r\left( \frac{Pe^{t}}{1-(1-P)e^{t}}\right)^{r-1}\left(\frac{Pe^{t}}{1-(1-P)e^{t}}+\frac{P(1-P)e^{2t}}{(1-(1-P)e^{t})^{2}}\right)$$
The second derivative of this would become:
(yeah I'm not typing that up at all, of course I used Wolfram)
Setting $t=0$ reduces this to:
$$\frac{-r(P-1)+r^{2}}{P^2}$$
Now to use the fact that $\sigma^2=E[x^2]-\mu^2$, we have:
$$\sigma^2=\frac{-r(P-1)+r^{2}}{P^2}-\frac{r^2}{P^2}=\frac{-r(1-P)}{P^2}=\frac{r(P-1)}{P^2}$$
Now we have the variance of the negative binomial, we can use this to find the variance of the geometric, which is just the special case of $r=1$. That is simply:
$$\frac{(P-1)}{P^2}$$
Work is finallyyyyy done. Now moving onto the next discrete distribution. The hypergeometric. That leaves only two of the most useful discrete distributions. The Poisson, and the uniform (trivial but I'll make a post about it anyways).
No comments:
Post a Comment