Alright, I realize I still have to do that MGF post, but I also realized most of my stuff has dealt with continuous distributions. A lot of the proofs are pretty easy (just switch to summation notation), but I figured I'd do a post or two covering the main discrete distributions.
$$
P_{x}(x)=\left\{
\begin{array}{ll}
P^{x}(1-P)^{1-x} \quad x=0,1\\
0 \qquad \qquad \qquad otherwise
\end{array}
\right.
$$
This can be considered a success and failure distribution. For instance, if there are two mutually exclusive events, success=$1$ and failure=$0$, with a probability $P$ such that $0\leq P\leq 1$, then the probability of success is $P$, and failure is $(1-P)$. To see that, let's say it succeeds:
$$P^{1}(1-P)^{1-1}=P^{1}(1-P)^{0}=P(1)=P$$
Since we use the fact that $(1-P)$ raised to $0$ is $1$. For failure:
$$P^{0}(1-P)^{1-0}=(1)(1-P)=(1-P)$$
Using the fact that $P^0$ equals $1$. So, we've verified that the probability of success is $P$, and failure $(1-P)$. But, this is really just the beginning. How do we know that it's a probability function? Well, there are two conditions it must satisfy. Since $0\leq P$, we know that the probability of success is greater than or equal to zero. Furthermore, since $P\leq1$ we know that $(1-P)$ must be greater than or equal to $0$, so we know that both events have a probability greater than or equal to $0$. Now, since they're independent and discrete their total probability is equal to:
$$\sum_{x}P_{x}(x)$$
Which, in this case is:
$$\sum_{0}^{1}P_{x}(x)=P_{x}(0)+P_{x}(1)=P+(1-P)=1$$
So this shows it's a PMF. What about it's CMF? That's set as:
$$F(x)=P[X\leq x]=\sum_{x_{i}\leq x}P_{x}(x_{i})$$
Where:
$$\lim_{x\rightarrow-\infty}F(x)=0$$
And,
$$\lim_{x\rightarrow\infty}F(x)=1$$
Something worth covering is the expectation and the variance. For discrete functions the expected value is defined as:
$$\sum_{i=1}^{\infty}x_{i}P_{i}$$
For this case, it's:
$$\sum_{i=0}^{1}x P^{x}(1-P)^{1-x}=(0) P^{0}(1-P)^{1}+(1)P^{1}(1-P)^{0}=P$$
And the variance, defined as:
$$\sum_{i=1}^{\infty}(x_{i}-\mu)^{2}P_{i}$$
Which for the Bernoulli distribution is:
$$\sum_{0}^{1}(x-P)^{2}P^{x}(1-P)^{1-x}=(P)^{2}(1-P)+(1-P)^{2}P$$
Where we can pull out a $P$ and $(1-P)$ to get:
$$P[P+(1-P)](1-P)=P(1)(1-P)=P(1-P)$$
That was a lot of the basic stuff for the Bernoulli distribution. I'll cover the binomial one next since it's so closely related.
No comments:
Post a Comment