I believe this will be the last part of my MGF series, but we'll see. This is the MGF of a joint probability function (multiple variables), or a vector valued MGF. Originally I was going to use a proof that I put together myself, but it was long and cumbersome compared to a much shorter, comprehensible one I recently found. Firstly, I'll have to show a very useful result. Which is, if we have an expectation of $E[XY]$ where $X$ and $Y$ are random variables that are independent, then this is equivalent to $E[X]E[Y]$. To show this I'll have to explain a bit about independent events, and then move on to that result.
Now from an earlier post we know about the conditional probability, which is written as:
$$P(A|B)=\frac{P(A\cap B)}{P(B)}$$
Rearranged it becomes:
$$P(A|B)P(B)=P(A\cap B)$$
In other words, the probability that $A$ and $B$ happen is equal to the probability that $A$ given that $B$ has happened, multiplied by the probability of $B$. Now here's where we introduce the idea of independence. Let's say that we know $B$ has happened. Now what if $B$ happening doesn't affect $A$ at all? In other words, if we know $B$ happens, it doesn't change what $A$ could be. Let's say we flip a coin and get heads. Now let's say we flip a coin again. Does the fact we got heads on the first flip affect what we get on the second? Absolutely not. So, in mathematical notation, that's saying that $P(A|B)=P(A)$. Plugging that into our earlier function gives:
$$P(A)P(B)=P(A\cap B)$$
So the probability of $A$ and $B$ is the multiplication of both. This is known as the multiplication rule, and should make a lot of intuitive sense. What are the odds of getting two heads in a row? $(\frac{1}{2})(\frac{1}{2})=\frac{1}{4}$.
We can further the discussion by talking about PDFs of independent variables. Since a PDF is the probability of an event, independence should carry over naturally. So if we have two variables $X_{1}$ and $X_{2}$ that are independent, we know how conditional probability is defined:
$$f_{X_{2}|X_{1}}(X_{2}|X_{1})=\frac{f_{X_{2},X_{1}}(X_{2},X_{1})}{f_{X_{1}}(X_{1})}$$
Which, again, becomes:
$$f_{X_{2}|X_{1}}(X_{2}|X_{1})f_{X_{1}}(X_{1})=f_{X_{2},X_{1}}(X_{2},X_{1})$$
Now if like last time we can say that $X_{2}$ does not depend on $X_{1}$, then:
$$f_{X_{2}|X_{1}}(X_{2}|X_{1})=f_{X_2}(X_{2})$$
Which makes our previous equation:
$$f_{X_{2}}(X_{2})f_{X_{1}}(X_{1})=f_{X_{2},X_{1}}(X_{2},X_{1})$$
Which essentially says that the joint PDF of two independent variables is equal to the PDFs of both variables multiplied together. Now, we have enough information to find the result we were originally looking for. Let's start with $E[XY]$. Using the expectation operator this gives:
$$\int\int XYf_{X,Y}(X,Y)dxdy$$
But because the distributions are independent:
$$\int\int XYf_{X}(X)f_{Y}(Y)dxdy$$
Now we can separate the variables with the integrals. As we can see, we can pull out the PDF of $Y$ as well as $Y$ itself and integrate over $X$ and its distribution. Written out mathematically:
$$\int\int XYf_{X}(X)f_{Y}(Y)dxdy=\int Yf_{Y}(Y)\int Xf_{X}(X)dxdy$$
The inside integral becomes the average of $X$. Continuing:
$$ \int Y\mu_{X}f_{Y}(Y)dy=\mu_{X}\int Yf_{Y}(Y)dy=\mu_{X}\mu_{Y}=E[X]E[Y]$$
Which was the desired result. Now that we have this, we can talk about the MGF. The multiple variable MGF, where we take $n$ random variables, $X_{1},X_{2},...,X_{n}$, is defined as:
$$E[e^{t_{1}X_{1}+t_{2}X_{2}+...+t_{n}X_{n}}]=E[e^{\sum_{i=1}^{n}t_{i}X_{i}}]$$
Now we know that $e^{x+y}=e^{x}e^{y}$. Carrying this result gives us:
$$E[\prod_{i=1}^{n}e^{t_{i}X_{i}}]$$
Now we can finally use the result we proved earlier. Since this is the product of all the individual random variables, we can use independence. This gives us:
$$\prod_{i=1}^{n}E[e^{t_{i}X_{i}}]=\prod_{i=1}^{n}M_{X_{i}}$$
Where $M_{X_{i}}$ is the MGF of the $i$th variable. Hence, the vector valued MGF is simply the product of all the individual MGFs. From here we can use what we know about their individual MGFs to find their respective moments.
No comments:
Post a Comment