Chapter 9 Bivariate Random Variables
Before examining the joint distribution of several random variables, focus first on the bivariate case. The aim is to investigate how two random variables change together through an analysis of their joint distribution.
We can derive the marginal distribution of each individual random variable from their joint distribution. If two random variables X and Y are defined on the same sample space, and we call their respective support sets A and B then the function that is the joint distribution is given by \(f(x,y)= \mathbb{P}(X=x, Y=y)\).
It is clear that for a bivariate distribution, that
\[ \int_{A}\int_{B}f(x,y)dxdy=1 \qquad \text{and} \qquad \sum_{x \in A}\sum_{y \in B}p(x,y)=1 \]
9.1 Independence and Correlation of Two RVs
When studying the relationship between two random variables, there are two primary characteristics we are interested in– namely, independence and correlation. Independence measures a relationship between X and Y, while correlation measures a linear relationship between X and Y.