In many probabilistic scenarios, we are not interested in just a single random variable but in the interplay and relationships between several random variables defined on the same sample space. This section delves into the realm of multiple random variables, exploring concepts such as independence, joint distributions, and operations on sets of random variables. Understanding multiple random variables is crucial for modeling complex systems and analyzing dependencies between different aspects of a random experiment.
Independence of Random Variables (2.6.1)
Just as events can be independent, so too can random variables. Intuitively, two random variables are independent if knowing the value of one variable provides no information about the value of the other. Formally, independence of random variables is defined in terms of their joint probability distribution.
Definition 2.52: Random variables are said to be independent if and only if for all possible values in their respective ranges , the joint probability mass function factorizes into the product of marginal probability mass functions:
This definition extends the concept of independence from events to random variables. It requires that the joint probability of any combination of values is simply the product of the probabilities of each variable taking its respective value.
Alternative Definition using Densities: Equivalently, independence can be defined in terms of density functions: Random variables are independent if and only if for all :
This factorization property greatly simplifies calculations and analysis when dealing with independent random variables.
Lemma 2.53: If are independent random variables, and are arbitrary sets of real numbers, then the events are independent. That is:
This lemma is a crucial extension, showing that independence of random variables implies independence of events defined in terms of these variables. It allows us to work with probabilities of random variables falling within arbitrary sets, not just taking specific values, while preserving the independence property.
Corollary 2.54: Subsets of independent random variables are also independent. If are independent and , then are also independent.
Satz 2.55: If are real-valued functions, and are independent random variables, then are also independent.
This theorem is vital as it allows us to apply functions to independent random variables without destroying their independence. This is frequently used when constructing new random variables from existing independent ones, while preserving the independence property.
Composed Random Variables (2.6.2)
We can create new random variables by combining existing ones through functions. If we have random variables defined on a sample space , and a function , then is also a random variable. The probability distribution of is determined by the joint distribution of and the function .
For example, if , where and are random variables, then is a composed random variable. The probability of taking a specific value can be calculated by summing over all pairs such that :
If and are independent, this simplifies to:
This formula is known as the convolution of the density functions of and .
Satz 2.58: If and are independent random variables and , then the density function of is given by the convolution of the density functions of and :
Moments of Composed Random Variables (2.6.3)
Linearity of Expectation (Satz 2.60)
Linearity of expectation extends to multiple random variables and remains one of the most powerful and versatile tools in probability.
Satz 2.60 (Linearity of Expectation): For random variables and constants , the expectation of a linear combination is the linear combination of expectations:
Crucially, linearity of expectation holds regardless of whether the random variables are independent or dependent. This makes it a widely applicable tool.
Multiplicativity of Expectation for Independent Variables (Satz 2.61)
For products of random variables, a similar simplification holds, but only under the assumption of independence.
Satz 2.61 (Multiplicativity of Expectation): If are independent random variables, then the expectation of their product is the product of their expectations:
This multiplicativity property is a direct consequence of independence and significantly simplifies calculations involving products of independent random variables.
Variance of Sums of Independent Variables (Satz 2.62)
For the variance of sums of random variables, a similar additive property holds, again under the condition of independence.
Satz 2.62: If are independent random variables and , then the variance of the sum is the sum of the variances:
This additivity of variance for independent random variables is a powerful tool for analyzing the variability of sums of independent components.
Wald’s Identity (2.6.4)
Wald’s Identity (Satz 2.65) provides a formula for the expectation of a sum of a random number of random variables. It is particularly useful in analyzing processes that proceed in stages, where the number of stages is itself a random variable.
Satz 2.65 (Wald’s Identity): Let and be independent random variables, where takes values in natural numbers (). Let be independent copies of . Define . Then, the expectation of is given by:
Wald’s Identity is a powerful and elegant result that connects the expectation of a sum with the expectations of the number of terms and the individual terms, under the crucial assumption of independence between and the ‘s. It has applications in various areas, including queuing theory, branching processes, and the analysis of randomized algorithms.
In summary, this section has expanded our toolkit for working with random variables to encompass multiple random variables and their interactions. We have explored the crucial concept of independence, its implications for joint distributions and moments, and powerful tools like linearity of expectation and Wald’s Identity for analyzing composed random variables. These concepts will be essential for understanding and designing randomized algorithms, where randomness often arises from multiple sources and interacts in complex ways.
Prev: 04 Important Discrete Distributions | Next: 06 Estimating Probabilities