ABSTRACT
Linearity of Expectation is a powerful property of expected values which states that the expectation of a sum of random variables is equal to the sum of their individual expectations. Crucially, this property holds true even if the random variables are dependent.
The Theorem
For any finite collection of random variables , the expected value of their sum is:
General Form
In its most general linear form, including constants and :
Key Property: No Independence Required
The most significant aspect of Linearity of Expectation is that it does not require the variables to be independent. While the probability of a joint event changes based on dependence, the average of their sum remains additive.
- To find : You must know if and are independent.
- To find : You only need to know the individual expectations and .
Examples
1. Sum of Two Dice
In the Expected Value note, we calculated the sum of two dice by summing all 36 possible outcomes, resulting in . Using Linearity of Expectation, we can simplify this significantly:
Let be the result of the first die and be the result of the second die.
- We know and .
- By linearity:
2. Indicator Variables (Binomial Expectation)
Suppose you flip a biased coin times where the probability of heads is . Let be the total number of heads. We can define an indicator variable for each flip:
- if the flip is heads.
- if the flip is tails.
The expectation of one indicator variable is .
The total number of heads is .
By linearity:
Related Notes
- Expected Value: The fundamental definition of the weighted average.
- Random Variables: The numerical functions upon which expectation is calculated.
- Independent Events: While not required for linearity, independence is required for the variance of a sum to be additive.
- Binomial Distribution: Where the formula is standard.