17 Sums of Random Variables

17.1 Linear Transformation of a RV

Suppose that \(X\) is a random variable and that \(a\) and \(b\) are constants. Then:

\[{\operatorname{E}}\left[a + bX \right] = a + b {\operatorname{E}}[X]\]

\[{\operatorname{Var}}\left(a + bX \right) = b^2 {\operatorname{Var}}(X)\]

17.2 Sums of Independent RVs

If \(X_1, X_2, \ldots, X_n\) are independent random variables, then:

\[{\operatorname{E}}\left[ \sum_{i=1}^n X_i \right] = \sum_{i=1}^n {\operatorname{E}}[X_i]\]

\[{\operatorname{Var}}\left( \sum_{i=1}^n X_i \right) = \sum_{i=1}^n {\operatorname{Var}}(X_i)\]

17.3 Sums of Dependent RVs

If \(X_1, X_2, \ldots, X_n\) are possibly dependent random variables, then:

\[{\operatorname{E}}\left[ \sum_{i=1}^n X_i \right] = \sum_{i=1}^n {\operatorname{E}}[X_i]\]

\[{\operatorname{Var}}\left( \sum_{i=1}^n X_i \right) = \sum_{i=1}^n {\operatorname{Var}}(X_i) + \sum_{i \not= j} {\operatorname{Cov}}(X_i, X_j)\]

Note that when \(X_i\) and \(X_j\) are independent (\(i \not= j\)), then \({\operatorname{Cov}}(X_i, X_j) = 0\).

17.4 Means of Random Variables

Suppose \(X_1, X_2, \ldots, X_n\) are independent and identically distributed (iid) random variables. Let \(\overline{X}_n = \frac{1}{n} \sum_{i=1}^n X_i\) be their sample mean. Then:

\[{\operatorname{E}}\left[\overline{X}_n \right] = {\operatorname{E}}[X_i]\]

\[{\operatorname{Var}}\left(\overline{X}_n \right) = \frac{1}{n}{\operatorname{Var}}(X_i)\]