6.1 Random Variables

A random variable \(X\) is a variable that can take different values and there is a probability associated for each value or range of values. We differentiate between discrete and continuous random variables. The important aspect to keep in mind is that the sum of the probabilities for all values has to be equal to one!

6.1.1 Expected Value and Variance

Think of the expected value as a weighted average. Let \(X\) be a discrete random variable taking the values \(x_1,x_2,\dots\) and with probability mass function \(p\). Then the expected value of \(X\), \(E(X)\), is defined to be \[E(X)=\sum_i x_i \cdot P(X=x_i) = \sum_i x_i \cdot p(x_i)\] Let \(X\) be a continuous random variable taking the values with probability density function \(f(x)\). Then the expected value of \(X\), \(E(X)\), is defined to be \[E(X)=\int_{-\infty}^{\infty} x \cdot f(x) dx\] The variance can be calculated in with to different equations: \[Var(X) = E(X-E(X))^2 = E(X^2)-E(X)^2 \]

Both equations give you the variance. Sometimes one of the equations is more convenient to use. Note that \(E(X^2) \neq E(X)^2\).

Suppose you roll a die and observe the number that comes up. The probability mass or frequency function is given by \[p(x_i)=P(X=x_i)= \frac{1}{6} \quad \text{for i=1,...,6}\] Thus, the expected value can be calculated as follows: \[E(X) = \sum_{i=1}^6 x_i \cdot \left(\frac{1}{6}\right) = 21/6 = 3.5\]