| 127 |
probability mass function
If $X$ can assume at most a countable number of values
$x_1,x_2,\ldots$ then it is a discrete random variable,
and the probability mass function is defined as given.
|
\[ p\left(x\right)=\Pr{\left\{X=x\right\}} \]
|
| 128 |
consequences for the probability mass function
of a discrete random variable $X$ which can assume at most the countable number of values $x_1,x_2,\ldots$
|
\[
\begin{array}{ll}
p\left( x_i \right) \geq 0 & i = 1,2,3,\ldots\\
p\left(x\right)=0 & \mathrm{all\ other\ values\ of\ x}\\
\sum\limits_{ i = 1 }^{ \infty } p \left( x_i \right) = 1
\end{array}
\]
|
| 129 |
cumulative distribution function
|
\[
F\left(a\right)=\sum_{x:x\le a} p\left(x\right)
\]
|
| 129 |
constant cumulative distribution function.
Holds if the CDF is constant,
$x_1 \lt x_2 \lt x_2 \lt \cdots,$ and $x_{i-1} \le a \le x_i$
|
\[
p\left(a\right)=F\left(x_i\right)-F\left(x_{i-1}\right)
\]
|
| 130 |
|
\[
\mu=E\left[X\right]=\sum_{x:p\left(x\right)>0} x p\left(x\right)
\]
|
| 137 |
expectation of a discrete random variable
also called the
mean of $X$ and
first moment of $X,$ and
center of mass.
This number is a constant.
|
|
| 134 |
expectation of a function of a discrete random variable
|
\[
E \left[ g \left( X \right) \right]
= \sum_{ x : p \left( x \right) \gt 0} g \left( x \right) p \left( x \right)
\]
|
| 137 |
sum rule for expectation,
$a$ and $b$ constants.
|
\[
E \left[ aX + b \right]
= aE \left[ X \right] + b
\]
|
| 137 |
$n$th moment of $X$,
power rule for expectation
|
\[
E\left[X^n\right]=\sum_{x:p\left(x\right)>0}{x^np\left(x\right)}
\]
|
| 139 |
variance for discrete random variable
It is the
second moment
of $X-\mu$ or the
moment of inertia
|
\[
\begin{array}{l}
\Var\left(X\right)
&= E\left[\left( X - \mu^2 \right)\right]\\
&= E\left[X^2\right] - \mu^2
\end{array}
\]
|
| 139 |
standard deviation
|
\[
SD\left(X\right)=\sqrt{\Var\left(X\right)}
\]
|
| 139 |
Bernoulli random variable
$X$ can only assume two values, $0$ or $1.$
The parameter $p\in\left(0,1\right).$
The value $0$ is used for failure, $1$ for success.
|
\[
p \left( i \right) =
\left\{
\begin{array}{cl}
1-p & i=0\\
p & i=1
\end{array}
\right.
\]
|
| 140 |
binomial random variable,
probability mass function
|
\[
p\left(i\right)=\sum_{i=0}^{n}{\binom{n}{i}p^i\left(1-p\right)^{n-i}}
\]
|
144 145 |
binomial random variable, expectation
|
\[
E \left[ X \right] = np
\]
|
| 145 |
binomial random variable, variance
|
\[
\begin{array}{l}
\Var\left(X\right) &= np \left(1 - p \right)\\
&= npq
\end{array}
\]
|
| 147 |
Stirling's approximation
approximation of $n!$ for larger $n.$
|
\[
\begin{array}{l}
n! &\approx n^{n+1/2} e^{-n} \sqrt{2 \pi}\\
&= n^n e^{-n} \sqrt{2 \pi n}
\end{array}
\]
|
| 149 |
Poisson random variable, probbility mass function
If $X$ takes values $0, 1, 2, \ldots$ such that
$\Pr{\left\{X=k\right\}}=\frac{e^{-\lambda}\lambda^k}{k!}, \lambda>0,$
then $X$ is a Poisson random variable with parameter $\lambda.$
|
\[
p\left(k\right)=\frac{e^{-\lambda}\lambda^k}{k!},\ \lambda>0
\]
|
|
Poisson random variable, expectation
|
\[
E\left[X\right]=\lambda
\]
|
|
Poisson random variable, variance
|
\[
\Var \left( X \right) = \lambda
\]
|
| 4.7 |
Poisson approximation of a binomial random variable
If $np=\lambda$ remains constant as $n\rightarrow\infty.$
|
\[
p\left(k\right)\approx\frac{e^{-\lambda}\lambda^k}{k!},\ \lambda \gt 0
\]
|
|
geometric random variable, probability mass function
|
\[
p\left(k\right)=\left(1-p\right)^{k-1}p,\quad k=1,2,3,\ldots
\]
|
|
geometric random variable, expectation
|
\[
E\left[X\right]=\frac{1}{p}
\]
|
|
negative binomial random variable, probability mass function
|
\[
p\left(k\right)=\binom{k-1}{r-1}p^r\left(1-p\right)^{k-r},\quad k\geq r
\]
|
|
negative binomial random variable, expectation
|
\[
E\left[X\right]=\frac{r}{p}
\]
|
|
negative binomial random variable, variance
|
\[
\Var \left( X \right) = \frac{ r \left(1-p \right)} { p^2 }
\]
|
|
hypergeometric random variable
|
|
|
The Zeta (or Zipf) distribution
|
|
| 4.9 |
properties of the cumulative distribution function
$F$ is a Distribution Function of the random variable $X$ if the conditions holds.
|
-
$F \left( x \right) = \Pr{\left( X \le x \right)}$
-
$F$ is non-decreasing
-
$0 \lt F \lt 1$
-
\(
\begin{array}{ll}
\lim\limits_{ x \rightarrow - \infty} F \left( x \right) = 0\\
\lim\limits_{ x \rightarrow \infty} F \left( x \right) = 1\\
\lim\limits_{ x \rightarrow a+ } F \left( x \right) = F \left(a \right)\\
\lim\limits_{ x \rightarrow a- } F \left(x \right) = F \left(a \right)
\end{array}
\)
|