239 |
joint cumulative (probability) distribution function of $X$ and $Y.$
|
\[
F\left(x,y\right)
=\Pr{\left\{X\le x\ \mathrm{and}\ Y\le y\right\}},\\
\mathrm{where} -\infty\lt x,\ y\lt \infty
\]
|
247 |
joint cumulative (probability) distribution function of $X_1,\ldots,X_n.$
|
\[
F\left( x_1, \ldots, x_n \right)
=\Pr{ \left\{ X_1 \le x_1\ \mathrm{and} \ldots\mathrm{and} \ X_n \le x_n \right\} },\\
\mathrm{where} -\infty \lt x_1, \ldots, x_n \lt \infty
\]
|
239 |
cumulative distribution functions of $X$ and $Y$
given that $F$
is the joint cumulative probability distribution function of $X$ and $Y.$
$F_x$ and $F_y$ are sometimes called the
marginal distributions of $X$ and $Y$.
|
\[
F_X\left(x\right)
=\lim\limits_{y\rightarrow\infty}{F(x,y)}
=F\left(x,\infty\right)
\\
F_Y\left(y\right)
=\lim\limits_{x\rightarrow\infty}{F(x,y)}
=F\left(\infty,y\right)
\]
|
240 |
joint probability mass function of $X$ and $Y$
|
\[
\Pr{\left(x,y\right)}=\Pr{\left\{X=x\ \mathrm{and}\ Y=y\right\}}
\]
|
240 |
probability mass functions of discrete random variables $X$ and $Y$
given that $p$ is their joint probability mass function.
The individual probability mass functions $p_x(x)$ and $p_y(y)$
are sometimes called marginal probability mass functions of $X$ and $Y.$
|
\[
p_X\left(x\right)
=\Pr{\left\{X=x\right\}}
=\sum_{y:p\left(x,y\right)\gt 0}{p(x,y)}
\\
p_Y\left(y\right)
=\Pr{\left\{Y=y\right\}}
=\sum_{x:p\left(x,y\right)\gt 0}{p(x,y)}
\]
|
242 |
Let $A$ and $B$ be sets of real numbers and
$C=\left\{\left(x,y\right):x\in A,\ y\in B\right\}.$
$X$ and $Y$ are
jointly continuous
if there exists
a function $f\left(x,y\right)$ called the
joint probability density function of $X$ and $Y$
defined for all real $x$ and $y$ having the property that
for every set $C$ of pairs of real numbers (that is,
$C$ is a set in the two-dimensional plane),
the given equation holds.
|
\[
\eqalign{
\Pr{\left\{\left(X,Y\right)\in C\right\}}
&=\Pr{\left\{X\in A,Y\in B\right\}}\\
&=\iint\limits_{\left(x,y\right)\in C} f\left(x,y\right)dx\,dy\\
&=\int\limits_{B}\int\limits_{A} f\left(x,y\right)dx\,dy
}
\]
|
247 |
More generally, let $A_1,\ldots,A_n$
be $n$ sets of real numbers and
$C=\left\{\left(x_1,\ldots,x_n\right):x_1\in A_1,\ldots,x_n\in A_n\right\}.$
Then the $n$ random variables $X_1,\ldots,X_n$ are
jointly continuous
if there exists a function $f\left(x_1,\ldots,x_n\right),$ called the
joint probability density function,
such that for any set $C$ in $n\textrm{-space},$
the given equation holds.
|
\[
\eqalign{
\Pr{\left\{\left(X_1,\ldots,X_n\right)\in C\right\}}
&= \Pr{\left\{X_1\in A_1,\ldots,\ X_n\in A_n\right\}}\\
&= \idotsint\limits_{x_1, \ldots, x_n \in C} f\left(x_1, \ldots, x_n\right) dx_1\cdots dx_n\\
&= \int\limits_{A_1}\cdots\int\limits_{A_n}{f\left(x_1,\ldots,x_n\right)dx_1\cdots dx_n}
}
\]
|
242 |
|
\[
\eqalign{
F\left(x,y\right)
&=\Pr{\left\{X\le x\land Y\le y\right\}} \\
&=\int\limits_{-\infty}^{y}\int\limits_{-\infty}^{x}f\left(x,y\right)dx\,dy
}
\]
|
242 |
|
\[
f\left(x,y\right)=\frac{\partial^2}{\partial x\partial y}F\left(x,y\right)
\]
|
243 |
If $X$ and $Y$ are jointly continuous with joint probability density function $f,$
then they are
individually continuous.
Let $F$ denote their joint cumulative probability distribution function, and
$f_x\left(x\right)$ and $f_y\left(y\right)$
their
individual probability density functions.
Then the given equations hold.
Functions $f_x\left(x\right)$ and $f_y\left(y\right)$
are sometimes called
marginal probability density functions.
|
\[
f_X\left(x\right)=\int\limits_{-\infty}^{\infty}f\left(x,y\right)dy
\\
f_Y\left(y\right)=\int\limits_{-\infty}^{\infty}f\left(x,y\right)dx
\]
|
248 |
Random variables $X$ and $Y$ are
independent
if for any two sets of real numbers $A$ and $B$ the first equation holds.
Equivalently, $X$ and $Y$ are independent if for all $x$ and $y$ the second equation holds.
Equivalently, $X$ and $Y$ are independent if the events
$E_A=\left\{X\in A\right\}$ and
$E_B=\left\{X\in B\right\}$
are independent.
If random variables are not independent, they are
dependent.
|
\[
\eqalign{
\Pr{\left\{ X \in A, Y \in B \right\}}
&= \Pr{\left\{ X \in A \right\}} \Pr{\left\{ Y \in B \right\}} F\left( x, y \right) \\
&= F_X\left( x \right) F_Y\left( y \right)
}
\]
|
254 |
More generally, the $n$ random variables $X_1,\ldots,X_n$
are
independent
if, for all sets of real numbers $A_1,\ldots,A_n$
the given equation holds.
An infinite collection of random variables is independent
if every finite subcollection
of them is independent.
|
\[
\eqalign{
\Pr{\left\{X_1\in A_1,\ldots,X_n\in A_n\right\}}
&=\Pr{\left\{X_1\le x_1,\ldots,X_n\le x_n\right\}} \\
&=\prod_{i=1}^{n}\Pr{\left\{X_i\in A_i\right\}} \\
&=\prod_{i=1}^{n}\Pr{\left\{X_i\in A_i\right\}}
}
\]
|
248 |
If $X$ and $Y$ are discrete random variables,
then $X$ and $Y$ are
independent
if for all $x$ and $y$ the given equation holds.
|
\[
p\left(x,y\right)=p_X\left(x\right)p_Y\left(y\right)
\]
|
248 |
If $X$ and $Y$ are jointly continuous,
then $X$ and $Y$ are
independent
if for all $x$ and $y$ the given equation holds.
|
\[
f\left(x,y\right)=f_X\left(x\right)f_Y\left(y\right)
\]
|
253 |
The continuous (discrete) random variables $X$ and $Y$ are
independent
if and only if their
joint probability density (mass) function
can be expressed by the given equation.
(Note. The author uses $f(x,y)$ and
$f_{X,Y}\left(x,y\right)$
interchangeably, the latter stressing the random variables
for which $f$ is a density function.)
|
\[
f_{X,Y}\left(x,y\right)
= h\left(x\right)g\left(y\right), \\
\mathrm{where}
-\infty\lt x,y\lt \infty
\]
|
260 |
|
\[
\eqalign{
F_{X+Y}\left(a\right)
&= \Pr{\left\{X+Y\le a\right\}} \\
&= \iint\limits_{x+y\le a}{f_X\left(x\right)f_Y\left(y\right)dx\,dy}
}
\]
|
261 |
|
\[
F_{X+Y}\left(a\right)=\frac{d}{da}F_{X+Y}\left(a\right)
\]
|
260 |
sums of independent random variables
|
|
261 |
sum of two independent uniform random variable
|
|
263 |
chi-squared with n degrees of freedom
|
\[
\]
|
267 |
sums of independent Poisson random variables
|
|
267 |
sums of independent binomial random variables
|
|
268 |
If $X$ and $Y$ are discrete random variables, then
the conditional probability mass function
of $X$ given that $Y=y$ is defined
for all $y:p_Y\left(y\right)\gt 0$ as given.
|
\[
\eqalign{
P_{X \mid Y}\left(x \mid y\right)
&= \Pr{\left\{X=x \mid Y=y\right\}} \\
&= \frac{p\left(x,y\right)}{p_Y\left(y\right)}
}
\]
|
269 |
If $X$ and $Y$ are discrete random variables, then
the conditional probability distribution function
of $X$ given that $Y=y$ is defined
for all $y:p_Y\left(y\right)\gt 0$ as given.
|
\[
\eqalign{
F_{X \mid Y}\left(x \mid y\right)
&= \Pr{\left\{X\le x \mid Y=y\right\}} \\
&= \sum_{a\le x}{p_{X \mid Y}\left(a \mid y\right)} \\
&= \sum_{a\le x}\frac{p\left(x,y\right)}{p_Y\left(y\right)}
}
\]
|
269 |
If $X$ is
independent
of $Y,$ then the equations hold.
|
\[
F_{X \mid Y}\left(x \mid y\right)=F_X(x)
\\
p_{X \mid Y}\left(x \mid y\right)=p_X\left(x\right)
\]
|
270 |
If $X$ and $Y$ are continuous random variables, then
the conditional probability density function
of $X$ given that $Y=y$ is defined
for all $y:p_Y\left(y\right)\gt 0$ as given.
|
\[
f_{X \mid Y}\left(x \mid y\right)=\frac{f\left(x,y\right)}{f_Y\left(y\right)}
\]
|
271 |
If $X$ and $Y$ are continuous random variables, then
the conditional cumulative distribution function
of $X$ given that $Y=y$ is defined for all $y:p_Y\left(y\right)\gt 0$ as given.
|
\[
\eqalign{
F_{X \mid Y}\left(x \mid y\right)
&=\Pr{\left\{X\le x \mid Y=y\right\}}\\
&=\int_{-\infty}^{\infty}{f_{X \mid Y}\left(x \mid y\right)dx}
}
\]
|