Show Menu
Cheatography

Defini­tions

Sample Space
The set of all possible outcomes of an experiment is called the sample space and is denoted by Ω.
Sigma field
A collection of sets F of Ω is called a σ-field if it satisfies the following condit­ions:
 
1. ∅ ∈ F  ­ ­ ­ 2. If A1,...,∈ F then 􏰍U∞1 Ai ∈ F
 
3. If A ∈ F then Ac ∈ F
Probab­ility
A probab­ility measure P on (Ω, F ) is a function P : F → [0, 1] which satisfies:
 
1.P(Ω)=1 and P(∅)=0
2.
Condit­ional Probab­ility
Consider probab­ility space (Ω, F , P) and let A, B ∈ F with P(B) > 0. Then the condit­ional probab­ility that A occurs given B occurs is defined to be: P(A|B) = P(A ∩ B) / P(B)
Total Probab­ility
A family of sets B1, . ., Bn is called a partition of Ω if: ∀i !=j Bi ∩Bj =∅ and 􏰒U∞1 Bi =Ω
P(A) = ∑n1 P(A|Bi­)P(Bi)
P(A) = ∑n1 P(A∩Bi)
Indepe­ndence
Consider probab­ility space (Ω, F , P) and let A, B ∈ F . A and B are indepe­ndent if P(A ∩ B) = P(A)P(B)
 
More generally, a family of F−sets A1,...,An (∞ > n ≥ 2) are indepe­ndent if􏰃􏰓􏰄􏰐 P(∩n1 Ai) = ∏ n1 P(Ai)
Random Variable (RV)
A RV is a function X : Ω → R such that for each x ∈ R, {ω ∈ Ω : X(ω) ≤ x} ∈ F. Such a function is said to be F−meas­urable
Distri­bution Function
Distri­bution function of a random variable X is the function F : R → [0, 1] given by F(x)=P(X ≤x), x∈R.
Discrete RV
A RV is said to be discrete if it takes values in some countable subset X = {x1,x2­,...} of R
PMF
PMF of a discrete RV X, is the function f :X→[0,1] defined by f(x)=P(X =x). It satisfy:
PDF
function f is called the probab­ility density function (PDF) of the con- tinuous random variable X
 
1. set of x s.t. f(x) != 0 is countable
 
f(x) = F'(x)
 
2. ∑􏰊x∈X f(x) = 1
 
F(x) = ∫-∞x f(u) du
 
3. f(x) ≥ 0
Indepe­ndence
Discrete RV X and Y are indie if the events {X = x} & {Y =y} are indie for each(x­,y)∈X×Y
The RV X and Y are indie if {X≤x} {Y≤y} are indie events for each x, y ∈ R
 
P(X,Y) = P(X=x)­P(Y=y)
 
f(x,y) = f(x)f(y)
f(x,y) = f(x)f(y) F(x,y) v
 
E[XY] = E[X]E[Y]
Expect­ation
expected value of RV X on X,
The expect­ation of a continuous random variable X with PDF f is given by
 
E[X] = ∑x∈X xf(x)
E[X] = ∫x∈X xf(x) dx
 
E[g(x)] = ∑x∈X g(x)f(x)
E[g(x)] = ∫x∈X g(x)f(x) dx
Variance
spread of RV
E[(X − E[X])2]
E[X2] - E[X]2
MGF (uniquely charac­terises distri­bution)
M(t) = E[eXt] = ∑x∈X eXt f(x)
t∈T s.t. t for ∑x∈X eXt f(x) < ∞
 
M(t) = E[eXt] = ∫x∈X eXt f(x) dx
t∈T s.t. t for ∫x∈X eXt f(x) dx < ∞
 
M(t1,t2) = E[eXt1+Yt2] = ∫z eXt1+Yt2 f(x,y) dxdy (t1,t2)∈T
E[X] = ∂/∂t1 M(t1,t2) |t1=t2=0
E[XY] = ∂2/∂t1∂t2 M(t1,t2) |t1=t2=0
 
E[Xk] = Mk(0)
Moment
Given a discrete RV X on X, with PMF f and k ∈ Z+, the kth moment of X is
E[Xk]
Central Moment
kth central moment of X is
E[(X − E[X])k]
Dependence
Joint distri­bution function F : R2 → [0,1] of X,Y where X and Y are discrete random variables, is given by F(x,y) = P(X≤x∩Y≤y)
The joint distri­bution function of X and Y is the function F : R2 → [0, 1] given by F(x,y)­=P(­X≤x,Y ≤y)
 
Joint mass function f : R2 → [0, 1] is given by f(x,y) = P(x∩y)
The random variables are jointly continuous with joint PDF f : R2 → [0, ∞) if F(x, y) = ∫-∞y∫-∞x f(u,v) dudv
   
f(x,y) = ∂2/∂x∂y F(x,y)
Marginal
f(x) = ∑y∈Y f(x,y)
f(x) = ∫y∈Y f(x,y)dy
F(x) = lim y->∞ F(x,y) F(x) = ∫-∞x∫-∞∞ f(u,y) dydu
 
E[g(x,y)] = ∑x,y∈XxY g(x,y)­f(x,y)
E[g(x,y)] = ∫x,y∈XxY g(x,y)­f(x,y) dxdy
Covariance
indie => E[XY] = E[X]E[Y], Cov = 0 => ρ = 0
ρ = 0 => E[XY] = E[X]E[Y]
 
Cov[X,Y] = E[(X − E[X])(Y − E[Y])]
Cov[X,Y] = E[XY] - E[X]E[Y]
Correl­ation
Gives linear relati­onship (+/-). |ρ| close to 1 is strong, close to 0 is weak
special for bi-variate normal, indie <=> uncorr­elated
 
ρ(X,Y)= Cov[X,Y] / sqrt(􏰗­Var­[X]­Var[Y])
Condit­ional distri­bution
The condit­ional distri­bution function of Y given X, written FY |x(·|x), is defined by
F(y|x) = ∫-∞y f(x,v)­/f(x) dv
f(y|x) = f(x,y)­/f(x) where f(x) = ∫-∞∞ f(x,y) dy
 
Fy|x(y|x) = P(Y ≤ y|X = x)
 
for any x with P(X =x)>0. The condit­ional PMF of Y given X =x is defined by ... when x is s.t. P(X =x)>0
 
f(y|x) = P(Y = y|X = x)
 
f(x,y) = f(x|y)f(y) or f(y|x)f(x)
Condit­ional expect­ation
The condit­ional expect­ation of a RV Y, given X = x is E[Y|X =x] = 􏰏∑y∈Y yf(y|x) given that the condit­ional PMF is well-d­efined
E[h(X)­g(Y)] = E[E[g(­Y)|­X]h(X)] = ∫(∫g(Y­)f(Y|X) dx) h(X)f(x) dx
 
E[Y|X =x] = 􏰏∑y∈Y yf(y|x)
E[E[Y|X]] = E[Y]
E[E[Y|­X]g(X)] = E[Yg(X)]
 
E[(aX + bY)|Z] = aE[X|Z] + bE[Y|Z]
 
if X and Y are indepe­ndent
E[X|Y] = E[X]
Var[X|Y] = E[X2|Y] - E[X|Y]2

Theorems

Bayes Theorem
Consider probab­ility space (Ω, F , P) and let A, B ∈ F with P(A), P(B) > 0. Then we have:
 
P(B|A) = P(A|B)P(B) / P(A)
Indepe­ndence
If X and Y are indie RV and g : X → R, h : Y → R, then the RV g(X) and h(Y ) are also indie
Expect­ations
1. if X≥0, E[X]≥0
 
2. if a, b∈R then E[aX+b­Y]=­aE[­X]+­bE[Y]
 
3. if X = c∈R always, then E[X]=c.
Variance
1. For a ∈ R, Var[aX] = a2Var[X]
 
2. Uncorr­elated Var[X + Y] = Var[X] + Var[Y]
Condit­ional Expect­ation
Condit­ional expect­ations satisfies E[E[Y|X]] = E[Y] assuming all the expect­ations exist
 
for any g : R → R, E[E[Y|­X]g(X)] = E[Yg(X)] assuming all expect­ations exist
Change of variable
If (X1,X2) have joint density f(x,y) on Z, then for (Y1,Y2) = T(X1,X2), with T as described above, the joint density of (Y1,Y2), denoted g is: g(y1,y­2)=f(T−1(y1,y2),T−1(y1,y2)) |J(y ,y )| (y1,y2)∈T
 

Comments

No comments yet. Add yours below!

Add a Comment

Your Comment

Please enter your name.

    Please enter your email address

      Please enter your Comment.

          Related Cheat Sheets

          Studying Cheat Sheet
          Understanding Question Terms: Bloom's Taxonomy Cheat Sheet
          Biology A level - Homeostasis Cheat Sheet