P(A) |
Probability of event A occuring. (Number of ways event A can occur / Number of total outcomes possible)) |
Ω |
Sample Space/Universe. P(Ω)=1 |
∅ |
Empty/Null set |
P(A∩B) |
Probability of A Intersection B |
Disjoint/Independent/Mutually Exclusive |
If A∩B = ∅ then disjoint/independent of each other |
P(A∪B) |
If disjoint/independent of one another P(A∪B) = P(A) + P(B)
If not disjoint P(A∪B) = P(A) + P(B) - P(A∩B) |
Ac |
A complement. Everything outside A. P(Ac) = 1 - P(A) |
A∈B / A∉B |
A is an element of B / A is not an element of B |
A: A ∈ B |
A such that A is an element of B |
n! aka Permutations |
Counting method where ORDER matters. n! = n(n-1)(n-2)...(n-k+1) where k = sample size |
(nk) aka Combinations |
Counting method where order does not matter. (nk) = n!/k!(n-k)!) |
P(A|B) aka Conditional Probability |
The Probability of A happening, given that B occurs.
If A and B are disjoint/independent/mutually exclusive then P(A|B)=P(A) as B has no effect on A.
If A and B are dependent ie. B has an effect on the chances of A the P(A|B) = P(A∩B)/P(B)
P(A|B)+P(Ac|B)=1 |
P(Bi|A)P(A) = P(A|Bi)P(Bi) |
Proven by the combination of Bayes' rule and Law of total probability applied to P(A) |
Independence of more than 2 events |
Events A1,A2,...,Am are independent if P(∩mi=1Ai) = ∏mi=1P(Ai) (ie. They are independent events if the probability of all of their intersections are equal to the product of all of their individual probabilities)
A and B are independent. B and C are independent. This does not mean that A and C are independent, nor does it mean they must be dependent. |
Random variable aka. rv |
Any variable whose value is not known prior to the experiment and are subject to chance aka. Variability aka. Change. Has an associated probability aka. mass
An rv is a type of mapping function over the whole sample space and is associated with measure theory. ie. An rv can transform the sample space. |
Discrete |
There is a set number of outcomes |
Discrete Random Variable |
Any function X: Ω→ℝ that takes on some value. eg. X could be S=sum or M=max ran on a sample space, getting the sum/max of each experiment outcome and constructing a new sample space out of it. |
Probability Mass Function aka pmf |
|
Cumulative Distribution Function aka cdf |
|
Continuous |
An infinite number of possible values. |
Continuous Random Variables |
Is a function X: Ω→ℝ that takes on any value a∈ℝ Mass/Associated probability no longer considered for each possible value of X instead consider the likelihood that X∈(a,b) for a<b. |
Probability Density Function aka pdf |
Pdf on a continuous rv f(x) of X is an integrable function such that... P(a<=X<=b) = ∫ ba
f(x)dx ie. it is the area under the cure between points a and b. Therefore it is the probability of a range of values occuring s.t. conditions on f f(x)>=0 ∀x∈Ω ∫ ∞-∞
f(x)dx=1 ie. the complete area under the curve contains all outcomes. This is defined by the formula... F(x)=∫ x-∞
f(u)du = P(X<=x) |
Expectation aka. E(x) |
The expected value of a random variable This is found using the formula when our rv is discrete E(X) = Σ xi∈Ω
xi p(xi) and the following formula when the rv is continuous E(X)=∫ Ω
x f(x)dx To make this easier to understand The expected value is simply the mean and is calculated as the sum of (each possible value muiltiplied by it's independent probability) ie The sum of weighted values to probabilities |
Variance aka Var(X) |
A method of measuring how far the actual value of a rv may be from the expected value. Given a discrete variable X the formula is.. Var(X)=Σ xi∈Ω
x 2i p(xi) - (Σ xi∈Ω
xi p(xi)) 2Or given a continuous rv use the formula Var(X)=∫ Ω
x 2 f(x)dx - (∫ Ω
x f(x)dx) 2In other words we sum up (the squared value's multiplied by their individual probabilities) and finally deduct the the squared expected value. |
Standard Deviation |
Another method similar to variance about looking at how far distribution goes from the mean ie. The actual value vs the expected value. Simply calculated with the sqrt(Var(X)). Benefit of this is that it is expressed in the same unit that X is expressed in rather than the squared as variance is. |