Sample Space (Ω) |
Set of all possible outcomes of a random experiment. |
Event |
Outcome of a random experiment (inside Ω) |
σ-field |
The allowable events constitute a family of sets F, usually referred to as σ-field. Each set in F is a subset of the sample space Ω. |
Probability measure (P) |
A probability measure on (Ω, F) is a function P : F → [0, 1] that satisfies the following two properties: 1. P[Ω] = 1 2. The probability of the union of a collection of disjoint members is the sum of its probabilities |
Probability space |
(Ω, F, P) |
Basic properties of probability measures |
P[∅] = 0 P[A¯] = 1 − P[A] If A ⊂ B, then P[B] = P[A] + P[B \ A] ≥ P[A] P[A ∪ B] = P[A] + P[B] − P[A ∩ B] |
Inclusion Exclusion Principle (comes from last basic property of probability measures) |
Given sets A1, A2... P[union(Ai)] ≤ sum(P[Ai]) When the two events are disjoint, the inequality is = as they don't share any common space: P[A ∩ B] = 0 |
Sampling strategy |
Choose repeatedly a random number in Ω |
Sampling with replacement |
Select random numbers in Ω, without taking into account which ones you've already tested. Therefore, there will be some numbers tested multiple times |
Sampling without replacement |
Select random numbers in Ω taking into account which ones you've already used. Therefore, you won't run the algorithm with the same number more than once |
Independent (events or family) |
Two events are independent if: P[A ∩ B] = P[A] P[B] It also applies to families {Ai, i∈ I} |
Pairwise |
To form all possible pairs (two items at a time) from a set |
Pairwise independent (family or events) |
A family or events are pairwise independent if: P[Ai ∩ Aj ] = P[Ai] P[Aj] for all i != j In english terms, a family or events is pairwise independent if any of its possible pairs is independent of each other. For example: P(A∩B)=P(A)P(B) P(A∩C)=P(A)P(C) P(B∩C)=P(B)P(C) |
Mutually independent (events) |
More than two events (i.e. A,B,C) are mutually independent if: 1. They are pairwise independent 2. They meet the condition: P(A ∩ B ∩ C) = P(A) × P(B) × P(C) In plain english, events are mutually independent if any event is independent to the other events |
Conditional Probability |
If P[B] > 0, the conditional probability that A occurs give that B occurs is: P[A|B]=P[A∩B]/P[B] |
Conditional Probability (independent events) |
If A and B are independent events, then: P[A|B] = P[A∩ B]/P[B] = (P[A]*P[B])/P[B] = P[A] |
Law of Total Probability |
Let e1...en be partitions of Ω (a collection of ALL the sets in Ω which are independent of each other). Also assuming P[ei] > 0 for all i. The probability of A can be written as: P[A] = sum(i=1,n)(P[A|ei]*P[ei]) In english, it's the sum of all the possible scenarios in which A can occur |
Bayes Theorem |
Assuming e1...en be partitions of Ω: P[ej|B] = P[Ej ∩ B]/P[B] = (P[B|Ej]P[Ej])/(sum(i=1,n)(P[B|ei]P[ei]) It's basically using conditional theory and then applying conditional theory again for the top part and law of total probability in the lower part |