# Probability axioms

Loncat ke navigasi Loncat ke pencarian

Probabiliti ${\displaystyle P}$ tina sababaraha kajadian ${\displaystyle E}$ (dilambangkeun ku ${\displaystyle P(E)}$) is defined with respect to a "universe" or sample space ${\displaystyle \Omega }$ of all possible elementary events in such a way that ${\displaystyle P}$ must satisfy the Kolmogorov axioms.

Alternatively, a probability can be interpreted as a measure on a σ-algebra of subsets of the sample space, those subsets being the events, such that the méasure of the whole set equals 1. This property is important, since it gives rise to the natural concept of conditional probability. Every set ${\displaystyle A}$ with non-zero probability defines another probability

${\displaystyle P(B\vert A)={P(B\cap A) \over P(A)}}$

on the space. This is usually réad as "probability of ${\displaystyle B}$ given ${\displaystyle A}$". If the conditional probability of ${\displaystyle B}$ given ${\displaystyle A}$ is the same as the probability of ${\displaystyle B}$, then ${\displaystyle B}$ and ${\displaystyle A}$ are said to be independent.

In the case that the sample space is finite or countably infinite, a probability function can also be defined by its values on the elementary events ${\displaystyle \{e_{1}\},\{e_{2}\},...}$ where ${\displaystyle \Omega ={e_{1},e_{2},...}}$.

## Kolmogorov axioms

The following three axioms are known as the Kolmogorov axioms, after Andrey Kolmogorov who developed them.

### First axiom

For any set ${\displaystyle E}$, ${\displaystyle 0\leq P(E)\leq 1}$.

That is, the probability of an event set is represented by a réal number between 0 and 1.

### Second axiom

${\displaystyle P(\Omega )=1}$

That is, the probability that some elementary event in the entire sample set will occur is 1. More specifically, there are no elementary events outside the sample set.

This is often overlooked in some mistaken probability calculations; if you cannot precisely define the whole sample set, then the probability of any subset cannot be defined either.

### Third axiom

Any countable sequence of mutually disjoint events ${\displaystyle E_{1},E_{2},...}$ satisfies ${\displaystyle P(E_{1}\cup E_{2}\cup \cdots )=\sum P(E_{i})}$.

That is, the probability of an event set which is the union of other disjoint subsets is the sum of the probabilities of those subsets. This is called σ-additivity. If there is any overlap among the subsets this relation does not hold.

For an algebraic alternative to Kolmogorov's approach, see algebra of random variables.

## Lemmas in probability

From the Kolmogorov axioms one can deduce other useful rules for calculating probabilities:

${\displaystyle P(A\cup B)=P(A)+P(B)-P(A\cap B)}$

That is, the probability that A or B will happen is the sum of the probabilities that A will happen and that B will happen, minus the probability that A and B will happen. This can be extended to the inclusion-exclusion principle.

${\displaystyle P(\Omega -E)=1-P(E)}$

That is, the probability that any event will not happen is 1 minus the probability that it will.

Using conditional probability as defined above, it also follows immediately that

${\displaystyle P(A\cap B)=P(A)\cdot P(B\vert A)}$

That is, the probability that A and B will happen is the probability that A will happen, times the probability that B will happen given that A happened; this relationship gives Bayes' theorem. It then follows that A and B are independent if and only if

${\displaystyle P(A\cap B)=P(A)\cdot P(B)}$.