Probability, loosely speaking, concerns the study of uncertainty. Probability can be thought of as the fraction of times an event occurs, or as a degree of belief about an event. We then would like to use this probability to measure the chance of something occurring in an experiment.
We often quantify uncertainty in the data, uncertainty in the machine learning model, and uncertainty in the predictions produced by random variable of the model. Quantifying uncertainty requires the idea of a random variable, which is a function that maps outcomes of random experiments to a set of properties that we are interested in. Associated with the random variable is a function that measures the probability that a particular outcome (or probability set of outcomes) will occur; this is called the probability distribution.
Probability and Random Variables
Modern probability is based on a set of axioms proposed by Kolmogorov(Grinstead and Snell, 1997, Jaynes, 2003) that introduce the three concepts of sample space, event space, and probability measure. The probability space models a real-world process (referred to as an experiment) with random outcomes.
The sample space $\Omega$
The sample space is the set of all possible outcomes of the experiment,usually denoted by $\Omega$. For example, two successive coin tosses have a sample space of $\{hh, tt, ht, th\}$, where $h$ denotes “heads” and $t$ denotes “tails”.
The event space $\mathbb{A}$
The event space is the space of potential results of the experiment. A subset $\mathbb{A}$ of the sample space $\Omega$ is in the event space $\mathbb{A}$ if at the end of the experiment we can observe whether a particular outcome $\omega \in \Omega$ is in $\mathbb{A}$. The event space $\mathbb{A}$ is obtained by considering the collection of subsets of $\Omega$, and for discrete probability distributions $\mathbb{A}$ is often the power set of $\Omega$.The probability $P$
With each event $A \in \mathbb{A}$, we associate a number $P(A)$ that measures the probability or degree of belief that the event will occur. $P(A)$ is called the probability of $A$.
The probability of a single event must lie in the interval $[0, 1]$ and the total probability over all outcomes in the sample space must be 1, i.e.,$P(\Omega) = 1$.
Given a probability space $(\Omega,\mathbb{A}, P)$, we want to use it to model some real-world phenomenon. In machine learning, we often avoid explicitly referring to the probability space, but instead refer to probabilities on quantities of interest, which we denote by $T$ . We refer $T$ as the target space and refer to elements of $T$ as states.
We introduce a target space function $X :\Omega \to T$, that takes an element of $\Omega$ (an outcome) and returns a particular quantity of interest $x$, a value in $T$.This association/mapping from $\Omega$ to $T$ is called a random variable.
For example, in the case of tossing two coins and counting the number of heads, a random variable $X$ maps to the three possible outcomes: $X(hh) = 2, X(ht) = 1, X(th) = 1$, and $X(tt) = 0$. In this particular case, $T = {0, 1, 2}$, and it is the probabilities on elements of $T$ that we are interested in. For any subset $S \in T $, we associate $P_X(S) \in [0,1]$(the probability) to a particular event occurring corresponding to the random variable $X$.
$P(X=2)=P(hh)=P(h).P(h)=\frac{1}{2}.\frac{1}{2}=\frac{1}{4}$
$P(X=1)=P(ht) + P(th)= \frac{1}{2}.\frac{1}{2}+\frac{1}{2}.\frac{1}{2}=\frac{1}{4}+\frac{1}{4}=\frac{1}{2}$
$P(X=0)=P(tt)=P(t).P(t)=\frac{1}{2}.\frac{1}{2}=\frac{1}{4}$
Consider another example
Remark. The target space, that is, the range $T$ of the random variable $X$,is used to indicate the kind of probability space, i.e., a $T$ random variable.When $T$ is finite or countably infinite, this is called a discrete random variable . For continuous random variables ,we only consider $T = \mathbb{R}$ or $T = \mathbb{R}^D$.
In machine learning systems we are interested in generalization error.This means that we are actually interested in the performance of our system on instances that we will observe in future, which are not identical to the instances that we have seen so far. This analysis of future performance relies on probability and statistics.
Comments
Post a Comment