Skip to main content

Posts

Showing posts from September, 2021

4.0 Basics of Probability

Sample Spaces, Events, and Their Probabilities Sample Spaces and Events Rolling an ordinary six-sided die is a familiar example of a random experiment, an action for which all possible outcomes can be listed, but for which the actual outcome on any given trial of the experiment cannot be predicted with certainty. In such a situation we wish to assign to each outcome, such as rolling a two, a number, called the probability of the outcome, that indicates how likely it is that the outcome will occur. Similarly, we would like to assign a probability to any event, or collection of outcomes, such as rolling an even number, which indicates how likely it is that the event will occur if the experiment is performed. Definition A random experiment is a mechanism that produces a definite outcome that cannot be predicted with certainty. The sample space associated with a random experiment is the set of all possible outcomes. An event is a subset of the sample space. An event $E$ is said to occur

4.1b Probability Distributions for Continuous Random Variable

For a discrete random variable $X$ the probability that $X$ assumes one of its possible values on a single trial of the experiment makes good sense. This is not the case for a continuous random variable.  For example, suppose $X$ denotes the length of time a commuter just arriving at a bus stop has to wait for the next bus. If buses run every 30 minutes without fail, then the set of possible values of X is the interval denoted [0,30], the set of all decimal numbers between 0 and 30. But although the number 7.211916 is a possible value of $X$, there is little or no meaning to the concept of the probability that the commuter will wait precisely 7.211916 minutes for the next bus. If anything the probability should be zero, since if we could meaningfully measure the waiting time to the nearest millionth of a minute it is practically inconceivable that we would ever get exactly 7.211916 minutes. More meaningful questions are those of the form: What is the probability that the commuter'

4.1a Probability Distributions for Discrete Random Variables

  Probability Distributions Associated to each possible value $x$ of a discrete random variable $X$ is the probability $P(x)$ that $X$ will take the value $x$ in one trial of the experiment. Definition The probability distribution of a discrete random variable $X$(probability mass function) is a list of each possible value of $X$ together with the probability that $X$ takes that value in one trial of the experiment. The probabilities in the probability distribution of a random variable X must satisfy the following two conditions: Each probability $P(x)$ must be between 0 and 1: $0≤P(x)≤1.0$ The sum of all the probabilities is 1: $\sum P(x)=1.$ Example: A fair coin is tossed twice. Let $X$ be the number of heads that are observed. a.Construct the probability distribution of $X$. b.Find the probability that at least one head is observed The possible values that $X$ can take are 0, 1, and 2. Each of these numbers corresponds to an event in the sample space $S={hh,ht,th,tt}$ of equally l