probability theory basic concepts , Definition , Example , PROPERTIES OF PROBABILITY

Definition , Example , PROPERTIES OF PROBABILITY :-
BASIC CONCEPT OF PROBABILITY
 (i) Definition
          Probability may be defined as the study of random experiments. In any random experiment, there is always an uncertainty that a particular event will occur or not. As a measure of probability of occurrence of an event, a number between 0 to 1 is assigned. If it is sure that an event will occur, then we can say that its probability is 100% or 1. If it is not sure that an event will occur, then we can say that its probability is 0% or 0. If it is not sure whether the event will occur or not, then its probability is between 0 and 1.
Example
          Let us consider an example. The probability of occurrence of 28th February in a year is 1 since it is certain to occur every year. On the other hand, the probability of occurrence of 30th February in a year is 0 since it never comes. Again, the probability of occurrence of 29th February in a year is neither 0 not I. It is always between 0 and 1. Actually, it is 1/4 since it occurs every leap year i.e., once in four years.
Therefore, from above discussion, we can write a mathematical expression for probability as Probability of an event A,
For example, let us consider the probability of getting an even number in tossing of a die. In tossing of a die, an even number can occur in 3 ways out of 6 equally likely ways.
Therefore,
2.4     PROPERTIES OF PROBABILITY
As discussed earlier, the sample space S contains all possible outcomes of an experiment. Also, an event is the subset of the sample space. Two events are said to be mutually exclusive if the occurrence of one of them precludes the occurrence of other. For example, in tossing of a coin, events Head and Tail are mutually exclusive. In throw of a die, the occurrence of number ‘4’ will automatically exclude the occurrence of numbers 1, 2, 3 ,5 and 6. If an event contains all the outcomes then it is called certain event. Then, probability of this event is unity, i.e.,
P(A) = P(s) = 1
The properties of probability may be listed as under :
Property 1: The probability of a certain event is unity i.e.,
P (A) = 1                                              …..(2.1)
Property 2: The probability of any event is always less than or equal to 1 and non-negative. Mathematically,
…(2.2)
Property 3:  If A and B are two mutually exclusive events, then
P(A + B) = P(A) + P(B)                                             …(2.3)
Property 4: If A is any event, then the probability of not happening of A is
…..(2.4)
where A represents the complement of event A.
Property 5: If A and B are any two events (not mutually exclussive events), then
P (A + B) = P (A) + P (B) – P (AB)             …(2.5)
where P (AB) is called the probability of events A and B both occurring simultaneously. Such an event is called joint event of A and B, and the probability P (AB) is called the joint probability.
Now, if events A and B are mutually exclusive, then the joint probability, P(AB) = 0.
4.5  CONDITIONAL PROBABILITY
The concept of conditional probability is used in conditional occurrences of the events. Let us consider an experiment which involves two events A and B. Now the probability of event B, given that event A has occurred, is represented by P (B/A). Similarly, P (A/B) represents probability of event A given that event B has already occurred. Therefore, P (B/A) and P (A/B) are called conditional probabilities. These conditional probabilities may be defined in terms of their independent and joint probabilities as under:
PAGE NO. 44 TO 47
EQUATION
2.7 RANDOM VARIABLES
As discussed earlier, an event is the possible outcome of an experiment. The range of all the possible, outcomes of an experiment is called the sample space ‘S’. When a trial or experiment is performed, any one sample point in the sample space is the outcome of the trial. This means that a sample point always lies in the sample space ‘S’. On the other hand, an event may correspond to a single sample point or set of sample points.
As an example, in an experiment of tossing a die, the sample space contains six sample points. Every time, the trial is performed, the outcome is any one sample point (number 1 to 6) in the sample space. In every trial, the outcome (sample point) will occur randomly. This has no fixed output. Hence, the outcome of a trial or experiment is a variable which can take values over the set of sample points.
Therefore, from above discussion, we can define a random variable as under:

DO YOU KNOW?
Virtually all serious probabilistic computatations are performed in terms of random variables.

A function which can take on any value from the sample space and its range is some set of real numbers is called a random variable of the experiment. Random variables are denoted by upper case letters such as X, Y etc. and the values taken by them are denoted by lower case letters with subscripts such as x1, x2, y1, y2 etc.
Random variables may be classified as under:

  1. Discrete random variables
  2. Continuous random variables.

2.7.1. Discrete Random Variables
          A discrete random variable may be defined as the random variable which can take on only finite number of values in a finite observation interval. This means that the discrete random variable has countable number of distinct values.
For example, let us consider an experiment of tossing three coins simultaneously. In this case, there are eight possible outcomes. This will
constitute the sample space S. Let the number of heads be the random variable X. Sample space S and random variable X may be written as
S = { HHH HHT HTH THH HTT THT TTH TTT}
X = { x1  x2   x3  x4  x5  x6  x7  x8}
{ 3   2    2   2   1   1   1   0}
Hence, from above example, the concept of random variable is quite clear. From this example, it is also clear that random variable X takes on only a finite number of values i.e., 8. Therefore, it is a discrete random variable.
2.7.2. Continuous Random Variables
          A random variable that takes on an infinite number of values is called a continuous random variable. Actually, there are several physical system (experiments) that generate continuous outputs or outcomes. Such systems generate infante number of outputs or outcomes within the finite period. Continuous random variables may be used to define the outputs of such systems.
As an example, the noise voltage generated by an electronic amplifier has a continuous amplitude. This means that sample space S of the noise voltage amplitude is continuous. Therefore, in this case, the random variable X has a continuous range of values.

Leave a Reply

Your email address will not be published. Required fields are marked *