What is Probability
Probability Problems
Probability is a numerical description of how likely an event is to occur or how likely it is that a proposition is true. Probability is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility and 1 indicates certainty.
A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes Heads and Tails are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either Heads or Tails is 1/2 (which could also be written as 0.5 or 50%).
The probability of happening of an event E, denoted by P(E), is defined as
Formula
Probability of event to happen P(E) = | Number of favourable outcomes |
Total Number of outcomes |
Thus, if an event can happen in m ways and fails to occur in n ways and m+n ways is equally likely to occur then the probability of happening of the event A is given by
Formula
P(E) = | m |
m+n |
And the probability of non-happening of E is
Formula
P(E) = | n |
m+n |
Many events can't be predicted with total certainty. The best we can say is how likely they are to happen, using the idea of Probability
Tossing a Coin
- head tails coin
- Tossing a Coin
When a coin is tossed, there are two possible outcomes
- heads (H) or
- tails (T)
We say that the probability of the coin landing H is 1/2 and the probability of the coin landing T is 1/2
Throwing Dice
- pair of dice
- Throwing Dice
When a single die is thrown, there are six possible outcomes: 1, 2, 3, 4, 5, 6. The probability of any one of them is 1/6.
Formula
Probability of an event happening = | Number of ways it can happen |
Total number of outcomes |
Note:
- The probability of an event which is certain to occur is one.
- The probability of an event which is impossible to zero.
- If the probability of happening of an event P(E) and that of not happening is P(E), then P(E)+ P(E) = 1, 0 ≤ P(E) ≤ 1,0≤ P(E)≤1.
Important Terms related to Probability:
1. Trial and Event: The performance of an experiment is called a trial, and the set of its outcomes is termed an event.
Example: Tossing a coin and getting head is a trial. Then the event is {HT, TH, HH}
2. Random Experiment: It is an experiment in which all the possible outcomes of the experiment are known in advance. But the exact outcomes of any specific performance are not known in advance.
Example
Tossing a Coin Rolling a die Drawing a card from a pack of 52 cards Drawing a ball from a bag
3 Outcome: The result of a random experiment is called an Outcome.
Example
Tossing a coin is an experiment and getting head is called an outcome. Rolling a die and getting 6 is an outcome.
4. Sample Space: The set of all possible outcomes of an experiment is called sample space and is denoted by S.
Example
When a die is thrown, sample space is S = {1, 2, 3, 4, 5, 6} It consists of six outcomes 1, 2, 3, 4, 5, 6
Note1: If a die is rolled n times the total number of outcomes will be 6n.
Note2: If 1 die rolled n times then n die rolled 1 time.
5. Complement of Event: The set of all outcomes which are in sample space but not an event is called the complement of an event.
6. Impossible Events: An event which will never be happened
.Example
Example1: Tossing double-headed coins and getting tails in an impossible event. Example2: Rolling a die and getting number > 10 in an impossible outcome. P (impossible outcome) =0
7. Sure Outcome/Certain Outcome: An Outcome which will definitely be happen
Example1: Tossing double-headed coins and getting heads only.
Example 2
8. Possible Outcome: An outcome which is possible to occur is called Possible Outcome.
Example
9. Equally Likely Events: Events are said to be equally likely if one of them cannot be expected to occur in preference to others. In other words, it means each outcome is as likely to occur as any other outcome.
Example
10. Mutually Exclusive or Disjoint Events: Events are called mutually exclusive if they cannot occur simultaneously.
Example
11. Exhaustive Events: The total number of all possible outcomes of an experiment is called exhaustive events.
Example
12. Independent Events: Events A and B are said to be independent if the occurrence of any one event does not affect the occurrence of any other event.