LearnPick Navigation


Published in: Mathematics
  • Score Guarantee Education

    • 201, Sukhmani Building (2 Mins Walking D...
    • Area: Andheri, Andheri ( E), Andheri ( W ), Andheri Rail...
    • Courses: Mathematics, Physics, English, Chemistry, Mathemat...
  • Contact this Institute

01_Straight_Line 02_Circle 03_Permutation_and_combination 04_Complex_numbers 05_Application_of_Derivative 06_Binomial_theorem 07_Progression__Series.pdf 08_Definite_Integral.pdf 09_Indefinite_Integral.pdf 10_Probability.pdf

  • 1
    PROBABILITY DEFINITIONS Random Experiment: It is an experiment in which all the outcomes of the experiment are known in advance, but the exact outcome of any particular performance of the experiment is not known in advance. Sample Space: The set of all possible outcomes of an experiment is known as sample space, provided no two or more of these outcomes can occur simultaneously and exactly one of these outcomes must occur whenever the experiment is conducted. It is usually denoted as S. For example: The tossing a coin is a random experiment and the sample space associated with it is the set {H, T}. Similarly, the throwing a die is a random experiment and the sample space associated with it is the set EVENT: The outcome of an experiment is known as simple events and any subset of the sample space is called an event. For example: Throwing a die is an experiment, S = {1, 2, 3, 4, 5, 6} is the sample space,{l}, are simple events and {1 , 2}, etc. are events. The empty set is also an event as c S and it is called an impossible event. The sample space S is also a subset of S and so it is also an event. S represents the sure event. Types of Events a. b. c. Equally likely events: A set of events is said to be equally likely if taking into consideration all the relevant factors there is no reason to expect one of them in preference to the others. For example: When an unbiased coin is tossed, the occurrence of a tail or a head are equally likely. Exhaustive events: A set of events is said to be exhaustive if the performance of the experiment always results in the occurrence of atleast one of them. For example: In throwing a die, the events Al = {1 , 2}, A2 = {2, 3, 4} are not exhaustive as 5 as outcome of the experiment which is not the member of any of the events Al and A2. Let El = {1 , 2, 3}, E2 = {2, 4, 5, 6}, then the set {El, E2} is exhaustive. The set of events is exhaustive is S = UEi. Mutually Exclusive Events: A set of events is said to be mutually exclusive if they have no point in common. Thus El, E2, E3, . . are mutually exclusive iff Ei nEj for i j. For example: In throwing two dice, let El = a sum of 5 = {(1, 4) (2, 3) (3, 2) (4, 1)} and E2 = a sum of 9 ={(3, 6) (4, 5) (5, 4) (6, 3)}, then clearly El n E2 = (l) so El and E2 are mutually exclusive. Example 1 : Solution: Only three students Sl, S2 and S3 appear at a competitive examination. The probability of Sl coming first is 3 times that of S2 and the probability of S2 coming first is three times that of S3. Find the probability of each coming first. Also find the probability that Sl or S2 comes first. Let PI, P2 and P3 be the probabilities of Sl, S2 and S3 coming first respectively. PI = 3P2 and P2 = 3P3 also, PI + P2 +P3 = 1 (as one of them has to come first) 3 3 1 13 13 13 12 Finally probability of Sl or S2 coming first = PI + P2 = 13 PROBABILITY: If there are n exhaustive mutually exclusive and equally likely outcomes of an experiment and m of them are favorable to an event A, the probability of the happening of A is defined as the ratio m/n. Numberof outcomes favourable to the event Total number of outcomes
  • 2
    2 m n Here p is a positive number not greater than unity. So that 0 < p < 1. Therefore, the number of cases in which the event A will not happen is n will not happen is given by n-m n -s p + q = 1, where 0 < p, q
  • 3
    Probability 3. If El, E2 and E3 be three events, then 3 a) b) c) P(at least two of El, E2 , E3 occur) = P(EI n E2) + P(E3 n El) + P(EI n E2) P(exactly two of El, E2, E3 occur) = P(E2 n E3) + P(E3 n El) + P(EI n E2) P(exactly one of El, E2, E3 occur) 2P(El n E2 n E3) 3P(El n E2 n E3) = P(EI) + P(E2) + P(E3) - 2P(E2 n E3) - 2P(E3 n El) - 2P(El n E2) + 3P(El n E2 n E3) CONDITIONAL PROBABILITY The probability of occurrence of an event A, given that B has already occurred is called the conditional probability of occurrence of A. It is denoted as P(A/B). Let the event B has already occurred, then the space reduces to B. Now the results favorable to the occurrence of A (where B has already occurred) are those that are common to both A and B, i.e., it belongs to A n B. Thus, p (A/B) = — N = Number of elements in An B and NB = Number of elements in B (where NB 0) NA B/N P(AnB) n B) = P(B) P(A/B) if P(B) * O Example 3 : Solution: = P(A) P(B/A) if P(A) O There is 30% chance that it rains on any particular day. What is the probability that there is at least one rainy day within a period of 7 days? Given that there is at least one rainy day, what is the probability that there are at least two rainy days? Let A be the event that there is at least one rainy day and B be the event that there are atleast two rainy days. Now P(A') = (0.7)7 P(A) = 1 - (0.7)7 P(AnB) P(B) Also, P(B/A) = But P(B) = 1 - (0.7)7 - = 1 . P(B/A) = 1 - (0.7)7 (Since B c A) INDEPENDENT EVENTS Two events A and B are said to be independent if occurrence of A does not depend on the occurrence or non-occurrence of the event B. Thus A and B are independent if P(A/B) = P(A) and P(B/A) = P(A) Thus n B) = P(B). P(A/B). Similarly if there are n independent events, then P(EI n E2 n E3 n .nEn) = P(EI) P(E2) ... P(En). Pairwise Independent Events Three events El, E2 and E3 are said to pairwise independent if P(EI n E2) = P(EI) P(E2), P(E2 n E3) = P(E2) P(E3) and P(E3 n El) = P(E3) P(EI) i.e. Events El, E2, E3 En will be pairwise independent if P(AI n AD = P(AI) P(AJ) V i Three events are said to be mutually independent if P(EI n E2) = P(EI) P(E2), P(E2 n E3) = P(E2) P(E3), P(E3 n El) = P(E3) P(EI) and P(EI n E2 n E3) = P(EI) P(E2) P(E3). If A and B are two mutually exclusive, then n B) = O but P(A) P(B) O (In general)
  • 4
    4 *Mutually exclusive events will not be independent. Difference between mutually exclusive and Independence Probability Mutually exclusive is used when the events are taken from the same experiment whereas the independence is used when the events are taken from the different experiments. For example: Let two dice are thrown. Let two events be "first die shows an odd number" and "second die shows an even number". These two events are independent events because the result of the first die does not depend on the result of second die. But these events are not mutually exclusive since both the events may occur simultaneously. Example 4 : Solution: A person draws a card from a pack of 52, replaces it and shuffles it. He continues doing it until he draws a spade. What is the chance that he has to make (i) atleast 3 trials (ii) exactly 3 trials. (i) For atleast 3 trials, he has to fail at the first 2 attempts and then after that it doesn't make a difference if he fails or wins at the 3rd or the subsequent attempts. Chance of success at any attempt = 1/4 . chance of failure = 3/4 . chance of failing in first 2 attempt = (ii) For exactly 3 attempts, he has to 2 3 4 9 16 fail in the first two attempts and succeed in the 3rd attempt. 2 3 1 . Required probability - TOTAL PROBABILITY THEOREM 9 64 Let Al, A2, An be a set of mutually exclusive events i.e., Al n AJ (i j) = (l) and exhaustive events i.e., IJAI = S(sample space) and let E be an event which is related with Al, Then the probability that E will occur is given by P(E) = which is known as Total probability theorem. Example 5 : Find the probability that a year chosen at random has 53 Sundays? Solution : Let P(L) be the probability that a leap year is chosen at random. Since for every four years 1 leap year comes, then 1 4 3 . The probability that a leap year is not chosen is L) = 4 Now let P(S) be the probability than a year chosen at random has 53 Sundays . So that P(S) is given by probability of its occurrence in a non-leap year s p and the s probability of its occurrence in a non-leap year P s s i.e. In a leap year, there is 366 days i.e., 52 weeks and 2 days. These 2 days can be
  • 5
    Probability Sun, Mon Mon, Tue Tue, Wed wed, Thu Thu, Fri Fri, Sat Sat, Sun Out of these 7 days, Sunday can come in 2 ways s L 5 2 7 Now in a non-leap year, there will be 365 days i.e. 52 weeks and 1 day. And this 1 day can be any of the 7 days. 1 So the probability that Sunday will occur is s 1 C 7 123 5 28 BAYE's THEOREM Suppose Al, A2 7 s s 1 7 ...An are mutually exclusive and exhaustive set of events. Thus, they divide the sample space into n parts and an event B occurs. Then the conditional probability that Al happens (given that B has happened) is given by Baye's theorem which is Example 6 : Solution : p(Al n B) B p (Al)p B A bag contains 5 balls and of these it is equally likely that 0, 1, 2, 3, 4, 5 are white. A ball is drawn and is found to be white. What is the probability that is only white ball? Hence ball drawn is white i.e. events has been occurred already. Now we want to know the probability of that it is only white ball, hence here conditional probability occurs and so we use Baye's theorem. The condition B which is given is that one ball is drawn and it is white. Hence P IW - (B/5W) Where P = probability that B occurs when exactly IW ball is there. P(B/OW) P(B/IW) P(B/2W) P(B/5W) 0 1 5 2 5 5 5 1 And P(OW) = - , P(IVV) = 6 P(2W) = P(4W) = 1 P(5W) = — 6 . Required probability
  • 6
    6 Probability 1 5 1 q 2q 3Cl 5Cl 1 5 Cl 5Cl 5Cl 5Cl 6 BINOMIAL DISTRIBUTION FOR SUCCESSIVE EVENTS 1 15 Suppose if p and q are the successive probabilities of happening and failing of an event at a single trial (where p + q = 1). Then chance of its happening r times (exactly) in n trials is P r qn r because the chance of happening r times and failing (n-r) times in a given order is pr q n r and r times can be chosen in Cr ways i.e. there are such type of orders which are mutually exclusive, since happening of it rules out the probability of failing for any such order, the probability is pr qn r . Required probability = • pr • q Since the probabilities P(x) are given by the terms in the form of binomial expansion of (p + q)n, this is called Binomial distribution. Consider the following example to understand it. Example 7 : If a coin is tossed n times, what is the probability that head will appear on odd number of times. Solution : Here number of trials = n p probability of success in a trial i.e., 1 probability that head appears = 2 . q probability of failure is 1 -p = 1 1 1 . Required probability = P(x 1 1 1 = ncl - 1 2 1 2 nq + nC3 + -F... n—l 1 2 3 1 2 5 1 1 1 2 1 2 . Required probability - 2 PROBABILITY DISTRIBUTION Random Variable: It is a real-valued function defined over the sample space of an experiment. Or, A random variable is a real-valued function whose domain is the sample space of a random experiment. It is usually denoted as X, Y, Z, ... etc. Example 8 : Consider a random experiment of tossing three coins. Solution: S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT} Let X be a real valued function on S. Let X = number of heads in S. Then X is a random variable such that X(HHH) = 3, X(HHT) = 2, X(HTH) = 2, X(THH) = 2 X(HTT) = 1, X(THT) = 1, X(TTH) = 1 and X(TTT) = O Discrete random variable: A random variable which can take only finite or countable infinite number of values is called a discrete random variable. Continuous random variable: A random variable which can take any value between two given limits is
  • 7
    Probability called a continuous random variable. 7 Probability Distribution when two coins are tossed: If the values of a random variable together with the corresponding possibilities are given, then this is called probability distribution of the random variable. For example: Probability distribution when three coins are tossed, let x denotes the number of heads occurred, then 1 0) = probability of getting no heads = P(TTT) = — . 8 3 1) = probability of getting one heads = P(HTT or THT or THH) = 8 probability of getting two head = P(HHT or T HH or HTH) = 8 1 3) = probability of getting three heads = P(HHH) = — 8 Thus the probability distribution when three coins are tossed is as given below. 0 x 1 8 1 3 8 2 3 8 3 1 8 MEAN AND VARIANCE OF A RANDOM VARIABLE Mean: If X is a discrete random variable which assumes the values Pn, then the mean of X is defined as = PIXI -k • • • + i=l Usually it is denoted as p. Variance: Let X is a discrete random variable which assumes the values respective probabilities PI, P2, Pn then the variance of X is defined as Var (X) = - + - + ... + pn(xn - (x, X) 2 , where X = is the mean of X. i=l i=l 2 It is denoted as o . Mean of the Binomial Distribution Mean p = = Crprqn [Here X, = r, PI = pr qn r ] g = np Variance of the Binomial Distribution: Variance = P) 2 p = E (r —np)2 pr qn r o = npq. Maximum Probability: If X be a binomial variable with parameters n and p. Then = r) = ncr r, r=o, 1, 2, n p(X=r) n Crprqn r r—l n—r+l p(X=r rq Case I: When (n + l)p is not an integer. , xn with the probabilities PI, xn with the
  • 8
    8 Let m be an integer part and f be the fractional part of (n + l)p. p(X = r) m+f—r Probability p(X=r—1) 12 m rq < 1 for r = m +1 , m +2, n < = r —1) for r = m +1,m+2, . .,n p(X = r) = > — r —1) for r = 1,2, ...,m —p(X = m) is greatest among p(X = 0), p(X = 1) Case Il: When (n + l)p is an integer. p(X=r) , where m = (n + l)p > p(X = n) p(X=r 1) rq 1,2, ..., (m —1) > 1 for r p(X=r) —1 for r = m p(X=r < 1 for r +1, ...,n = p(X = m) > 1) ... > p(X = n) —p(X = m - 1) = p(X = m) is the greatest. POISSON DISTRIBUTION m 1) The Poisson distribution is a limiting case of a binomial distribution under the condition that the number of trials n is infinitely large and probability of success is very small. . lim np X. Definition: A random variable X is said to follow a Poisson distribution, if it assumes only non- values and its probability mass function is given by Here is the parameter of distribution. In Poisson distribution, Mean = variance = X. NORMAL DISTRIBUTION negative A random variable X is said to have a normal distribution with parameter (called 'mean') and 02 if its density function is given by 1 -(x-g)2 /202 f(x, g, o) = 0 211 00, —oo < X < 00, A normal distribution is the continuous distribution. Normal distribution is the limiting form of Binomial distribution under condition 00, p-» 0. Normal distribution is the limiting form of Poisson distribution under the condition 00. Example 9 : Solution : Example 10 : The mean and variance of a binomial variate X are 2 and 1 respectively. Find the probability that X takes a value greater than 1. 1 2 1 2 1 11 = (4C2 + 4C3 + 4C4) — If mean is 15 and q - 24 - 16 1 . Find the value of S.D. 4
  • 9
    Probability Solution : 15, q = 15 x 4 = 20


Copyright Infringement: All the contents displayed here are being uploaded by our members. If an user uploaded your copyrighted material to LearnPick without your permission, please submit a Takedown Request for removal.

Need a Tutor or Coaching Class?

Post an enquiry and get instant responses from qualified and experienced tutors.

Post Requirement

Related Notes

Query submitted.

Thank you!

Drop Us a Query:

Drop Us a Query