- Statistics - Discussion
- Z table
- Weak Law of Large Numbers
- Venn Diagram
- Variance
- Type I & II Error
- Trimmed Mean
- Transformations
- Ti 83 Exponential Regression
- T-Distribution Table
- Sum of Square
- Student T Test
- Stratified sampling
- Stem and Leaf Plot
- Statistics Notation
- Statistics Formulas
- Statistical Significance
- Standard normal table
- Standard Error ( SE )
- Standard Deviation
- Skewness
- Simple random sampling
- Signal to Noise Ratio
- Shannon Wiener Diversity Index
- Scatterplots
- Sampling methods
- Sample planning
- Root Mean Square
- Residual sum of squares
- Residual analysis
- Required Sample Size
- Reliability Coefficient
- Relative Standard Deviation
- Regression Intercept Confidence Interval
- Rayleigh Distribution
- Range Rule of Thumb
- Quartile Deviation
- Qualitative Data Vs Quantitative Data
- Quadratic Regression Equation
- Process Sigma
- Process Capability (Cp) & Process Performance (Pp)
- Probability Density Function
- Probability Bayes Theorem
- Probability Multiplecative Theorem
- Probability Additive Theorem
- Probability
- Power Calculator
- Pooled Variance (r)
- Poisson Distribution
- Pie Chart
- Permutation with Replacement
- Permutation
- Outlier Function
- One Proportion Z Test
- Odd and Even Permutation
- Normal Distribution
- Negative Binomial Distribution
- Multinomial Distribution
- Means Difference
- Mean Deviation
- Mcnemar Test
- Logistic Regression
- Log Gamma Distribution
- Linear regression
- Laplace Distribution
- Kurtosis
- Kolmogorov Smirnov Test
- Inverse Gamma Distribution
- Interval Estimation
- Individual Series Arithmetic Mode
- Individual Series Arithmetic Median
- Individual Series Arithmetic Mean
- Hypothesis testing
- Hypergeometric Distribution
- Histograms
- Harmonic Resonance Frequency
- Harmonic Number
- Harmonic Mean
- Gumbel Distribution
- Grand Mean
- Goodness of Fit
- Geometric Probability Distribution
- Geometric Mean
- Gamma Distribution
- Frequency Distribution
- Factorial
- F Test Table
- F distribution
- Exponential distribution
- Dot Plot
- Discrete Series Arithmetic Mode
- Discrete Series Arithmetic Median
- Discrete Series Arithmetic Mean
- Deciles Statistics
- Data Patterns
- Data collection - Case Study Method
- Data collection - Observation
- Data collection - Questionaire Designing
- Data collection
- Cumulative Poisson Distribution
- Cumulative plots
- Correlation Co-efficient
- Co-efficient of Variation
- Cumulative Frequency
- Continuous Series Arithmetic Mode
- Continuous Series Arithmetic Median
- Continuous Series Arithmetic Mean
- Continuous Uniform Distribution
- Comparing plots
- Combination with replacement
- Combination
- Cluster sampling
- Circular Permutation
- Chi Squared table
- Chi-squared Distribution
- Central limit theorem
- Boxplots
- Black-Scholes model
- Binomial Distribution
- Beta Distribution
- Best Point Estimation
- Bar Graph
- Arithmetic Range
- Arithmetic Mode
- Arithmetic Median
- Arithmetic Mean
- Analysis of Variance
- Adjusted R-Squared
- Home
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Statistics - Probabipty Multippcative Theorem
For Independent Events
The theorem states that the probabipty of the simultaneous occurrence of two events that are independent is given by the product of their inspanidual probabipties.
${P(A and B) = P(A) imes P(B) \[7pt] P (AB) = P(A) imes P(B)}$The theorem can he extended to three or more independent events also as
${P(A cap B cap C) = P(A) imes P(B) imes P(C) P (A,B and C) = P(A) imes P(B) imes P(C) }$Example
Problem Statement:
A college has to appoint a lecturer who must be B.Com., MBA, and Ph. D, the probabipty of which is ${frac{1}{20}}$, ${frac{1}{25}}$, and ${frac{1}{40}}$ respectively. Find the probabipty of getting such a person to be appointed by the college.
Solution:
Probabipty of a person being a B.Com.P(A) =${frac{1}{20}}$
Probabipty of a person being a MBA P(B) = ${frac{1}{25}}$
Probabipty of a person being a Ph.D P(C) =${frac{1}{40}}$
Using multippcative theorem for independent events
${ P (A,B and C) = P(A) imes P(B) imes P(C) \[7pt] = frac{1}{20} imes frac{1}{25} imes frac{1}{40} \[7pt] = .05 imes .04 imes .025 \[7pt] = .00005 }$For Dependent Events (Conditional Probabipty)
As defined earper, dependent events are those were the occurrences or nonoccurrence of one event effects the outcome of next event. For such events the earper stated multippcative theorem is not apppcable. The probabipty associated with such events is called as conditional probabipty and is given by
P(A/B) = ${frac{P(AB)}{P(B)}}$ or ${frac{P(A cap B)}{P(B)}}$
Read P(A/B) as the probabipty of occurrence of event A when event B has already occurred.
Similarly the conditional probabipty of B given A is
P(B/A) = ${frac{P(AB)}{P(A)}}$ or ${frac{P(A cap B)}{P(A)}}$
Example
Problem Statement:
A coin is tossed 2 times. The toss resulted in one head and one tail. What is the probabipty that the first throw resulted in a tail?
Solution:
The sample space of a coin tossed two times is given as S = {HH, HT, TH, TT}
Let Event A be the first throw resulting in a tail.
Event B be that one tail and one head occurred.
${ P(A) = frac{P(TH,TT)}{P(HH,HT,TH,TT)} = frac{2}{4} =frac {1}{2} \[7pt] P(A cap B) = frac{P(TH)}{P(HH,HT,TH,TT)} =frac{1}{4} \[7pt] So P (A/B) = frac{P(A cap B)}{P(A)} \[7pt] = frac{frac{1}{4}}{frac{1}{2}} \[7pt] = frac{1}{2} = 0.5 }$ Advertisements