MATH 340, Study Guide for the Final Exam
5/10/2014
Test coverage
- Combinatorial Analysis
- Introduction: the Basic Notions of Probability; Birthday Problem (2.5, [5i])
- The Basic Principle of Counting (1.2)
- Permutations, Combinations, Multinomial Coefficients (1.3-5)
- The number of Integer Solutions of Equations (1.6)
- Axioms of Probability
- Sample Space and Events (2.2)
- Axioms of Probability (2.3)
- Properties of Probability (2.4)
- "Classical Definition of Probability"; Sample Spaces with Equally Likely Outcomes 2.5
- Probability as a Measure of Belief (2.7: read this section)
- Skip Section 2.6
- Conditional Probability and Independence
- Conditional probability: definition and basic properties (3.1-2)
- Formula of compound probabilities. Bayes' formula (3.3)
- Independent events (3.4)
- Further Properties of Conditional Probability (3.5)
- Independent Trials (3.4)
- Random Variables
- Random variables: examples (4.1)
- Discrete random variables: probability mass function, cumulative distribution function (4.2)
- Expectation of a random variable (4.3)
- Expectation of a function of a random variable (4.4)
- Variance (4.5)
- The Bernoulli and Binomial random variables (4.6)
- The Poisson random variable (4.7)
- Other discrete probability distributions: Geometric, Negative Binomial (4.8.1, 4.8.2)
- Properties of the Cumulative Distribution Function (4.9)
- Summary after the Chapter to be used for review
- Continuous Random Variables
- Introduction: concept of probability density (5.1)
- Expectation and Variance (5.2)
- Uniform random variables (5.3)
- Normal random variables (5.4)
- The normal approximation of the binomial distribution; "0.5 correction" (5.4.1)
- Exponential random variables (5.5); skip 5.5.1 and 5.6
- Distribution of a function of a random variable (5.7: Example 7a, problem 5.40)
- Jointly distributed random variables
- Joint distribution functions (6.1)
- Independent random variables (6.2)
- Properties of expectation
- Covariance, variance of sums (7.4)
- Limit theorems
- Weak law of large numbers. Chebyshev's and Markov's inequalities (8.2)
- The central limit theorem. Approximation of sample means by normal random
variables (8.3)
Key concepts
- Basic principle of counting, permutations, combinations, binomial theorem
- Sample space, outcomes, events. Probability as a function of event
- Classical definition of probability
- Conditional probability, independence
- Bayes' formula
- Discrete random variables, probability mass functions, cumulative distribution functions
- Expectation, variance in the discrete case
- Formula for the expectation of a function (w/o proof)
- Continous random variables, probability densities, cumulative distribution functions
- Expectation, variance; expectation of a function of a random variable
- Continuous probability distributions: uniform, normal, exponential, gamma
- Discrete probability distributions: Bernoulli, binomial, Poisson
- The DeMoivre-Laplace limit theorem and its use
- Joint distribution functions, densities. Marginal densities
- Independence of random variables. Equivalent definitions:
product rules for probabilities, density functions and CDFs
- Convolution of density functions and the way to compute density for the sum
- Computing density for other combinations of random variables (XY, X/Y,...) using CDF
- Expectations, variances applied to the case of several random variables
- Covariance, variance of the sum formula
- Chebyshev and Markov inequalities
- Weak law of large numbers
- The central limit theorem
Basic types of random variables
(you need to know expressions for probability mass functions/densities, be able to derive basic
properties (expectations/variances) and know how to use them in context of problems)
- Bernoulli
- Binomial
- Poisson
- Geometric
- Negative binomial
- Uniform
- Exponential
- Normal
- Gamma
List of theory questions
- Give the formulations of the three axioms of probability. Derive other properties of probability
based on the axioms
- Derive formulas for the probability of a union of several events; prove Bonferroni's inequality
for the intersection
- Compute expectations or variances for the random variables from the list above
- State the formal definition of conditional probability. Derive formulas of total probability
(the rule of conditioning) and Bayes' rule from the definitions
- Derive a probability identity using conditional probability (see problems TH 1, 5, 25; ST 17,
21, 25 in Chapter 3)
- Verify a property of discrete random variables (see problems TH 10, 19, 27; ST 2, 5, 20 in Chapter 4)
- Verify a property of expectation, variance (see problems TH 7, 8, 18 in Chapter 5)
- Derive an expression for the density of a function of X, where X has known distribution
(see problems TH 29, 30; ST 16 in Chapter 5)
- Show that the sum of two gamma random variables has a gamma distribution (Section 6.3, Proposition 3.1)
- Show that the sum of two normal random variables has normal distribution (Section 6.3, Proposition 3.2)
- Derive a formula for the density of the chi-square random variable (Section 6.3, Example 3b)
- Derive Markov's and Chebyshev's inequalities from basic principles (Section 8.2)
- Formulate weak law of large numbers and the central limit theorem (Sections 8.2, 8.3)
See homework assignments and quizzes for the possible types of problems.