MATH 340, Study Guide for the Final Exam
12/05/2015
Test coverage
- Combinatorial Analysis
- The Basic Principle of Counting (1.2)
- Permutations, Combinations, Multinomial Coefficients (1.3-5)
- The number of Integer Solutions of Equations (1.6: Propositions 6.1, 6.2)
- Axioms of Probability
- Sample Space; Outcomes; Events. Operations on Events (2.2)
- Axioms of Probability (2.3)
- Properties of Probability (2.4)
- "Classical Definition of Probability": Sample Spaces with Equally Likely Outcomes (2.5: Examples 5a-j,m,n; skip the rest)
- Probability as a Measure of Belief (2.7: read this section)
- Skip Section 2.6
- Conditional Probability and Independence
- Conditional probability: definition and basic properties (3.1-2)
- Formula of compound probabilities. Bayes' formula (3.3: Examples 3acdi; Prop. 3.1, Examples 3klmn; rest can be skipped)
- Independent events (3.4: Examples 4abce; Prop. 4.1)
- Further Properties of Conditional Probability (3.5; Formula 5.1)
- Independent Trials (3.4: Definition of independent trials; Examples 4f,h,j; rest can be skipped)
- Random Variables
- Random variables: examples (4.1)
- Discrete random variables: probability mass function, cumulative distribution function (4.2)
- Expectation of a random variable (4.3)
- Expectation of a function of a random variable (4.4)
- Variance (4.5)
- The Bernoulli and Binomial random variables (4.6)
- The Poisson random variable (4.7)
- Geometric random variable (4.8.1)
- Properties of the Cumulative Distribution Function (4.9)
- Summary after the Chapter to be used for review
- Continuous Random Variables
- Introduction: concept of probability density (5.1)
- Expectation and Variance (5.2)
- Uniform random variables (5.3)
- Normal random variables (5.4)
- The normal approximation of the binomial distribution; "0.5 correction" (5.4.1)
- Exponential random variables (5.5); skip 5.5.1 and 5.6
- Distribution of a function of a random variable (5.7: Example 7a, problem 5.40)
- Jointly distributed random variables
- Joint distribution functions (6.1)
- Independent random variables (6.2)
- Distributions of sums of independent random variables (6.3)
- Properties of expectation
- Expectations of sums (7.2)
- Covariance, variance of sums (7.4)
- Limit theorems
- The central limit theorem. Approximation of sums of independent identically
distributed random variables using the normal curve (8.3)
Key concepts
- Basic principle of counting, permutations, combinations, binomial theorem
- Sample space, outcomes, events. Probability as a function of event
- Classical definition of probability
- Conditional probability, independence
- Bayes' formula
- Discrete random variables, probability mass functions, cumulative distribution functions
- Expectation, variance in the discrete case
- Formula for the expectation of a function (w/o proof)
- Continous random variables, probability densities, cumulative distribution functions
- Expectation, variance; expectation of a function of a random variable
- Continuous probability distributions: uniform, normal, exponential, gamma
- Discrete probability distributions: Bernoulli, binomial, Poisson
- The DeMoivre-Laplace limit theorem and its use
- Joint distribution functions, densities. Marginal densities
- Independence of random variables. Equivalent definitions:
product rules for probabilities, density functions and CDFs
- Convolution of density functions and the way to compute density for the sum
- Computing density for other combinations of random variables (XY, X/Y,...) using CDF
- Expectations, variances applied to the case of several random variables
- Covariance, variance of the sum formula
- The central limit theorem
Basic types of random variables
(you need to know expressions for probability mass functions/densities, be able to derive basic
properties (expectations/variances) and know how to use them in context of problems)
- Bernoulli
- Binomial
- Poisson
- Geometric
- Uniform
- Exponential
- Normal
- Gamma
List of theory questions
- Give the formulations of the three axioms of probability. Derive other properties of probability
based on the axioms
- Derive formulas for the probability of a union of several events; prove Bonferroni's inequality
for the intersection
- Compute expectations or variances for the random variables from the list above
- State the formal definition of conditional probability. Derive formulas of total probability
(the rule of conditioning) and Bayes' rule from the definitions
- Derive a probability identity using conditional probability (see problems TH 1, 5, 25; ST 17,
21, 25 in Chapter 3)
- Verify a property of discrete random variables (see problems TH 10, 19, 27; ST 2, 5, 20 in Chapter 4)
- Verify a property of expectation, variance (see problems TH 7, 8, 18 in Chapter 5)
- Derive an expression for the density of a function of X, where X has known distribution
(see problems TH 29, 30; ST 16 in Chapter 5)
- Derive the formula for the density of a sum of two independent random variables (Section 6.3)
- Show that the sum of two normal random variables has normal distribution (Section 6.3, Proposition 3.2)
- Show that the sum of two gamma random variables has a gamma distribution (Section 6.3, Proposition 3.1)
- Show that the sum of two Poisson random variables has Poisson distribution (Section 6.3)
- Show that the sum of two binomial random variables has binomial distribution (Section 6.3)
Refer to homework assignments and quizzes for possible types of problems.