Introduction to Probability Theory
Purpose of Course showclose
This course will introduce you to the fundamentals of probability theory and random processes. The theory of probability was originally developed in the 17th century by two great French mathematicians, Blaise Pascal and Pierre de Fermat, to understand gambling. Today, the theory of probability has found many applications in science and engineering. Engineers use data from manufacturing processes to sample characteristics of product quality in order to improve the products being produced. Pharmaceutical companies perform experiments to determine the effect of a drug on humans and use the results to make decisions about treatment of illnesses, while economists observe the state of the economy over periods of time and use the information to forecast the economic future.
In this course, you will learn the basic terminology and concepts of probability theory, including random experiments, sample spaces, discrete distribution, probability density function, expected values, and conditional probability. You will also learn about the fundamental properties of several special distributions, including binomial, geometric, normal, exponential, and Poisson distributions, as well as how to use them to model reallife situations and solve applied problems.
Course Information showclose
Primary Resources: This course is comprised of a range of different free online materials. However, the course makes primary use of the following materials:
 Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics Lecture Notes (HTML)
 YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences Lecture Series (YouTube)
 Khan Academy: Salman Khan’s Probability Lecture Series (YouTube)
 Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability (PDF)
In order to pass this course, you will need to complete the final exam and earn 70% or higher. Your score on the exam will be tabulated as soon as you finish it. If you do not pass the exam, you may take it again.
Note that you will only receive an official grade on your final exam. However, in order to adequately prepare for it, you will need to work through all the readings, lectures, and practice assignments in the course.
Time Commitment: This course should take you a total of 115.25 hours. Each unit includes a time advisory that lists the amount of time you are expected to spend on each subunit. These advisories should help you plan your time accordingly. It may be useful to take a look at the time advisories, determine how much time you have over the next few weeks to complete each unit, and then set goals for yourself. For example, Unit 1 should take you approximately 18.75 hours. Perhaps you can sit down with your calendar and decide to complete Subunit 1.1 (a total of 7 hours) on Monday and Tuesday nights, Subunit 1.2 (a total of 10.5 hours) on Wednesday and Thursday nights, etc.
Tips/Suggestions: Make sure to review the learning outcomes and time estimates for the course and those set out for each unit. Keep these in mind as you work through the course materials. Also, be sure to take notes on each of the resources in the course. These notes will be a useful review as you study for your final exam.
This course features a number of Khan Academy™ videos. Khan Academy™ has a library of over 3,000 videos covering a range of topics (math, physics, chemistry, finance, history and more), plus over 300 practice exercises. All Khan Academy™ materials are available for free at www.khanacademy.org.

Learning Outcomes showclose
 define probability,sample space, events, and probability functions;
 use combinations to evaluate the probability of outcomes in coinflipping experiments;
 calculate the probability of union and intersection of events and conditional probability;
 apply Bayes’ theorem to simple situations;
 calculate the expected values of discrete and continuous random variables;
 determine the distribution of the sums of random variables;
 calculate cumulative distributions and marginal distributions;
 use random processes to model and predict phenomena governed by binomial, multinomial, geometric, exponential, normal, and Poisson distributions; and
 explain and use the law of large numbers and the central limit theorem.
Course Requirements showclose
√ have access to a computer;
√ have continuous broadband Internet access;
√ have the ability/permission to install plugins or software (e.g., Adobe Reader or Flash);
√ have the ability to download and save files and documents to a computer;
√ have the ability to open Microsoft files and documents (.doc, .ppt, .xls, etc.);
√ have competency in the English language;
√ have read the Saylor Student Handbook; and
√ have completed MA101, MA102, MA103, MA211, and MA221, or their equivalents.
Unit Outline show close

Unit 1: Introduction to Probability
This unit will introduce you to fundamental concepts of probability theory. You will learn the definitions of probability, random variables, outcome space, events, and probability function. These concepts of probability will be explored through simple chance experiments with discrete outcomes, such as tossing a coin or rolling a die. Random variables represent outcomes of chance experiments. You will also learn about four basic set operations  union, intersection, difference, and complement. This unit will also introduce you to the concepts of conditional probability (i.e., the probability of event A, given the occurrence of some other event B) and Bayes’Theorem, which is one of the most celebrated theorems in the theory of probability.
Unit 1 Time Advisory show close
Unit 1 Learning Outcomes show close

1.1 Probability and Set Operations
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Jeremy Orloff and Professor Jonathan Bloom’s Math 18.05: Introduction to Probability and Statistics: “Lecture 1: Counting and Sets” and “Lecture 2: Probability: Terminology and Examples”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Jeremy Orloff and Professor Jonathan Bloom’s Math 18.05: Introduction to Probability and Statistics: “Lecture 1: Counting and Sets” (PDF) and “Lecture 2: Probability: Terminology and Examples” (PDF)
Instructions: Read both sets of lecture notes for an introduction to the basic concepts of probability. If one performs an experiment that has random outcomes, the set of all possible outcomes, S, is called the sample space, and the elements of S are called sample points. Subsets of S are called events and are usually denoted by letters like A, B, E, F, etc....
The probability of an event A, usually denoted by P(A), is defined by k/n, where n is the total number of possible outcomes of the experiment and k is the number of outcomes that result in the occurrence of A. Since events are sets, one needs to know about set operations. This lecture will give you more details about sets and how to compute probabilities of events.
Reading these lecture notes should take approximately 1 hour.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Jeremy Orloff, Jonathan Bloom, and Massachusetts Institute of Technology OpenCourseWare.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 1: Introduction: Probability and Counting”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 1: Introduction: Probability and Counting” (YouTube)
Instructions: Watch this video. It will introduce you to the basic concepts of probability, including outcome space, events, and probability functions. Several examples will be given to show you how to compute probabilities of events.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Lecture: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability (1)” and “Probability (2)”
Link: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability (1)” (YouTube) and “Probability (2)” (YouTube)
Instructions: Watch these videos. The first video defines probability and the second explores the outcomes from coin flips.
Watching these videos and taking notes should take approximately 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialShareAlike 3.0 United States License. It is attributed to the Khan Academy.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 1, Section 1.2: Discrete Probability Distributions”
Link: Dartmouth College: Professors Charles M.Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 1, Section 1.2: Discrete Probability Distributions” (PDF)
Instructions: Read pages 18  29 of Section 1.2 “Discrete Probability Distributions” in Chapter 1. This reading will introduce you to random variables and probability distributions. You will also learn about several important properties of probability distributions. Lastly, you will be shown how to use distribution functions to compute probabilities of events, their unions, and their intersections.
Reading this chapter and taking notes should take approximately 2 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 1, Section 1.2: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 1, Section 1.2: Exercises” (PDF)
Instructions: Go to pages 35  40 and complete exercises 2, 4, 5, 6, 7, 9, 10, 19, and 21. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 2 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Jeremy Orloff and Professor Jonathan Bloom’s Math 18.05: Introduction to Probability and Statistics: “Lecture 1: Counting and Sets” and “Lecture 2: Probability: Terminology and Examples”

1.2 Properties of Probability and Counting Techniques
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 2: Properties of Probability”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 2: Properties of Probability” (PDF)
Instructions: This lecture will introduce you to the basic properties of probability. It will also teach you the basic principles of counting, such as the multiplication rule, permutations, and combinations.
A permutation of n objects taken r at a time is the arrangement of the n objects in a specific order using r objects at a time. It is usually denoted by nPr.
A combination of n objects taken r at a time is the number of ways we can select r objects from a set of n objects without regard to order. It is usually denoted by nCr.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability Using Combinations”
Link: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability Using Combinations” (YouTube)
Instructions: Watch this video. It will help you learn how to calculate probability from coin flips by using combinations.
Watching this video and taking notes should take approximately 15 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialShareAlike 3.0 United States License. It is attributed to the Khan Academy.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 2: Probability Functions” and “Lecture 3: Permutations”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 2: Probability Functions” (YouTube) and “Lecture 3: Permutations” (YouTube)
Instructions: Watch these videos. They will introduce you to probability functions and the counting techniques, including permutations and combinations.
Watching these videos and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 3, Section 3.1: Permutations” and “Chapter 3, Section 3.2: Combinations”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 3, Section 3.1: Permutations” (PDF) and “Chapter 3, Section 3.2: Combinations” (PDF)
Instructions: Read pages 75  84 of Section 3.1 “Permutations” and pages 92  101 of Section 3.2 “Combinations” in Chapter 3. This reading will introduce you to the two important counting techniques involving permutations and combinations.
Reading these textbook sections and taking notes should take approximately 2 hours and 45 minutes.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 3, Section 3.1: Exercises” and “Chapter 3, Section 3.2: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 3, Section 3.1: Exercises” (PDF) and “Chapter 3, Section 3.2: Exercises” (PDF)
Instructions: Go to the exercises for Section 3.1 on page 88 and page 89 and complete exercises 1, 2, 3, 5, 6, 10, 12, and 13. Then go to the exercises for Section 3.2 on pages 113  116 and complete exercises 2, 10, 11, 12, 19, and 20. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 3 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 2: Properties of Probability”

1.3 Union of Events
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 3: Probabilities of Unions of Events”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 3: Probabilities of Unions of Events” (PDF)
Instructions: Read this lecture, which will help you learn how to calculate the probability of unions of events. It will also review some of the counting formulas involving permutations and combinations and will discuss multinomial coefficients.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 4: Probability Functions (Continued)”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 4: Probability Functions (Continued)” (YouTube)
Instructions: Watch this video for an introduction to more properties of probability functions and how to use them to solve problems.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 3: Probabilities of Unions of Events”

1.4 Conditional Probability
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 4: Conditional Probability”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 4: Conditional Probability” (PDF)
Instructions: Read this lecture to learn about the definition of conditional probability. Calculations of conditional probability are illustrated via a simple example. This lecture will remind you about unions of events and how to compute their probabilities. It will define the conditional probability of an event A given another event B, denoted by P(AB), which is the probability that A occurs given that B has already occurred. It will also show how conditional probability is used to compute the probability of intersections of events.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: Khan Academy: Salman Khan’s Probability Lecture Series: “Conditional Probability and Combinations”
Link: Khan Academy: Salman Khan’s Probability Lecture Series: “Conditional Probability and Combinations” (YouTube)
Instructions: Watch this video. It will help you learn how to calculate the conditional probability of randomly picking a fair coin out of a mixture of fair and biased coins given that the coin turned heads 4 times in 6 flips.
Watching this video and taking notes should take approximately 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialShareAlike 3.0 United States License. It is attributed to the Khan Academy.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 5: Conditional Probability” and “Lecture 6: Conditional Probability (Continued)”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 5: Conditional Probability” (YouTube) and “Lecture 6: Conditional Probability (Continued)” (YouTube)
Instructions: Watch these videos. They will introduce you to conditional probability, its properties, and how to use it to solve problems.
Watching these videos and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 4: Conditional Probability”

1.5 Independent Events
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 5: Independence of Events, Bayes’ Theorem”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 5: Independence of Events, Bayes’ Theorem” (PDF)
Instructions: Read this lecture for an introduction to one of the most important formulas in probability theory  Bayes’ Theorem. This reading will demonstrate the power of Bayes’ theorem through several realworld examples. You will first learn that 2 events A and B are independent if P(A and B) = P(A)P(B), which means that the occurrence of one of the two events, for instance A, does not affect the occurrence of the other event, B. Another interpretation using conditional probability will be given as well, and, finally, an important law used in Bayes’ formula, called the law of total probability, will be given.
Reading this lecture and taking notes should take approximately 1 hour.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 7: Independent Events”
Link: YouTube: UCLA: Professor David Welsbart’s Math 3C: Math and Probability for Life Sciences: “Lecture 7: Independent Events” (YouTube)
Instructions: Watch this video for an introduction to independent events.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. David Welsbart, and the original version can be found here.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 5: Independence of Events, Bayes’ Theorem”

1.6 Bayes’ Theorem
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 7: Bayes’ Formula”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 7: Bayes’ Formula” (PDF)
Instructions: Read this lecture, which is a continuation of Lecture 6 from the previous subunit. The lecture starts with the law of total probability and then shows how Bayes’ formula is just a consequence of that law combined with the definition of conditional probability. Bayes’ formula allows us to find the probability of the first stage of an experiment given that we know the outcome of the second stage.
Reading this lecture and taking notes should take approximately 1 hour.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability (8)”
Link: Khan Academy: Salman Khan’s Probability Lecture Series: “Probability (8)” (YouTube)
Instructions: Watch this video, which gives another overview of Bayes’ Theorem.
Watching this video and taking notes should take approximately 15 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialShareAlike 3.0 United States License. It is attributed to the Khan Academy.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 4, Section 4.1: Discrete Conditional Probability”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 4, Section 4.1: Discrete Conditional Probability” (PDF)
Instructions: Read pages 133  147 of Section 4.1 “Discrete Conditional Probability” in Chapter 4. This reading will introduce you to conditional probability, which you saw in Subunit 1.4, then explain Bayes’ Rule and talks about independence of events.
Reading this chapter and taking notes should take approximately 2 hours and 30 minutes.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 3, Section 4.1: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 4, Section 4.1: Exercises” (PDF)
Instructions: Go to the exercises for Section 4.1 on pages 150  155 and complete exercises 2, 3, 4, 5, 9, 18, and 29. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 2 hours and 30 minutes.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 7: Bayes’ Formula”

Unit 2: Introduction to Probability Distributions
This unit will introduce you to probability distributions and several of their most important properties. You will learn how to identify discrete and continuous probability distributions as well as calculate their expected values and variances. You will also learn how to calculate the cumulative probability distribution for a given probability distribution and vice versa.
Unit 2 Time Advisory show close
Unit 2 Learning Outcomes show close

2.1 Random Variables and Distributions
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 8: Random Variables and Distributions”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 8: Random Variables and Distributions” (PDF)
Instructions: Read this lecture to gain an understanding of random variables, which map probabilities to real numbers. This reading will also introduce you to the concepts of discrete and continuous distributions and offers several examples of each.
Reading this lecture and taking notes should take approximately 2 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 8: Random Variables”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 8: Random Variables” (YouTube)
Instructions: Watch this video. It will introduce you to discrete random variables, probability mass functions, cumulative distributions, and expected values. This video will reinforce the earlier readings you did in this subunit and show you more examples.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Continuous Density Functions”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Continuous Density Functions”(PDF)
Instructions: Read pages 55  68 of Section 2.2 “Continuous Density Functions” in Chapter 2. This reading will introduce you to continuous random variables and their density functions.
Reading this chapter and taking notes should take approximately 2 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Exercises” (PDF)
Instructions: Go to pages 71  73 and complete exercises 1, 2, 3, 4, and 5. You can then check your answers to oddnumbered questions here.
Completing this assessment should take you approximately 2 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 8: Random Variables and Distributions”

2.2 Expected Values, Variance, and Standard Deviation
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 9: Expected Values”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 9: Expected Values” (YouTube)
Instructions: Watch this video to learn how to calculate expected values, variance, and standard deviation of a probability distribution. In this video, you will see how the expected value is a measure of central tendency of a random variable, while the variance and standard deviation are measures of variation. The random variables are discrete, and several nice examples are illustrated.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6, Section 6.1: Expected Value of Discrete Random Variables,” “Chapter 6, Section 6.2: Variance of Discrete Random Variables,” and “Chapter 6, Section 6.3: Continuous Random Variables”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6, Section 6.1: Expected Value of Discrete Random Variables” (PDF), “Chapter 6, Section 6.2: Variance of Discrete Random Variables” (PDF), and “Chapter 6, Section 6.3: Continuous Random Variables” (PDF)
Instructions: Read the following pages in Sections 6.1, 6.2, and 6.3 of “Chapter 6: Expected Value and Variance”:
Section 6.1: pages 225  240;
Section 6.2: pages 257  263; and
Section 6.3: pages 268  275.
In this reading, you will learn how to compute the expected value and variance of discrete and continuous random variables. You will also learn how to compute the expected value and variance of sums of random variables. Several examples will be given, including special random variables that will be studied in the coming units.
Reading these textbook sections and taking notes should take approximately 4 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 16 : Expectation, Chebyshev's Inequality” and “Lecture 17 : Properties of Expectation, Variance, Standard Deviation”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 16: Expectation, Chebyshev’s Inequality” (PDF) and “Lecture 17: Properties of Expectation, Variance, Standard Deviation” (PDF)
Instructions: Read these lectures to get a good review of expected values, variance, and standard deviation. You will see more examples that illustrate the concepts.
Reading these lectures and taking notes should take approximately 3 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6: Expected Value and Variance”: “Section 6.1: Exercises,” “Section 6.2: Exercises,” and “Section 6.3: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6: Expected Value and Variance”: “Section 6.1: Exercises” (PDF), “Section 6.2: Exercises” (PDF), and “Section 6.3: Exercises” (PDF)
Instructions: Go to the Section 6.1 exercises on pages 247  250 and complete exercises 1, 3, 5, 8, 13, and 23. Then go to the Section 6.2 exercises on pages 263  264 and complete exercises 1, 2, 7, 8, and 14. Finally, go to the exercises on page 378 and complete exercises 1 and 3. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 3 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 9: Expected Values”

2.3 Cumulative Distribution
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 9: Cumulative Distribution Function”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 9: Cumulative Distribution Function” (PDF)
Instructions: Read this lecture to learn how to calculate cumulative distributions for discrete and continuous probability distributions. Given a probability mass function of a discrete random variable X or a density function of a continuous random variable X, you should be able to find the cumulative distribution function F(x) = P(X is less than or equal to x). You will also learn how to use the cumulative distribution function to compute probabilities. Finally, you will be introduced to joint distributions, which will be studied further in the next subunit.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Assessment: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1” (PDF)
Instructions: Solve problem 4 in this practice test. Read the problem carefully and then try to solve it yourself before looking up the solution, which you can find here (PDF).
Completing this assessment should take approximately 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 9: Cumulative Distribution Function”

2.4 Joint Probability Distributions
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 10: Marginal Distributions”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 10: Marginal Distributions” (PDF)
Instructions: Read this lecture for an introduction to marginal distributions. You will first learn about joint distributions, f(x,y), of two discrete or continuous random variables X and Y. Several examples will be given, and marginal distributions will be defined.
Reading this lecture and taking notes should take approximately 2 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 11: Conditional Distributions, Multivariate Distributions”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 11: Conditional Distributions, Multivariate Distributions” (PDF)
Instructions: Read this lecture for an introduction to conditional distributions.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 19: Covariance and Correlation, CauchySchwartz Inequality.”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 19: Covariance and Correlation, CauchySchwartz Inequality” (PDF)
Instructions: Read this lecture for an introduction to covariance and correlation.
Reading this lecture and taking notes should take approximately 1 hour.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Assessment: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1” (PDF)
Instructions: Solve problem 3 in this practice test. Read the problem carefully and then try to solve it yourself before looking up the solution, which you can find here (PDF).
Completing this assessment should take approximately 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 10: Marginal Distributions”

Unit 3: Discrete Distributions
In this unit, you will learn about four basic discrete probability distributions that have widespread applications in engineering and science: binomial distributions, multinomial distributions, geometric distributions, and Poisson distributions. One of the most common applications of the binomial distribution is to test items for defects as they come off an assembly line. For the geometric distribution, a common application tests drugs that are known to be effective 70% of the time. We might want to see the probability that the first person on whom the drug is effective this week is the tenth one to take it or the probability that the third person on whom it was effective is the sixth to take it, etc. For the Poisson distribution, typical applications include counting the number of phone calls received by an office in one hour or the number of days school is closed due to snow in a winter.
Unit 3 Time Advisory show close
Unit 3 Learning Outcomes show close

3.1 Binomial Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 10: Binomial Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 10: Binomial Distributions” (YouTube)
Instructions: Watch this video for an introduction to one of the most common discrete probability distributions: binomial distributions. You will learn how to decide if a distribution is binomial and how to find the mean, variance, and standard deviation of a binomial random variable.
If you perform a sequence of n independent, identical Bernoulli trials  that is, trials with only two possible outcomes: a success and a failure  then the random variable that counts the number of successes is called a binomial random variable. The probability of success is denoted by p, and the probability of failure, which is 1  p, is denoted by q.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 10: Binomial Distributions”

3.2 Multinomial Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 12: Multinomial Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 12: Multinomial Distributions” (YouTube)
Instructions: Watch this video for an introduction to multinomial distributions. This is basically a generalization of the binomial distribution.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 12: Multinomial Distributions”

3.3 Geometric Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 13: Geometric Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 13: Geometric Distributions” (YouTube)
Instructions: Watch this video for an introduction to multinomial distributions and geometric distributions. The geometric distribution has to do with Bernoulli trials, but it is different from the binomial distribution. This time you perform a sequence of independent, identical Bernoulli trials until you get the first success. The random variable that counts the number of trials needed is called a geometric random variable.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 13: Geometric Distributions”

3.4 Poisson Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 14: Poisson Distributions” and “Lecture 15: Poisson Distributions (continued)”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 14: Poisson Distributions” (YouTube) and “Lecture 15: Poisson Distributions (continued)” (YouTube)
Instructions: Watch these video lectures for an introduction to Poisson distributions. Poisson experiments are experiments that give the number of outcomes occurring during a given time interval or in a specific region. The random variables that count those outcomes are called Poisson random variables.
For example, a random variable X that counts the number of games postponed due to rain during a football season, or the number of phone calls received per hour by a secretary, are examples of Poisson random variables. Formulas for the distribution will be given as well as the expected value and variance. Specific examples will be given to illustrate the concepts.
Watching these videos and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 20: Poisson Distribution, Approximation of Binomial Distribution, Normal Distribution”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 20: Poisson Distribution, Approximation of Binomial Distribution, Normal Distribution” (PDF)
Instructions: Read this lecture for a further discussion of Poisson distributions and how to use them to approximate the binomial distribution. The normal distribution, which will be studied in Unit 4, will be briefly introduced, and you will be shown how to use it to approximate the binomial distribution.
Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.1: Important Distributions”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.1: Important Distributions” (PDF)
Instructions: Read Section 5.1 “Important Distributions” on pages 183  195 in “Chapter 5: Distributions and Densities.” This reading will help you review the distributions of the previous three subunits and learn about the definition of Poisson distributions as well as other distributions, such as the uniform, negative binomial, and hypergeometric distributions. It is a reinforcement of the concepts you saw in the previous videos and lecture.
Reading this chapter and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.1: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.1: Exercises” (PDF)
Instructions: Go to the Section 5.1 exercises on pages 197  204 and complete exercises 1, 4, 7, 13, 14, 18, 21, 27, 28, and 38. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 3 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 14: Poisson Distributions” and “Lecture 15: Poisson Distributions (continued)”

Unit 4: Continuous Distributions
This unit will introduce you to several important continuous probability distributions, including exponential distributions and normal distributions. You will also learn about standard normal distributions (i.e., a normal distribution with a mean of 0 and a standard deviation of 1). Normal distributions can be converted to standard normal distributions by using a standard formula.
Unit 4 Time Advisory show close
Unit 4 Learning Outcomes show close

4.1 Probability Density Functions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 16: Density Function” and “Lecture 17: Exponential Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 16: Density Function” (YouTube) and “Lecture 17: Exponential Distributions” (YouTube)
Instructions: Watch these videos for an introduction to probability density functions of continuous random variables. The first video will explain the difference between discrete and continuous random variables and show how density functions are used to study continuous random variables. You will see several nice examples, and you will see again the definition of the expected value, or mean, of a continuous random variable.
The second video is a continuation of the first. It will show again how to get the variance and standard deviation of a continuous random variable. At minute 28, the exponential distribution, which is the subject of the next subunit, will be introduced as an example. It is worthwhile to watch it now. If you want to wait until the next subunit you can stop watching after minute 27.
Watching these videos and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 16: Density Function” and “Lecture 17: Exponential Distributions”

4.2 Exponential Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 17: Exponential Distributions” and “Lecture 18: Normal Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 17: Exponential Distributions” (YouTube) and “Lecture 18: Normal Distribution” (YouTube)
Instructions: Watch these video lectures for an introduction to common properties of exponential distributions. If you did not watch the last part of Lecture 17 in the previous subunit, please do so now, starting at minute 28. This section of the video defines the exponential distribution and some of its properties, such as the expected value, variance, etc... The second video will show applications of the exponential distributions and present some very nice reallife examples.
Watching these videos and taking notes should take approximately 3 hours.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 17: Exponential Distributions” and “Lecture 18: Normal Distributions”

4.3 Normal Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 19: Normal Distribution (continued)”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 19: Normal Distribution (continued)” (YouTube)
Instructions: Watch this video lecture for an introduction to normal distributions. This is the most important distribution in probability theory, and it is used in almost every field of probability. You will learn about its properties, and you will see its expected value, variance, and standard deviation.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 21: Normal Distribution, Central Limit Theorem”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 21: Normal Distribution, Central Limit Theorem” (PDF)
Instructions: Read the section on normal distribution on pages 63  64 of this lecture. This reading will strengthen your understanding of the concepts you saw in the previous video.
Reading this section and taking notes should take approximately 1 hour.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 19: Normal Distribution (continued)”

4.4 Standard Normal Distributions
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 20: Standard Normal Distributions”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 20: Standard Normal Distributions” (YouTube)
Instructions: Watch this video lecture for an introduction to standard normal distributions and transformations between standard normal distributions and other normal distributions. The standard normal distribution is a normal distribution with mean 0 and standard deviation 1. If you take any normal random variable, subtract its mean from it, and divide the result by its standard deviation, you will get a standard normal random variable. So, we just need to know about standard normal random variables for which probability tables are developed to compute the probabilities that the variable is less than a given value.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.2: Important Densities”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.2: Important Densities” (PDF)
Instructions: Read Section 5.2 “Important Densities” on pages 205  219 in “Chapter 5: Distributions and Densities.” This reading will help you review the distributions of the previous three units and learn about the definition of normal distributions as well as other distributions, such as the uniform, gamma and Chisquare distributions. It is a reinforcement of the concepts you saw in the previous videos and lecture.
Reading this chapter and taking notes should take approximately 4 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.2: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 5, Section 5.2: Exercises” (PDF)
Instructions: Go to the Section 5.2 exercises on pages 219  224 and complete exercises 1, 7, 14, 16, 17, 21, 25, 27, 29, and 30. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 3 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 20: Standard Normal Distributions”

Unit 5: Law of Large Numbers and Central Limit Theorem
In this unit, you will learn about two important theorems of Probability Theory: the law of large numbers and the central limit theorem. The law of large numbers describes the result of performing the same experiment a large number of times. The central limit theorem states the conditions under which the mean of a large number of random variables will be normally distributed.
Unit 5 Time Advisory show close
Unit 5 Learning Outcomes show close

5.1 Law of Large Numbers
 Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 8, Section 8.1: Law of Large Numbers for Discrete Random Variables” and “Chapter 8, Section 8.2: Law of Large Numbers for Continuous Random Variables”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 8, Section 8.1: Law of Large Numbers for Discrete Random Variables” (PDF) and “Chapter 8, Section 8.2: Law of Large Numbers for Continuous Random Variables” (PDF)
Instructions: Read Section 8.1 and Section 8.2 of “Chapter 8: Law of Large Numbers” on pages 305  320. This reading discusses the law of large numbers. If you have n independent, identically distributed random variables X1, X2, ..., Xn, let S be their sum. The law of large numbers says roughly that if n is very large, then S/n is a good approximation of the mean E(Xn). It is given as a limit theorem. Some nice applications will be given.
Reading these textbook sections and taking notes should take approximately 4 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 18: Law of Large Numbers, Median”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 18: Law of Large Numbers, Median” (PDF)
Instructions: Read the section of the law of large numbers on pages 53  54 of this lecture. This reading will further your understanding of the law of large numbers.
Reading this lecture and taking notes should take approximately 2 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 8, Section 8.1: Exercises” and “Chapter 8, Section 8.2: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 8, Section 8.1: Exercises” (PDF) and “Chapter 8, Section 8.2: Exercises” (PDF)
Instructions: Go to the Section 8.1 exercises on pages 312  313 and complete exercises 1, 5, and 11. Then go to the Section 8.2 exercises on pages 321  322 and complete exercises 1, 2, 4, and 10. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 2 hours and 30 minutes.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 8, Section 8.1: Law of Large Numbers for Discrete Random Variables” and “Chapter 8, Section 8.2: Law of Large Numbers for Continuous Random Variables”

5.2 Central Limit Theorem
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 21: Central Limit Theorem”
Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 21: Central Limit Theorem” (YouTube)
Instructions: Watch this video for an introduction to the Central Limit Theorem. The central limit theorem roughly says that if you select samples of size n randomly from a population of X values, where X is a random variable that has any distribution, then as n gets very large, the sample means will approach a normal distribution. Formulas for the mean and standard deviation of that normal distribution will be given. This is one of the most important concepts in statistics.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: This resource is licensed under a Creative Commons AttributionNonCommercialNoDerivs 3.0 Unported License. It is attributed to UCLA and Dr. Herbert Enderton, and the original version can be found here.  Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 9, Section 9.1: Central Limit Theorem for Bernoulli Trials,” “Chapter 9, Section 9.2: Central Limit Theorem for Discrete Independent Trials,” “Chapter 9, Section 9.3: Central Limit Theorem for Continuous Independent Trials”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 9, Section 9.1: Central Limit Theorem for Bernoulli Trials” (PDF), “Chapter 9, Section 9.2: Central Limit Theorem for Discrete Independent Trials” (PDF), and “Chapter 9, Section 9.3: Central Limit Theorem for Continuous Independent Trials” (PDF)
Instructions: Read Section 9.1 to Section 9.3 of “Chapter 9: Central Limit Theorem” on pages 325  361. This reading further discusses the second fundamental theorem of probability: the central limit theorem.
Reading these textbook sections and taking notes should take approximately 4 hours.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.  Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 9, Section 9.1: Exercises”
Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 9, Section 9.1: Exercises” (PDF)
Instructions: Go to the Section 9.1 exercises on pages 338  339 and complete exercises 1, 2, 3, 5, 8, and 14. You can then check your answers to oddnumbered questions here.
Completing this assessment should take approximately 2 hours and 30 minutes.
Terms of Use: This resource is licensed under a GNU Free Documentation License (FDL). It is attributed to Charles M. Grinstead, J. Laurie Snell and Dartmouth College.
 Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 21: Central Limit Theorem”

5.3 Other Distributions
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 22: Central Limit Theorem, Gamma Distribution, Beta Distribution”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 22: Central Limit Theorem, Gamma Distribution, Beta Distribution” (PDF)
Instructions: Read pages 66  69 of this lecture. This reading will give properties of other important distributions, their expected values, etc.  namely, the Gamma and Beta distributions.
Reading this section of the lecture and taking notes should take approximately 2 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 26: Confidence Intervals for Parameters of Normal Distribution”
Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 26: Confidence Intervals for Parameters of Normal Distribution” (PDF)
Instructions: Read pages 78 and 79 of this lecture. This reading will give a simple introduction to other distributions and their properties  namely, the tdistribution and Chisquare distributions, which are widely used in statistics.
Reading this section of the lecture and taking notes should take approximately 2 hours.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.  Lecture: YouTube: Harvard University: Professor Joseph Blitzstein’s Statistics 110: Probability: “Lecture 30: Chisquare, Studentt, Multivariate Normal Distributions”
Link: YouTube: Harvard University: Professor Joseph Blitzstein’s Statistics 110: Probability: “Lecture 30: Chisquare, Studentt, Multivariate Normal Distributions” (YouTube)
Instructions: Watch this video for an introduction to other distributions and their properties, namely the Chisquare and tdistributions, which are widely used in statistics. You will see that they are basically offshoots of the normal distribution.
Watching this video and taking notes should take approximately 1 hour and 30 minutes.
Terms of Use: The above material is released under a Creative Commons AttributionNon CommercialShareAlike License 3.0. It is attributed to Harvard University and Dr. Joseph Blitzstein, and the original version can be found here.
 Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 22: Central Limit Theorem, Gamma Distribution, Beta Distribution”

Final Exam
 Final Exam: The Saylor Foundation’s “MA252 Final Exam”
Link: The Saylor Foundation’s “MA252 Final Exam” (HTML)
Instructions: You must be logged into your Saylor Foundation School account in order to access this exam. If you do not yet have an account, you will be able to create one, free of charge, after clicking on the link.
Completing this assessment should take approximately 1 hour.
 Final Exam: The Saylor Foundation’s “MA252 Final Exam”