In this course we learn to:
- Describe & apply the Bayesian approach to statistics.
- Explain the key differences between Bayesian and Frequentist approaches.
- Master the basics of the R computing environment.
We will build the following skills:
- Regression Analysis
- Statistics
- Probability
- R Programming
- Data Analysis
- Statistical Inference
- Statistical Modeling
- Bayesian Statistics
- Statistical Analysis
- Probability Distribution
- Microsoft Excel
There are four modules in this course:
Probability and Bayes’ Theorem: In this module, we review the basics of probability and Bayes’ theorem. In Lesson 1, we introduce the different paradigms or definitions of probability and discuss why probability provides a coherent framework for dealing with uncertainty. In Lesson 2, we review the rules of conditional probability and introduce Bayes’ theorem. Lesson 3 reviews common probability distributions for discrete and continuous random variables.
Statistical Inference: In this module we introduces concepts of statistical inference from both frequentist and Bayesian perspectives. Lesson 4 takes the frequentist view, demonstrating maximum likelihood estimation and confidence intervals for binomial data. Lesson 5 introduces the fundamentals of Bayesian inference. Beginning with a binomial likelihood and prior probabilities for simple hypotheses, you will learn how to use Bayes’ theorem to update the prior with data to obtain posterior probabilities. This framework is extended with the continuous version of Bayes theorem to estimate continuous model parameters, and calculate posterior probabilities and credible intervals.
Priors and Models for Discrete Data: In this module, you will learn methods for selecting prior distributions and building models for discrete data. Lesson 6 introduces prior selection and predictive distributions as a means of evaluating priors. Lesson 7 demonstrates Bayesian analysis of Bernoulli data and introduces the computationally convenient concept of conjugate priors. Lesson 8 builds a conjugate model for Poisson data and discusses strategies for selection of prior hyperparameters.
Models for Continuous Data: In this module we cover conjugate and objective Bayesian analysis for continuous data. Lesson 9 presents the conjugate model for exponentially distributed data. Lesson 10 discusses models for normally distributed data, which play a central role in statistics. In Lesson 11, we return to prior selection and discuss ‘objective’ or ‘non-informative’ priors. Lesson 12 presents Bayesian linear regression with non-informative priors, which yield results comparable to those of classical regression
Introduction
In these notes I supplement the course material with my own notes for establishing an axiomatic foundation for probability. These were ostentatiously omitted from the specialization material, but alluded to in many places as a form of hand waving for introducing results. I found their absence increasingly irksome that I decided to add them to my notes.
In reality probability theory is a beautiful part of Mathematics and bring together many results from analysis, topology, functional analysis and integration theory. I hope that adding this material will make your journey easier and not more challenging. In the future I hope to dive a little deeper, as progressing with the specialization has uncovered additional topics in probability theory that might be useful to review. When this happens, I will move most of these extras into their own appendices.