Introduction to Statistical Computing
Graduate Course, Fudan University, School of Data Science, 2023
This course teaches probabilistic graphical models and their inference algorithms.
Contents
- Introduction
- Conjugate priors
- Linear Gaussian system
- The exponential family distribution
- Bayesian statistics
- Bayesian statistics vs. frequentist statistics
- Bayesian model selection (BIC)
- Hierarchical Bayesian (Empirical Bayes)
- Generalized linear models
- Bayesian linear regression
- Bayesian logistic regression
- Directed graphical models
- d-separation
- Markov blanket
- Mixture models and the EM algorithm
- Gaussian process
- Kernels
- GP for regression
- Markov and hidden Markov models
- Markov models
- Hidden Markov model
- State space model
- Linear dynamical system
- Kalman filtering and smoothing
- Markov random fields
- The Hammersley-Clifford theorem
- Ising model etc.
- Variational Inference
- The mean field method
- Expectation propagation
- Monte Carlo inference
- Sampling from standard distributions
- Rejection sampling (Adaptive rejection sampling)
- Importance sampling
- MCMC
- Gibbs sampling
- Metropolis Hastings algorithm
- MCMC using Hamiltonian Dynamics
- Clustering
- Dirichlet process mixture model
- Structured data
- Restricted Boltzmann machines (contrastive divergence)
- Online learning
- Multi-armed bandit
- Thompson sampling
- Bayesian optimization (expected improvement, entropy search)
References
- Kevin Murphy, “Machine Learning: A probabilistic perspective”, 2012, MIT press