Probability and Information Theory in Machine Learning
Probabilistic tools for machine learning and analysis of real-world datasets. Introductory topics include classification, regression, probability theory, decision theory and quantifying information with entropy, relative entropy and mutual information. Additional topics include naive Bayes, probabilistic graphical models, discriminant analysis, logistic regression, expectation maximization, source coding and variational inference. Previous exposure to numerical computing (e.g. Matlab, Python, Julia, R) required.