EE763:SCIENCE OF INFORMATION STATISTICS AND LEARNING
Information Theory basics: Bayes’ theorem, Random Variables, Independence
and Conditioning, Shannon entropy, Relative entropy, Mutual Information,
Markov chains, Sanov’s theorem.
Statistics: Linear regression, statistical model, Exponential families,
sampling, Monte Carlo, inference, Maximum Likelihood Estimation, Maximum a
posteriori, Bayesian Inference.
Inference: MaxENT algorithm, relation between Bayesian and MaxENT
methods, Statistical Mechanics, Ising models, graphical models,
Hammersley-Clifford theorem, EM algorithm, belief propagation.
Learning: Introduction to neural networks, the single neuron as a
classifier, capacity of a single neuron, learning as inference, Hopfield
networks, Boltzmann machines, Supervised learning in multilayered networks,
Gaussian processes, Deconvolution.
Application to Chemical Reaction Networks: Introduction to chemical
reaction networks, Mass-action kinetics, Chemical Master Equation, Birch’s
theorem, Connection to exponential families, the MLE algorithm using
reaction networks, current topics in molecular intelligence.
David MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press, fourth printing, 2005.
T. Cover and J. A. Thomas, Elements of Information Theory, Wiley Student Edition 2006, Second edition.
Larry Wasserman, All of Statistics: A Concise Course in Statistical Inference, Springer Science and Business Media, 2013.
Kevin P. Murphy“Machine learning: a probabilistic perspective.” MIT press, 2012.
Manoj Gopalkrishnan. "A Scheme for Molecular Computation of Maximum Likelihood Estimators for Log-Linear Models." Springer LNCS Proceedings of the 22nd International Conference on DNA Computing and Molecular Programming 2016, arXiv:1506.03172 (2015).
Shun-ichi Amari, Information Geometry and its applications, Springer Applied Mathematical Sciences volume 194, 2016.
Edwin T. Jaynes, "Information theory and statistical mechanics." Physical review 106.4 (1957): 620.