From Atoms to Intelligence

Atoms come together to form molecules. Molecules self-organize to form crystals, and stars, and planets, and galaxies, and living systems. Living systems self-organize to display a wealth of sophisticated behavior, leading to the evolution of marvels such as the human brain. What algorithmic processes drive this arrow of sophistication? Can we get a computer to replicate such processes?

My broad research goals are:

to study the principles behind self-organization from the vantage point of the information sciences (algorithms, machine learning, information theory), and 

to discover new principles for the design of algorithms and learning systems from the study of self-organization.

For beginning researchers: The interdisciplinary nature of this research area requires the willingness to engage with multiple disciplines including Machine Learning, Statistics, Statistical Mechanics, Information Theory, Information Geometry, Cognitive Psychology, Neuroscience, Evolution, Chemical Reaction Network Theory, etc. If you want to work in this area, you should have a foundation in the physical, mathematical, and information sciences, and be a proficient programmer. Here are some resources that I would strongly recommend as reading for every beginning researcher who wishes to work in this area.

1. The Feynman Lectures on Physics: Perhaps the greatest exposition of Physics at the undergraduate level by one of the all-time greats of physics and physics communication, this book never goes out of fashion! Tl;dr: the chapters on Thermodynamics are a must-read.

2. Elements of Information Theory by Cover and Thomas: This book has its fans and its detractors. It is written in a somewhat terse style which can be intimidating to the newbie, and gives almost no physical intuition for the mathematical definitions. It doesn't adequately cover some of the latest topics in Information Theory like one-shot information theory and network information theory.On the plus side, it covers quite a lot of topics, and the proofs are clean and mathematically parsimonious (though often mysterious). To benefit maximally from this book, use its topics more as a list of topics to study, and supplement your reading with in-class instruction from a good instructor or video lectures as well as readings from other books on information theory, including the one by Csiszar. This book has a great list of exercises at the end of each chapter, and a thoughtful problem solver will get a lot from it. Tl;dr: Chapter 2 and Chapter 12 give a crash course in Statistics and connections to Information Theory. 

3. Smooth Dynamical Systems by M C Irwin: This beauty of a book (Dover publication!) will give a quick but insightful introduction to the geometric approach to dynamical systems. 

4. Information Theory and Statistical Mechanics by E T Jaynes: This is a good follow-up to reading Feynman's lectures on Thermodynamics and Chapter 2 of Cover and Thomas. There is something intoxicating about Jaynes' prose, he makes you want to believe him. You feel like you understand everything he's saying, but it is going to take a few re-reads before you start to realize you don't understand, and then a few more to start getting the picture. 

5. Information Geometry and its applications by Amari: Amari has been one of the pioneers of Information Geometry, a subject which promises to bring the powers of global analysis in geometry tothe information point of view . If you have tried reading his early papers and books, you may have found that they make for difficult reading. However, this one is much more accessible with many applications, as promised in the title. Be warned, this book is going to require more prerequisites, including a course in Riemannian geometry and some familiarity with the language of connections, as well as topics from statistics and machine learning (as well as some intuition from thermodynamics and statistical mechanics). But if you can work through this book, the reward is worth all the effort. Tl;dr: For mathematically mature audiences only. Keep on your shelf, and read occasionally as and when a certain chapter starts to make sense.

6. The Selfish Gene by Richard Dawkins: Absolutely everyone who has read this calls this book a masterpiece on the theory of evolution. I must confess I have never read this book, because I felt I had already learnt all the contents through second-hand sources like Genome by Matt Ridley and other books. Such is the shadow this book has cast onideas about evolution that every subsequent book, and many subsequent papers, refer to this book, often many times. Tl;dr: Evolution from the point of view of information processing.

Honourable mentions: Deep Learning by Goodfellow, Bengio, & Courville; Information theory, inference and learning algorithms by David MacKay; Algorithms by Dasgupta, Papadimitriou, and Vazirani


My research publications are listed on my Google Scholar page.