Information theory has been traditionally studied in the context of communication theory and statistical physics. This talk will reveal applications of information theory in three other fields: 1) Statistics: The Hirschfeld-Gebelein-Rényi maximal correlation is an important tool in statistics that has found applications from correspondence analysis, to detection of non-linear patterns in data. We will describe a simple information-theoretic proof of a fundamental result on maximal correlation due to Dembo, Kagan, and Shepp (2001). 2) Computer Science: Boolean functions are fundamental in theoretical computer science. We show how information-theoretic tools can aid Fourier analytic tools in this quest. Specifically, we will consider the problem of correlation between Boolean functions on a noisy hypercube graph. 3) Mathematics: Hypercontractivity and reverse hypercontractivity are very useful tools for studying concentration of measure, and extremal questions in the geometry of high-dimensional spaces, both discrete and continuous. In this talk, we will describe a recent result by characterizing hypercontractivity using information measures. We will extend this result to reverse hypercontractivity, and its implications. The presentation will use two measures of correlation - the maximal correlation and the so-called strong data processing constant - that will be used throughout. (Based on joint works with Venkat Anantharam, Amin Gohari, and Chandra Nair)
Dr. Sudeep Kamath did his BTech (EE, 2008) from IIT Bombay, MS (EECS, 2011), MA (Stat, 2013) and PhD (EECS, 2013) from the University of California Berkeley. He was a postdoctoral fellow with the University of California at San Diego (2013-14) and is now a postdoctoral fellow in Princeton University. He is the winner of Eliahu Jury Award (2013) at the University of California, Berkeley, and Institute Silver Medal (EE, 2013), IIT Bombay. His research interests are in information theory and coding.