About
Statistics and information theory are deeply intertwined, influencing each other's development for over four decades. Concepts such as maximum likelihood estimation and Shannon coding illustrate this close relationship, leading to breakthroughs in model uncertainty handling. Through these connections, fundamental ideas like Fisher information and Shannon information offer diverse insights across disciplines. In today's data-driven era, these tools are essential for pushing the boundaries of learning, particularly in complex data environments.
Our conference convenes leading researchers and emerging data science leaders to celebrate and explore these foundational connections. With a focus on new developments in information-theoretic methods, neural network approximations, and statistical learning, the event promises rich discussions and opportunities for interdisciplinary collaboration. Hosted by Yale University, renowned for its commitment to data science education and research, the conference provides a platform to shape the future of statistics, probability, and information theory.
(in alphabetical order):
- Joseph Chang (Yale University)
- Bertrand Clarke (University of Nebraska–Lincoln)
- Feng Liang (University of Illinois Urbana-Champaign)
- Curtis McDonald (Yale University)
- Cynthia Rush (Columbia University)
- Dan Spielman (Yale University)
- Yuhong Yang (University of Minnesota)
- Emily Hau, Associate Director, Yale Institute for Foundations of Data Science
- Mandy Singer, Administrative Assistant, Yale Institute for Foundations of Data Science
At Yale University, Andrew Barron regularly teaches courses in Information Theory, Theory of Statistics, High-Dimensional Function Estimation and Artificial Neural Networks. He has served terms as department chair, director of graduate studies, director of undergraduate studies in Statistics, director of undergraduate studies in Applied Mathematics, and courtesy appointee as Professor of Electrical Engineering.
Prior to his appointment as Professor of Statistics and Data Science at Yale University in 1992, Andrew Barron was a faculty member in Statistics and Electrical and Computer Engineering at the University of Illinois at Urbana Champaign. He received his MS and PhD degrees from Stanford University in Electrical Engineering in 1985 under the direction of Tom Cover and a Bachelor's degree in the fields of Mathematical Science and Electrical Engineering from Rice University in 1981. A Fellow of the IEEE, Barron is a Medallion Prize winner of the Institute of Mathematical Statistics, and a winner along with Bertrand Clarke of the IEEE Thompson Prize. This year, he will be the Shannon Lecturer at the 2024 International Symposium on Information Theory.
Andrew Barron has proudly mentored 20 PhD students. Often working with those students and other colleagues, he is known for several specific research accomplishments:
- generalizing the AEP to continuous-valued ergodic processes
- proving an information-theoretic Central Limit Theorem
- determining information-theoretic aspects of portfolio estimation
- formulating the index of resolvability and providing an associated characterization of performance of Minimum Description Length estimators
- determining the asymptotics of universal data compression in parametric families
- characterizing the concentration of Bayesian posteriors in the vicinity of parameters in the information support of the prior
- an information-theoretic determination of the minimax rates of function estimation
- providing information-theoretic characterization of statistical efficiency
- providing an early unifying view of statistical learning networks
- developing approximation and estimation bounds for artificial neural networks and recent extensions to deep learning
- advancing greedy algorithms for training neural networks
- information-theoretic aggregation of least squares regressions
- formulating and proving capacity-achieving sparse regression codes for Gaussian noise communication channels