Full Name
Max Raginsky
Job Title
Professor; William L. Everitt Fellow
Company
University of Illinois at Urbana-Champaign
Speaker Bio
Maxim Raginsky received the B.S. and M.S. degrees in 2000 and the Ph.D. degree in 2002 from Northwestern University, all in Electrical Engineering. He has held research positions with Northwestern, the University of Illinois at Urbana-Champaign (where he was a Beckman Foundation Postdoctoral Fellow from 2004 to 2007), and Duke University. In 2012, he has returned to the UIUC, where he is currently a Professor and William L. Everett Fellow with the Department of Electrical and Computer Engineering and the Coordinated Science Laboratory. He also holds a courtesy appointment with the Department of Computer Science. Prof. Raginsky's interests cover probability and stochastic processes, deterministic and stochastic control, machine learning, optimization, and information theory. Much of his recent research is motivated by fundamental questions in modeling, learning, and simulation of nonlinear dynamical systems, with applications to advanced electronics, autonomy, and artificial intelligence.
Speaking At
Abstract
In a remarkable paper published in 1993, Andrew Barron has shown how one can avoid, or at least mitigate, the curse of dimensionality when approximating functions by neural nets with sigmoidal activations. The main conceptual innovation in that work was a probabilistic selection argument: Instead of considering the problem of approximating arbitrary continuous functions, which will inevitably require exponentially many neurons in the worst case, we focus only on those functions that can be represented as expectations of nonlinearities that depend on a random finite-dimensional parameter. Any such function can then be approximated by a finite sum by random sampling, with quantitative guarantees on the approximation error. This construction relies on a fortuitous confluence of structure (encoded in the properties of the nonlinearity) and randomness (encoded in the choice of the probability law of the random parameter). In this talk, I will give an overview of function approximation results making use of this construction and discuss several examples, from neural nets to diffusion processes.