The Mathematical Theory of Deep Neural Networks

Tue, Mar 20, 2018, 10:00 am
Princeton Neuroscience Institute, Lecture Hall A32

Recent advances in deep networks, combined with open, easily-accessible implementations, have moved the fields empirical results far faster than formal understanding. The lack of rigorous analysis for these techniques limits their use in addressing scientific questions in the physical and biological sciences, and prevents systematic design of the next generation of networks. Recently, long-past-due theoretical results have begun to emerge. These results, and those that will follow in their wake, will begin to shed light on the properties of large, adaptive, distributed learning architectures, and stand to revolutionize how computer science and neuroscience understand these systems.


This intensive one-day technical workshop, generously supported by the Princeton Neuroscience Institute (PNI), the Center for Statistics and Machine Learning (CSML) and Princeton Psychology, will focus on state of the art theoretical understanding of deep learning. We aim to bring together researchers from PNI and CSML at Princeton University and of the theoretical machine learning group at the Institute for Advanced Studies (IAS) interested in more rigorously understanding deep networks to foster increased discussion and collaboration across these intrinsically related groups. For more information and registration, please visit: (space is limited but all talks will be live-streamed).