Marco Mondelli (EPFL 2019)

Foundations of Deep Learning

Today, we are witnessing a revolution in information technology, and deep learning is the leading candidate to exploit the newly available datasets in a variety of different fields. However, a coherent framework to understand the limits, devise algorithms, and analyze the performance of deep learning methods has remained elusive.

In fact, the choice of the neural network architecture, as well as its optimization, typically requires an extensive phase of numerical simulations. This approach can be prohibitively expensive in terms of computational and human resources, and it is currently preventing an even wider usage of deep learning. The overall goal of this project is to build theoretical foundations towards the design of more scalable and efficient deep learning systems. In particular, practitioners have observed that gradient descent is successful in problems with billions of parameters. A central part of our study consists in developing powerful analytical tools to understand why and how this is possible. Furthermore, most of the problems encountered in deep learning concern high-dimensional data. Thus, this project aims at providing an accurate characterization of the performance in such high-dimensional tasks.

http://marcomondelli.com