Mathematics of Deep Learning 



BERLIN, 19 - 30 August 2019, at Zuse Institute Berlin 



Background 



Deep learning (DL) methodologies are currently showing tremendous success in a variety of applications. In many cases, DL based methods outperform traditional approaches by far. At the same time, they lack rigorous mathematical foundation. 

 Recently, various researchers of the mathematical community decided to start developing mathematical theories for DL from different angles, e.g., approaches focusing on analyzing the abstract approximation power of deep neural networks including approaches to understanding the convergence of the numerical minimization methods used in DL, or mathematical frameworks for convolutional neural networks such as the scattering transform approach, or the convolutional sparse coding approach, pushing a door open to the methodology of compressed sensing. 


Topics 



The summer school will offer lectures on both the theory of deep neural networks, on related questions such as generalization, expressivity, or explainability, as well as on applications of deep neural networks (e.g. to PDEs, inverse problems, or specific real-world problems). 



Format 



The school consists of two weeks, where the first week is devoted to the theory of deep neural networks, and the second week has a focus on applications. The format is dominated by 1,5 hour lectures by international experts.

The schedule can be downloaded here.

You can download the slides of the talks here once the speakers have provided them to the BMS: Summer School Talks (password protected area)

Speakers 

and Organizers

Leonid Berlyand (Penn State): PDE techniques in deep learning: convergence & stability of neural net classifiers

Taco Cohen (Qualcomm): learning of equivariant representations for data-efficient deep learning, medical imaging

Francois Fleuret (IDIAP, EPF Lausanne): statistical learning techniques mainly for computer vision

Eldad Haber (University of British Columbia): Using PDEs for designing stable DNN architectures

Robert Jenssen (Tromso): next generation machine learning data analytics methodology, health data analytics

Andreas Krause (ETH Zurich): Large-scale Machine Learning, Probabilistic Modeling and Inference, Sequential Decision Making, Crowds, Learning and Incentives

Gitta Kutyniok (TU Berlin): Theory of DL, Explainability, Applications to Inverse Problems

Ben Leimkuhler (U Edinburgh): MD, Bayesian parameterisation of complex models

Klaus-Robert Müller (TU Berlin): Machine Learning

Frank Noé (FU Berlin)

Christof Schütte (FU Berlin, ZIB): Applications of DL to MD

Vladimir Spokoiny (HU Berlin, WIAS)

René Vidal (Johns Hopkins University): Optimization and DNNs

 

How to get there

Berlin has two airports Tegel TXL and Schoenefeld SXF. TXL is closer to the city but both airports are easily accessible by public transport. You can find detailed information for your journey on this website: www.bvg.de/en

A weekly ticket for zones A and B costs 30€. The ZIB is located in zone B.

The Konrad Zuse Institute Berlin (ZIB) is located on the campus of the Freie Universität Berlin in Dahlem. It can be reached by U-Bahn U3 (station: Dahlem-Dorf) and bus X83 (station: Arnimallee). The bus X83 leaves at the U/S-Bahn station Rathaus Steglitz (U9, S1).

The ZIB is accessible through two foot paths from Arnimallee 6 and 10. Takustreet is closed due to construction. You can find a map here: https://www.zib.de/sites/default/files/page_attachments/Plan_new_0.png

Questions 



Please direct questions about the school to This email address is being protected from spambots. You need JavaScript enabled to view it.

Coffee breaks will be provided.