Oct. 2020: The visit of Marc Niethammer in 2019 led to a result, obtained jointly with François-Xavier Vialard (LIGM, Bézout Labex) and other colleagues, to be presented at NeurIPS 2020 as an oral (top 1.1% of submissions).
Deep learning usually has a large number of layers and thus a large number of parameters, which are optimized for the given learning task. Questions raised in this work are: Is it possible to parametrize deep neural network with much less parameters, and to control the complexity of the resulting deep neural network maps ? This work leverages optimal control ideas to answer positively to both questions. The authors use a regularization on the parameters and by optimizing only on the “optimal paths” in the parameter space, they are able to parametrize the network using only “initial conditions” and the complexity of the map is controlled explicitly in terms of these initial conditions. They show promising experiments. This work may open up a new fertile area of research in deep learning parametrization.