For 2018-2019, the “mathematics and computer science track” is a track of the M2 ”Computer Science” master’s program aiming for a double formation in mathematics and computer science with courses at the border of the two disciplines (one math professor and one computer science coordinating for each class). Lectures will be in English. It is expected that it will be also part of the “Mathematics” master’s program from 2019-2020 on.

The persons in charge are Matthieu Fradelizi (LAMA), Cyril Nicaud (LIGM), and, until June 2019, Jean-Christophe Novelli (LIGM).

### Schedule

(Note: UE=”Unité d’enseignement”=indivisible piece of lectures; HETD=”Heure équivalent TD”: 1 HETD corresponds to 40 minutes of lecture.)

- 4 weeks on basics: complements in mathematics and complements in computer science. Each UE has 4 ECTS and 40 HETD. (For 2018-2019: from September 17 to October 12.)
- 10 weeks for a general large background: data sciences, probabilistic methods, discrete maths and geometric calculus. Each UE has 6 ECTS and 60 HETD, split into 2 courses of 3 ECTS and 30 HETD each. Only the three best grades of this part are taken into account, the fourth grade is ignored. (In 2018-2019: from October 15 to December 21. Exams from the 7th to the 11th of January 2019.)
- 8 weeks for two UE of specialization chosen by students among four UE, having each 6 ECTS and 40 HETD. (In 2018-2019: from January 14 to March 15.)
- A research memoir of 18 ECTS. (In 2018-2019: From April 1.)
- English lessons have to be taken during the first 12 weeks and weight 4 ECTS.

### Preliminary list of courses

Semester | Name | ECTS | Hours |

1 | UE Basics Mathematics (analysis, algebra, probability, geometry) |
4 |
40 |

1 | UE Basics Computer Science (complexity, algorithms, programming, graphs) |
4 |
40 |

1 | UE English |
4 |
40 |

1 | UE Discrete and continuous optimisation |
6 |
60 |

Discrete optimisation | 3 | 30 | |

Continuous optimisation | 3 | 30 | |

1 | UE Probabilistic methods |
6 |
60 |

Concentration inequalities and applications | 3 | 30 | |

Algorithms and probability | 3 | 30 | |

1 | UE Discrete Mathematics |
6 |
60 |

Combinatorics | 3 | 30 | |

Discrete isoperimetry | 3 | 30 | |

1 | UE Geometric calculus |
6 |
60 |

Discrete differential geometry | 3 | 30 | |

Discrete systolic geometry and algorithms | 3 | 30 | |

2 | UE Data Sciences |
6 |
40 |

Statistical learning | 3 | 20 | |

Deep learning methods | 3 | 20 | |

2 | UE Maths specialization: Random graphs and graphons |
6 |
40 |

Introduction to graphons | 3 | 20 | |

Applications | 3 | 20 | |

2 | UE CS specialization: Algebraic Combinatorics and formal calculus |
6 |
40 |

Combinatorics Hopf algebra | 3 | 20 | |

Operads | 3 | 20 | |

2 | UE Large random matrices and applications |
6 |
40 |

3 | 20 | ||

3 | 20 |

### Detailed description of the courses of the first semester

**Basics in Mathematics.**

**Algebra and linear algebra (Fradelizi):**– Groups: order, quotient group, cyclic groups, finite groups, finite abelian groups, group actions.

– Rings, polynomials and fields: ideals, principal ideal domains, finite fields.

– Linear algebra: endomorphisms, eigenvectors, spectral theorem.

**Analysis (Beaulieu):**– Normed vector spaces : equivalent norms, topology, continuous functions, compactness, the finite dimensional case.

– Examples of metric spaces.

– Differential calculus. Extremum problems.

– Convex functions, convexity inequalities.

**Probability (Samson):**– Random experiment and probability spaces. Law, mean value, moments,… of a random variable. Applications to combinatorics on graphs.

– Deviation’s inequality, concentration inequalities (Markov, Tchebychev, Hoeffding inequality…).

– Martingales, inequalities with Martingales.

– Markov chains.

References : The Probabilistic Method (N. Alon, J. H. Spencer).

**Geometry (Hauswirth and Romon):**Geometry and topology of of curves. We shall study the discrete curves in the plane and their deformations, in particular the curvature flow, including the isometric flow. These are used in computer graphics for classification, reconstruction or smoothing. We will see the relevant concepts of lengths, areas, gradients and differential equations, and how to implement them on a computer. All the exercises will be made using the programming language Processing (based on Java and very simple to use).

**Basics in Computer sciences.**

**Algorithmic: data structures (Nicaud):**The data structures studied include (in parenthesis, only if time allows):

– array, lists, stack, …

– dynamical arrays

– trees, well-balanced trees

– heap, priority queues

– hashtables

– minimal range query

– suffix array

– suffix trees

**Complexity (Thapper).****Programmation (Borie):**Programmation : Python and Sage.

Quick review of the basics of programming. Solve simple mathematical-algorithmic problems with Python (gcd, f(x) = 0, numerical integration algorithm, knapsack, backtracking, …). Getting started with Sage, a computer algebra system. Programming project at the interface mathematics and computer science.**Graph theory (Bulteau and Weller).**

– Fundamentals– Connectivity

– Planar graphs

– Flow/Cut

– Examples of graph classes

– Examples of problems

– Matchings

– P/NP, Reductions

– Parameterized algorithms

– Examples of parameters

– Kernels

– Minors

The course is an introduction to computational complexity theory. We will cover the following notions:

– Turing machines, the Church-Turing thesis, (un)decidability, the halting problem.

– P, NP, polynomial-time reductions, NP-completeness, the Cook-Levin theorem, co-NP.

– PSPACE.

– The time and space hierarchy theorems, Ladner’s theorem.

– The polynomial hierarchy and collapses.

– Approximation of NP-hard problems.

References: Introduction to the Theory of Computation (Michael Sipser).

**Discrete and continuous optimization.**

**Discrete optimization (Meunier):**Min-max results in combinatorial optimization provide elegant mathematical statements, are often related to the existence of efficient algorithms, and illustrate well the power of duality in optimization. The course aims at being a gentle introduction to the richness of this type of results, and especially those that belong to the theory of perfect graphs. It will make connections with the course of continuous optimization, in particular in what concerns linear programming and polyhedra, and will rely on concrete examples taken from industry that illustrate the relevance of tools from combinatorial optimization for real-world applications.

The preliminary plan of the course is as follows:– Discrete optimization in bipartite graphs: Hall’s marriage theorem, König’s theorems, algorithms.

– Chains and antichains in posets: theorems of Dilworth and Mirsky.

– Chordal graphs: interval graphs, coloring, duality, decomposition.

– Perfect graphs: definition, weak and strong theorems.

– Perfect graphs: polyhedra, algorithms.

– Lovász’ theta function: definition, computation, sandwich theorem, Shannon capacity.

**Continuous optimization (Sandier):**The course will cover over theoretic and algorithmic aspects of convex optimization in a finite-dimensional setting. Important applications will be discussed.

The tentative list of topics covered is as follows:– Linear programming, the simplex algorithm, totally unimodular matrices. Applications to network flow, optimal strategies, allocation…

– Convex fuctions, convexity preserving transformations. Beyond linear programming: semi-definite programming, convex programming. Applications in statistics and interpolation.

– Necessary conditions for optimality, Karush-Kuhn-Tucker conditions.

– Weak and strong duality, Farkas Lemma and its various formulations. Duality examples, shadow prices, max flow-min cut.

– Algorithms for non constrained optimization. Line search, descent methods (gradient, steepest descent, Newton).

– Sparse solutions via L1 penalization. LASSO method.

– Interior point algorithms for constrained optimization. The cases of linear and semi-definite programming.

**Probabilistic methods.**

**Concentration inequalities and applications (Guédon):**We will study various subjects related to the phenomena of concentration of measure.

I) We will present some classical concentration inequalities for the sum of independent random variables, like Hoeffding, Bennett, or Bernstein inequalities. We will focus on the particular case of Gaussian or sub-Gaussian random variables and apply these concentration inequalities to the Johnson-Lindenstrauss lemma.

A classical reference is the book

“Concentration inequalities: a nonasymptotic theory of independence”, by Boucheron, Lugosi, Massart.II) We will study the problem of reconstruction of sparse vectors via the -minimization method. And present the links with the study of convex bodies and their euclidean sections.

A classical reference is the book

“Interactions between compressed sensing, random matrices and high dimensional geometry”, by Chafaï, Guédon, Lécuyer, Pajor.III) We will study the volume of convex bodies, the links with the study of nets and entropy numbers. We will discuss some algorithmic problems related to the geometry of convex bodies and present the ellipsoid method (a counterpart to the simplex method).

A classical reference is the book

“Geometric algorithms and combinatorial optimization”, by Grötschel, Lovász, Schrijver.**Randomized algorithms (Nicaud):**This course is about randomized algorithms, which relies on randomness to speed up their running time. It is based on the classical textbook “Randomized Algorithms” by Motwani and Raghavan.

The methods:

– Las Vegas and Monte Carlo algorithmes

– complexity classes for randomized algorithms

– lower bounds: Yao’s Minimax principle

– some probabilistic settings useful in algorithms: coupon collector, birthday problem, …

Applications:

– analysis of Quicksort and Quickselect algorithms

– stable marriage problem

– probabilistic data structures (hash tables, skip lists, treaps, …)

– probabilistic counting

– graph algorithms

– …

**Discrete mathematics.**

**Combinatorics (Novelli):**

The lectures on enumerative combinatorics will consist in the study of

– classical objects: permutations, trees, partitions, parking functions, …– classical sequences: factorial, Catalan, Schroder, …

– classical methods: bijections, group actions, induction, generating series.

The lectures will be heavily based on the study of various examples, some very easy and others trickier.

**Discrete isoperimetric inequalities (Fradelizi and Hubard):**The classical isoperimetric inequality asserts that among all the sets of a certain surface area the Euclidean ball maximizes the volume. During the last decades inequalities relating the size of the boundary with the size of a set have emerged in diverse fields of mathematics and computer science. Particularly, isoperimetric inequalities bridge geometric invariants with probabilistic and algebraic ones. In the first part of this course we focus on exact isoperimetric inequalities on the sphere, the Euclidean space and the discrete hypercube. Specifically we will present the inequalities of Brunn-Minkowski, Loomis-Whitney, Sauer-Shelah, Harper and Kruskal-Katona, together with their applications, the flattening lemma, the epsilon net theorem. We then develop basics of Boolean analysis on the hypercube and give an application to random graphs.

The second part of the course concentrates on isoperimetric inequalities on families of graphs. Here we relate geometric invariants of graphs to algebraic properties of their Laplacians and to the behavior of random walks on them. After introducing the basics of spectral graph theory we will prove Cheeger inequality and concentrate on two families of graphs that behave very differently with respect to isoperimetric inequalities. We will see that on one hand planar graphs have small separators via the circle packing theorem, on the other hand, we will construct expander graphs which have rapid mixing time. Each of these behaviors turns out to have applications in computer science and scientific computing.

**Geometric calculus.**

**Discrete differential geometry (Hauswirth and Romon):**Discrete differential geometry describe how to extend curvature

to discrete objects such that polyedra or graphs. Geometric flows

along curvature is useful to segment or denoise image. Geometric flow

under constraint can be used to preserve a texture mapping along

deformation. Extension on graphs pretends to isolate sub graph relevant

for topological data analysis.Programm

– Discrete surface theory, topology and Gaussian curvature, Gauss Bonnet theorem.

– Discrete differential calculus: Laplacian, divergence operator, differential forms, Poincaré’s Lemma.

– Discrete mean curvature on triangulated surfaces.

– Parametrization by line of curvature, Quad surfaces and application to architecture.

– Intrinsic curvature and application to graph theory.

**Discrete systolic geometry and algorithms (Colin de Verdière, Hubard and Sabourau):**This course, located at the interface between mathematics and computer science, is part of the research topics “Images and Geometry” and “Discrete Mathematics and Algorithms” of Labex Bézout.

This course is an introduction to systolic geometry and its ramifications based on the study of polyhedral objects. This approach makes it possible to directly manipulate the notions of length, area and volume without going through the formalism of differential geometry. We will also be interested in algorithmic problems related to geometric constructions of systolic geometry. The theme of this course is to relate the notion of systole, defined as the minimum length of a closed noncontractible curve, to other geometric invariants such as volume or diameter across different inequalities, and to study the algorithmic aspects of these inequalities.

This area of research at the crossroads of geometry, topology and algorithmics is developing strongly with many ramifications. We will present a panorama of classical or more recent results in systolic geometry based on various techniques. The topics covered concern algebraic topology (homotopic and homological) in a simple form, systolic inequalities on graphs, surfaces and in higher dimension, pants decomposition, probabilistic and arithmetical constructions of almost extremal surfaces, min-max inequalities related to families of curves as well as algorithmic problems related to these notions. We will also present some open questions.

This course requires no advanced knowledge.