Accès direct au contenu

MATH

Version anglaise

aide

Accueil > Formations

Deep Learning in Practice

Lecturers: Guillaume Charpiat (TD: Wenzhuo Liu and Nilo Schwencke)


Cours en anglais sauf s'il n'y a que des francophones

Course description:

Despite impressive mediatized results, deep learning methods are still poorly understood, neural networks are often difficult to train, and the results are black-boxes missing explanations, which is problematic given the societal impact of machine learning today (used as assistance in medicine, hiring process, bank loans...). Besides, real world problems usually do not fit the standard assumptions or frameworks of the most famous academic work (e.g., data quantity and quality). This course aims at providing insights and tools to address these practical aspects, based on mathematical concepts.
We will first emphasize the gap between practice and classical theory (e.g., number of parameters), and reconcile them thanks to recent theoretical advances.

After a review of recent architectures, we will study visualization techniques, and check that undesired biases present in the dataset (e.g., sensitivity to gender when matching CVs to job offers) are not reproduced.
We will then investigate practical issues when training neural networks, in particular data quantity (small or big data), application to reinforcement learning and physical problems, and automatic hyper-parameter tuning.
  

Pré-requis:


  • The "Introduction to Deep Learning" course by Vincent Lepetit (taking place during the 1st semester)
  • Notions in machine learning, Bayesian statistics, analysis, differential calculus.
  • Basic knowledge of Python and PyTorch.

Structure:

The course will comprise lectures as well as practical and theoretical exercises (in PyTorch), which will be evaluated.
The practical sessions will be run on personal laptops (no GPU needed).
Announcements are made through a dedicated mailing-list; questions and discussions take place on an online forum.
All details are available on the course webpage: https://www.lri.fr/~gcharpia/deeppractice/
or when subscribing to the mailing-list.

Evaluation:

The evaluation of the course is based on 6 exercises.

Resources:

. Related conferences:

– machine learning: NIPS / NeurIPS, ICML, ICLR
– computer vision: CVPR, ICCV, ECCVRelated books:
– « Deep Learning » by I. Goodfellow, Y. Bengio and A. Courville

. Examples of references:

– « Do Deep Nets Really Need to be Deep? »; L. J. Ba, R. Caruana; NIPS 2014
– « Graph Attention Networks »; P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio; ICLR 2018
– « Women also Snowboard: Overcoming Bias in Captioning Models »; L. A. Hendricks, K. Burns, K. Saenko, T Darrell, A Rohrbach; FAT-ML 2018
– « Scaling description of generalization with number of parameters in deep learning »; M. Geiger, A. Jacot, S. Spigler, F. Gabriel, L. Sagun, S. d’Ascoli, G. Biroli, C. Hongler, M. Wyart; arXiv 2019
– « Deep Sets »; M.Zaheer, S.Kottur, S. Ravanbakhsh, B. Poczos, R. Salakhutdinov, A. Smola; NIPS 2017
– « DGM: A deep learning algorithm for solving partial differential equations »; J. Sirignano and K. Spiliopoulos; Journal of Computational Physics 2018

Schedule:

  • Temporalité de l'enseignement : from January to March 2020, from 9h to 12h15 on Mondays (January 13, February 3+10+17+24+27, March 9+16)
  • Nombre d'étudiants maximum suivant l'UE : no limit
  • Nombre d'heures *présentielles* étudiant : 8 sessions of 3 hours each, most often 1.5h lesson followed by 1.5h exercises
  • UE mutualisée au sein de l'offre de masters de l'Université Paris-Saclay : NON (pas à ma connaissance)
  • UE mutualisée avec un cursus ingénieur : OUI : avec CentraleSupelec (cursus OMA)
  • Quel(s) est (sont) le(s) établissement(s) qui prennent en charge vos heures : CentraleSupelec (environ 8 * (1.5*1.5 + 1.5) = 30 HTD)