Generative Deep Learning
A.Y. 2021/2022
News
We have the pleasure to host two webinars from international experts in the generative deep learning research field from Google Brain and Stanford AI Lab. Further information here.
This course has been awarded the TensorFlow Faculty Award 2021!
General Information
Teacher: Danilo Comminiello
Credits: 3 CFU
Scientific sector: ING-IND/31
Course language: English
Offered programs: PhD Course in Information and Communication Technology (ICT).
Calendar: June 6-7-13-14-20-21, 2022
Class timing: 10:00-13:00
Lecture modality: Hybrid: DIET Reading Room (DIET Dept, II floor, Via Eudossiana 18) / Online via Zoom
Classroom page: Download slides, videolectures, homeworks and additional material. Access code: rg2mczw. Participants are invited to register here.
GitHub repository: Python course notebooks, additional material, project notebooks.
Office hours: by appointment
Course Description
Generative deep learning represents one of the most promising paradigms of the modern artificial intelligence. Generative models aim at learning the true data distribution of the training set in an unsupervised fashion and at generating new data points with some variations by leveraging the capabilities of deep neural networks.
In this course, we will study the foundations and the main models of generative deep learning, including autoregressive models, variational autoencoders, generative adversarial networks, flow-based models and energy-based models. The course will also discuss some applications related to information and communication technology (ICT) that benefit from deep generative learning.
Application examples will be addressed by using TensorFlow. Fields of application can be agreed with the teacher on the basis of the own PhD research topics. Please inform the teacher in advance if you are interested.
TensorFlow Faculty Award
The course of Generative Deep Learning has been awarded the TensorFlow Faculty Award 2021 from Google in support of the development of new teaching courses on emerging machine learning topics that also promote diversity initiatives aimed at widening access to machine learning education. To this end, this edition of the course will also feature some extra initiatives (e.g., invited seminars) and additional material and tutorials from Google TensorFlow.
Prerequisites
Knowledge of machine learning is warmly recommended.
Program
INTRODUCTION TO GENERATIVE DEEP LEARNING. Generative modeling and probabilistic generative models. Deep neural networks. Building deep network models. Learning latent representations.
GENERATIVE AUTOREGRESSIVE MODELS. Autoregressive models. Parameterization by neural networks. Generation examples.
VARIATIONAL AUTOENCODERS. Probabilistic principal component analysis. Variational inference. Variational autoencoders (VAEs) and modern deep VAE architectures.
GENERATIVE ADVERSARIAL NETWORKS. Generator and critic. Training generative adversarial networks (GANs). Optimization and loss functions of GANs.
ENERGY-BASED AND DIFFUSION MODELS. Likelihood based learning. Energy based models (EBMs) with latent variables. Self-supervised learning with EBMs. Diffusion and score-based models.
APPLICATIONS. Python examples of generative deep learning models, including: music generation, image style transfer, text generation, video generation, anomaly detection, data augmentation, inverse problems solution and medical imaging applications, among others.
Class Schedule
The course will be held in June 2022 with the following schedule:
Monday, June 6 10:00-13:00
Tuesday, June 7 10:00-13:00
Monday, June 13 10:00-13:00
Wednesday, June 15 10:00-13:00
Monday, June 20 10:00-13:00
Tuesday, June 21 10:00-13:00
The course will be held in hybrid modality:
In-person: DIET Reading Room, DIET Dept., II floor, Via Eudossiana 18
Online: via Zoom at the following link:
https://uniroma1.zoom.us/j/86596351314?pwd=T0pja1RKLzFTd2ZabWN0TS9yN2tMQT09
Final Examination
Small project assignment on one of the course topics. The topic can also be agreed based on the students' PhD research program.
This year there is the possibility to agree research projects including groups of students from different PhD programs under the supervision of a tutor.
Textbooks and Material
Course slides, lecture notes and lab notebooks by the instructor.
All the material can be found on the Classroom page of the course (code rg2mczw). Participants are invited to register.
Textbooks:
Jakub M. Tomczak, "Deep Generative Modeling", Springer, 2022.
Kevin P. Murphy, "Probabilistic Machine Learning: Advanced Topics", The MIT Press, 2022.
David Foster, "Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play", O'Reilly Media, Inc., June 2019.