Generative Deep Learning
A.Y. 2020/2021
News
This course has been awarded the TensorFlow Faculty Award 2021!
General Information
Teacher: Danilo Comminiello
Credits: 3 CFU
Scientific sector: ING-IND/31
Course language: English
Offered programs: PhD Course in Information and Communication Technology (ICT).
Calendar: June 14-15-21-22-30, July 1, 2021
Class timing: 10:00-13:00 - Online
Classroom page: Download slides, videolectures, Python notebooks, homeworks and additional material. Access code: f5saj7t. Participants are invited to register here.
Online lessons: Connect with your Sapienza account at this Zoom link.
Office hours: by appointment
Course Description
Generative deep learning represents one of the most promising paradigms of the modern artificial intelligence. Generative models aim at learning the true data distribution of the training set in an unsupervised fashion and at generating new data points with some variations by leveraging the capabilities of deep neural networks.
In this course, we will study the foundations and the main models of generative deep learning, including variational autoencoders, generative adversarial networks, normalizing flows and energy-based models. The course will also discuss some applications related to information and communication technology (ICT) that benefit from deep generative learning.
Application examples can be agreed with the teacher on the basis of the own PhD research topics. Please inform the teacher in advance.
Prerequisites
Knowledge of machine learning is warmly recommended, as well as basic programming skills.
Exam assessment and grade evaluation
Small project assignment on one of the course topics. The topic can also be agreed on the basis of the student’s PhD research program.
Program
INTRODUCTION TO GENERATIVE DEEP LEARNING. Generative modeling and probabilistic generative models. Deep neural networks. Building deep network models. Learning latent representations.
VARIATIONAL AUTOENCODERS. Autoencoders. Variational inference. Variational autoencoders (VAEs) and modern deep VAE architectures.
GENERATIVE ADVERSARIAL NETWORKS. Discriminator and generator. Training generative adversarial networks (GANs). Optimization and loss functions of GANs.
NORMALIZING FLOWS AND ENERGY-BASED MODELS. Invertible transformation of complex distributions. Autoregressive and normalizing flows. Likelihood based learning. Energy based models (EBMs) with latent variables. Self-supervised learning with EBMs.
APPLICATIONS. Python examples of generative deep learning models, such as: music generation, image style transfer, text generation, fake video generation, network anomaly detection, data augmentation and biomedical imaging generation, among others.
Textbook and Material
Main textbook:
David Foster, "Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play", O'Reilly Media, Inc., June 2019.
Supplementary material (e.g., course slides, papers, notebooks) will be provided by the instructor on the Classroom page (access code: f5saj7t).