Seminars and Talks

Generative Modeling by Estimating Gradients of the Data Distribution
by Yang Song
Date: Thursday, Jun. 24
Time: 17:30
Location: Online Call via Zoom

Our guest speaker is Yang Song from the University of Stanford and you are all cordially invited to the CVG Seminar on June 24th at 5:30 p.m. CET on Zoom (passcode is 299064), where‪ Yang will give a talk titled “Generative Modeling by Estimating Gradients of the Data Distribution“.

Abstract

Existing generative methods are typically based on training explicit probability representations with maximum likelihood (e.g., VAEs), or learning implicit sampling procedures with adversarial training (e.g., GANs). The former requires variational inference or special model architectures for tractable training, while the latter can be unstable. To address these difficulties, we explore an alternative approach based on estimating gradients of probability densities. We can estimate gradients of distributions by training flexible neural network models with denoising score matching, and use these models for sample generation, exact likelihood computation, posterior inference, and data manipulation by leveraging techniques of MCMC and stochastic differential equations. Our framework enables free-form model architectures, requires no adversarial optimization, and achieves the state-of-the-art performance in many applications such as image and audio generation.

Bio

Yang Song is a fifth-year PhD student in Computer Science at Stanford University, advised by Stefano Ermon. His research focuses on deep generative models, with applications in robust machine learning and inverse problems. He is a recipient of the inaugural Apple PhD Fellowship in AI/ML and J.P. Morgan PhD Fellowship. His research in score-based generative models has been recognized in NeurIPS 2019 (Oral) and ICLR 2021 (Outstanding Paper Award).
 

Causality and Distribution Generalization
by Professor Jonas Peters
Date: Thursday, May. 27
Time: 14:30
Location: Online Call via Zoom

Our guest speaker is Professor Jonas Peters from the department of mathematical sciences at the University of Copenhagen and you are all cordially invited to the CVG Seminar on May 27th at 2:30 p.m. CET on Zoom (passcode is 486210), where‪ Jonas will give a talk titled “Causality and Distribution Generalization“.

Abstract

Purely predictive methods do not perform well when the test distribution changes too much from the training distribution. Causal models are known to be stable with respect to distributional shifts such as arbitrarily strong interventions on the covariates but do not perform well when the test distribution differs only mildly from the training distribution. We discuss methods such as Anchor Regression, Stabilized Regression, and CausalKinetiX that trade-off between causal and predictive models to obtain favorable generalization properties. We discuss possible extensions to nonlinear models and the theoretical limitations of such methodology.

Bio

Jonas is a professor in statistics at the Department of Mathematical Sciences at the University of Copenhagen. Previously, he has been a group leader at the Max-Planck-Institute for Intelligent Systems in Tuebingen and a Marie Curie fellow at the Seminar for Statistics, ETH Zurich. He studied Mathematics at the University of Heidelberg and the University of Cambridge and obtained his Ph.D. jointly from MPI and ETH. He is interested in inferring causal relationships from different types of data and in building statistical methods that are robust with respect to distributional shifts. In his research, Jonas seeks to combine theory, methodology, and applications. His work relates to areas such as computational statistics, causal inference, graphical models, independence testing, or high-dimensional statistics.

Uncovering the Intrinsic Structures: Representation Learning and Its Applications
by Dr. Shuai Zhang
Date: Friday, Apr. 30
Time: 14:30
Location: Online Call via Zoom

Our guest speaker is Dr. Shuai Zhang from the department of computer science at ETH Zurich and you are all cordially invited to the CVG Seminar on April 30th at 2:30 p.m. CET on Zoom (passcode is 765585), where‪ Shuai will give a talk titled “Uncovering the Intrinsic Structures: Representation Learning and Its Applications“.

Abstract

Learning suitable data representation lives at the heart of many intelligent applications. The quality of the learned representations is determined by how well the model uncovers the intrinsic structures of data.
In this talk, I will first describe our recent work on geometry-oriented representation learning and demonstrate how applications that heavily rely on representation learning can benefit from it. In particular, I will present a data-driven approach, switch space, a novel way of combining spherical, euclidean, and hyperbolic spaces in a single model with specialization. Using switch spaces, we obtain state-of-the-art performances on knowledge graph completion and recommender systems.  Then, I will introduce our ICLR 2021 work on learning representations in hypercomplex space, including the parameterized hypercomplex multiplication layer and its applications on LSTM and Transformer.

Bio

Shuai Zhang is a postdoctoral researcher in the department of computer science at ETH Zurich, where he works with Prof. Ce Zhang. He received his Ph.D. in computer science from the University of New South Wales, under the supervision of Prof. Lina Yao.  His current research lies in geometry-oriented representation learning and its applications in information filtering, knowledge graph completion, and reasoning. He is a recipient of the outstanding paper award at ICLR 2021 and the best paper award runner-up at WSDM 2020.

Distributions and Geometry
by Emiel Hoogeboom
Date: Friday, Mar. 26
Time: 13:30
Location: Online Call via Zoom

Our first guest speaker is Emiel Hoogeboom from the University of Amsterdam and you are all cordially invited to the CVG Seminar on March 26th at 1:30 p.m. CET on Zoom (passcode is 809447), where‪ Emiel will give a talk titled “Distributions and Geometry“.

Abstract

Deep generative models aim to model complicated high-dimensional distributions. Among these are Normalizing Flows, a rich family of distributions for many different types of geometry. Normalizing Flows are attractive because in many cases they admit exact likelihood evaluation, and can be designed for fast inference and sampling. Modelling high-dimensional distributions has many applications such as representation learning, outlier detection, variance reduction in estimator, and (conditional) generation. In this talk, we will visit applications of flows on hyperspheres and flows for discrete spaces. Additionally, we talk about graph neural networks with rotational and translational symmetries.

Bio

Emiel is a PhD Student at the University of Amsterdam, working on deep generative modelling under the supervision of Max Welling. Recent works include "Integer Discrete Flows", "Argmax Flows" and "E(n)-equivariant Graph Neural Networks"

Modeling and optimizing set functions via RKHS embeddings
by Prof. David Ginsbourger
Date: Thursday, Feb. 25
Time: 13:00
Location: Online Call via Zoom

Hi everyone! We are thrilled to announce our new monthly CVG Seminars!

Our first guest speaker is Prof. David Ginsbourger from the University of Bern and you are all cordially invited to the CVG Seminar on February 25th at 1:00 p.m. CET on Zoom (passcode is 004934), where‪ David will give a talk titled “Modeling and optimizing set functions via RKHS embeddings“.

Abstract

We consider the issue of modeling and optimizing set functions, with a main focus on kernel methods for expensive objective functions taking finite sets as inputs. Based on recent developments on embeddings of probability distributions in Reproducing Kernel Hilbert Spaces, we explore adaptations of Gaussian Process modeling and Bayesian Optimization to the framework of interest. In particular, combining RKHS embeddings and positive definite kernels on Hilbert spaces delivers a promising class of kernels, as illustrated in particular on two test cases from mechanical engineering and contaminant source localization, respectively. Based on several collaborations and notably on the paper "Kernels over sets of finite sets using RKHS embeddings, with application to Bayesian (combinatorial) optimization" with Poompol Buathong and Tipaluck Krityakierne (AISTATS 2020).

Bio

David Ginsbourger is working at the Institute of Mathematical Statistics and Acturial Sciences of the University of Bern, leading a research group focusing on uncertainty quantification and statistical data science. A significant part of his research deals with Gaussian process modeling and adaptive design of experiments. Further interests encompass kernel design and fitting, and connections between spatial statistics and functional analysis. From the application side, he has been working with a number of colleagues from various disciplines pertaining to engineering and increasingly to geosciences. He completed his PhD in applied mathematics at Ecole Nationale Supérieure des Mines de Saint-Etienne in 2009, and his habilitation in statistics and applied probability at UniBE in 2014. From 2015 to 2020 he was mainly affiliated with Idiap Research institute where he was heading the uncertainty quantification and optimal design group. He received in 2018 a titular professorship from UniBE, where he is now associate (assoziierter) professor.