» » »

Estimating Gradients of Distributions for Generative Modeling - Livestream

Yang Song

The gradient of a log probability density function (aka, score function) is a very useful quantity in generative modeling. One existing method for estimating the score function from data is score matching, yet it is computationally prohibitive for complex, high-dimensional datasets. To alleviate this difficulty, we propose sliced score matching, a new approach based on random projections that scales much better than score matching, enjoys strong theoretical guarantees, and suffers almost no performance loss. We demonstrate the efficacy of this method on learning deep energy-based models and training variational / Wasserstein autoencoders with implicit encoders. By directly estimating score functions from iid data samples, we propose a new framework of generative modeling that allows flexible energy-based / non-normalized model architectures, requires no sampling during training and no use of adversarial optimization. Using annealed Langevin dynamics, we are able to produce image samples with comparable quality to GANs on MNIST, CelebA and CIFAR-10 datasets.

Speaker: Yang Song, Stanford

See weblink for Zoom information

Monday, 04/06/20

Contact:

Website: Click to Visit

Cost:

Free

Save this Event:

iCalendar
Google Calendar
Yahoo! Calendar
Windows Live Calendar

Berkeley Institute for Data Science


, CA

Categories: