ICERM Generative Models Discussion Group
https://stanniszhou.github.io/discussion-group/
Recent content on ICERM Generative Models Discussion GroupHugo -- gohugo.ioenSun, 20 Aug 2017 21:38:52 +0800Papers
https://stanniszhou.github.io/discussion-group/papers/
Sun, 20 Aug 2017 21:38:52 +0800https://stanniszhou.github.io/discussion-group/papers/Papers Here is a list of suggested papers, loosely organized by topic.
Compositional Generative Models Lake, Brenden M., Ruslan Salakhutdinov, and Joshua B. Tenenbaum. 2015. “Human-Level Concept Learning through Probabilistic Program Induction.” Science 350 (6266): 1332–38. link
George, D., W. Lehrach, K. Kansky, M. Lázaro-Gredilla, C. Laan, B. Marthi, X. Lou, et al. 2017. “A Generative Vision Model That Trains with High Data Efficiency and Breaks Text-Based CAPTCHAs.Schedule
https://stanniszhou.github.io/discussion-group/schedule/
Sun, 20 Aug 2017 21:38:52 +0800https://stanniszhou.github.io/discussion-group/schedule/Past Schedule Tutorial on Probabilistic Programming When: Thursday 2019-02-28 10 am to 11 am
Where: ICERM 11th Floor Lecture Hall
Presenter: Daniel Ritchie (Slides)
Scribe: Theresa Barton (Blog)
Overview of Deep Generative Models and Discussion When: Tuesday 2019-03-05 10 am to 11 am
Where: ICERM 11th Floor Lecture Hall
Presenter: Guangyao Zhou (Slides)
Scribe: Guangyao Zhou (Blog)
Flow-based Generative Models and Discussion When: Thursday 2019-03-14 10 am to 11 amAbout
https://stanniszhou.github.io/discussion-group/about/
Sun, 20 Aug 2017 21:38:52 +0800https://stanniszhou.github.io/discussion-group/about/The ICERM Generative Models Discussion Group is part of the Semester Program in Computer Vision at ICERM. We meet in most weeks in the Spring 2019 semester in the 11th floor lecture hall at ICERM. The discussion group will focus on the general research area of probabilistic generative models.
For questions, please contact Guangyao Zhou.Problems and Advances of Wasserstein GAN
https://stanniszhou.github.io/discussion-group/post/wgan/
Thu, 11 Apr 2019 00:00:00 +0000https://stanniszhou.github.io/discussion-group/post/wgan/Introduction Since Generative Adversarial Nets(GAN)([1]) was proposed in 2014, there have been a lot of researches on and applications of GAN([2,3]). However the generative and discriminative models were studied before the GAN was proposed([4]). Some problems of GAN are summarized in [5].
The basic idea of classical GAN is to minimize the Kullback-Leibler(KL) divergence. However, it is possible that the KL divergence (or distance) can not be defined (or is simply infinite).Digging Deeper Into Flow-based Generative Models
https://stanniszhou.github.io/discussion-group/post/ffjord/
Thu, 04 Apr 2019 00:00:00 +0000https://stanniszhou.github.io/discussion-group/post/ffjord/Summary
In this week’s meeting, we discussed free-form Jacobian of reversible dynamics (FFJORD)[1], and the closely related neural ordinary differential equation (neural ODE)[2]. In this blog post, we summarize the main points of these two papers.
Overview Recall that the essential idea of flow-based generative models is to model a complicated target distribution as the result of applying a reversible differentiable transformation to some simple base distribution. The base distribution should be easy to sample from, so that we can apply the differentiable transformation and get a sample from the target distribution.GANs that work well empirically
https://stanniszhou.github.io/discussion-group/post/gan-empirical/
Thu, 28 Mar 2019 00:00:00 +0000https://stanniszhou.github.io/discussion-group/post/gan-empirical/Overview Generative Adversarial Networks (GANs) are a class of deep learning methods which is first proposed by Ian Goodfellow and other researchers at the University of Montreal in 2014 [1]. Two neural networks, a generator, and a discriminator learn in a zero-sum game framework.
The loss formulation of GAN is as follows:
$$ \min_{G} \ \max_{D}V(D,G)= \mathbb{E}_{x\sim p_{data}(x)}\big[ \log D(x) \big] + \mathbb{E}_{z\sim p_{z}(z)} \big[ \log (1- D(G(z))) \big]$$Overview of Deep Generative Models
https://stanniszhou.github.io/discussion-group/post/dgm-overview/
Tue, 05 Mar 2019 00:00:00 +0000https://stanniszhou.github.io/discussion-group/post/dgm-overview/Summary
This week Stannis gave a high-level overview of three popular families of deep generative models. The discussion is mainly based on the original papers [1][2]. The goal is to point out the commonalities and differences between these models, and have a detailed discussion on the different learning methods employed by these models.
Overview When using latent variable models for probabilistic modeling, the objects of interest are the latent variables (which we denote by $z$), and the observed variables (which we denote by $x$).Tutorial on Probabilistic Programming
https://stanniszhou.github.io/discussion-group/post/ppls/
Thu, 28 Feb 2019 00:00:00 +0000https://stanniszhou.github.io/discussion-group/post/ppls/Summary
This week, Daniel gave a tutorial on probabilistic programming and its use in generative modeling.
What is a PPL? Probabilistic programming languages (PPLs) leverage powerful programming concepts such as recursion, abstraction and modularity to define and sample from user-specified distributions and perform inference on statistical models.
For example, here is a program written in WebPPL:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 var geometric = function() { return flip(.