Choose timezone
Your profile timezone:
Deep generative models complement Markov-chain-Monte-Carlo methods for efficient sampling from high-dimensional distributions. Among these methods, explicit generators, such as Normalising Flows (NFs), in combination with the Metropolis-Hastings algorithm, have been extensively applied to get unbiased samples from target distributions, such as those found in lattice theories in Physics. I will discuss central problems in conditional NFs, such as high variance, mode collapse, and data efficiency, and our work on adversarial training for NFs to ameliorate these problems. I will also briefly discuss our other recent work on the inverse problem, viz., estimating density equation from data samples.