– Europe/Lisbon
Room P3.10, Mathematics Building
— Online
How quickly does the Gibbs sampler converge for log-concave distributions?
The Gibbs sampler (a.k.a. Glauber dynamics and heat-bath algorithm) is a popular Markov Chain Monte Carlo algorithm that iteratively samples from the conditional distributions of the probability measure of interest. Under the assumption of log-concavity, for its random scan version, we provide a sharp bound on the speed of convergence in relative entropy. Assuming that evaluating conditionals is cheap compared to evaluating the joint density, our results imply that the number of full evaluations required for the Gibbs sampler to mix grows linearly with the condition number and is independent of the dimension. This contrasts with gradient-based methods such as overdamped Langevin or Hamiltonian Monte Carlo (HMC), whose mixing time typically increases with the dimension. Our techniques also allow us to analyze Metropolis-within-Gibbs schemes, as well as the Hit-and-Run algorithm. This is joint work with Filippo Ascolani and Giacomo Zanella.