Aaron Smith
Talk Title
Error Lower Bounds for Approximate MCMC
Error Lower Bounds for Approximate MCMC
Hamiltonianizing a Piecewise Deterministic Markov Process: A Bouncy Particle Sampler with "Inertia"
The mixing Time of Metropolized Hamiltonian Monte Carlo
We analyze the mixing time of Metropolized Hamiltonian Monte Carlo (HMC) with the leapfrog integrator to sample from a distribution whose log-density is smooth, has Lipschitz Hessian in Frobenius norm and satisfies isoperimetry. We bound the gradient complexity to reach ϵ error in total variation distance from a warm start by Õ (d^{1/4} polylog(1/ϵ)) and demonstrate the benefit of choosing the number of leapfrog steps to be larger than 1.
Statistical Inference with Stochastic Gradient Algorithms
This talk will focus on the asymptotic properties of stochastic gradient methods used as sampling algorithms. We present a Bernstein--von Mises-like theorem for the scaling limit of the paths of stochastic gradient algorithms, showing they converge to an Ornstein-Uhlenbeck process. Then, using the large sample asymptotics, we demonstrate how to properly tune SGAs for various desiderata, including matching the asymptotics of the posterior distribution.
Bayesian Modeling and Computation in Causal Inference – Applications in Sequential Decision-Making
Diagnostics for Inexact Monte Carlo
Breaking the Communication Barrier
The global communication barrier is a natural performance limit arising in several annealing methods. A natural question is: can this barrier be broken?
I will talk about several approaches to tackle this problem, including a perspective on variational inference based on (a weird kind of) statistical estimation instead of optimization. This perspective allows us to scale to large problems while avoiding the headaches of tuning stochastic optimization methods.
Efficient Shape Constrained Inference With Applications in Autocovariance Sequence Estimation