Abstract: 

Sampling from the posterior probability distribution of the latent states of a Hidden Markov Model (HMM) is a non-trivial problem even in the context of Markov Chain Monte Carlo. To address this Andrieu et al. (2010) proposed a way of using a Particle Filter to construct a Markov kernel which leaves this posterior distribution invariant. Recent theoretical results establish the uniform ergodicity of this Markov kernel and show that the mixing rate does not deteriorate provided the number of particles grows at least linearly with the number of latent states. However, this gives rise to a cost per application of the kernel that is quadratic in the number of latent states, which can be prohibitive for long observation sequences. Using blocking strategies, we devise samplers which have a stable mixing rate for a cost per iteration that is linear in the number of latent states and which are furthermore easily parallelizable. We then extend our method to sample from the posterior distribution of a HMM in the setting where the state transition model cannot be evaluated but can be simulated from-this class of HMMs are said to have an intractable transition density.

Speaker

Sumeetpal Sidhu Singh

Research Area
Affiliation

University of Cambridge

Date

Thu, 04/08/2016 - 4:00pm

Venue

RC-2063, The Red Centre, UNSW