We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on $\mathbb{R}^d$ for large $d$. It is well known that using a single.

importance sampling step, one produces an approximation for the target that deteriorates as the dimension d increases, unless the number of Monte Carlo samples N increases at an exponential rate in d. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a `simple' density and moving to the one of interest, using an SMC method to sample from the sequence. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable as $d\rightarrow\infty$. The convergence is achieved with a computational cost proportional to Nd^2. A variety of other results are proved and imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed dimensional marginals at a cost which is less than exponential in $d$. All of our analysis is made under the assumption that the target density is i.i.d. This is a joint work with Alex Beskos (UCL) and Dan Crisan (Imperial College London).


Assoc/Prof Ajay Jasra

Research Area

National University of Singapore, Dep. of Statistics and Applied Probability


Mon, 08/09/2014 - 4:00pm


RC-4082, The Red Centre (East Wing), UNSW