Abstract: 

Approximate Bayesian computation (ABC) is the standard method for performing Bayesian inference in the presence of a computationally intractable likelihood function. ABC proceeds by comparing the observed and simulated counterparts of a carefully chosen statistic believed to be informative about the parameter of interest. Despite the success of ABC, it is known to scale poorly with the dimension of the statistic, as it essentially uses ideas from non-parametric kernel density estimation. An alternative approach, called the synthetic likelihood (SL), assumes a multivariate normal distribution for the model statistic with a mean and covariance allowed to be parameter dependent. We have found the Bayesian version of synthetic likelihood (BSL) to be more computational efficient and require less tuning than ABC, and very effective when the model statistic is roughly Gaussian. However, BSL has some limitations. It relies on the Gaussian assumption, is still simulation-intensive and performs poorly when the model is misspecified. In this talk, I will discuss some recent developments with my group and collaborators for alleviating this limitations.

Speaker

Christopher Drovandi

Research Area
Affiliation

Queensland University of Technology

Date

Tue, 19/02/2019 - 4:00pm

Venue

RC-3085, The Red Centre, UNSW