Markov Chain Monte Carlo in Small Worlds

Yongtao Guan, Roland Fleissner, Paul Joyce, Stephen M. Krone

Abstract

As the number of applications for Markov Chain Monte Carlo (MCMC) grows, the power of these methods as well as their shortcomings become more apparent. While MCMC yields an almost automatic way to sample a space according to some distribution, its implementations often fall short of this task as they may lead to chains which converge too slowly or get trapped within one mode of a multi-modal space. Moreover, it may be difficult to determine if a chain is only sampling a certain area of the space or if it has indeed reached stationarity. In this paper, we show how a simple modification of the proposal mechanism results in a better mixing behaviour of the chain and helps to circumvent the problems described above. This mechanism, which is based on an idea from the field of "small-world" networks, amounts to adding occasional "wild" proposals to a local proposal scheme. We demonstrate through both theory and extensive simulations that these new proposal distributions greatly outperform the traditional local proposals when it comes to exploring complex heterogenous spaces and multi-modal distributions. Our method can easily be applied to most, if not all, problems involving MCMC and unlike many other remedies which improve the performance of MCMC it preserves the simplicity of the underlying algorithm.