Abstract: |
This talk considers a novel approach to using Markov chain Monte Carlo (MCMC) in contexts where one may adopt
multilevel (ML) and Multi-index (MI) Monte Carlo. The underlying problem is to approximate expectations w.r.t.~an
underlying probability measure that is associated to a continuum problem, such as a continuous-time stochastic process.
It is then assumed that the associated probability measure can
only be used (e.g.~sampled) under a discretized approximation. In such scenarios, it is known that to achieve a target error, the computational
effort can be reduced when using MLMC or MIMC, relative to exact sampling from the most accurate discretized probability.
The ideas rely upon introducing hierarchies of the discretizations and assuming that less accurate approximations cost less
to compute, one can introduce an appropriate collapsiing type sum expression for the target expectation. If a suitable
coupling of the exact sampling of the probability measures in the hierarchy is achieved, then the reduction in cost is possible.
This talk focuses on the case where such exact sampling is not possible. We show that given only access to MCMC kernels
which are invariant to each discretized probability measure that such couplings are possible. We prove, under assumptions,
that this coupled MCMC approach can reduce the cost to achieve a given error, relative to exact sampling in an ML context. Our approach is illustrated
on several examples. |
|