DIY Markov Chains.
This package presents a simple combinator language for Markov transition operators that are useful in MCMC.
Any transition operators sharing the same stationary distribution and obeying the Markov and reversibility properties can be combined in a couple of ways, such that the resulting operator preserves the stationary distribution and desirable properties amenable for MCMC.
We can deterministically concatenate operators end-to-end, or sample from a collection of them according to some probability distribution. See Geyer, 2005 for details.
A useful strategy is to hedge one's 'sampling risk' by occasionally interleaving a computationally-expensive transition (such as a gradient-based algorithm like Hamiltonian Monte Carlo or NUTS) with cheap Metropolis transitions.
transition = frequency [
(9, metropolis 1.0)
, (1, hamiltonian 0.05 20)
]
Alternatively: sample consecutively using the same algorithm, but over a range of different proposal distributions.
transition = concatAllT [
slice 0.5
, slice 1.0
, slice 2.0
]
Or just mix and match and see what happens!
transition =
sampleT
(sampleT (metropolis 0.5) (slice 0.1))
(sampleT (hamiltonian 0.01 20) (metropolis 2.0))
Check the test suite for example usage.