Implementation of MCMC- with the Metropolis algorithm.

We will follow the steps described in the excellent article in introduction to the MCMC of David W. Hogg and Daniel Foreman-Mackey, available here. Section 3 will be particularly useful for this example.

One-dimensional density function (Problems 2 and 3 of the article)

Gaussian density

For this first exercise, we will implement the Metropolis algorithm and apply it to a unidimensional normal distribution.

Use the following information:

  • The density function P(Θ) is a Gaussian with a dimension with average of $μ = 2 $ and a variance $σ^2 = 2 $.
  • The distribution of proposal $q (θ’ | ) $ is a Gaussian for θ with an average $μ = $ and a standard deviation σ=1.
  • The initial point of the MCMC is $ = 0 $.
  • The MCMC must perform $10^4 $ iterations.

The equation of normal distribution is

p(θ)=12πσ2exp[(θμ)22σ2].

However, to avoid digital errors, you use your logarithm. Code directly a function for lnp(θ).

Code
function log_gaussian(x, μ, σ)
    -0.5 * ((x - μ) / σ)^2 + log(1 / (sqrt(2 * π) * σ))
end
log_gaussian (generic function with 1 method)

We can now implement the Metropolis algorithm.

We want our algorithm to be applicable to any density of (log-) probability which accepts an argument $  Theta $ scalar. We can therefore give log_Density (our above probability function) in the function argument.

Code
"""
Metropolis MCMC algorithm for sampling from a probability distribution.

Arguments:
- log_density: log-density function, accepts a theta argument
- θ₀: initial value of theta for the MCMC
- n: number of steps to take in the MCMC
- q_scale: standard deviation of the proposal distribution.
"""
function mcmc_metropolis(log_density, θ₀, n, q_scale=1.0, verbose=false)
    chain = zeros(n + 1)
    chain[1] = θ₀
    p_c = log_density(θ₀)

    n_accepted = 0
    for i in 1:n
        proposal = chain[i] + q_scale * randn()
        p_i = log_density(proposal)
        log_alpha = p_i - p_c

        if log(rand()) < log_alpha # Accept the proposal
            chain[i+1] = proposal
            p_c = p_i
            n_accepted += 1
        else # Reject the proposal, stay at current position
            chain[i+1] = chain[i]
        end
    end

    # Print acceptance rate for diagnostics
    verbose && println("Acceptance rate: $(n_accepted / n)")

    return chain
end
Main.Notebook.mcmc_metropolis

Apply the algorithm to obtain 10,000 samples.

Then display a histogram and compare it with the analytical PDF. Then display the temporal evolution (Θ vs k) of the MCMC.

Code
using CairoMakie

μ = 2.0
σ = sqrt(2.0)
log_density(θ) = log_gaussian(θ, μ, σ)

# Initial conditions
θ₀ = 0.0
nsteps = 10000
q_scale = 1.0

# Run the MCMC algorithm
samples = mcmc_metropolis(log_density, θ₀, nsteps, q_scale)

# Create a figure for the histogram and analytical PDF comparison
f = Figure()
ax1 = Axis(f[1, 1], xlabel="θ", ylabel="Density", title="MCMC Sampling vs Analytical PDF")

hist!(ax1, samples, bins=50, normalization=:pdf, color=(:blue, 0.5), label="MCMC Samples")

# Overlay the analytical PDF
θ_range = range(extrema(samples)..., length=1000)
analytical_pdf = [exp(log_density(θ)) for θ in θ_range]
lines!(ax1, θ_range, analytical_pdf, color=:red, linewidth=2, label="Analytical PDF")

axislegend(ax1)
f

Code
ax2 = Axis(f[2, 1], xlabel="Iteration (k)", ylabel="θ", title="MCMC Trace Plot")
lines!(ax2, 0:nsteps, samples)
f