Bayesian Inference of ODE
For this tutorial, we will show how to do Bayesian inference to infer the parameters of the Lotka-Volterra equations using each of the three backends:
- Turing.jl
- Stan.jl
- DynamicHMC.jl
Setup
First, let's set up our ODE and the data. For the data, we will simply solve the ODE and take that solution at some known parameters as the dataset. This looks like the following:
using DiffEqBayes, ParameterizedFunctions, OrdinaryDiffEq, RecursiveArrayTools,
Distributions
f1 = @ode_def LotkaVolterra begin
dx = a * x - x * y
dy = -3 * y + x * y
end a
p = [1.5]
u0 = [1.0, 1.0]
tspan = (0.0, 10.0)
prob1 = ODEProblem(f1, u0, tspan, p)
σ = 0.01 # noise, fixed for now
t = collect(1.0:10.0) # observation times
sol = solve(prob1, Tsit5())
priors = [Normal(1.5, 1)]
randomized = VectorOfArray([(sol(t[i]) + σ * randn(2)) for i in 1:length(t)])
data = convert(Array, randomized)2×10 Matrix{Float64}:
2.78418 6.77929 0.97305 1.88555 … 4.35307 3.24765 1.02305
0.272611 2.0173 1.92325 0.337052 0.30974 4.55022 0.920505Inference Methods
Stan
using StanSample #required for using the Stan backend
bayesian_result_stan = stan_inference(prob1, :rk45, t, data, priors)Chains MCMC chain (1000×3×1 Array{Float64, 3}):
Iterations = 1:1:1000
Number of chains = 1
Samples per chain = 1000
parameters = sigma1.1, sigma1.2, theta_1
internals =
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
sigma1.1 0.2707 0.0836 0.0034 693.1996 610.0000 1.0013 ⋯
sigma1.2 0.2552 0.0772 0.0033 652.3373 589.7251 1.0047 ⋯
theta_1 1.5010 0.0057 0.0002 720.2230 632.9344 0.9995 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
sigma1.1 0.1470 0.2126 0.2570 0.3116 0.4849
sigma1.2 0.1477 0.2017 0.2416 0.2937 0.4497
theta_1 1.4899 1.4974 1.5011 1.5044 1.5126
Turing
bayesian_result_turing = turing_inference(prob1, Tsit5(), t, data, priors)Chains MCMC chain (1000×16×1 Array{Float64, 3}):
Iterations = 501:1:1500
Number of chains = 1
Samples per chain = 1000
Wall duration = 15.15 seconds
Compute duration = 15.15 seconds
parameters = theta[1], σ[1]
internals = n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size, logprior, loglikelihood, logjoint
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
theta[1] 1.2601 0.3868 0.2185 6.8790 24.4990 1.3619 ⋯
σ[1] 0.7849 0.9754 0.5609 4.0131 9.5758 1.3177 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
theta[1] 0.4814 0.8079 1.4977 1.5010 1.5028
σ[1] 0.1548 0.1614 0.1846 1.8586 2.8301
DynamicHMC
We can use DynamicHMC.jl as the backend for sampling with the dynamic_inference function. It is similarly used as follows:
bayesian_result_hmc = dynamichmc_inference(prob1, Tsit5(), t, data, priors)(posterior = [(parameters = [1.500331125417926], σ = [0.011925089770852352, 0.012133174428458444]), (parameters = [1.5000089865213329], σ = [0.007526344882705398, 0.010569911773269016]), (parameters = [1.5000185597424076], σ = [0.008148373702427462, 0.010584578894476743]), (parameters = [1.5002265993577166], σ = [0.011938930601147844, 0.015436090069603198]), (parameters = [1.5001614442020181], σ = [0.011361916568198393, 0.018976907056293138]), (parameters = [1.49992240716758], σ = [0.010734567439703694, 0.013440365372451753]), (parameters = [1.5002357072077273], σ = [0.0117843770615593, 0.009159406169930915]), (parameters = [1.500127759622864], σ = [0.00942150577826185, 0.015511045401176714]), (parameters = [1.5001135248690798], σ = [0.00966578058623726, 0.0159565983312131]), (parameters = [1.500010266376956], σ = [0.009879905589727693, 0.01272616480048977]) … (parameters = [1.5006412933473403], σ = [0.009494046121147235, 0.00985851419205065]), (parameters = [1.5001569473559766], σ = [0.012550688862847739, 0.010786521838498441]), (parameters = [1.5002022318106387], σ = [0.010782633303418767, 0.013677230038039962]), (parameters = [1.5003069647040759], σ = [0.013326162557742183, 0.014900432996220415]), (parameters = [1.5005353227839504], σ = [0.010365742938679765, 0.009700907082275662]), (parameters = [1.500232511979089], σ = [0.01165229000359959, 0.010828619516820496]), (parameters = [1.5001936801040046], σ = [0.009154798285782362, 0.013206778439986232]), (parameters = [1.5002186603078584], σ = [0.007896942846735202, 0.016579466849438555]), (parameters = [1.5002509922179532], σ = [0.016193793541480473, 0.011134268393874486]), (parameters = [1.500107056809297], σ = [0.014900304472224893, 0.01319373606747409])], posterior_matrix = [0.4056858340250241 0.4054710991044403 … 0.4056324222556739 0.405536476767559; -4.429110714281991 -4.889345762361858 … -4.1231272248744215 -4.206373631861993; -4.411811889648935 -4.549743826034636 … -4.497727683720586 -4.328013102362585], tree_statistics = DynamicHMC.TreeStatisticsNUTS[DynamicHMC.TreeStatisticsNUTS(46.7049227972793, 2, turning at positions 0:3, 0.9992255946562536, 3, DynamicHMC.Directions(0x0098595b)), DynamicHMC.TreeStatisticsNUTS(46.98016184722868, 2, turning at positions 0:3, 0.9550546514494296, 3, DynamicHMC.Directions(0xd318eb5f)), DynamicHMC.TreeStatisticsNUTS(47.11656126783241, 1, turning at positions -1:0, 1.0, 1, DynamicHMC.Directions(0x9d4d8e3a)), DynamicHMC.TreeStatisticsNUTS(46.91328809230483, 3, turning at positions -3:4, 0.9998618713420979, 7, DynamicHMC.Directions(0x0895bc4c)), DynamicHMC.TreeStatisticsNUTS(46.876649568371796, 2, turning at positions -3:0, 0.9718417954978875, 3, DynamicHMC.Directions(0x26c65ebc)), DynamicHMC.TreeStatisticsNUTS(46.176607125608236, 3, turning at positions -6:1, 0.9862975980191598, 7, DynamicHMC.Directions(0xec7d8839)), DynamicHMC.TreeStatisticsNUTS(47.10357167152023, 3, turning at positions -4:3, 0.9992033487019194, 7, DynamicHMC.Directions(0xe4090653)), DynamicHMC.TreeStatisticsNUTS(47.26171861216138, 3, turning at positions -7:0, 0.9999999999999999, 7, DynamicHMC.Directions(0x7932abc0)), DynamicHMC.TreeStatisticsNUTS(47.95671719156682, 2, turning at positions -1:2, 0.9979860158326342, 3, DynamicHMC.Directions(0xc39e010e)), DynamicHMC.TreeStatisticsNUTS(45.96802556812099, 2, turning at positions 0:3, 0.877051330095926, 3, DynamicHMC.Directions(0x995523d7)) … DynamicHMC.TreeStatisticsNUTS(44.15576126100752, 3, turning at positions -3:4, 0.952188034023866, 7, DynamicHMC.Directions(0xf5c2e91c)), DynamicHMC.TreeStatisticsNUTS(45.85952541356949, 2, turning at positions -1:2, 0.9567486809378596, 3, DynamicHMC.Directions(0x04974646)), DynamicHMC.TreeStatisticsNUTS(46.5429124921004, 2, turning at positions -1:-2, 0.9239474301477275, 5, DynamicHMC.Directions(0x75481d23)), DynamicHMC.TreeStatisticsNUTS(46.63400521018098, 2, turning at positions 2:5, 0.9330206201794315, 7, DynamicHMC.Directions(0x74c0292d)), DynamicHMC.TreeStatisticsNUTS(46.3639874507065, 3, turning at positions -4:3, 0.9774231199375144, 7, DynamicHMC.Directions(0x9f6dfbf3)), DynamicHMC.TreeStatisticsNUTS(47.29299106259527, 3, turning at positions -3:4, 0.9999999999999999, 7, DynamicHMC.Directions(0x9a523704)), DynamicHMC.TreeStatisticsNUTS(48.18723155551831, 2, turning at positions -3:-6, 0.9889391180820286, 7, DynamicHMC.Directions(0x54b90c41)), DynamicHMC.TreeStatisticsNUTS(47.547036736311604, 2, turning at positions 2:5, 0.951395518095363, 7, DynamicHMC.Directions(0xde316abd)), DynamicHMC.TreeStatisticsNUTS(46.10497882730614, 3, turning at positions -7:0, 0.900707383820086, 7, DynamicHMC.Directions(0x7bc221e0)), DynamicHMC.TreeStatisticsNUTS(46.593249901467374, 2, turning at positions -2:-5, 0.9932998733694597, 7, DynamicHMC.Directions(0x841fd5b2))], logdensities = [48.446254607905466, 47.11901013479689, 47.7108029061054, 47.950850170214075, 47.0504093024864, 47.68199458569164, 47.90108858375206, 48.12951287936435, 47.98473806241749, 48.23197568946575 … 46.64609897597575, 48.24240995264539, 48.57586438303587, 47.63711160626775, 47.508362339338795, 48.532434291965174, 48.73062381144815, 47.64685923648045, 47.06069092212579, 47.37003521832972], κ = Gaussian kinetic energy (Diagonal), √diag(M⁻¹): [0.00016372468302170066, 0.26461637917871256, 0.24853798707389932], ϵ = 0.5470426218669716)More Information
For a better idea of the summary statistics and plotting, you can take a look at the benchmarks.