Bayesian Inference of ODE
For this tutorial, we will show how to do Bayesian inference to infer the parameters of the Lotka-Volterra equations using each of the three backends:
- Turing.jl
- Stan.jl
- DynamicHMC.jl
Setup
First, let's set up our ODE and the data. For the data, we will simply solve the ODE and take that solution at some known parameters as the dataset. This looks like the following:
using DiffEqBayes, ParameterizedFunctions, OrdinaryDiffEq, RecursiveArrayTools,
Distributions
f1 = @ode_def LotkaVolterra begin
dx = a * x - x * y
dy = -3 * y + x * y
end a
p = [1.5]
u0 = [1.0, 1.0]
tspan = (0.0, 10.0)
prob1 = ODEProblem(f1, u0, tspan, p)
σ = 0.01 # noise, fixed for now
t = collect(1.0:10.0) # observation times
sol = solve(prob1, Tsit5())
priors = [Normal(1.5, 1)]
randomized = VectorOfArray([(sol(t[i]) + σ * randn(2)) for i in 1:length(t)])
data = convert(Array, randomized)
2×10 Matrix{Float64}:
2.79517 6.77211 0.958822 1.88719 … 4.356 3.25658 1.03072
0.275446 2.00148 1.92502 0.337545 0.317308 4.53178 0.917278
Inference Methods
Stan
using CmdStan #required for using the Stan backend
bayesian_result_stan = stan_inference(prob1, :rk45, t, data, priors)
Chains MCMC chain (1000×3×1 Array{Float64, 3}):
Iterations = 1:1:1000
Number of chains = 1
Samples per chain = 1000
parameters = sigma1.1, sigma1.2, theta_1
internals =
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
sigma1.1 0.2664 0.0718 0.0035 441.2790 418.8726 1.0052 ⋯
sigma1.2 0.2540 0.0705 0.0039 352.4781 449.2813 1.0028 ⋯
theta_1 1.5008 0.0056 0.0002 949.9723 716.6565 1.0005 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
sigma1.1 0.1608 0.2144 0.2568 0.3018 0.4430
sigma1.2 0.1517 0.2032 0.2409 0.2887 0.4247
theta_1 1.4896 1.4971 1.5006 1.5044 1.5116
Turing
bayesian_result_turing = turing_inference(prob1, Tsit5(), t, data, priors)
Chains MCMC chain (1000×14×1 Array{Float64, 3}):
Iterations = 501:1:1500
Number of chains = 1
Samples per chain = 1000
Wall duration = 13.34 seconds
Compute duration = 13.34 seconds
parameters = theta[1], σ[1]
internals = lp, n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
theta[1] 1.5001 0.0035 0.0001 1045.9073 736.8116 1.0014 ⋯
σ[1] 0.1497 0.0341 0.0019 285.3026 296.4165 1.0018 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
theta[1] 1.4941 1.4978 1.4999 1.5022 1.5073
σ[1] 0.1001 0.1275 0.1438 0.1660 0.2412
DynamicHMC
We can use DynamicHMC.jl as the backend for sampling with the dynamic_inference
function. It is similarly used as follows:
bayesian_result_hmc = dynamichmc_inference(prob1, Tsit5(), t, data, priors)
(posterior = @NamedTuple{parameters::Vector{Float64}, σ::Vector{Float64}}[(parameters = [1.5002789851223284], σ = [0.015313158010534931, 0.016089601837499284]), (parameters = [1.5010242722835874], σ = [0.015337330724393588, 0.016114269053692835]), (parameters = [1.4995423757503143], σ = [0.014792348296473478, 0.021669746405726182]), (parameters = [1.5008544195783835], σ = [0.014718084289703594, 0.021660958503722447]), (parameters = [1.4997502717285727], σ = [0.01468779424717361, 0.021605476323549457]), (parameters = [1.4998597371704312], σ = [0.012951282593052678, 0.02428609504630794]), (parameters = [1.500219588277488], σ = [0.012093811718833364, 0.02163400395889436]), (parameters = [1.4998446264736425], σ = [0.014532081681065076, 0.011716733341768506]), (parameters = [1.4998446264736425], σ = [0.014532081681065076, 0.011716733341768506]), (parameters = [1.5004258310467673], σ = [0.018330955320478185, 0.018456361151126952]) … (parameters = [1.500156496185466], σ = [0.011029192445282287, 0.021757563973950195]), (parameters = [1.4999182849903254], σ = [0.011027899552194046, 0.02177685430840904]), (parameters = [1.5003037010477729], σ = [0.00997115479920601, 0.022230117234737497]), (parameters = [1.4998679592268036], σ = [0.009982769379106539, 0.02226770435586875]), (parameters = [1.4995795522277782], σ = [0.01031153820579554, 0.02177563680896988]), (parameters = [1.5002414420936545], σ = [0.010316474484965565, 0.02168823442671804]), (parameters = [1.5000587719310965], σ = [0.014179143055928795, 0.020755604195314678]), (parameters = [1.5004485124961149], σ = [0.014186937110833762, 0.020743522083406005]), (parameters = [1.4996595029386712], σ = [0.014141349446711481, 0.020654518257246048]), (parameters = [1.5004266950046117], σ = [0.012792915912118471, 0.012395138866145149])], posterior_matrix = [0.4056510808957057 0.4061477232624769 … 0.4052380842993236 0.40574953099254857; -4.179042819483899 -4.177465505720367 … -4.25865218836353 -4.358863705629173; -4.1295820642443 -4.128050122876067 … -3.879821181705036 -4.3904509101575675], tree_statistics = DynamicHMC.TreeStatisticsNUTS[DynamicHMC.TreeStatisticsNUTS(40.258948222351705, 7, turning at positions -196:-199, 0.9557769606285563, 223, DynamicHMC.Directions(0xf7b46618)), DynamicHMC.TreeStatisticsNUTS(37.705030052876225, 2, turning at positions -1:2, 0.49756442973177645, 3, DynamicHMC.Directions(0x307c8fee)), DynamicHMC.TreeStatisticsNUTS(36.59222060574335, 7, turning at positions -114:-117, 0.9859844286926779, 195, DynamicHMC.Directions(0xf595294e)), DynamicHMC.TreeStatisticsNUTS(38.778777679537264, 2, turning at positions 0:3, 0.853725623958627, 3, DynamicHMC.Directions(0x1dc16edb)), DynamicHMC.TreeStatisticsNUTS(39.83891995881921, 2, turning at positions 4:7, 0.9999999999999999, 7, DynamicHMC.Directions(0x5e54ab4f)), DynamicHMC.TreeStatisticsNUTS(38.378239037012605, 7, turning at positions 157:160, 0.9530214781362079, 223, DynamicHMC.Directions(0x0edd34c0)), DynamicHMC.TreeStatisticsNUTS(39.97160847855848, 8, turning at positions 205:460, 0.8312860658878597, 511, DynamicHMC.Directions(0xd0e247cc)), DynamicHMC.TreeStatisticsNUTS(41.52126714676729, 9, turning at positions -134:377, 0.9813225016112509, 511, DynamicHMC.Directions(0x76b37579)), DynamicHMC.TreeStatisticsNUTS(40.987988805770556, 2, turning at positions -2:1, 0.8198629247544962, 3, DynamicHMC.Directions(0x9b374bfd)), DynamicHMC.TreeStatisticsNUTS(41.28959077888655, 9, turning at positions -100:411, 0.9443949650425898, 511, DynamicHMC.Directions(0x177be79b)) … DynamicHMC.TreeStatisticsNUTS(40.39918966072918, 8, turning at positions -218:-221, 0.8828612292157036, 283, DynamicHMC.Directions(0x44d93e3e)), DynamicHMC.TreeStatisticsNUTS(41.1381144325348, 2, turning at positions 0:3, 0.8109804022986292, 3, DynamicHMC.Directions(0x05c688cb)), DynamicHMC.TreeStatisticsNUTS(40.30581382863675, 7, turning at positions 79:206, 0.9950900487111555, 255, DynamicHMC.Directions(0x1accf0ce)), DynamicHMC.TreeStatisticsNUTS(39.477731071316555, 2, turning at positions -3:0, 0.6294084335527638, 3, DynamicHMC.Directions(0xcbfe2d40)), DynamicHMC.TreeStatisticsNUTS(37.76454865976372, 5, turning at positions 23:26, 0.7052198848944128, 47, DynamicHMC.Directions(0xdda5a3ea)), DynamicHMC.TreeStatisticsNUTS(39.35399274812504, 2, turning at positions 1:4, 0.9705013473751215, 7, DynamicHMC.Directions(0x01366014)), DynamicHMC.TreeStatisticsNUTS(41.276164167271546, 8, turning at positions 247:502, 0.9999938705286218, 511, DynamicHMC.Directions(0xe906a7f6)), DynamicHMC.TreeStatisticsNUTS(41.661230021316264, 2, turning at positions 2:5, 0.8732839564604812, 7, DynamicHMC.Directions(0xb308445d)), DynamicHMC.TreeStatisticsNUTS(39.270650228723504, 2, turning at positions 0:3, 0.7937029899381386, 3, DynamicHMC.Directions(0x97dad04f)), DynamicHMC.TreeStatisticsNUTS(41.169226207955745, 9, turning at positions -133:378, 0.990436714878458, 511, DynamicHMC.Directions(0x7a8bbf7a))], κ = Gaussian kinetic energy (Diagonal), √diag(M⁻¹): [0.03059818969906924, 0.2736708414762136, 0.29425504784995715], ϵ = 0.006526057651795619)
More Information
For a better idea of the summary statistics and plotting, you can take a look at the benchmarks.