Bayesian Inference of ODE
For this tutorial, we will show how to do Bayesian inference to infer the parameters of the Lotka-Volterra equations using each of the three backends:
- Turing.jl
- Stan.jl
- DynamicHMC.jl
Setup
First, let's set up our ODE and the data. For the data, we will simply solve the ODE and take that solution at some known parameters as the dataset. This looks like the following:
using DiffEqBayes, ParameterizedFunctions, OrdinaryDiffEq, RecursiveArrayTools,
Distributions
f1 = @ode_def LotkaVolterra begin
dx = a * x - x * y
dy = -3 * y + x * y
end a
p = [1.5]
u0 = [1.0, 1.0]
tspan = (0.0, 10.0)
prob1 = ODEProblem(f1, u0, tspan, p)
σ = 0.01 # noise, fixed for now
t = collect(1.0:10.0) # observation times
sol = solve(prob1, Tsit5())
priors = [Normal(1.5, 1)]
randomized = VectorOfArray([(sol(t[i]) + σ * randn(2)) for i in 1:length(t)])
data = convert(Array, randomized)
2×10 Matrix{Float64}:
2.77429 6.78941 0.959473 1.8917 … 4.35205 3.23001 1.02767
0.274488 2.0194 1.92066 0.325267 0.317497 4.55407 0.913721
Inference Methods
Stan
using CmdStan #required for using the Stan backend
bayesian_result_stan = stan_inference(prob1, t, data, priors)
Chains MCMC chain (1000×3×1 Array{Float64, 3}):
Iterations = 1:1:1000
Number of chains = 1
Samples per chain = 1000
parameters = sigma1.1, sigma1.2, theta_1
internals =
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
sigma1.1 0.6813 0.8689 0.4347 5.5793 18.8637 1.2134 ⋯
sigma1.2 0.5259 0.6450 0.3179 5.9507 19.1806 1.1476 ⋯
theta_1 1.3124 0.3892 0.1987 8.3993 16.7839 1.2772 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
sigma1.1 0.1663 0.2384 0.2777 0.3887 3.0476
sigma1.2 0.1497 0.1820 0.2418 0.3338 2.1849
theta_1 0.4795 1.4940 1.4994 1.5039 1.5097
Turing
bayesian_result_turing = turing_inference(prob1, Tsit5(), t, data, priors)
Chains MCMC chain (1000×14×1 Array{Float64, 3}):
Iterations = 501:1:1500
Number of chains = 1
Samples per chain = 1000
Wall duration = 12.88 seconds
Compute duration = 12.88 seconds
parameters = theta[1], σ[1]
internals = lp, n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat e ⋯
Symbol Float64 Float64 Float64 Float64 Float64 Float64 ⋯
theta[1] 1.3203 0.3615 0.1887 10.3864 14.4210 1.3167 ⋯
σ[1] 0.7945 0.7476 0.4012 2.5531 11.3926 1.9327 ⋯
1 column omitted
Quantiles
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
Symbol Float64 Float64 Float64 Float64 Float64
theta[1] 0.4851 1.4894 1.4980 1.5051 1.5176
σ[1] 0.3334 0.3831 0.4404 0.5058 2.6923
DynamicHMC
We can use DynamicHMC.jl as the backend for sampling with the dynamic_inference
function. It is similarly used as follows:
bayesian_result_hmc = dynamichmc_inference(prob1, Tsit5(), t, data, priors)
(posterior = @NamedTuple{parameters::Vector{Float64}, σ::Vector{Float64}}[(parameters = [1.5005069090610041], σ = [0.013269361874373931, 0.009160483150090831]), (parameters = [1.5002657640434525], σ = [0.012506454666742493, 0.008201418089571562]), (parameters = [1.4999080934047058], σ = [0.012667120435714337, 0.008285586541695816]), (parameters = [1.5008382680862462], σ = [0.015044007114196673, 0.007607383998794528]), (parameters = [1.500158257116188], σ = [0.010356579236617307, 0.009748944245146312]), (parameters = [1.5006905148715308], σ = [0.012467126473818238, 0.007716372291010076]), (parameters = [1.5002924645828732], σ = [0.008928970721068206, 0.005991507855097853]), (parameters = [1.500189002081964], σ = [0.009224265625728341, 0.004694232550726952]), (parameters = [1.5003340234386817], σ = [0.01647284892586232, 0.026648609377828678]), (parameters = [1.5004037464013122], σ = [0.02185753352110219, 0.026913912666465232]) … (parameters = [1.5000229570758783], σ = [0.009776011132812833, 0.007535072265388466]), (parameters = [1.500448485413541], σ = [0.014432303511321517, 0.012407180456402976]), (parameters = [1.5000388316587274], σ = [0.014450473964361563, 0.01240264877701568]), (parameters = [1.5008244370297719], σ = [0.01318750854998743, 0.011078029270035705]), (parameters = [1.5002531726591561], σ = [0.013519768384758418, 0.011777701627111496]), (parameters = [1.5004523999004056], σ = [0.014363315006680807, 0.00864098426625317]), (parameters = [1.5004720614623255], σ = [0.012829487726099235, 0.008586137760910141]), (parameters = [1.500627195200073], σ = [0.009095448422327762, 0.011899854573356789]), (parameters = [1.5003847384606446], σ = [0.009030039952521678, 0.012373282217504541]), (parameters = [1.5003499825728546], σ = [0.010860085611086328, 0.010922994968577448])], posterior_matrix = [0.40580299039351825 0.4056422684433138 … 0.40572156752673266 0.4056984026081226; -4.322297519633469 -4.381510394609317 … -4.707198487142025 -4.522661081350299; -4.692856356043997 -4.803448201911309 … -4.392215790869185 -4.516885081747272], tree_statistics = DynamicHMC.TreeStatisticsNUTS[DynamicHMC.TreeStatisticsNUTS(49.08907073469188, 9, turning at positions -376:135, 0.9875306784579895, 511, DynamicHMC.Directions(0xa98ac087)), DynamicHMC.TreeStatisticsNUTS(49.81024964096152, 8, turning at positions 339:342, 0.98860481391063, 435, DynamicHMC.Directions(0xa97499a2)), DynamicHMC.TreeStatisticsNUTS(47.62122844236009, 4, turning at positions -9:-12, 0.751977248098556, 27, DynamicHMC.Directions(0x3905998f)), DynamicHMC.TreeStatisticsNUTS(46.25038737919384, 8, turning at positions -174:-177, 0.9541211974384797, 263, DynamicHMC.Directions(0xaabfdc56)), DynamicHMC.TreeStatisticsNUTS(46.516297703797534, 8, turning at positions -243:-246, 0.9584481776830001, 327, DynamicHMC.Directions(0x66196251)), DynamicHMC.TreeStatisticsNUTS(48.67887643233524, 8, turning at positions -286:-289, 0.9947261269557669, 395, DynamicHMC.Directions(0x3e52b06a)), DynamicHMC.TreeStatisticsNUTS(47.49325094538792, 8, turning at positions -188:-191, 0.9852807598210194, 299, DynamicHMC.Directions(0xb5626e6c)), DynamicHMC.TreeStatisticsNUTS(44.663024458938395, 8, turning at positions -224:31, 0.701782122531228, 255, DynamicHMC.Directions(0xc3af4e1f)), DynamicHMC.TreeStatisticsNUTS(41.34193743511337, 9, turning at positions 31:542, 0.45142154895963316, 1023, DynamicHMC.Directions(0x0e35121e)), DynamicHMC.TreeStatisticsNUTS(40.98782468772613, 8, turning at positions -61:194, 0.9981101958715148, 255, DynamicHMC.Directions(0x2ffb7ac2)) … DynamicHMC.TreeStatisticsNUTS(47.12501973281833, 9, turning at positions -225:286, 0.8836276808862655, 511, DynamicHMC.Directions(0x4d2b491e)), DynamicHMC.TreeStatisticsNUTS(45.588269256854616, 8, turning at positions 152:155, 0.9629813636773791, 379, DynamicHMC.Directions(0xbf89db1f)), DynamicHMC.TreeStatisticsNUTS(47.037816781510465, 2, turning at positions 0:3, 0.8463471058208972, 3, DynamicHMC.Directions(0x58fee5bf)), DynamicHMC.TreeStatisticsNUTS(47.047584632479506, 8, turning at positions -153:-408, 0.9690464944371295, 511, DynamicHMC.Directions(0x39755a67)), DynamicHMC.TreeStatisticsNUTS(47.41322734986437, 8, turning at positions 185:192, 0.9976591946853847, 319, DynamicHMC.Directions(0xc9313d80)), DynamicHMC.TreeStatisticsNUTS(48.833450204262654, 9, turning at positions -20:491, 0.9941153869127529, 511, DynamicHMC.Directions(0x2b99adeb)), DynamicHMC.TreeStatisticsNUTS(50.12204823447008, 7, turning at positions 34:41, 0.9994102974674915, 159, DynamicHMC.Directions(0x9dde9689)), DynamicHMC.TreeStatisticsNUTS(48.44367301129119, 8, turning at positions 198:325, 0.9907423783276301, 383, DynamicHMC.Directions(0xf90cb1c5)), DynamicHMC.TreeStatisticsNUTS(44.27437886937356, 5, turning at positions 46:49, 0.4814075134368171, 51, DynamicHMC.Directions(0x18ad96fd)), DynamicHMC.TreeStatisticsNUTS(47.6166316557638, 7, turning at positions 123:126, 0.9865363739554034, 223, DynamicHMC.Directions(0x85238f9e))], κ = Gaussian kinetic energy (Diagonal), √diag(M⁻¹): [0.026525303517445142, 0.3089348478322596, 0.23724420400887697], ϵ = 0.0055947543430102395)
More Information
For a better idea of the summary statistics and plotting, you can take a look at the benchmarks.