Implicit Nonlinear Dynamics : Michaelis-Menten
What if you want to estimate an implicitly defined system of the form $f(u_t, u, p, t) = 0$? The solution : Implicit Sparse Identification. This method was originally described in this paper, and currently there exist robust algorithms to identify these systems. We will focus on Michaelis-Menten Kinetics. As before, we will define the DataDrivenProblem
and the Basis
containing possible candidate functions for our sparse regression. Let's generate some data! We will use two experiments starting from different initial conditions.
using DataDrivenDiffEq
using LinearAlgebra
using OrdinaryDiffEq
using DataDrivenSparse
using Plots
function michaelis_menten(u, p, t)
[0.6 - 1.5u[1] / (0.3 + u[1])]
end
u0 = [0.5]
ode_problem = ODEProblem(michaelis_menten, u0, (0.0, 4.0));
Since we have multiple trajectories at hand, we define a DataDrivenDataset
, which collects multiple problems but handles them as a unit for the processing.
prob = DataDrivenDataset(map(1:2) do i
solve(remake(ode_problem, u0 = i * u0),
Tsit5(), saveat = 0.1, tspan = (0.0, 4.0))
end...)
plot(prob)
Next, we define our Basis
. Since we want to identify an implicit system, we have to include some candidate terms which use these as an argument, and inform our constructor about the meaning of these variables.
@parameters t
@variables u(t)[1:1]
u = collect(u)
D = Differential(t)
h = [monomial_basis(u[1:1], 4)...]
basis = Basis([h; h .* (D(u[1]))], u, implicits = D.(u), iv = t)
Model ##Basis#349 with 10 equations
States : (u(t))[1]
Independent variable: t
Equations
φ₁ = 1
φ₂ = (u(t))[1]
φ₃ = (u(t))[1]^2
φ₄ = (u(t))[1]^3
...
φ₁₀ = ((u(t))[1]^4)*Differential(t)((u(t))[1])
Next, we define the ImplicitOptimizer
and solve
the problem. It wraps a standard optimizer, by default STLSQ
, and performs implicit sparse regression upon the selected basis.
opt = ImplicitOptimizer(1e-1:1e-1:5e-1)
res = solve(prob, basis, opt)
"DataDrivenSolution{Float64}" with 1 equations and 6 parameters.
Returncode: Success
Residual sum of squares: 7.149470902887641e-22
Let's check the summary statistics of the solution, which show the summary of the residual sum of squares.
summarystats(res)
1-element Vector{StatsBase.SummaryStats{Float64}}:
Summary Stats:
Length: 82
Missing Count: 0
Mean: 0.000000
Std. Deviation: 0.000000
Minimum: 0.000000
1st Quartile: 0.000000
Median: 0.000000
3rd Quartile: 0.000000
Maximum: 0.000000
We could also check different metrics as described in the DataDrivenSolution
section, e.g. aic
or bic
. As we can see, the DataDrivenSolution
has good metrics.
Furthermore, inspection of the underlying system shows that the original equations have been recovered correctly:
system = get_basis(res)
plot(
plot(prob), plot(res), layout = (1,2)
)
Copy-Pasteable Code
using DataDrivenDiffEq
using LinearAlgebra
using OrdinaryDiffEq
using DataDrivenSparse
function michaelis_menten(u, p, t)
[0.6 - 1.5u[1] / (0.3 + u[1])]
end
u0 = [0.5]
ode_problem = ODEProblem(michaelis_menten, u0, (0.0, 4.0));
prob = DataDrivenDataset(map(1:2) do i
solve(remake(ode_problem, u0 = i * u0),
Tsit5(), saveat = 0.1, tspan = (0.0, 4.0))
end...)
@parameters t
@variables u(t)[1:1]
u = collect(u)
D = Differential(t)
h = [monomial_basis(u[1:1], 4)...]
basis = Basis([h; h .* (D(u[1]))], u, implicits = D.(u), iv = t)
opt = ImplicitOptimizer(1e-1:1e-1:5e-1)
res = solve(prob, basis, opt)
summarystats(res)
# This file was generated using Literate.jl, https://github.com/fredrikekre/Literate.jl
This page was generated using Literate.jl.