DiffEqFlux: High Level Pre-Built Architectures for Implicit Deep Learning

DiffEqFlux.jl is an implicit deep learning library built using the SciML ecosystem. It is a high level interface that pulls together all of the tools with heuristics and helper functions to make training such deep implicit layer models fast and easy.


DiffEqFlux.jl is only for pre-built architectures and utility functions for deep implicit learning, mixing differential equations with machine learning. For details on automatic differentiation of equation solvers and adjoint techniques, and using these methods for doing things like callibrating models to data, nonlinear optimal control, and PDE-constrained optimization, see SciMLSensitivity.jl

Pre-Built Architectures

The approach of this package is the easy and efficient training of Universal Differential Equations. DiffEqFlux.jl provides architectures which match the interfaces of machine learning libraries such as Flux.jl and Lux.jl to make it easy to build continuous-time machine learning layers into larger machine learning applications.

The following layer functions exist:

Examples of how to build architectures from scratch, with tutorials on things like Graph Neural ODEs, can be found in the SciMLSensitivity.jl documentation.


  • Lagrangian Neural Networks
  • Galerkin Neural ODEs


If you use DiffEqFlux.jl or are influenced by its ideas, please cite:

  title={Universal differential equations for scientific machine learning},
  author={Rackauckas, Christopher and Ma, Yingbo and Martensen, Julius and Warner, Collin and Zubov, Kirill and Supekar, Rohit and Skinner, Dominic and Ramadhan, Ali},
  journal={arXiv preprint arXiv:2001.04385},