Implicit Layer Deep Learning
Implicit layer deep learning is a field which uses implicit rules, such as differential equations and nonlinear solvers, to define the layers of neural networks. This field has brought the potential to automatically optimize network depth and improve training performance. SciML's differentiable solver ecosystem is specifically designed to accommodate implicit layer methodologies, and provides libraries with pre-built layers for common methods.
DiffEqFlux.jl: High Level Pre-Built Architectures for Implicit Deep Learning
DiffEqFlux.jl is a library of pre-built architectures for implicit deep learning, including layer definitions for methods like:
- Neural Ordinary Differential Equations (Neural ODEs)
- Collocation-Based Neural ODEs (Neural ODEs without a solver, by far the fastest way!)
- Multiple Shooting Neural Ordinary Differential Equations
- Neural Stochastic Differential Equations (Neural SDEs)
- Neural Differential-Algebraic Equations (Neural DAEs)
- Neural Delay Differential Equations (Neural DDEs)
- Augmented Neural ODEs
- Hamiltonian Neural Networks (with specialized second order and symplectic integrators)
- Continuous Normalizing Flows (CNF) and FFJORD
DeepEquilibriumNetworks.jl: Deep Equilibrium Models Made Fast
DeepEquilibriumNetworks.jl is a library of optimized layer implementations for Deep Equilibrium Models (DEQs). It uses special training techniques such as implicit-explicit regularization in order to accelerate the convergence over traditional implementations, all while using the optimized and flexible SciML libraries under the hood.