While SciML is not an ecosystem for machine learning, SciML has many libraries for doing machine learning with its equation solver libraries and machine learning libraries which are integrated into the equation solvers.
DiffEqFlux.jl is a library of pre-built architectures for implicit deep learning, including layer definitions for methods like:
- Neural Ordinary Differential Equations (Neural ODEs)
- Collocation-Based Neural ODEs (Neural ODEs without a solver, by far the fastest way!)
- Multiple Shooting Neural Ordinary Differential Equations
- Neural Stochastic Differential Equations (Neural SDEs)
- Neural Differential-Algebriac Equations (Neural DAEs)
- Neural Delay Differential Equations (Neural DDEs)
- Augmented Neural ODEs
- Hamiltonian Neural Networks (with specialized second order and symplectic integrators)
- Continuous Normalizing Flows (CNF) and FFJORD
ReservoirComputing.jl is a library for doing machine learning using reservoir computing techniques, such as with methods like Echo State Networks (ESNs). Its reservoir computing methods make it stabilized for usage with difficult equations like stiff dynamics, chaotic equations, and more.
FastDEQ.jl is a library of optimized layer implementations for Deep Equilibrium Models (DEQs). It uses special training techniques such as implicit-explicit regularization in order to accelerate the convergence over traditional implementations, all while using the optimized and flexible SciML libraries under the hood.
Flux.jl is the most popular machine learning library in the Julia programming language. SciML's libraries are heavily tested with it and its automatic differentiation engine Zygote.jl for composability and compatibility.
Lux.jl is a library for fully explicitly parameterized neural networks. Thus while alternative interfaces are required to use Flux with many equation solvers (i.e.
Flux.destructure), Lux.jl's explicit design merries very easily with the SciML equation solver libraries. For this reason, SciML's library are also heavily tested with Lux to ensure compatibility with neural network definitions from here.
SimpleChains.jl is a library specialized for small-scale machine learning. It uses non-allocating mutating forms to be highly efficient for the cases where matrix multiplication kernels are not able to overcome the common overheads of machine learning libraries. Thus for SciML cases with small neural networks (<100 node layers) and non-batched usage (many/most use cases), SimpleChains.jl can be the fastest choice for the neural network definitions.
NNLib.jl is the core library which defines the handling of common functions, like
conv and how they map to device accelerators such as the NVIDA cudnn. This library can thus be used to directly grab many of the core functions used in machine learning, such as common activation functions and gather/scatter operations, without depending on the given style of any machine learning library.
GeometricFlux.jl is a library for graph neural networks and geometric deep learning. It is the one that is used and tested by the SciML developers for mixing with equation solver applications.
AbstractGPs.jl is the fast and flexible Gaussian Process library that is used by the SciML packages and recommended for downstream usage.
MLDatasets.jl is a common interface for accessing common machine learning datasets. For example, if you want to run a test on MNIST data, MLDatasets is the quicket way to obtain it.
MLUtils.jl is a library of utility functions for making writing common machine learning pipelines easier. This includes functionality for:
- An extensible dataset interface (
- Data iteration and dataloaders (
- Lazy data views (
- Resampling procedures (
- Train/test splits (
- Data partitioning and aggregation tools (
- Folds for cross-validation (
- Datasets lazy tranformations (
- Toy datasets for demonstration purpose.
- Other data handling utilities (