API Reference
Pre-Built Architectures
NeuralOperators.NOMAD — TypeNOMAD(approximator, decoder)Constructs a NOMAD from approximator and decoder architectures. Make sure the output from approximator combined with the coordinate dimension has compatible size for input to decoder
Arguments
approximator:Luxnetwork to be used as approximator net.decoder:Luxnetwork to be used as decoder net.
References
[1] Jacob H. Seidman and Georgios Kissas and Paris Perdikaris and George J. Pappas, "NOMAD: Nonlinear Manifold Decoders for Operator Learning", doi: https://arxiv.org/abs/2206.03551
Example
julia> approximator_net = Chain(Dense(8 => 32), Dense(32 => 32), Dense(32 => 16));
julia> decoder_net = Chain(Dense(18 => 16), Dense(16 => 16), Dense(16 => 8));
julia> nomad = NOMAD(approximator_net, decoder_net);
julia> ps, st = Lux.setup(Xoshiro(), nomad);
julia> u = rand(Float32, 8, 5);
julia> y = rand(Float32, 2, 5);
julia> size(first(nomad((u, y), ps, st)))
(8, 5)NeuralOperators.DeepONet — TypeDeepONet(branch, trunk, additional)Constructs a DeepONet from a branch and trunk architectures. Make sure that both the nets output should have the same first dimension.
Arguments
branch:Luxnetwork to be used as branch net.trunk:Luxnetwork to be used as trunk net.
References
[1] Lu Lu, Pengzhan Jin, George Em Karniadakis, "DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators", doi: https://arxiv.org/abs/1910.03193
Input Output Dimensions
Consider a transient 1D advection problem ∂ₜu + u ⋅ ∇u = 0, with an IC u(x,0) = g(x). We are given several (b = 200) instances of the IC, discretized at 50 points each, and want to query the solution for 100 different locations and times [0;1].
That makes the branch input of shape [50 x 200] and the trunk input of shape [2 x 100]. So, the input for the branch net is 50 and 100 for the trunk net.
Example
julia> branch_net = Chain(Dense(64 => 32), Dense(32 => 32), Dense(32 => 16));
julia> trunk_net = Chain(Dense(1 => 8), Dense(8 => 8), Dense(8 => 16));
julia> deeponet = DeepONet(branch_net, trunk_net);
julia> ps, st = Lux.setup(Xoshiro(), deeponet);
julia> u = rand(Float32, 64, 5);
julia> y = rand(Float32, 1, 10);
julia> size(first(deeponet((u, y), ps, st)))
(10, 5)NeuralOperators.FourierNeuralOperator — TypeFourierNeuralOperator(
σ=gelu;
chs::Dims{C}=(2, 64, 64, 64, 64, 64, 128, 1),
modes::Dims{M}=(16,),
kwargs...
) where {C, M}The Fourier neural operator is a operator learning model that uses a Fourier kernel to perform spectral convolutions. It is a promising operator for surrogate methods, and can be regarded as a physics operator.
The model is composed of a Dense layer to lift a (d + 1)-dimensional vector field to an n-dimensional vector field, an integral kernel operator which consists of four Fourier kernels, and two Dense layers to project data back to the scalar field of the space of interest.
Arguments
σ: Activation function for all layers in the model.
Keyword Arguments
chs: ATupleorVectorof the size of each of the 8 channels.modes: The modes to be preserved. A tuple of lengthd, wheredis the dimension of data. For example, one-dimensional data would have a 1-element tuple, and two-dimensional data would have a 2-element tuple.
Example
julia> fno = FourierNeuralOperator(gelu; chs=(2, 64, 64, 128, 1), modes=(16,));
julia> ps, st = Lux.setup(Xoshiro(), fno);
julia> u = rand(Float32, 1024, 2, 5);
julia> size(first(fno(u, ps, st)))
(1024, 1, 5)Building blocks
NeuralOperators.OperatorConv — TypeOperatorConv(
ch::Pair{<:Integer, <:Integer}, modes::Dims, tr::AbstractTransform;
init_weight=glorot_uniform
)Arguments
ch: APairof input and output channel sizech_in => ch_out, e.g.64 => 64.modes: The modes to be preserved. A tuple of lengthd, wheredis the dimension of data.tr: The transform to operate the transformation.
Keyword Arguments
init_weight: Initial function to initialize parameters.
Example
julia> OperatorConv(2 => 5, (16,), FourierTransform{ComplexF32}((16,)));
NeuralOperators.SpectralConv — FunctionSpectralConv(args...; kwargs...)Construct a OperatorConv with FourierTransform{ComplexF32} as the transform. See OperatorConv for the individual arguments.
Example
julia> SpectralConv(2 => 5, (16,));
NeuralOperators.OperatorKernel — TypeOperatorKernel(
ch::Pair{<:Integer, <:Integer}, modes::Dims, transform::AbstractTransform,
act=identity; kwargs...
)Arguments
ch: APairof input and output channel sizech_in => ch_out, e.g.64 => 64.modes: The modes to be preserved. A tuple of lengthd, wheredis the dimension of data.transform: The transform to operate the transformation.act: Activation function.
All the keyword arguments are passed to the OperatorConv constructor.
Example
julia> OperatorKernel(2 => 5, (16,), FourierTransform{ComplexF64}((16,)));
NeuralOperators.SpectralKernel — FunctionSpectralKernel(args...; kwargs...)Construct a OperatorKernel with FourierTransform{ComplexF32} as the transform. See OperatorKernel for the individual arguments.
Example
julia> SpectralKernel(2 => 5, (16,));
NeuralOperators.GridEmbedding — TypeGridEmbedding(grid_boundaries::Vector{<:Tuple{<:Real,<:Real}})Appends a uniform grid embedding to the input data along the penultimate dimension.
NeuralOperators.ComplexDecomposedLayer — TypeComplexDecomposedLayer(layer::AbstractLuxLayer)Decomposes complex activations into real and imaginary parts and applies the given layer to each component separately, and then recombines the real and imaginary parts.
NeuralOperators.SoftGating — TypeSoftGating(chs::Integer, ndims::Integer; kwargs...)Constructs a wrapper over Scale with dims = (ntuple(Returns(1), ndims)..., chs). All keyword arguments are passed to the Scale constructor.
Transform API
NeuralOperators.AbstractTransform — TypeAbstractTransformInterface
Base.ndims(<:AbstractTransform): N dims of modestransform(<:AbstractTransform, x::AbstractArray): Apply the transform to xtruncate_modes(<:AbstractTransform, x_transformed::AbstractArray): Truncate modes that contribute to the noiseinverse(<:AbstractTransform, x_transformed::AbstractArray): Apply the inverse transform tox_transformed
NeuralOperators.FourierTransform — TypeFourierTransform{T}(modes, shift::Bool=false)A concrete implementation of AbstractTransform for Fourier transforms.
If shift is true, we apply a fftshift before truncating the modes.