API Reference

Pre-Built Architectures

NeuralOperators.NOMADType
NOMAD(approximator, decoder)

Constructs a NOMAD from approximator and decoder architectures. Make sure the output from approximator combined with the coordinate dimension has compatible size for input to decoder

Arguments

  • approximator: Lux network to be used as approximator net.
  • decoder: Lux network to be used as decoder net.

References

[1] Jacob H. Seidman and Georgios Kissas and Paris Perdikaris and George J. Pappas, "NOMAD: Nonlinear Manifold Decoders for Operator Learning", doi: https://arxiv.org/abs/2206.03551

Example

julia> approximator_net = Chain(Dense(8 => 32), Dense(32 => 32), Dense(32 => 16));

julia> decoder_net = Chain(Dense(18 => 16), Dense(16 => 16), Dense(16 => 8));

julia> nomad = NOMAD(approximator_net, decoder_net);

julia> ps, st = Lux.setup(Xoshiro(), nomad);

julia> u = rand(Float32, 8, 5);

julia> y = rand(Float32, 2, 5);

julia> size(first(nomad((u, y), ps, st)))
(8, 5)
source
NeuralOperators.DeepONetType
DeepONet(branch, trunk, additional)

Constructs a DeepONet from a branch and trunk architectures. Make sure that both the nets output should have the same first dimension.

Arguments

  • branch: Lux network to be used as branch net.
  • trunk: Lux network to be used as trunk net.

References

[1] Lu Lu, Pengzhan Jin, George Em Karniadakis, "DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators", doi: https://arxiv.org/abs/1910.03193

Input Output Dimensions

Consider a transient 1D advection problem ∂ₜu + u ⋅ ∇u = 0, with an IC u(x,0) = g(x). We are given several (b = 200) instances of the IC, discretized at 50 points each, and want to query the solution for 100 different locations and times [0;1].

That makes the branch input of shape [50 x 200] and the trunk input of shape [2 x 100]. So, the input for the branch net is 50 and 100 for the trunk net.

Example

julia> branch_net = Chain(Dense(64 => 32), Dense(32 => 32), Dense(32 => 16));

julia> trunk_net = Chain(Dense(1 => 8), Dense(8 => 8), Dense(8 => 16));

julia> deeponet = DeepONet(branch_net, trunk_net);

julia> ps, st = Lux.setup(Xoshiro(), deeponet);

julia> u = rand(Float32, 64, 5);

julia> y = rand(Float32, 1, 10);

julia> size(first(deeponet((u, y), ps, st)))
(10, 5)
source
NeuralOperators.FourierNeuralOperatorType
FourierNeuralOperator(
    σ=gelu;
    chs::Dims{C}=(2, 64, 64, 64, 64, 64, 128, 1),
    modes::Dims{M}=(16,),
    kwargs...
) where {C, M}

The Fourier neural operator is a operator learning model that uses a Fourier kernel to perform spectral convolutions. It is a promising operator for surrogate methods, and can be regarded as a physics operator.

The model is composed of a Dense layer to lift a (d + 1)-dimensional vector field to an n-dimensional vector field, an integral kernel operator which consists of four Fourier kernels, and two Dense layers to project data back to the scalar field of the space of interest.

Arguments

  • σ: Activation function for all layers in the model.

Keyword Arguments

  • chs: A Tuple or Vector of the size of each of the 8 channels.
  • modes: The modes to be preserved. A tuple of length d, where d is the dimension of data. For example, one-dimensional data would have a 1-element tuple, and two-dimensional data would have a 2-element tuple.

Example

julia> fno = FourierNeuralOperator(gelu; chs=(2, 64, 64, 128, 1), modes=(16,));

julia> ps, st = Lux.setup(Xoshiro(), fno);

julia> u = rand(Float32, 1024, 2, 5);

julia> size(first(fno(u, ps, st)))
(1024, 1, 5)
source

Building blocks

NeuralOperators.OperatorConvType
OperatorConv(
    ch::Pair{<:Integer, <:Integer}, modes::Dims, tr::AbstractTransform;
    init_weight=glorot_uniform
)

Arguments

  • ch: A Pair of input and output channel size ch_in => ch_out, e.g. 64 => 64.
  • modes: The modes to be preserved. A tuple of length d, where d is the dimension of data.
  • tr: The transform to operate the transformation.

Keyword Arguments

  • init_weight: Initial function to initialize parameters.

Example

julia> OperatorConv(2 => 5, (16,), FourierTransform{ComplexF32}((16,)));
source
NeuralOperators.SpectralConvFunction
SpectralConv(args...; kwargs...)

Construct a OperatorConv with FourierTransform{ComplexF32} as the transform. See OperatorConv for the individual arguments.

Example

julia> SpectralConv(2 => 5, (16,));
source
NeuralOperators.OperatorKernelType
OperatorKernel(
    ch::Pair{<:Integer, <:Integer}, modes::Dims, transform::AbstractTransform,
    act=identity; kwargs...
)

Arguments

  • ch: A Pair of input and output channel size ch_in => ch_out, e.g. 64 => 64.
  • modes: The modes to be preserved. A tuple of length d, where d is the dimension of data.
  • transform: The transform to operate the transformation.
  • act: Activation function.

All the keyword arguments are passed to the OperatorConv constructor.

Example

julia> OperatorKernel(2 => 5, (16,), FourierTransform{ComplexF64}((16,)));
source
NeuralOperators.SpectralKernelFunction
SpectralKernel(args...; kwargs...)

Construct a OperatorKernel with FourierTransform{ComplexF32} as the transform. See OperatorKernel for the individual arguments.

Example

julia> SpectralKernel(2 => 5, (16,));
source
NeuralOperators.GridEmbeddingType
GridEmbedding(grid_boundaries::Vector{<:Tuple{<:Real,<:Real}})

Appends a uniform grid embedding to the input data along the penultimate dimension.

source
NeuralOperators.ComplexDecomposedLayerType
ComplexDecomposedLayer(layer::AbstractLuxLayer)

Decomposes complex activations into real and imaginary parts and applies the given layer to each component separately, and then recombines the real and imaginary parts.

source
NeuralOperators.SoftGatingType
SoftGating(chs::Integer, ndims::Integer; kwargs...)

Constructs a wrapper over Scale with dims = (ntuple(Returns(1), ndims)..., chs). All keyword arguments are passed to the Scale constructor.

source

Transform API

NeuralOperators.AbstractTransformType
AbstractTransform

Interface

  • Base.ndims(<:AbstractTransform): N dims of modes
  • transform(<:AbstractTransform, x::AbstractArray): Apply the transform to x
  • truncate_modes(<:AbstractTransform, x_transformed::AbstractArray): Truncate modes that contribute to the noise
  • inverse(<:AbstractTransform, x_transformed::AbstractArray): Apply the inverse transform to x_transformed
source
NeuralOperators.FourierTransformType
FourierTransform{T}(modes, shift::Bool=false)

A concrete implementation of AbstractTransform for Fourier transforms.

If shift is true, we apply a fftshift before truncating the modes.

source