APIs

Layers

Operator convolutional layer

\[F(s) = \mathcal{F} \{ v(x) \} \\ F'(s) = g(F(s)) \\ v'(x) = \mathcal{F}^{-1} \{ F'(s) \}\]

where $v(x)$ and $v'(x)$ denotes input and output function, $\mathcal{F} \{ \cdot \}$, $\mathcal{F}^{-1} \{ \cdot \}$ are Fourier transform, inverse Fourier transform, respectively. Function $g$ is a linear transform for lowering Fouier modes.

NeuralOperators.OperatorConvType
OperatorConv(
    ch, modes, transform;
    init=c_glorot_uniform, permuted=false, T=ComplexF32
)

Arguments

  • ch: Input and output channel size, e.g. 64=>64.
  • modes: The modes to be preserved.
  • Transform: The trafo to operate the transformation.
  • permuted: Whether the dim is permuted. If permuted=true, layer accepts data in the order of (ch, ..., batch), otherwise the order is (..., ch, batch).

Example

julia> OperatorConv(2=>5, (16, ), FourierTransform)
OperatorConv(2 => 5, (16,), FourierTransform, permuted=false)

julia> OperatorConv(2=>5, (16, ), FourierTransform, permuted=true)
OperatorConv(2 => 5, (16,), FourierTransform, permuted=true)

Reference: FNO2021


Operator kernel layer

\[v_{t+1}(x) = \sigma(W v_t(x) + \mathcal{K} \{ v_t(x) \} )\]

where $v_t(x)$ is the input function for $t$-th layer and $\mathcal{K} \{ \cdot \}$ denotes spectral convolutional layer. Activation function $\sigma$ can be arbitrary non-linear function.

NeuralOperators.OperatorKernelType
OperatorKernel(ch, modes, σ=identity; permuted=false)

Arguments

  • ch: Input and output channel size for spectral convolution, e.g. 64=>64.
  • modes: The Fourier modes to be preserved for spectral convolution.
  • σ: Activation function.
  • permuted: Whether the dim is permuted. If permuted=true, layer accepts data in the order of (ch, ..., batch), otherwise the order is (..., ch, batch).

Example

julia> OperatorKernel(2=>5, (16, ), FourierTransform)
OperatorKernel(2 => 5, (16,), FourierTransform, σ=identity, permuted=false)

julia> using Flux

julia> OperatorKernel(2=>5, (16, ), FourierTransform, relu)
OperatorKernel(2 => 5, (16,), FourierTransform, σ=relu, permuted=false)

julia> OperatorKernel(2=>5, (16, ), FourierTransform, relu, permuted=true)
OperatorKernel(2 => 5, (16,), FourierTransform, σ=relu, permuted=true)

Reference: FNO2021


Graph kernel layer

\[v_{t+1}(x_i) = \sigma(W v_t(x_i) + \frac{1}{|\mathcal{N}(x_i)|} \sum_{x_j \in \mathcal{N}(x_i)} \kappa \{ v_t(x_i), v_t(x_j) \} )\]

where $v_t(x_i)$ is the input function for $t$-th layer, $x_i$ is the node feature for $i$-th node and $\mathcal{N}(x_i)$ represents the neighbors for $x_i$. Activation function $\sigma$ can be arbitrary non-linear function.

NeuralOperators.GraphKernelType
GraphKernel(κ, ch, σ=identity)

Graph kernel layer.

Arguments

  • κ: A neural network layer for approximation, e.g. a Dense layer or a MLP.
  • ch: Channel size for linear transform, e.g. 32.
  • σ: Activation function.

Reference: NO2020


Models

Fourier neural operator

NeuralOperators.FourierNeuralOperatorFunction
FourierNeuralOperator(;
    ch=(2, 64, 64, 64, 64, 64, 128, 1),
    modes=(16, ),
    σ=gelu
)

Fourier neural operator learns a neural operator with Dirichlet kernel to form a Fourier transformation. It performs Fourier transformation across infinite-dimensional function spaces and learns better than neural operator.

Reference: FNO2021


Markov neural operator

NeuralOperators.MarkovNeuralOperatorFunction
MarkovNeuralOperator(;
    ch=(1, 64, 64, 64, 64, 64, 1),
    modes=(24, 24),
    σ=gelu
)

Markov neural operator learns a neural operator with Fourier operators. With only one time step information of learning, it can predict the following few steps with low loss by linking the operators into a Markov chain.

Reference: MNO2021