API

ModelingToolkitNeuralNets.NeuralNetworkBlockFunction
NeuralNetworkBlock(; n_input = 1, n_output = 1,
    chain = multi_layer_feed_forward(n_input, n_output),
    rng = Xoshiro(0),
    init_params = Lux.initialparameters(rng, chain),
    eltype = Float64,
    name)

Create a component neural network as a System.

source
ModelingToolkitNeuralNets.SymbolicNeuralNetworkFunction
SymbolicNeuralNetwork(; n_input = 1, n_output = 1,
    chain = multi_layer_feed_forward(n_input, n_output),
    rng = Xoshiro(0),
    init_params = Lux.initialparameters(rng, chain),
    nn_name =  :NN,
    nn_p_name = :p,
    eltype = Float64)

Create symbolic parameter for a neural network and one for its parameters. Example:

chain = multi_layer_feed_forward(2, 2)
NN, p = SymbolicNeuralNetwork(; chain, n_input=2, n_output=2, rng = StableRNG(42))

The NN and p are symbolic parameters that can be used later as part of a system. To change the name of the symbolic variables, use nn_name and nn_p_name. To get the predictions of the neural network, use

pred ~ NN(input, p)

where pred and input are a symbolic vector variable with the lengths n_output and n_input.

To use this outside of an equation, you can get the default values for the symbols and make a similar call

defaults(sys)[sys.NN](input, nn_p)

where sys is a system (e.g. ODESystem) that contains NN, input is a vector of n_input length and nn_p is a vector representing parameter values for the neural network.

To get the underlying Lux model you can use get_network(defaults(sys)[sys.NN]) or

source
ModelingToolkitNeuralNets.@SymbolicNeuralNetworkMacro
@SymbolicNeuralNetwork

Macro for interfacing with the SymbolicNeuralNetwork function. Essentially handles automatic naming of the symbolic variables. It takes a single input, the Lux chain from which to construct the symbolic neural network, and returns the corresponding symbolic parameters.

Example:

chain = Lux.Chain(
    Lux.Dense(1 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 1, Lux.softplus, use_bias = false)
)
@SymbolicNeuralNetwork NN, p = chain

is equivalent to

chain = Lux.Chain(
    Lux.Dense(1 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 1, Lux.softplus, use_bias = false)
)
NN, p = SymbolicNeuralNetwork(; chain, n_input=1, n_output=1, nn_name =  :NN, nn_p_name = :p)

Here, @SymbolicNeuralNetwork takes the neural network chain as its input, and:

  1. Automatically infer nnname and nnp_name from the variable names on the left-hand side of the assignment.
  2. Automatically infer ninput and noutput from the chain structure.

Designation of rng. The only other option @SymbolicNeuralNetwork currently accepts is a random number generator. This is simply provided as a second input to @SymbolicNeuralNetwork:

chain = Lux.Chain(
    Lux.Dense(1 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 3, Lux.softplus, use_bias = false),
    Lux.Dense(3 => 1, Lux.softplus, use_bias = false)
)
rng = Xoshiro(0)
@SymbolicNeuralNetwork NN, p = chain rng

Notes:

  • The first and last layers of the chain must be one of the following types: Lux.Dense. For other first

layer types, use the SymbolicNeuralNetwork

  • Types that are intended to be supported in the first layer in future updates include Lux.Bilinear,

Lux.RNNCell, Lux.LSTMCell, Lux.GRUCell.

source
ModelingToolkitNeuralNets.multi_layer_feed_forwardFunction
multi_layer_feed_forward(; n_input, n_output, width::Int = 4,
    depth::Int = 1, activation = tanh, use_bias = true, initial_scaling_factor = 1e-8)

Create a Lux.jl Chain for use in NeuralNetworkBlocks. The weights of the last layer are multiplied by the initial_scaling_factor in order to make the initial contribution of the network small and thus help with achieving a stable starting position for the training.

source