# ESN Layers

## Input Layers

ReservoirComputing.WeightedLayerType
WeightedInput(scaling)
WeightedInput(;scaling=0.1)

Returns a weighted layer initializer object, that will produce a weighted input matrix with a with random non-zero elements drawn from [-scaling, scaling], as described in [1]. The scaling factor can be given as arg or kwarg.

[1] Lu, Zhixin, et al. "Reservoir observers: Model-free inference of unmeasured variables in chaotic systems." Chaos: An Interdisciplinary Journal of Nonlinear Science 27.4 (2017): 041102.

ReservoirComputing.DenseLayerType
DenseLayer(scaling)
DenseLayer(;scaling=0.1)

Returns a fully connected layer initializer object, that will produce a weighted input matrix with a with random non-zero elements drawn from [-scaling, scaling]. The scaling factor can be given as arg or kwarg. This is the default choice in the ESN construction.

ReservoirComputing.SparseLayerType
SparseLayer(scaling, sparsity)
SparseLayer(scaling; sparsity=0.1)
SparseLayer(;scaling=0.1, sparsity=0.1)

Returns a sparsely connected layer initializer object, that will produce a random sparse input matrix with random non-zero elements drawn from [-scaling, scaling] and given sparsity. The scaling and sparsity factors can be given as args or kwargs.

ReservoirComputing.InformedLayerType
InformedLayer(model_in_size; scaling=0.1, gamma=0.5)

Returns a weighted input layer matrix, with random non-zero elements drawn from [-scaling, scaling], where some γ of reservoir nodes are connected exclusively to the raw inputs, and the rest to the outputs of the prior knowledge model, as described in [1].

[1] Jaideep Pathak et al. "Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model" (2018)

ReservoirComputing.MinimumLayerType
MinimumLayer(weight, sampling)
MinimumLayer(weight; sampling=BernoulliSample(0.5))
MinimumLayer(;weight=0.1, sampling=BernoulliSample(0.5))

Returns a fully connected layer initializer object. The matrix constructed with this initializer presents the same absolute weight value, decided by the weight factor. The sign of each entry is decided by the sampling struct. Construction detailed in [1] and [2].

[1] Rodan, Ali, and Peter Tino. "Minimum complexity echo state network." IEEE transactions on neural networks 22.1 (2010): 131-144. [2] Rodan, Ali, and Peter Tiňo. "Simple deterministically constructed cycle reservoirs with regular jumps." Neural computation 24.7 (2012): 1822-1852.

The sign in the MinimumLayer are chosen based on the following methods:

ReservoirComputing.BernoulliSampleType
BernoulliSample(p)
BernoulliSample(;p=0.5)

Returns a Bernoulli sign constructor for the MinimumLayer call. The p factor determines the probability of the result as in the Distributions call. The value can be passed as an arg or kwarg. This sign weight determination for input layers is introduced in [1].

[1] Rodan, Ali, and Peter Tino. "Minimum complexity echo state network." IEEE transactions on neural networks 22.1 (2010): 131-144.

ReservoirComputing.IrrationalSampleType
IrrationalSample(irrational, start)
IrrationalSample(;irrational=pi, start=1)

Returns an irrational sign contructor for the '''MinimumLayer''' call. The values can be passed as args or kwargs. The sign of the weight are decided from the decimal expansion of the given irrational. The first start decimal digits are thresholded at 4.5, then the n-th input sign will be + and - respectively.

[1] Rodan, Ali, and Peter Tiňo. "Simple deterministically constructed cycle reservoirs with regular jumps." Neural computation 24.7 (2012): 1822-1852.

To derive the matrix one can call the following function:

ReservoirComputing.create_layerFunction
create_layer(input_layer::AbstractLayer, res_size, in_size)

Returns a res_size times in_size matrix layer, built accordingly to the input_layer constructor.

To create new input layers it suffice to define a new struct containing the needed parameters of the new input layer. This struct wiil need to be an AbstractLayer, so the create_layer function can be dispatched over it. The workflow should follow this snippet:

#creation of the new struct for the layer
struct MyNewLayer <: AbstractLayer
#the layer params go here
end

#dispatch over the function to actually build the layer matrix
function create_layer(input_layer::MyNewLayer, res_size, in_size)
#the new algorithm to build the input layer goes here
end

## Reservoirs

ReservoirComputing.RandSparseReservoirType
RandSparseReservoir(res_size, radius, sparsity)
RandSparseReservoir(res_size; radius=1.0, sparsity=0.1)

Returns a random sparse reservoir initializer, that will return a matrix with given sparsity and scaled spectral radius according to radius. This is the default choice in the ESN construction.

ReservoirComputing.PseudoSVDReservoirType
PseudoSVDReservoir(max_value, sparsity, sorted, reverse_sort)
PseudoSVDReservoir(max_value, sparsity; sorted=true, reverse_sort=false)

Returns an initializer to build a sparse reservoir matrix, with given sparsity created using SVD as described in [1].

[1] Yang, Cuili, et al. "Design of polynomial echo state networks for time series prediction." Neurocomputing 290 (2018): 148-160.

ReservoirComputing.DelayLineReservoirType
DelayLineReservoir(res_size, weight)
DelayLineReservoir(res_size; weight=0.1)

Returns a Delay Line Reservoir matrix constructor to obtain a deterministi reservoir as described in [1]. The weight can be passed as arg or kwarg and it determines the absolute value of all the connections in the reservoir.

[1] Rodan, Ali, and Peter Tino. "Minimum complexity echo state network." IEEE transactions on neural networks 22.1 (2010): 131-144.

ReservoirComputing.DelayLineBackwardReservoirType
DelayLineBackwardReservoir(res_size, weight, fb_weight)
DelayLineBackwardReservoir(res_size; weight=0.1, fb_weight=0.2)

Returns a Delay Line Reservoir constructor to create a matrix with Backward connections as described in [1]. The weight and fb_weight can be passed as either args or kwargs, and they determine the only absolute values of the connections in the reservoir.

[1] Rodan, Ali, and Peter Tino. "Minimum complexity echo state network." IEEE transactions on neural networks 22.1 (2010): 131-144.

ReservoirComputing.SimpleCycleReservoirType
SimpleCycleReservoir(res_size, weight)
SimpleCycleReservoir(res_size; weight=0.1)

Returns a Simple Cycle Reservoir Reservoir constructor to biuld a reservoir matrix as described in [1]. The weight can be passed as arg or kwarg and it determines the absolute value of all the connections in the reservoir.

[1] Rodan, Ali, and Peter Tino. "Minimum complexity echo state network." IEEE transactions on neural networks 22.1 (2010): 131-144.

ReservoirComputing.CycleJumpsReservoirType
CycleJumpsReservoir(res_size; cycle_weight=0.1, jump_weight=0.1, jump_size=3)
CycleJumpsReservoir(res_size, cycle_weight, jump_weight, jump_size)

Return a Cycle Reservoir with Jumps constructor to create a reservoir matrix as described in [1]. The weight and jump_weight can be passed as args or kwargs and they determine the absolute values of all the connections in the reservoir. The jump_size can also be passed either as arg and kwarg and it detemines the jumps between jump_weights.

[1] Rodan, Ali, and Peter Tiňo. "Simple deterministically constructed cycle reservoirs with regular jumps." Neural computation 24.7 (2012): 1822-1852.

Like for the input layers, to actually build the matrix of the reservoir one can call the following function:

ReservoirComputing.create_reservoirFunction
create_reservoir(reservoir::AbstractReservoir, res_size)
create_reservoir(reservoir, args...)

Given an AbstractReservoir constructor and the reservoir size it returns the corresponding matrix. Alternatively it accepts a given matrix.

To create a new reservoir the procedure is imilar to the one for the input layers. First the definition of the new struct of type AbstractReservoir with the reservoir parameters is needed. Then the dispatch over the create_reservoir function makes the model actually build the reservoir matrix. An example of the workflow is given in the following snippet:

#creation of the new struct for the reservoir
struct MyNewReservoir <: AbstractReservoir
#the reservoir params go here
end

#dispatch over the function to build the reservoir matrix
function create_reservoir(reservoir::AbstractReservoir, res_size)
#the new algorithm to build the reservoir matrix goes here
end