States Modifications

ReservoirComputing.PadType
Pad(padding=1.0)

Padding layer that appends a constant value to the state (and hence to the layer output).

\[\tilde{x} = \begin{bmatrix} x \\ \text{padding} \end{bmatrix}\]

Arguments

  • padding: value to append. Default is 1.0.

Forward

pad(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with chosen padding added, thus increasing the size by 1.

Examples

julia> pad = Pad(1.0)
(::Pad{Float64}) (generic function with 2 methods)

julia> x_old = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
10-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> x_new = pad(x_old)
11-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9
 1
julia> mat_old = [1  2  3;
                   4  5  6;
                   7  8  9;
                  10 11 12;
                  13 14 15;
                  16 17 18;
                  19 20 21]
7×3 Matrix{Int64}:
  1   2   3
  4   5   6
  7   8   9
 10  11  12
 13  14  15
 16  17  18
 19  20  21


 julia> mat_new = pad(mat_old)
 8×3 Matrix{Int64}:
   1   2   3
   4   5   6
   7   8   9
  10  11  12
  13  14  15
  16  17  18
  19  20  21
   1   1   1
source
ReservoirComputing.ExtendType
Extend(op)

Wrapper layer that concatenates the reservoir state produced by op with the input that Extend receives.

For an input vector or matrix x and a wrapped layer producing state s, Extend computes:

\[\begin{bmatrix} x \\ s \end{bmatrix}\]

Arguments

  • op: the wrapped layer whose output state will be concatenated with the input.

Examples

esn = ReservoirChain(
    Extend(
        StatefulLayer(
        ESNCell(
        3 => 300; init_reservoir = rand_sparse(; radius = 1.2, sparsity = 6 / 300))
    )
    ),
    NLAT2(),
    LinearReadout(300 + 3 => 3)
)

In this esample the input to Extend is the initial value fed to ReservoirChain. After Extend, the value in the chain will be the state returned by the StatefulLayer, vcated with the input.

source
ReservoirComputing.NLAT1Function
NLAT1()

NLAT1 implements the T₁ transformation algorithm introduced in (Chattopadhyay et al., 2020) and (Pathak et al., 2017). The T₁ algorithm squares elements of the input array, targeting every second row.

\[\tilde{r}_{i,j} = \begin{cases} r_{i,j} \times r_{i,j}, & \text{if } j \text{ is odd}; \\ r_{i,j}, & \text{if } j \text{ is even}. \end{cases}\]

Arguments

None

Forward

nlat1(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with transformed elements according to NLAT1, with same dimensionality as the original.

Example

julia> nlat1 = NLAT1()
NLAT1 (generic function with 3 methods)

julia> x_old = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
10-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> n_new = nlat1(x_old)
10-element Vector{Int64}:
  0
  1
  4
  3
 16
  5
 36
  7
 64
  9
julia> mat_old = [1  2  3;
                   4  5  6;
                   7  8  9;
                  10 11 12;
                  13 14 15;
                  16 17 18;
                  19 20 21]
7×3 Matrix{Int64}:
  1   2   3
  4   5   6
  7   8   9
 10  11  12
 13  14  15
 16  17  18
 19  20  21

julia> mat_new = nlat1(mat_old)
7×3 Matrix{Int64}:
   1    4    9
   4    5    6
  49   64   81
  10   11   12
 169  196  225
  16   17   18
 361  400  441
source
ReservoirComputing.NLAT2Function
NLAT2()

NLAT2 implements the T₂ transformation algorithm as defined in (Chattopadhyay et al., 2020). This transformation algorithm modifies the reservoir states by multiplying each odd-indexed row (starting from the second row) with the product of its two preceding rows.

\[\tilde{r}_{i,j} = \begin{cases} r_{i,j-1} \times r_{i,j-2}, & \text{if } j > 1 \text{ is odd}; \\ r_{i,j}, & \text{if } j \text{ is 1 or even}. \end{cases}\]

Arguments

None

Forward

nlat2(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with transformed elements according to NLAT2, with same dimensionality as the original.

Example

julia> nlat2 = NLAT2()
NLAT2 (generic function with 3 methods)

julia> x_old = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
10-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> n_new = nlat2(x_old)
10-element Vector{Int64}:
  0
  1
  0
  3
  6
  5
 20
  7
 42
  9
julia> mat_old = [1  2  3;
                   4  5  6;
                   7  8  9;
                  10 11 12;
                  13 14 15;
                  16 17 18;
                  19 20 21]
7×3 Matrix{Int64}:
  1   2   3
  4   5   6
  7   8   9
 10  11  12
 13  14  15
 16  17  18
 19  20  21

julia> mat_new = nlat2(mat_old)
7×3 Matrix{Int64}:
  1   2    3
  4   5    6
  4  10   18
 10  11   12
 70  88  108
 16  17   18
 19  20   21
source
ReservoirComputing.NLAT3Function
NLAT3(x)

Implements the T₃ transformation algorithm as detailed in (Chattopadhyay et al., 2020). This algorithm modifies the reservoir's states by multiplying each odd-indexed row (beginning from the second row) with the product of the immediately preceding and the immediately following rows.

\[\tilde{r}_{i,j} = \begin{cases} r_{i,j-1} \times r_{i,j+1}, & \text{if } j > 1 \text{ is odd}; \\ r_{i,j}, & \text{if } j = 1 \text{ or even.} \end{cases}\]

Arguments

None

Forward

nlat3(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with transformed elements according to NLAT3, with same dimensionality as the original.

Example

julia> nlat2 = NLAT3()
NLAT3 (generic function with 3 methods)

julia> x_old = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
10-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> n_new = nlat2(x_old)
10-element Vector{Int64}:
  0
  1
  3
  3
 15
  5
 35
  7
 63
  9
julia> mat_old = [1  2  3;
                   4  5  6;
                   7  8  9;
                  10 11 12;
                  13 14 15;
                  16 17 18;
                  19 20 21]
7×3 Matrix{Int64}:
  1   2   3
  4   5   6
  7   8   9
 10  11  12
 13  14  15
 16  17  18
 19  20  21

julia> mat_new = nlat2(mat_old)
7×3 Matrix{Int64}:
   1    2    3
   4    5    6
  40   55   72
  10   11   12
 160  187  216
  16   17   18
  19   20   21
source
ReservoirComputing.PartialSquareType
PartialSquare(eta)

Implement a partial squaring of the states as described in (Barbosa et al., 2021).

\[ \begin{equation} g(r_i) = \begin{cases} r_i^2, & \text{if } i \leq \eta_r N, \\ r_i, & \text{if } i > \eta_r N. \end{cases} \end{equation}\]

Arguments

  • eta: Percentage of elements of the input vector to be squared.

Forward

partialsq(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with partial square components, with same dimensionality as the original.

Example

julia> partialsq = PartialSquare(0.6)
PartialSquare(0.6)

julia> x_old = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
10-element Vector{Int64}:
 0
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> x_new = partialsq(x_old)
10-element Vector{Int64}:
  0
  1
  4
  9
 16
 25
  6
  7
  8
  9
source
ReservoirComputing.ExtendedSquareFunction
ExtendedSquare()

Extension of the Lu initialization proposed in (Herteux and Räth, 2020). The state vector is extended with the squared elements of the initial state.

\[\begin{equation} \vec{x} = \{x_1, x_2, \dots, x_N, x_1^2, x_2^2, \dots, x_N^2\} \end{equation}\]

Arguments

None

Forward

extendedsq(state)

Arguments

  • state: The reservoir computing state.

Returns

  • A vector or matrix with the original elements concatenated with the squared elements. Dimensionality is double of the original.

Example

julia> extendedsq = ExtendedSquare()
ExtendedSquare()

julia> x_old = [1, 2, 3, 4, 5, 6, 7, 8, 9]
9-element Vector{Int64}:
 1
 2
 3
 4
 5
 6
 7
 8
 9

julia> x_new = extendedsq(x_old)
18-element Vector{Int64}:
  1
  2
  3
  4
  5
  6
  7
  8
  9
  1
  4
  9
 16
 25
 36
 49
 64
 81
source

References