Training Algorithms

Linear Models

ReservoirComputing.StandardRidgeType
StandardRidge(regularization_coeff)
StandardRidge(;regularization_coeff=0.0)

Ridge regression training for all the models in the library. The regularization_coeff is the regularization, it can be passed as an arg or kwarg.

source
ReservoirComputing.LinearModelType
LinearModel(;regression=LinearRegression, 
    solver=Analytical(), 
    regression_kwargs=(;))

Linear regression training based on MLJLinearModels for all the models in the library. All the parameters have to be passed into regression_kwargs, apart from the solver choice. MLJLinearModels.jl needs to be called in order to use these models.

source

Gaussian Regression

ReservoirComputing.GaussianProcessType
GaussianProcess(mean, kernel;
    lognoise=-2, 
    optimize=false,
    optimizer=Optim.LBFGS())

Wrapper around GaussianProcesses gives the possibility of training every model in the library using Gaussian Regression. GaussianProcesses.jl needs to be called in order to use these models. The use of Gaussian Regression for ESNs has first been explored in [1].

[1] Chatzis, Sotirios P., and Yiannis Demiris. "Echo state Gaussian process." IEEE Transactions on Neural Networks 22.9 (2011): 1435-1445.

source

Support Vector Regression

Support vector Regression is possible using a direct call to LIBSVM regression methods. Instead of a wrapper please refer to the use of LIBSVM.AbstractSVR in the original library.