ReservoirPy Nodes (reservoirpy.nodes)#

Reservoirs#

Reservoir([units, lr, sr, input_bias, ...])

Pool of leaky-integrator neurons with random recurrent connexions.

NVAR(delay, order[, strides])

Non-linear Vector AutoRegressive machine.

IPReservoir([units, sr, lr, mu, sigma, ...])

Pool of neurons with random recurrent connexions, tuned using Intrinsic Plasticity.

Offline readouts#

Ridge([output_dim, ridge, Wout, bias, ...])

A single layer of neurons learning with Tikhonov linear regression.

ScikitLearnNode(model[, model_hypers, ...])

A node interfacing a scikit-learn linear model that can be used as an offline readout node.

Online readouts#

LMS([output_dim, alpha, Wout, bias, ...])

Single layer of neurons learning connections using Least Mean Squares algorithm.

RLS([output_dim, alpha, Wout, bias, ...])

Single layer of neurons learning connections using Recursive Least Squares algorithm.

FORCE([output_dim, alpha, rule, Wout, bias, ...])

Single layer of neurons learning connections through online learning rules.

Optimized ESN#

ESN([reservoir_method, learning_method, ...])

Echo State Networks as a Node, with parallelization of state update.

Activation functions#

Tanh(**kwargs)

Hyperbolic tangent activation function.

Sigmoid(**kwargs)

Sigmoid activation function.

Softmax([beta])

Softmax activation function.

Softplus(**kwargs)

Softplus activation function.

ReLU(**kwargs)

ReLU activation function.

Identity(**kwargs)

Identity function.

Input and Output#

Input([input_dim, name])

Node feeding input data to other nodes in the models.

Output([name])

Convenience node which can be used to add an output to a model.

Operators#

Concat([axis, name])

Concatenate vector of data along feature axis.

Delay([delay, initial_values, input_dim, dtype])

Delays the data transmitted through this node without transformation.