ReservoirCogs: Advanced Reservoir Computing with OpenCog AtomSpace Integration
Flexible library for Reservoir Computing architectures like Echo State Networks (ESN) with deep symbolic AI integration through OpenCog AtomSpace.
Tutorials: Open in Colab
Documentation: Getting Started | User Guide | AtomSpace Integration
C++ API: High-performance reservoir computing with OpenCog AtomSpace integration
๐ Project Board: Development Roadmap | GitHub Project
Tip
๐ Exciting News! We just launched a new beta tool based on a Large Language Model! ๐ You can chat with ReservoirChat and ask anything about Reservoir Computing and ReservoirPy! ๐ค๐ก Donโt miss out, itโs available for a limited time! โณ
https://chat.reservoirpy.inria.fr
๐ NEW: ReservoirChat Playground - Mindbendingly Amazing Interactive Experience!
ReservoirCogs now includes deep integration with OpenCog AtomSpace for symbolic AI capabilities:
- Symbolic Representation: Store reservoir states and dynamics as AtomSpace knowledge
- Temporal Logic: Reason about sequences and temporal patterns
- Knowledge Extraction: Extract learned patterns as symbolic concepts
- High Performance: C++ implementation for production systems
- Hybrid AI: Combine neural reservoir computing with symbolic reasoning
Feature overview:
- Easy creation of complex architectures with multiple reservoirs (e.g. deep reservoirs), readouts
- Feedback loops and advanced temporal processing
- OpenCog AtomSpace integration for symbolic AI and knowledge representation
- Differential Emotion Theory Framework for affective computing and emotionally aware AI
- High-performance C++ implementation alongside Python compatibility
- Intrinsic plasticity for adaptive reservoir dynamics
- Online learning for real-time applications
- Evolutionary optimization of reservoir topologies
- Symbolic pattern recognition and temporal logic reasoning
- Hybrid neural-symbolic architectures
- offline and online training
- parallel implementation
- sparse matrix computation
- advanced learning rules (e.g. Intrinsic Plasticity, Local Plasticity or NVAR (Next-Generation RC))
- interfacing with scikit-learn models
- and many more!
Moreover, graphical tools are included to easily explore hyperparameters with the help of the hyperopt library.
pip install reservoirpy# Install OpenCog dependencies
sudo apt-get install opencog-dev
# Clone and build ReservoirCogs
git clone https://github.com/HyperCogWizard/reservoircogs.git
cd reservoircogs
mkdir build && cd build
cmake ..
make -j4
sudo make installimport reservoirpy as rpy
from reservoirpy.nodes import Reservoir, Ridge
# Create a simple ESN
reservoir = Reservoir(100, lr=0.3, sr=0.9)
readout = Ridge(ridge=1e-6)
# Connect and train
model = reservoir >> readout
model.fit(X_train, y_train)
predictions = model.run(X_test)#include <opencog/reservoir/nodes/ReservoirNode.h>
// Create AtomSpace and reservoir
auto atomspace = createAtomSpace();
auto esn = std::make_shared<EchoStateNetwork>(atomspace, 100, 3, 1);
// Configure and train
esn->set_leaking_rate(0.3);
algorithms::ReservoirTrainer trainer(atomspace);
trainer.train_esn_ridge_regression(esn, inputs, targets);
auto predictions = esn->predict(test_input);For a general introduction to reservoir computing and ReservoirPy features, take a look at the tutorials
from reservoirpy.datasets import mackey_glass, to_forecasting
from reservoirpy.nodes import Reservoir, Ridge
from reservoirpy.observables import rmse, rsquare
### Step 1: Load the dataset
X = mackey_glass(n_timesteps=2000) # (2000, 1)-shaped array
# create y by shifting X, and train/test split
x_train, x_test, y_train, y_test = to_forecasting(X, test_size=0.2)
### Step 2: Create an Echo State Network
# 100 neurons reservoir, spectral radius = 1.25, leak rate = 0.3
reservoir = Reservoir(units=100, sr=1.25, lr=0.3)
# feed-forward layer of neurons, trained with L2-regularization
readout = Ridge(ridge=1e-5)
# connect the two nodes
esn = reservoir >> readout
### Step 3: Fit, run and evaluate the ESN
esn.fit(x_train, y_train, warmup=100)
predictions = esn.run(x_test)
print(f"RMSE: {rmse(y_test, predictions)}; R^2 score: {rsquare(y_test, predictions)}")
# RMSE: 0.0020282; R^2 score: 0.99992- 1 - Getting started with ReservoirPy
- 2 - Advanced features
- 3 - General introduction to Reservoir Computing
- 4 - Understand and optimise hyperparameters
- 5 - Classification with reservoir computing
- 6 - Interfacing ReservoirPy with scikit-learn
For advanced users, we also showcase partial reproduction of papers on reservoir computing to demonstrate some features of the library.
- Improving reservoir using Intrinsic Plasticity (Schrauwen et al., 2008)
- Interactive reservoir computing for chunking information streams (Asabuki et al., 2018)
- Next-Generation reservoir computing (Gauthier et al., 2021)
- Edge of stability Echo State Network (Ceni et al., 2023)
ReservoirCogs development is organized through a comprehensive GitHub Project that orchestrates both short-term and long-term implementation of our feature portfolio.
Active development focusing on production-ready capabilities:
- ๐ธ๏ธ GraphRAG Integration: Knowledge graph-based retrieval-augmented generation
- โก Codestral AI Engine: Specialized language model for technical documentation
- ๐ง AtomSpace Intelligence: OpenCog symbolic AI reasoning capabilities
- ๐ฎ Hybrid AI Architecture: Neural-symbolic fusion implementation
Research-driven features for long-term innovation:
- ๐งฌ P-Systems Membrane Computing with P-lingua integration
- ๐ณ B-Series Rooted Tree Gradient Descent with Runge-Kutta methods โ Research implementation available
- ๐ J-Surface Julia Differential Equations with DifferentialEquations.jl
- ๐ Differential Emotion Theory Framework for affective computing โ Research implementation available
- ๐ GitHub Project Board: Complete project tracking and coordination
- ๐บ๏ธ Development Roadmap: Detailed timeline and milestones
- ๐ท๏ธ Issue Templates: Structured feature requests and bug reports
- โ๏ธ Automation Workflows: Automated project management and CI/CD
Our project uses advanced GitHub Project features including custom fields, automated workflows, and multiple views (Board, Table, Timeline) to ensure efficient coordination of complex, interdisciplinary development spanning traditional software engineering and cutting-edge AI research.
If you want your paper to appear here, please contact us (see contact link below).
- ( HAL | PDF | Code ) Leger et al. (2024) Evolving Reservoirs for Meta Reinforcement Learning. EvoAPPS 2024
- ( arXiv | PDF ) Chaix-Eichel et al. (2022) From implicit learning to explicit representations. arXiv preprint arXiv:2204.02484.
- ( HTML | HAL | PDF ) Trouvain & Hinaut (2021) Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs. ICANN 2021
- ( HTML ) Pagliarini et al. (2021) Canary Vocal Sensorimotor Model with RNN Decoder and Low-dimensional GAN Generator. ICDL 2021.
- ( HAL | PDF ) Pagliarini et al. (2021) What does the Canary Say? Low-Dimensional GAN Applied to Birdsong. HAL preprint.
- ( HTML | HAL | PDF ) Hinaut & Trouvain (2021) Which Hype for My New Task? Hints and Random Search for Echo State Networks Hyperparameters. ICANN 2021
We also provide a curated list of tutorials, papers, projects and tools for Reservoir Computing (not necessarily related to ReservoirPy) here!:
https://github.com/reservoirpy/awesome-reservoir-computing
If you have a question regarding the library, please open an issue.
If you have more general question or feedback you can contact us by email to xavier dot hinaut the-famous-home-symbol inria dot fr.
Trouvain, N., Pedrelli, L., Dinh, T. T., Hinaut, X. (2020) ReservoirPy: an efficient and user-friendly library to design echo state networks. In International Conference on Artificial Neural Networks (pp. 494-505). Springer, Cham. ( HTML | HAL | PDF )
If you're using ReservoirPy in your work, please cite our package using the following bibtex entry:
@incollection{Trouvain2020,
doi = {10.1007/978-3-030-61616-8_40},
url = {https://doi.org/10.1007/978-3-030-61616-8_40},
year = {2020},
publisher = {Springer International Publishing},
pages = {494--505},
author = {Nathan Trouvain and Luca Pedrelli and Thanh Trung Dinh and Xavier Hinaut},
title = {{ReservoirPy}: An Efficient and User-Friendly Library to Design Echo State Networks},
booktitle = {Artificial Neural Networks and Machine Learning {\textendash} {ICANN} 2020}
}
This package is developed and supported by Inria at Bordeaux, France in Mnemosyne group. Inria is a French Research Institute in Digital Sciences (Computer Science, Mathematics, Robotics, ...).