RecurrentLayers.jl extends Flux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.
The package offers multiple layers for Flux.jl. Currently there are 30+ cells implemented, together with multiple higher level implementations:
Short name | Publication venue | Official implementation |
---|---|---|
AntisymmetricRNN/GatedAntisymmetricRNN | ICLR 2019 | – |
ATR | EMNLP 2018 | bzhangGo/ATR |
BR/BRC | PLOS ONE 2021 | nvecoven/BRC |
CFN | ICLR 2017 | – |
coRNN | ICLR 2021 | tk-rusch/coRNN |
FastRNN/FastGRNN | NeurIPS 2018 | Microsoft/EdgeML |
FSRNN | NeurIPS 2017 | amujika/Fast-Slow-LSTM |
IndRNN | CVPR 2018 | Sunnydreamrain/IndRNN_Theano_Lasagne |
JANET | arXiv 2018 | JosvanderWesthuizen/janet |
LEM | ICLR 2022 | tk-rusch/LEM |
LiGRU | IEEE Transactions on Emerging Topics in Computing 2018 | mravanelli/theano-kaldi-rnn |
LightRU | MDPI Electronics 2023 | – |
MinimalRNN | NeurIPS 2017 | – |
MultiplicativeLSTM | Workshop ICLR 2017 | benkrause/mLSTM |
MGU | International Journal of Automation and Computing 2016 | – |
MUT1/MUT2/MUT3 | ICML 2015 | – |
NAS | arXiv 2016 | tensorflow_addons/rnn |
OriginalLSTM | Neural Computation 1997 | - |
PeepholeLSTM | JMLR 2002 | – |
RAN | arXiv 2017 | kentonl/ran |
RHN | ICML 2017 | jzilly/RecurrentHighwayNetworks |
SCRN | ICLR 2015 | facebookarchive/SCRNNs |
SGRN | IET 2018 | – |
STAR | IEEE Transactions on Pattern Analysis and Machine Intelligence 2022 | 0zgur0/STAckable-Recurrent-network |
Typed RNN / GRU / LSTM | ICML 2016 | – |
UGRNN | ICLR 2017 | - |
UnICORNN | ICML 2021 | tk-rusch/unicornn |
WMCLSTM | Neural Networks 2021 | – |
- Additional wrappers: Stacked RNNs, and Multiplicative RNN.
You can install RecurrentLayers
using either of:
using Pkg
Pkg.add("RecurrentLayers")
julia> ]
pkg> add RecurrentLayers
The workflow is identical to any recurrent Flux layer: just plug in a new recurrent layer in your workflow and test it out!
This project is licensed under the MIT License, except for nas_cell.jl
, which is licensed under the Apache License, Version 2.0.
nas_cell.jl
is a reimplementation of the NASCell from TensorFlow and is licensed under the Apache License 2.0. See the file header andLICENSE-APACHE
for details.- All other files are licensed under the MIT License. See
LICENSE-MIT
for details.
LuxRecurrentLayers.jl: Equivalent library, providing recurrent layers for Lux.jl.
torchrecurrent: Recurrent layers for Pytorch.
ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.