RecurrentLayers.jl extends Flux.jl recurrent layers offering by providing implementations of bleeding edge recurrent layers not commonly available in base deep learning libraries. It is designed for a seamless integration with the larger Flux ecosystem, enabling researchers and practitioners to leverage the latest developments in recurrent neural networks.
Currently available layers and work in progress in the short term:
- Minimal gated unit (MGU) arxiv
- Light gated recurrent unit (LiGRU) arxiv
- Independently recurrent neural networks (IndRNN) arxiv
- Recurrent addictive networks (RAN) arxiv
- Recurrent highway network (RHN) arixv
- Light recurrent unit (LightRU) pub
- Neural architecture search unit (NAS) arxiv
- Evolving recurrent neural networks (MUT1/2/3) pub
- Structurally constrained recurrent neural network (SCRN) arxiv
- Peephole long short term memory (PeepholeLSTM) pub
- FastRNN and FastGRNN arxiv
- Minimal gated recurrent unit (minGRU) and minimal long short term memory (minLSTM) arxiv
You can install RecurrentLayers
using either of:
using Pkg
Pkg.add("RecurrentLayers")
julia> ]
pkg> add RecurrentLayers
The workflow is identical to any recurrent Flux layer: just plug in a new recurrent layer in your workflow and test it out!
This project is licensed under the MIT License, except for nas_cell.jl
, which is licensed under the Apache License, Version 2.0.
nas_cell.jl
is a reimplementation of the NASCell from TensorFlow and is licensed under the Apache License 2.0. See the file header andLICENSE-APACHE
for details.- All other files are licensed under the MIT License. See
LICENSE-MIT
for details.
If you have any questions, issues, or feature requests, please open an issue or contact us via email.