Batch-Normalized LSTM for Torch
Recurrent Batch Normalization
Batch-Normalized LSTMs
Tim Cooijmans, Nicolas Ballas, César Laurent, Çağlar Gülçehre, Aaron Courville
http://arxiv.org/abs/1603.09025
Usage
Clone from: https://github.com/iassael/torch-bnlstm
local rnn = nn.LSTM(input_size, rnn_size, n, dropout, bn)
n = number of layers (1-N)
dropout = probability of dropping a neuron (0-1)
bn = batch normalization (true, false)
Example
https://github.com/iassael/char-rnn
Performance
Validation scores on char-rnn with default options