yannis assael | the blog

  • Home
  • About
  • Categories
    • Android
    • Computing
    • iOS
    • Machine Learning
    • MacOSX
  • GitHub
  • Mobile Apps
  • yannisassael.com


Batch-Normalized LSTM for Torch

Written by iassael on 16/04/2016. Posted in computing, machine learning

Recurrent Batch Normalization

Batch-Normalized LSTMs

Tim Cooijmans, Nicolas Ballas, César Laurent, Çağlar Gülçehre, Aaron Courville

http://arxiv.org/abs/1603.09025

Usage

Clone from: https://github.com/iassael/torch-bnlstm

local rnn = nn.LSTM(input_size, rnn_size, n, dropout, bn)

n = number of layers (1-N)

dropout = probability of dropping a neuron (0-1)

bn = batch normalization (true, false)

Example

https://github.com/iassael/char-rnn

Performance

Validation scores on char-rnn with default options

bnlstm_val_loss

 

Tags: batch, bn, char-rnn, lstm, normalisation, normalization, torch

Trackback from your site.

Leave a comment