Skip to content

yang-fei/lstm

This branch is up to date with wojzaremba/lstm:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

author
Wojciech Zaremba
May 20, 2015
7687025 · May 20, 2015

History

13 Commits
Feb 13, 2015
Feb 13, 2015
Feb 13, 2015
Feb 16, 2015
May 20, 2015
Mar 27, 2015
May 20, 2015

Repository files navigation

Long Short Term Memory Units

This is self-contained package to train a language model on word level Penn Tree Bank dataset. It achieves 115 perplexity for a small model in 1h, and 81 perplexity for a big model in a day. Model ensemble of 38 big models gives 69 perplexity. This code is derived from https://github.com/wojciechz/learning_to_execute (the same author, but a different company).

More information: http://arxiv.org/pdf/1409.2329v4.pdf

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 100.0%