Skip to content

arbdigital/OpenNMT

 
 

Repository files navigation

Build Status Gitter chat

OpenNMT: Open-Source Neural Machine Translation

OpenNMT is a full-featured, open-source (MIT) neural machine translation system utilizing the Torch mathematical toolkit.

The system is designed to be simple to use and easy to extend , while maintaining efficiency and state-of-the-art translation accuracy. Features include:

  • Speed and memory optimizations for high-performance GPU training.
  • Simple general-purpose interface, only requires and source/target data files.
  • C-only decoder implementation for easy deployment.
  • Extensions to allow other sequence generation tasks such as summarization and image captioning.

Installation

OpenNMT only requires a vanilla torch/cutorch install. It uses nn, nngraph, and cunn. Alternatively there is a (CUDA) Docker container.

Quickstart

OpenNMT consists of three commands:

  1. Preprocess the data.

th preprocess.lua -train_src data/src-train.txt -train_tgt data/tgt-train.txt -valid_src data/src-val.txt -valid_tgt data/tgt-val.txt -save_data data/demo

  1. Train the model.

th train.lua -data data/demo-train.t7 -save_model model

  1. Translate sentences.

th translate.lua -model model_final.t7 -src data/src-test.txt -output pred.txt

See the guide for more details.

Documentation

About

Open-Source Neural Machine Translation in Torch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 100.0%