This project is a simple implementation of Tensor2tensor (https://github.com/tensorflow/tensor2tensor) for machine translation.
-
Preprosessing. Prepare the parallel data (token, bpe, vocab, and so on), run
./datagen.sh
to generate data. -
Training. Modify the model params (transformer_params_big or transformer_params_base, basic params are set in models/common_hparms.py), and run
./train.sh
. -
inference. Run the command to translate source sentence:
./test.sh
.
Once you have trained the model, you can also use a C++/CUDA Transformer-translator to translate source language.
If you have questions, suggestions and bug reports, please email [email protected] or [email protected].