A Transformer implementation from scratch, crafted following the principles outlined in the paper 'Attention Is All You Need', serves as a robust solution for tasks such as neural machine translation and grammatical error correction.
- Clone this repo:
git clone https://github.com/minhnguyent546/nmt-en-vi.git
cd nmt-en-vi
- Install required dependencies:
pip install -r requirements.txt
Please take a look at the config file located at config/config.yaml
, and change the train
, test
, and validation
paths to your local files.
- Preprocessing the data:
python preprocess_nmt.py --config 'config/config.yaml'
- To train the model:
python train_nmt.py --config 'config/config.yaml'
- To test the model:
python test_nmt.py --config 'config/config.yaml'