Group member: Yijun Lin ([email protected]), Min Namgung ([email protected])
This repo is to compare Transformer and LSTM on time series forecasting The dataset is generated based on the resource: https://github.com/CVxTz/time_series_forecasting. Sample data can be found HERE.
You can generate your data using the code in the /scripts folder. See the sample commands in /bash/generate_data0.sh.
The images demonstrates that Transformer produced more accurate prediction than LSTM.
More detailed metrics comparison can be found below.
Tramsformer | LSTM model |
---|---|
The syntatic data are generated based on: A * Cos(Bx + C) + D
where, A controls the amplitude, B is controlling the periodic of the function, e.g., 7, 14, 28, C is the horizonal shift, and D is the vertical shift
So, we intentionally add periodic information in the Transformer in a postional encoding way
Sign | Description |
---|---|
AUX | Auxiliary Features (e.g., Hour, Day) |
Penc | Periodic as Positional Encoding |
------------- | ----------------------------------------- |
0 | Not using |
1 | Using |
The images demonstrates that Transformer with periodic positional encoding produced more accurate prediction.
ID | Transformer W/WO Positional Encoding |
---|---|
Without | |
With |
Model | Using Auxiliary Features | Using Periodic as Positional Encoding | MAE | SMAPE |
---|---|---|---|---|
LSTM | 1 | 0 | 0.29625 | 47.51880 |
Transformer | 1 | 0 | 0.23089 | 37.93381 |
Transformer | 1 | 1 | 0.19829 | 34.05033 |
Please refer to bash folder about how to train/test model