Skip to content

This repo is to compare Transformer and LSTM on time series forecasting

Notifications You must be signed in to change notification settings

linyijun/transformer-vs-lstm-forecasting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CSCI 5525: Homework 4

Title: A Comparison of Transformer and LSTM Time-series Data Forecasting

Group member: Yijun Lin ([email protected]), Min Namgung ([email protected])

Date: Dec 17, 2021

This repo is to compare Transformer and LSTM on time series forecasting The dataset is generated based on the resource: https://github.com/CVxTz/time_series_forecasting. Sample data can be found HERE.

You can generate your data using the code in the /scripts folder. See the sample commands in /bash/generate_data0.sh.

Experiment 1: Transformer VS. LSTM Results:

The images demonstrates that Transformer produced more accurate prediction than LSTM.

More detailed metrics comparison can be found below.

Tramsformer LSTM model
1 2
3 4

Experiment 2: Adding Periodic Positional Encoding on Transformer:

The syntatic data are generated based on: A * Cos(Bx + C) + D

where, A controls the amplitude, B is controlling the periodic of the function, e.g., 7, 14, 28, C is the horizonal shift, and D is the vertical shift

So, we intentionally add periodic information in the Transformer in a postional encoding way

Sign Description
AUX Auxiliary Features (e.g., Hour, Day)
Penc Periodic as Positional Encoding
------------- -----------------------------------------
0 Not using
1 Using

The images demonstrates that Transformer with periodic positional encoding produced more accurate prediction.

ID Transformer W/WO Positional Encoding
Without 5
With 6
Model Using Auxiliary Features Using Periodic as Positional Encoding MAE SMAPE
LSTM 1 0 0.29625 47.51880
Transformer 1 0 0.23089 37.93381
Transformer 1 1 0.19829 34.05033

How to run the code:

Please refer to bash folder about how to train/test model

About

This repo is to compare Transformer and LSTM on time series forecasting

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published