forked from OpenBB-finance/OpenBB
-
Notifications
You must be signed in to change notification settings - Fork 0
/
config_neural_network_models.py
51 lines (46 loc) · 2.25 KB
/
config_neural_network_models.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
# This configuration file allows to define the layers that constitute each
#of the neural network models used by our MLP, RNN and LSTM models.
# The first layer is the input layer, meaning that to it we add the input shape expected.
#Note that we can still add the number of neurons on the first layer.
# The last layer is the output layer, meaning that from it we have our prediction share price days
#therefore, we don't define units for this layer as this is defined from the terminal directly.
# There are several types of parameters available to configure your Neural Network model, and more on
#these can be found in https://www.tensorflow.org/api_docs/python/tf/keras/layers
# So far the following type of layers are allowed: Dense, SimpleRNN, LSTM, and Dropout
# This is an on-going work, hence, feel free to request support for more types of layers
# From the terminal, we are able to define:
# - The number of inputs used by the module
# - The number of training epochs
# - The frequency of data training (i.e. do we use all data, or every other day - may speed up training)
# - The preprocessing type: normalization, standardization, or none
# - The optimizer
# - The loss function
MultiLayer_Perceptron \
= [ {'Dense':
{'units':50, 'activation':'relu'} },
{'Dense':
{'units':100, 'activation':'relu'} },
{'Dense':
{'units':80, 'activation':'relu'} },
{'Dense':
{'units':30, 'activation':'relu'} },
{'Dense':
{'activation':'linear'} }]
Recurrent_Neural_Network \
= [ {'SimpleRNN':
{'units':100, 'activation':'linear', 'return_sequences':True} },
{'SimpleRNN':
{'units':50, 'activation':'linear', 'return_sequences':True} },
{'Dropout':
{'rate':0.2} },
{'SimpleRNN':
{'units':21, 'activation':'linear', 'return_sequences':False} },
{'Dense':
{'activation':'linear'} }]
Long_Short_Term_Memory \
= [ {'LSTM':
{'units':25, 'activation':'tanh', 'return_sequences':True} },
{'LSTM':
{'units':15, 'activation':'tanh', 'return_sequences':False} },
{'Dense':
{'activation':'linear'} }]