GradientDescent_implementation.ipynb - Colab
GradientDescent_implementation.ipynb - Colab
ipynb - Colab
https://colab.research.google.com/drive/1beFjQpJa9lH_wXSyR-SMDuH3bV-GwcRc#scrollTo=tctrDsA8Pp0x&printMode=true 1/5
10/27/24, 4:11 PM Bản sao của GradientDescent_implementation.ipynb - Colab
Sigmoid activation function
1
σ(x) =
−x
1 + e
^ = σ(w1 x1 + w2 x2 + b)
y
Error function
^) = −y log(y
E rror(y, y ^) − (1 − y) log(1 − y
^)
^)xi
wi ⟶ wi + α(y − y
^)
b ⟶ b + α(y − y
np.random.seed(44)
epochs = 1000
learnrate = 0.01
errors = []
n_records, n_features = features.shape
last_loss = None
weights = np.random.normal(scale=1 / n_features**.5, size=n_features)
bias = 0
for e in range(epochs):
del_w = np.zeros(weights.shape)
for x, y in zip(features, targets):
output = output_formula(x, weights, bias)
error = y - output
weights += learnrate * error * x # wi⟶wi+α(y−y^)xi
bias += learnrate * error
https://colab.research.google.com/drive/1beFjQpJa9lH_wXSyR-SMDuH3bV-GwcRc#scrollTo=tctrDsA8Pp0x&printMode=true 3/5
10/27/24, 4:11 PM Bản sao của GradientDescent_implementation.ipynb - Colab
https://colab.research.google.com/drive/1beFjQpJa9lH_wXSyR-SMDuH3bV-GwcRc#scrollTo=tctrDsA8Pp0x&printMode=true 4/5
10/27/24, 4:11 PM Bản sao của GradientDescent_implementation.ipynb - Colab
# Derivative of ReLU
def relu_derivative(x):
return np.where(x > 0, 1, 0)
# Derivative of Tanh
def tanh_derivative(x):
return 1 - np.tanh(x)**2
for e in range(epochs):
for x, y in zip(features, targets):
linear_combination = np.dot(x, weights) + bias
https://colab.research.google.com/drive/1beFjQpJa9lH_wXSyR-SMDuH3bV-GwcRc#scrollTo=tctrDsA8Pp0x&printMode=true 5/5