Linear Regression
Regression
It is the statistical technique that relates a dependant variable to one or more independant variables
training set
learning algo
x
f y^
input/feature (function)(model) predicted value
w and b : parameters
coefficients
weights
Cost Function
error -> y^(i) - y(i)
squared error cost function:
y(i)
error
y^(i)
goal is to minimize J(w,b) hence minimising the error
Cost Function Intuition
w = 0.5
w=
- 0.
5
simplified: b=0
=
f(x) = wx
Gradient Descent
0<alpha<1 number of steps downhill
gives direction
if alpha is too small : if alpha is too large:
gradient descent may be too slow gradient descent may never reach min & overshoot
Gradient Descent Implementation
Multi Variable Linear Regression
Gradient Descent