Gradient Descent for Linear Regression

Gradient Descent for Linear Regression

Our goal is to use Gradient descent to minimize cost function in linear regression with one variable

Gradient descent for this specific linear regression model (where hypothese function h has form \(h(x) = a_0 + a_1 x\), the cost function \(J(a_0, a_1\)) will always have bowl-shaped graph, formally called convex function, hence always find global minimum. This not needs to be true for other model.

Batch gradient descent: "batch", each step of gradient descent uses all the training examples.

Resources:

Comments