Advanced Optimization Algorithms & Concepts

Advanced Optimization Algorithms & Concepts

With gradient descent, we'll need to provide code to compute partial derivatives for cost function.
There are other optimization algorithms below:

These algorithms have a clever inner loop so we don't have to pick the learning rate \(\alpha\). Not just that, they converge much faster than gradient descent.

We don't need to implement these algorithms as high level languages such as Octave and Matlab already have these algorithm built-in.
Choosing a good library is key because they will have big difference in performance

Resources:
https://www.coursera.org/learn/machine-learning/lecture/licwf/advanced-optimization

Comments