home

 

Optimization

Gradient Descent

Gradient descent is an optimization algorithm used to minimize some function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient.

It is one of the most popular algorithms to perform optimization and by far the most common way to optimize neural networks. <read on>