Archives for ADAM

03 Jun

Does Deep Learning Suffer From Too Many Optimizers?

image-23214
image-23214

“There is no single optimizer that dominates its competitors across all tasks.” Critics often call machine learning ‘glorified statistics’. There is some merit to the argument. The fundamental function of any machine learning model is pattern recognition, which relies on the principles of convergence; the methods of fitting data to the model. To that end,…

The post Does Deep Learning Suffer From Too Many Optimizers? appeared first on Analytics India Magazine.

12 Sep

A Lowdown On Alternatives To Gradient  Descent Optimization Algorithms 

image-7059
image-7059

Gradient Descent is the most common optimisation strategy used in machine learning frameworks. It is an iterative algorithm used to minimise a function to its local or global minima. In simple words, Gradient Descent iterates overs a function, adjusting it’s parameters until it finds the minimum.  Gradient Descent is used to minimise the error by…

The post A Lowdown On Alternatives To Gradient  Descent Optimization Algorithms  appeared first on Analytics India Magazine.