Archives for SGD

03 Jun

Does Deep Learning Suffer From Too Many Optimizers?

image-23214
image-23214

“There is no single optimizer that dominates its competitors across all tasks.” Critics often call machine learning ‘glorified statistics’. There is some merit to the argument. The fundamental function of any machine learning model is pattern recognition, which relies on the principles of convergence; the methods of fitting data to the model. To that end,…

The post Does Deep Learning Suffer From Too Many Optimizers? appeared first on Analytics India Magazine.

12 Sep

A Lowdown On Alternatives To Gradient  Descent Optimization Algorithms 

image-7059
image-7059

Gradient Descent is the most common optimisation strategy used in machine learning frameworks. It is an iterative algorithm used to minimise a function to its local or global minima. In simple words, Gradient Descent iterates overs a function, adjusting it’s parameters until it finds the minimum.  Gradient Descent is used to minimise the error by…

The post A Lowdown On Alternatives To Gradient  Descent Optimization Algorithms  appeared first on Analytics India Magazine.

21 Sep

How Stochastic Gradient Descent Is Solving Optimisation Problems In Deep Learning

image-257
image-257

To a large extent, deep learning is all about solving optimisation problems. According to computer science researchers, stochastic gradient descent, better known as SGD has become the workhorse of Deep Learning, which, in turn, is responsible for the remarkable progress in computer vision. Despite its simplicity, SGD is a simple variant of classical gradient descent […]

The post How Stochastic Gradient Descent Is Solving Optimisation Problems In Deep Learning appeared first on Analytics India Magazine.