Archives for Adagrad
Gradient Descent is the most common optimisation strategy used in machine learning frameworks. It is an iterative algorithm used to minimise a function to its local or global minima. In simple words, Gradient Descent iterates overs a function, adjusting it’s parameters until it finds the minimum. Gradient Descent is used to minimise the error by…
The post A Lowdown On Alternatives To Gradient Descent Optimization Algorithms appeared first on Analytics India Magazine.
A key balancing act in machine learning is choosing an appropriate level of model complexity: if the model is too complex, it will fit the data used to construct the model very well but generalise poorly to unseen data (overfitting); if the complexity is too low the model won’t capture all the information in the…
The post Why Learning Rate Is Crucial In Deep Learning appeared first on Analytics India Magazine.