Stochastic Gradient Descent (Sgd)

An iterative optimization algorithm widely used in machine learning and deep learning applications to find the model parameters that correspond to the best fit between predicted and actual outputs.

Stochastic Gradient Descent (Sgd)

Areas of application

  • Machine learning
  • Deep learning
  • Neural networks
  • Optimization algorithms

Example

For instance, when training a neural network to classify images, SGD can be used to update the weights of the network based on a random subset of the training images, rather than the entire dataset.