Batch Normalization

A method used in training artificial neural networks that normalizes the interlayer outputs or the inputs to each layer to make the training process faster and more stable.

Batch Normalization

Areas of application

  • Artificial Neural Networks
  • Deep Learning Models
  • Convolutional Neural Networks
  • Recurrent Neural Networks
  • Gradient Descent Optimization
  • Image Processing
  • Data Preprocessing
  • Computer Vision

Example

In a deep learning model for image classification, batch normalization can be applied to the activations of each convolutional layer to reduce the internal covariate shift and improve the generalization of the model.