An Activation Function

An activation function in the context of an artificial neural network is a mathematical function applied to a node’s input to produce the node’s output, which then serves as input to the next layer in the network. The primary purpose of an activation function is to introduce non-linearity into the network, enabling it to learn complex patterns and perform tasks beyond mere linear classification or regression.

An Activation Function

Areas of application

  • Computer Vision
  • Natural Language Processing
  • Speech Recognition
  • Robotics

Example

Common examples of activation functions include sigmoid, tanh, and ReLU (Rectified Linear Unit).