Question:

Which activation function is zero-centered and ranges between \(-1\) and \(1\)?

Show Hint

{Sigmoid vs Tanh:} Sigmoid outputs values between \(0\) and \(1\), while {Tanh outputs values between \(-1\) and \(1\)} and is zero-centered, which often leads to faster convergence in neural networks.
Updated On: Mar 16, 2026
  • Sigmoid
  • ReLU
  • Tanh
  • Softmax
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is C

Solution and Explanation


Concept: Activation functions are used in neural networks to introduce non-linearity into the model. Different activation functions have different output ranges and properties. Some common activation functions include:
  • Sigmoid: Output range \( (0,1) \)
  • ReLU: Output range \( [0, \infty) \)
  • Tanh: Output range \( (-1,1) \)
  • Softmax: Output range \( (0,1) \) and used for probability distributions
The Tanh (Hyperbolic Tangent) function is zero-centered and produces outputs between \(-1\) and \(1\).
Step 1: Understand the Tanh activation function.
The mathematical form of the Tanh function is: \[ \tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}} \] This function maps real-valued inputs into the range: \[ -1<\tanh(x)<1 \]
Step 2: Check the zero-centered property.
When \(x = 0\): \[ \tanh(0) = 0 \] Thus, the function is centered around zero, which helps neural networks learn more efficiently during training.
Step 3: Conclusion.
Since the Tanh activation function produces outputs in the range \((-1,1)\) and is centered at zero, it satisfies the condition given in the question. \[ \text{Answer: Tanh} \]
Was this answer helpful?
0
0