Concept:
Activation functions are used in neural networks to introduce non-linearity into the model. Different activation functions have different output ranges and properties.
Some common activation functions include:
- Sigmoid: Output range \( (0,1) \)
- ReLU: Output range \( [0, \infty) \)
- Tanh: Output range \( (-1,1) \)
- Softmax: Output range \( (0,1) \) and used for probability distributions
The Tanh (Hyperbolic Tangent) function is zero-centered and produces outputs between \(-1\) and \(1\).
Step 1: Understand the Tanh activation function.
The mathematical form of the Tanh function is:
\[
\tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}
\]
This function maps real-valued inputs into the range:
\[
-1<\tanh(x)<1
\]
Step 2: Check the zero-centered property.
When \(x = 0\):
\[
\tanh(0) = 0
\]
Thus, the function is centered around zero, which helps neural networks learn more efficiently during training.
Step 3: Conclusion.
Since the Tanh activation function produces outputs in the range \((-1,1)\) and is centered at zero, it satisfies the condition given in the question.
\[
\text{Answer: Tanh}
\]