Question:

Which of the following statements is/are correct about the rectified linear unit (ReLU) activation function defined as ReLU(x) = max(x, 0), where \( x \in \mathbb{R} \)?

Show Hint

The ReLU function is continuous everywhere, but not differentiable at \( x = 0 \) due to the sharp corner. When using ReLU in neural networks, this is handled by techniques like subgradient descent.
Updated On: Apr 4, 2025
  • ReLU is continuous everywhere
  • ReLU is differentiable everywhere
  • ReLU is not differentiable at \( x = 0 \)
  • ReLU(x) = ReLU(ax), for all \( a \in \mathbb{R} \)
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A, C

Solution and Explanation

The Rectified Linear Unit (ReLU) function is defined as:

\[ \text{ReLU}(x) = \max(x, 0) \] This function returns \( x \) for positive values and 0 for non-positive values of \( x \).

Step 1: Continuity of ReLU
The ReLU function is continuous everywhere. This is because:

For \( x > 0 \), ReLU behaves like the identity function, \( \text{ReLU}(x) = x \), which is continuous.
For \( x <= 0 \), ReLU is constant at 0, which is also continuous.
At \( x = 0 \), the left-hand and right-hand limits both approach 0, and the function value is 0, ensuring continuity.

Thus, ReLU is continuous everywhere, so Option (A) is correct.

Step 2: Differentiability of ReLU
For \( x > 0 \), the derivative of ReLU is \( 1 \) (since ReLU behaves like \( x \) for positive values).
For \( x < 0 \), the derivative of ReLU is \( 0 \) (since ReLU is constant at 0 for non-positive values).

However, at \( x = 0 \), there is a sharp corner in the function, where the left-hand derivative is 0 and the right-hand derivative is 1. Since the derivative does not exist at \( x = 0 \), ReLU is not differentiable at \( x = 0 \).

Thus, Option (C) is correct.

Step 3: Symmetry Property of ReLU
The function \( \text{ReLU}(x) = \text{ReLU}(ax) \) holds for all \( a \in \mathbb{R} \). This is because multiplying \( x \) by a constant factor does not change whether \( x \) is positive or negative, as long as the function definition is \( \max(x, 0) \). Therefore, Option (D) is also correct.

Conclusion:
The correct answer is that ReLU is continuous everywhere (Option A), and ReLU is not differentiable at \( x = 0 \) (Option C).

Thus, the correct answer is (A) and (C).
Was this answer helpful?
0
0

Top Questions on Machine Learning

Questions Asked in GATE DA exam

View More Questions