Mastering Activation Functions: ReLU, Sigmoid, and Tanh Essentials Quiz

  1. Understanding the ReLU Function

    Which mathematical expression correctly represents the ReLU (Rectified Linear Unit) activation function?

    1. A. f(x) = max(0, x)
    2. B. f(x) = 1/(1 + e^(-x))
    3. C. f(x) = (e^{x} - e^{-x})/(e^{x} + e^{-x})
    4. D. f(x) = x^2
    5. E. f(x) = min(0, x)
  2. Sigmoid Function Output Range

    What is the range of output values for the sigmoid activation function?

    1. A. (-∞, +∞)
    2. B. [0, 1]
    3. C. [-1, 1]
    4. D. [0, ∞)
    5. E. [-∞, 0]
  3. Tanh Activation Example

    If you apply the tanh activation function to the input value 0, what will the output be?

    1. A. 1
    2. B. 0
    3. C. -1
    4. D. 0.5
    5. E. -0.5
  4. ReLU Behavior with Negative Inputs

    What does the ReLU activation function output when the input is a negative number, such as -3?

    1. A. -3
    2. B. 1
    3. C. 0
    4. D. -1
    5. E. 3
  5. Identifying the Sigmoid Graph

    Which of the following best describes the shape of the sigmoid activation function’s graph?

    1. A. A V-shape
    2. B. S-shaped curve
    3. C. A straight line
    4. D. An oscillating wave
    5. E. A parabolic curve
  6. Tanh Versus Sigmoid Output Ranges

    Compared to the sigmoid, which range does the tanh activation function cover?

    1. A. [0, 1]
    2. B. [1, 2]
    3. C. [-1, 1]
    4. D. [0, ∞)
    5. E. [-∞, ∞]
  7. ReLU and the Vanishing Gradient Problem

    Which activation function is commonly chosen over sigmoid to help avoid the vanishing gradient problem in deep networks?

    1. A. Sigmooid
    2. B. Tahn
    3. C. ReLU
    4. D. Leaky Sigmoid
    5. E. Softmax
  8. Typical Use Case for the Sigmoid Function

    For which of the following tasks is the sigmoid activation function most often used?

    1. A. Text translation
    2. B. Multi-class classification
    3. C. Binary classification
    4. D. Regression analysis
    5. E. Data normalization
  9. Tanh Function Centering

    What is a key advantage of the tanh activation function compared to the sigmoid function?

    1. A. Its output is always positive
    2. B. It centers data around zero
    3. C. It has a linear output
    4. D. It increases gradient size
    5. E. It only outputs negative values
  10. Relation Between Functions

    Which statement about ReLU, sigmoid, and tanh activation functions is correct?

    1. A. All three have the same output range
    2. B. Only sigmoid produces negative outputs
    3. C. Tanh and sigmoid are both nonlinear functions
    4. D. ReLU always outputs negative values
    5. E. Tanh and ReLU are linear functions