Mastering Activation Functions: ReLU, Sigmoid, and Tanh Essentials — Questions & Answers

This quiz contains 10 questions. Below is a complete reference of all questions, answer choices, and correct answers. You can use this section to review after taking the interactive quiz above.

  1. Question 1: Understanding the ReLU Function

    Which mathematical expression correctly represents the ReLU (Rectified Linear Unit) activation function?

    • A. f(x) = max(0, x)
    • B. f(x) = 1/(1 + e^(-x))
    • C. f(x) = (e^{x} - e^{-x})/(e^{x} + e^{-x})
    • D. f(x) = x^2
    • E. f(x) = min(0, x)
    Show correct answer

    Correct answer: A. f(x) = max(0, x)

  2. Question 2: Sigmoid Function Output Range

    What is the range of output values for the sigmoid activation function?

    • A. (-∞, +∞)
    • B. [0, 1]
    • C. [-1, 1]
    • D. [0, ∞)
    • E. [-∞, 0]
    Show correct answer

    Correct answer: B. [0, 1]

  3. Question 3: Tanh Activation Example

    If you apply the tanh activation function to the input value 0, what will the output be?

    • A. 1
    • B. 0
    • C. -1
    • D. 0.5
    • E. -0.5
    Show correct answer

    Correct answer: B. 0

  4. Question 4: ReLU Behavior with Negative Inputs

    What does the ReLU activation function output when the input is a negative number, such as -3?

    • A. -3
    • B. 1
    • C. 0
    • D. -1
    • E. 3
    Show correct answer

    Correct answer: C. 0

  5. Question 5: Identifying the Sigmoid Graph

    Which of the following best describes the shape of the sigmoid activation function’s graph?

    • A. A V-shape
    • B. S-shaped curve
    • C. A straight line
    • D. An oscillating wave
    • E. A parabolic curve
    Show correct answer

    Correct answer: B. S-shaped curve

  6. Question 6: Tanh Versus Sigmoid Output Ranges

    Compared to the sigmoid, which range does the tanh activation function cover?

    • A. [0, 1]
    • B. [1, 2]
    • C. [-1, 1]
    • D. [0, ∞)
    • E. [-∞, ∞]
    Show correct answer

    Correct answer: C. [-1, 1]

  7. Question 7: ReLU and the Vanishing Gradient Problem

    Which activation function is commonly chosen over sigmoid to help avoid the vanishing gradient problem in deep networks?

    • A. Sigmooid
    • B. Tahn
    • C. ReLU
    • D. Leaky Sigmoid
    • E. Softmax
    Show correct answer

    Correct answer: C. ReLU

  8. Question 8: Typical Use Case for the Sigmoid Function

    For which of the following tasks is the sigmoid activation function most often used?

    • A. Text translation
    • B. Multi-class classification
    • C. Binary classification
    • D. Regression analysis
    • E. Data normalization
    Show correct answer

    Correct answer: C. Binary classification

  9. Question 9: Tanh Function Centering

    What is a key advantage of the tanh activation function compared to the sigmoid function?

    • A. Its output is always positive
    • B. It centers data around zero
    • C. It has a linear output
    • D. It increases gradient size
    • E. It only outputs negative values
    Show correct answer

    Correct answer: B. It centers data around zero

  10. Question 10: Relation Between Functions

    Which statement about ReLU, sigmoid, and tanh activation functions is correct?

    • A. All three have the same output range
    • B. Only sigmoid produces negative outputs
    • C. Tanh and sigmoid are both nonlinear functions
    • D. ReLU always outputs negative values
    • E. Tanh and ReLU are linear functions
    Show correct answer

    Correct answer: C. Tanh and sigmoid are both nonlinear functions