Boost Your Neural Networks: Quiz on ReLU, Sigmoid, and Tanh Activation Functions — Questions & Answers

This quiz contains 10 questions. Below is a complete reference of all questions, answer choices, and correct answers. You can use this section to review after taking the interactive quiz above.

  1. Question 1: Basic Concept

    Which of the following activation functions outputs values strictly between 0 and 1 for any real input?

    • ReLU
    • Sigmoid
    • Tanh
    • Softplus
    • Swish
    Show correct answer

    Correct answer: Sigmoid

  2. Question 2: Function Form

    What is the mathematical expression for the ReLU activation function?

    • max(0, x)
    • 1 / (1 + e^-x)
    • tanh(x)
    • min(1, x)
    • log(1+e^x)
    Show correct answer

    Correct answer: max(0, x)

  3. Question 3: Output Range

    If you input large negative values into the Tanh activation function, what is the approximate output?

    • Close to 1
    • Close to 0
    • Close to -1
    • Exactly 0
    • Exactly 1
    Show correct answer

    Correct answer: Close to -1

  4. Question 4: Identity Comparison

    Which activation function among ReLU, Sigmoid, and Tanh is non-linear but does not squash negative values to positive outputs?

    • Sigmoid
    • ReLU
    • Tanh
    • Softmax
    • Leaky ReLU
    Show correct answer

    Correct answer: ReLU

  5. Question 5: Derivative Behavior

    For which input does the derivative of the Sigmoid activation function reach its maximum value?

    • x = 0
    • x = 1
    • x = -1
    • x = infinity
    • x = -infinity
    Show correct answer

    Correct answer: x = 0

  6. Question 6: Scenario Application

    If you want your activation function's output to cover both negative and positive ranges symmetrically, which should you use?

    • Sigmoid
    • Tanh
    • ReLU
    • Softplus
    • Linear
    Show correct answer

    Correct answer: Tanh

  7. Question 7: Zero Value Handling

    What value does ReLU return if the input is zero?

    • Zero
    • One
    • Negative one
    • Infinity
    • Half
    Show correct answer

    Correct answer: Zero

  8. Question 8: Activation Function Saturation

    Which activation function is prone to the vanishing gradient problem because its output saturates for large positive or negative inputs?

    • Tanh
    • ReLU
    • Leaky ReLU
    • Swish
    • Poly
    Show correct answer

    Correct answer: Tanh

  9. Question 9: Probabilistic Output

    If you need an output interpreted as a probability, which activation function is most suitable at an output layer?

    • ReLU
    • Tanh
    • Sigmoid
    • Piecewise Linear
    • Exponential
    Show correct answer

    Correct answer: Sigmoid

  10. Question 10: Negative Inputs

    When using the ReLU activation function, what is the output for an input value of -5?

    • -5
    • 0
    • 5
    • 1
    • -1