Boost Your Neural Networks: Quiz on ReLU, Sigmoid, and Tanh Activation Functions Quiz

  1. Basic Concept

    Which of the following activation functions outputs values strictly between 0 and 1 for any real input?

    1. ReLU
    2. Sigmoid
    3. Tanh
    4. Softplus
    5. Swish
  2. Function Form

    What is the mathematical expression for the ReLU activation function?

    1. max(0, x)
    2. 1 / (1 + e^-x)
    3. tanh(x)
    4. min(1, x)
    5. log(1+e^x)
  3. Output Range

    If you input large negative values into the Tanh activation function, what is the approximate output?

    1. Close to 1
    2. Close to 0
    3. Close to -1
    4. Exactly 0
    5. Exactly 1
  4. Identity Comparison

    Which activation function among ReLU, Sigmoid, and Tanh is non-linear but does not squash negative values to positive outputs?

    1. Sigmoid
    2. ReLU
    3. Tanh
    4. Softmax
    5. Leaky ReLU
  5. Derivative Behavior

    For which input does the derivative of the Sigmoid activation function reach its maximum value?

    1. x = 0
    2. x = 1
    3. x = -1
    4. x = infinity
    5. x = -infinity
  6. Scenario Application

    If you want your activation function's output to cover both negative and positive ranges symmetrically, which should you use?

    1. Sigmoid
    2. Tanh
    3. ReLU
    4. Softplus
    5. Linear
  7. Zero Value Handling

    What value does ReLU return if the input is zero?

    1. Zero
    2. One
    3. Negative one
    4. Infinity
    5. Half
  8. Activation Function Saturation

    Which activation function is prone to the vanishing gradient problem because its output saturates for large positive or negative inputs?

    1. Tanh
    2. ReLU
    3. Leaky ReLU
    4. Swish
    5. Poly
  9. Probabilistic Output

    If you need an output interpreted as a probability, which activation function is most suitable at an output layer?

    1. ReLU
    2. Tanh
    3. Sigmoid
    4. Piecewise Linear
    5. Exponential
  10. Negative Inputs

    When using the ReLU activation function, what is the output for an input value of -5?

    1. -5
    2. 0
    3. 5
    4. 1
    5. -1