Understanding the ReLU Function
Which mathematical expression correctly represents the ReLU (Rectified Linear Unit) activation function?
- A. f(x) = max(0, x)
- B. f(x) = 1/(1 + e^(-x))
- C. f(x) = (e^{x} - e^{-x})/(e^{x} + e^{-x})
- D. f(x) = x^2
- E. f(x) = min(0, x)
Sigmoid Function Output Range
What is the range of output values for the sigmoid activation function?
- A. (-∞, +∞)
- B. [0, 1]
- C. [-1, 1]
- D. [0, ∞)
- E. [-∞, 0]
Tanh Activation Example
If you apply the tanh activation function to the input value 0, what will the output be?
- A. 1
- B. 0
- C. -1
- D. 0.5
- E. -0.5
ReLU Behavior with Negative Inputs
What does the ReLU activation function output when the input is a negative number, such as -3?
- A. -3
- B. 1
- C. 0
- D. -1
- E. 3
Identifying the Sigmoid Graph
Which of the following best describes the shape of the sigmoid activation function’s graph?
- A. A V-shape
- B. S-shaped curve
- C. A straight line
- D. An oscillating wave
- E. A parabolic curve
Tanh Versus Sigmoid Output Ranges
Compared to the sigmoid, which range does the tanh activation function cover?
- A. [0, 1]
- B. [1, 2]
- C. [-1, 1]
- D. [0, ∞)
- E. [-∞, ∞]
ReLU and the Vanishing Gradient Problem
Which activation function is commonly chosen over sigmoid to help avoid the vanishing gradient problem in deep networks?
- A. Sigmooid
- B. Tahn
- C. ReLU
- D. Leaky Sigmoid
- E. Softmax
Typical Use Case for the Sigmoid Function
For which of the following tasks is the sigmoid activation function most often used?
- A. Text translation
- B. Multi-class classification
- C. Binary classification
- D. Regression analysis
- E. Data normalization
Tanh Function Centering
What is a key advantage of the tanh activation function compared to the sigmoid function?
- A. Its output is always positive
- B. It centers data around zero
- C. It has a linear output
- D. It increases gradient size
- E. It only outputs negative values
Relation Between Functions
Which statement about ReLU, sigmoid, and tanh activation functions is correct?
- A. All three have the same output range
- B. Only sigmoid produces negative outputs
- C. Tanh and sigmoid are both nonlinear functions
- D. ReLU always outputs negative values
- E. Tanh and ReLU are linear functions