Perceptrons and Multi-Layer Perceptrons Made Simple: 10 Easy Questions Quiz

  1. Understanding Perceptrons

    Which of the following best describes what a single-layer perceptron does with input features?

    1. It computes a weighted sum followed by an activation function.
    2. It stores previous output values for comparison.
    3. It generates random outputs regardless of input.
    4. It multiplies the inputs without any additional processing.
    5. It only sorts the input values.
  2. Activation Functions

    When using the perceptron algorithm, what is the typical purpose of the activation function such as the step function?

    1. To convert the output into a binary value like 0 or 1.
    2. To randomly shuffle the input features.
    3. To expand the number of input layers.
    4. To calculate the loss between inputs.
    5. To deactivate some neurons in the network.
  3. Learning Rule

    In a perceptron, what typically happens to the weights during the learning process when a prediction is wrong?

    1. Weights are updated to reduce future errors.
    2. Weights are deleted from the model.
    3. Weights remain unchanged.
    4. Weights are randomized every time.
    5. Weights multiply by the output value.
  4. Limitations of Perceptrons

    Which type of problem cannot be solved by a single-layer perceptron, such as the classic XOR logic gate?

    1. Non-linearly separable problems.
    2. Linearly separable problems.
    3. Problems with only one input.
    4. Problems with numeric outputs.
    5. Problems that include negative numbers.
  5. Introducing MLP

    Which feature distinguishes a multi-layer perceptron (MLP) from a single-layer perceptron?

    1. It contains one or more hidden layers.
    2. It only has output and input layers.
    3. It never uses an activation function.
    4. It always has fewer neurons than a perceptron.
    5. It does not require inputs to work.
  6. Backpropagation

    In a multi-layer perceptron, which algorithm is commonly used to train the network by adjusting the weights in all layers?

    1. Backpropagation
    2. Backproliferation
    3. Backprofile
    4. Forward chaining
    5. Output splitting
  7. MLP Output

    If an MLP receives two inputs and has a single output neuron, what type of problem could it be used for?

    1. Binary classification
    2. Text summarization
    3. Image compression
    4. Signal encryption
    5. Database indexing
  8. Hidden Layers Role

    What is the main benefit of hidden layers in an MLP when learning complex patterns?

    1. They allow the network to capture non-linear relationships.
    2. They reduce the input size to zero.
    3. They prevent the network from learning.
    4. They only speed up calculations but don’t affect outputs.
    5. They make outputs random.
  9. MLP Example

    Suppose an MLP is trained to recognize cats and dogs in images. What kind of output would a two-output-neuron MLP provide for a given image?

    1. A score for each class, such as [cat: 0.8, dog: 0.2]
    2. A sequence of random numbers.
    3. Just a binary code with no meaning.
    4. An error message every time.
    5. Only the input image repeated.
  10. Choosing the Right Network

    Why would you choose a multi-layer perceptron instead of a single-layer perceptron for most real-world problems?

    1. Because MLPs can solve complex, non-linear problems that single-layer perceptrons cannot.
    2. Because MLPs always use less memory.
    3. Because single-layer perceptrons are faster for complex tasks.
    4. Because single-layer perceptrons learn non-linear patterns efficiently.
    5. Because MLPs do not require any data to function.