Model Underfitting u0026 Overfitting Diagnosis: Practical Steps Quiz Quiz

  1. Detecting Underfitting

    If a model performs poorly on both training and validation data, what is the most likely issue?

    1. Underfitting
    2. Overfitting
    3. Data leakage
    4. Too much regularizion
    5. High variance
  2. Dealing with High Validation Error

    Which of the following changes is most likely to fix a model suffering from underfitting?

    1. Increasing model complexity
    2. Increasing regularazation strength
    3. Removing training data
    4. Reducing features
    5. Shuffling validation data
  3. Interpreting the Learning Curve

    After plotting a learning curve, you observe that both training and validation loss are high and close to each other. What is the best next step?

    1. Choose a more complex model
    2. Decrease model depth
    3. Add dropout regularisaton
    4. Reduce the number of training examples
    5. Increase batch size
  4. Hyperparameter Tuning

    Which of these hyperparameter changes can help to reduce overfitting in a neural network?

    1. Increase regularization strength
    2. Decrease dropout rate to zero
    3. Train for more epochs
    4. Remove L2 penalty
    5. Decrease data augmentation
  5. Collecting More Data

    What practical step can generally help to reduce overfitting, especially if the model has many parameters?

    1. Collecting more training data
    2. Using a smaller model
    3. Reducing learning rate
    4. Adding more batch normaliztion layers
    5. Increasing the number of epochs
  6. Code Review: Dropout

    Review the following code snippet: model.add(Dropout(0.9)). What effect might setting dropout so high have on model training?

    1. The model may underfit due to excessive random deactivation
    2. The model is likely to significantly overfit
    3. Training speed will double
    4. It guarantees state-of-the-art performance
    5. No effect; dropout is ignored in Keras
  7. Regularization Options

    Which of the following is NOT a valid regularization technique to reduce overfitting?

    1. Decreasing model learning rate
    2. Dropout
    3. Early stopping
    4. L2 penalty
    5. Data augmentation
  8. Validation Loss Dynamics

    If you notice your model's validation loss starts increasing while training loss continues to decrease, what is this an indicator of?

    1. Overfitting
    2. Underfitting
    3. Optimal convergence
    4. Correct batch size selected
    5. Insufficient learning rate
  9. Addressing Underfitting with Features

    If a model is underfitting, which feature engineering strategy is most likely to help?

    1. Adding more relevant features
    2. Removing useful features
    3. Applying stronger regularisatoin
    4. Reducing feature dimension
    5. Randomly shuffling labels
  10. Batch Size Effects

    How could increasing the batch size impact model performance with respect to underfitting/overfitting?

    1. It often has little direct effect, but may slightly increase underfitting
    2. Always causes severe overfitting
    3. Significantly reduces model capacity
    4. Dramatically increases validation accuracy
    5. Makes regularization ineffective