Branch Out: A Beginner’s Quiz on Decision Trees, Splitting, and Gini Index — Questions & Answers

This quiz contains 10 questions. Below is a complete reference of all questions, answer choices, and correct answers. You can use this section to review after taking the interactive quiz above.

  1. Question 1: Understanding Split Criteria

    Which metric is commonly used to determine the best split in a decision tree when performing binary classification?

    • A. Gini Index
    • B. Mean Squared Error
    • C. Support Vector
    • D. Sigmoid Loss
    • E. Fourier Transform
    Show correct answer

    Correct answer: A. Gini Index

  2. Question 2: Basics of Splitting

    When constructing a decision tree, which action is taken to grow the tree after evaluating a node's data?

    • A. Increasing regularization
    • B. Splitting the node based on a selected feature
    • C. Averaging node values
    • D. Pruning the entire branch
    • E. Removing the node immediately
    Show correct answer

    Correct answer: B. Splitting the node based on a selected feature

  3. Question 3: Gini Index Values

    What is the Gini Index of a node containing only one class, such as all 'Yes' outcomes?

    • A. 0.0
    • B. 0.25
    • C. 0.5
    • D. 1.0
    • E. 2.0
    Show correct answer

    Correct answer: A. 0.0

  4. Question 4: Pruning Purpose

    Why is pruning applied to a decision tree after it has grown to its full depth?

    • A. To add more branches
    • B. To improve overfitting by reducing complexity
    • C. To remove the root node
    • D. To ensure all features are used
    • E. To maximize training data size
    Show correct answer

    Correct answer: B. To improve overfitting by reducing complexity

  5. Question 5: Splitting Example

    Given a dataset where splitting on 'Color' leads to two groups: all red objects in one and all blue objects in another, which property does this demonstrate?

    • A. Class purity after splitting
    • B. Random forest creation
    • C. Min-max scaling
    • D. Feature normalization
    • E. Data balancing error
    Show correct answer

    Correct answer: A. Class purity after splitting

  6. Question 6: Gini Calculation Basics

    If a split results in two nodes, each containing half 'Yes' and half 'No' values, what does this say about the Gini Index of these nodes?

    • A. The Gini Index is at its maximum
    • B. The Gini Index is zero
    • C. The Gini Index is negative
    • D. The Gini Index does not change
    • E. The Gini Index becomes infinite
    Show correct answer

    Correct answer: A. The Gini Index is at its maximum

  7. Question 7: Overfitting in Trees

    What is the risk of growing a decision tree without any restrictions on depth or minimum samples at nodes?

    • A. Overfitting to noise in the training data
    • B. Model always underfits
    • C. Missing variables
    • D. Increasing bias error
    • E. Improving generalization
    Show correct answer

    Correct answer: A. Overfitting to noise in the training data

  8. Question 8: Leaf Nodes

    In the context of decision trees, what typically characterizes a leaf node?

    • A. It performs further splits
    • B. It contains the final class prediction
    • C. It stores all features
    • D. It averages input data
    • E. It triggers pruning automatically
    Show correct answer

    Correct answer: B. It contains the final class prediction

  9. Question 9: Choosing Features for Splitting

    When deciding which feature to split on at each step in building a decision tree, what is usually maximized or minimized?

    • A. The number of input features
    • B. The Gini gain or impurity decrease
    • C. The absolute feature magnitude
    • D. The feature's name alphabetically
    • E. The feature's p-value
    Show correct answer

    Correct answer: B. The Gini gain or impurity decrease

  10. Question 10: Pruning Types

    Which of the following is an example of post-pruning in decision trees?

    • A. Removing branches after the tree is fully grown
    • B. Avoiding splits during tree construction
    • C. Selecting features at random before building the tree
    • D. Normalizing data before splitting
    • E. Adding more splits after training
    Show correct answer

    Correct answer: A. Removing branches after the tree is fully grown