Time Series Evaluation Metrics: MAPE, SMAPE, and RMSE Quiz Quiz

Challenge your understanding of key time series model evaluation metrics such as Mean Absolute Percentage Error (MAPE), Symmetric Mean Absolute Percentage Error (SMAPE), and Root Mean Square Error (RMSE). This quiz covers definitions, differences, calculation basics, and interpretation of these essential metrics in forecasting accuracy.

  1. MAPE Definition

    What does the acronym MAPE stand for when evaluating time series forecasting models?

    1. Maximum Average Percentage Error
    2. Mean Absolute Percentage Error
    3. Mean Absolute Prediction Error
    4. Median Absolute Percent Error

    Explanation: MAPE stands for Mean Absolute Percentage Error and is widely used to measure forecasting accuracy. The other options, such as Maximum Average Percentage Error or Median Absolute Percent Error, are not standard terms in time series analysis. Mean Absolute Prediction Error is close in wording but incorrect. Only 'Mean Absolute Percentage Error' correctly matches the commonly accepted definition.

  2. Interpreting MAPE

    A forecasting model shows a MAPE value of 10%. What does this value indicate?

    1. The errors are symmetrically distributed around zero
    2. The forecasts are off by an average of 10% from actual values
    3. There is a 10% chance the forecast is wrong
    4. Forecasts are always 10 units off from actual values

    Explanation: MAPE expresses how far predictions deviate from actual values as a percentage, so 10% means predictions are, on average, 10% off the real values. It does not mean errors are always 10 units away (option 2), nor does it refer to symmetrical error distribution (option 3). Option 4 confuses error rate with error magnitude, which is not what MAPE measures.

  3. RMSE Basics

    Which statement best describes how Root Mean Square Error (RMSE) is calculated for time series data?

    1. Calculate the percentage difference and find the mean
    2. Add all errors together and divide by the number of errors
    3. Square each error and choose the largest
    4. Take the square root of the mean of squared forecast errors

    Explanation: RMSE is calculated by squaring each forecast error, finding their mean, and then taking the square root of that mean. The second option describes the mean error, not RMSE. The third option focuses only on the largest error, which is not correct. The fourth option incorrectly describes MAPE's calculation, not RMSE's.

  4. SMAPE Symmetry

    Why is SMAPE (Symmetric Mean Absolute Percentage Error) considered an improvement over MAPE?

    1. SMAPE is based on the root mean square
    2. SMAPE treats overestimates and underestimates equally
    3. SMAPE always produces lower error values
    4. SMAPE ignores small prediction errors

    Explanation: SMAPE is designed to be symmetric; it penalizes overestimates and underestimates evenly, making it more balanced than MAPE. SMAPE doesn't always generate lower error values (option 2), nor does it ignore small errors (option 3). Option 4 incorrectly associates SMAPE with the concept of root mean square, which is related to RMSE instead.

  5. Zero Actual Values u0026 MAPE

    What is a common problem when using MAPE with time series data that contains zero actual values?

    1. MAPE treats zero actuals as perfect predictions
    2. MAPE calculation becomes undefined due to division by zero
    3. MAPE always gives a value of zero for such data
    4. MAPE automatically switches to RMSE

    Explanation: Since MAPE divides by the actual value, a zero in the actuals leads to division by zero, making it undefined for those points. MAPE does not simply give a value of zero (option 2), nor does it treat zero actuals as perfect (option 3). There is also no automatic switch to RMSE (option 4); these are separate metrics.

  6. Units in RMSE

    If you measure actual and predicted values in kilograms, what are the units of RMSE?

    1. Percentage
    2. Squared kilograms
    3. Unitless
    4. Kilograms

    Explanation: RMSE retains the original units of the data, so with values in kilograms, RMSE is also in kilograms. Percentage (option 2) pertains to metrics like MAPE or SMAPE, not RMSE. Option 3 is incorrect since RMSE is not unitless, and option 4 mistakes the squaring process, forgetting that taking the square root returns the units back to kilograms.

  7. MAPE Calculation Example

    Given an actual value of 40 and a forecast of 44, what is the MAPE for this single forecast?

    1. 11%
    2. 4%
    3. 10%
    4. 9%

    Explanation: MAPE is calculated as the absolute error divided by the actual value, multiplied by 100. The absolute error is |44-40| = 4. So, 4/40 = 0.1, or 10%. The options 11%, 4%, and 9% are incorrect results from either wrong addition, percentage calculation, or rounding errors.

  8. Sensitivity to Outliers

    Which error metric is most affected by large outliers in the forecast errors?

    1. SMAPE
    2. Mean Absolute Error (MAE)
    3. MAPE
    4. RMSE

    Explanation: RMSE squares the errors before averaging, which amplifies the effect of larger errors or outliers. MAPE and MAE are less sensitive since they use absolute errors. SMAPE, though related to MAPE, also does not amplify larger errors as much as RMSE does.

  9. Range of SMAPE

    What is the typical value range for SMAPE expressed as a percentage?

    1. -100% to 100%
    2. 0% to 200%
    3. 0% to 100%
    4. -∞ to +∞

    Explanation: SMAPE can range from 0% (perfect prediction) to 200% (maximum disagreement), as its denominator averages the absolute values of actual and predicted numbers. 0% to 100% is the usual range for MAPE, not SMAPE. Negative ranges and infinite ranges do not apply to SMAPE by definition.

  10. Metric Selection

    A time series analyst needs a metric that does not depend on the scale of the data. Which metric should they choose?

    1. RMSE
    2. Sum of Squared Errors
    3. MAE
    4. MAPE

    Explanation: MAPE expresses errors as a percentage, making it scale-independent and easy to compare across datasets. RMSE and MAE both retain the data's scale, making cross-series comparisons harder. The Sum of Squared Errors is not normalized and is highly scale-dependent.