\( \newcommand{\mathbbm}[1]{\boldsymbol{\mathbf{#1}}} \)

Chapter 16 Handling uncertainty in ADAM

So far, when we discussed forecasts from ADAM, we have assumed that the smoothing parameters and initial values are known, even though we have acknowledged in Chapter 11 that they are estimated. This is the conventional assumption of ETS models from Hyndman et al. (2008), which also applies to ARIMA models. However, in reality, the parameters are never known and are always estimated in-sample. This means that the estimates of parameters will inevitably change with the change of sample size. This uncertainty will impact the model fit, the point forecasts, and prediction intervals. To overcome this issue, Bergmeir et al. (2016) proposed bagging ETS – the procedure that decomposes time series using STL (Cleveland et al., 1990) then recreates many time series by bootstrapping the remainder then fits best ETS model to each of the newly created time series and combines the forecasts from the models. This way (as was explained by Petropoulos et al., 2018a), the parameters of the models will differ from one generated time series to another. Thus, the final forecasts will handle the uncertainty about the parameters. In addition, this approach also mitigates to some extent the model uncertainty, which was discussed in Section 15.4, because models are selected automatically on each bootstrapped series. The main issue with the approach is that it is computationally expensive and assumes that STL decomposition is appropriate for time series. Furthermore, it assumes that the residuals from this decomposition do not contain any information and are independent.

In this chapter, we focus on a discussion of uncertainty in ADAM, specifically about the estimates of parameters. We start with a discussion of how the data can be simulated from an estimated ADAM, then move to how to deal with confidence intervals for the parameters and after that – how the parameters’ uncertainty can be propagated to the states and fitted values of the model. Some parts of this chapter are based on Svetunkov and K. Pritularga (2023).

References

• Bergmeir, C., Hyndman, R.J., Benítez, J.M., 2016. Bagging Exponential Smoothing Methods Using STL Decomposition and Box-Cox Transformation. International Journal of Forecasting. 32, 303–312. https://doi.org/10.1016/j.ijforecast.2015.07.002
• Cleveland, R.B., Cleveland, W.S., McRae, J.E., Terpenning, I., 1990. STL: A Seasonal-trend Decomposition Procedure Based on LOESS. Journal of Official Statistics. 6, 3–73.
• Hyndman, R.J., Koehler, A.B., Ord, J.K., Snyder, R.D., 2008. Forecasting with Exponential Smoothing: The State Space Approach. Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-71918-2
• Petropoulos, F., Hyndman, R.J., Bergmeir, C., 2018a. Exploring the Sources of Uncertainty: Why Does Bagging for Time Series Forecasting Work? European Journal of Operational Research. 268, 545–554. https://doi.org/10.1016/j.ejor.2018.01.045
• Svetunkov, I., Pritularga, K., 2023. Incorporating Parameters Uncertainty in ETS. Department of Management Science Working Paper Series. 1–19.