Assessing the Predictions of Dynamic Neural Networks
Abstract
In this paper, the estimation of prediction intervals for multi-step-ahead predictions from dynamic neural network models is described. Usually, asymptotic methods based on linearizations are applied with the potential problem of large coverage errors and too optimistic prediction intervals. The potential sources ofthese problems are the negligence of the network parameter uncertainties and the non-normality of the error distribution. To overcome these restrictions, bootstrap methods are used here. New formulations are introduced to apply the bootstrap to nonlinear time series models with exogenous input. An explicit model of the error process considers the influence of different training data densities on the empirical error distribution. A Monte Carlo study illustrates the proposed methods.