Bayesian Inference for Training of Long Short Term Memory Models in Chaotic Time Series Forecasting Conference Poster

abstract

  • For time series forecasting, obtaining models is based on the use of past observations from the same sequence. In those cases, when the model is learning from data, there is not an extra information that discuss about the quantity of noise inside the data available. In practice, it is necessary to deal with finite noisy datasets, which lead to uncertainty about the propriety of the model. For this problem, the employment of the Bayesian inference tools are preferable. A modified algorithm used for training a long-short term memory recurrent neural network for time series forecasting is presented. This approach was chosen to improve the forecasting of the original series, employing an implementation based on the minimization of the associated Kullback-Leibler Information Criterion. For comparison, a nonlinear autoregressive model implemented with a feedforward neural network was also presented. A simulation study was conducted to evaluate and illustrate results, comparing this approach with Bayesian neural-networks-based algorithms for artificial chaotic time-series and showing an improvement in terms of forecasting errors.

publication date

  • 2019-12-5

keywords

  • Autoregressive Model
  • Bayesian Networks
  • Bayesian inference
  • Chaotic Time Series
  • Evaluate
  • Feedforward Neural Networks
  • Feedforward neural networks
  • Forecasting
  • Information Criterion
  • Kullback-Leibler Information
  • Learning
  • Long short-term memory
  • Memory Model
  • Memory Term
  • Model
  • Necessary
  • Neural Networks
  • Neural networks
  • Nonlinear Model
  • Observation
  • Recurrent Neural Networks
  • Recurrent neural networks
  • Series
  • Simulation Study
  • Term
  • Time Series Forecasting
  • Time series
  • Training
  • Uncertainty

ISBN

  • 9783030362102

number of pages

  • 12

start page

  • 197

end page

  • 208