Models¶
During the team's existence the river level prediction/forecasting models were Bayesian models; future models include recurrent neural network models.
Recurrent Neural Networks: LSTM¶
For an indepth understanding study2, 3, 4
- Long Short-Term Memory by Sepp Hochreiter, Jürgen Schmidhuber
- Understanding LSTM by Ralf C. Staudemeyer, Eric Rothstein Morris
Bayesian Structural Time Series + Variational Inference¶
A Bayesian Structural Time Series (STS) algorithm is a state space algorithm, in brief
whereby
| description | |
|---|---|
| \(y_{t}\) | \(1 \times 1\) scalar. Herein, it is a gauge's river level measure at time point \(t\). |
| \(\pmb{x}_{t}\) | \(p \times 1\). A design vector. |
| \(\pmb{\beta}_{t}\) | \(p \times 1\). A state vector. |
| \(\epsilon_{t}\) | \(1 \times 1\) scalar. An observation error, observation innovation. |
| \(\mathbf{F}_{t}\) | \(p \times p\). A transition matrix. |
| \(\pmb{\varsigma}_{t}\) | \(q \times 1\). A system error, or state innovation.1 |
Formally, Eq. 1 is the observation model, whilst Eq. 2 is the transition or state model. The latter models the transition of a state from \(t - 1\) to \(t\).
A key advantage of state space modelling is \(\rightarrow\) modelling via the superimposition of behaviours. The superimposition, encoding, of behaviours occurs via the components \(\pmb{x}_{t}\) & \(\mathbf{F}_{t}\). For an in-depth outline, study Bayesian Inference of State Space Models by K. Triantafyllopoulos.
In practice, model development is via TensorFlow Probability libraries. Visit the project's river level modelling repository; the modelling arguments are readable with or without comments/definitions.
-
For more about the structure options of \(\pmb{\varsigma}_{t}\), i.e., system errors, study Inferring causal impact using Bayesian structural time-series models, and Bayesian Inference of State Space Models ↩