Skip to content

Models

During the team's existence the river level prediction/forecasting models were Bayesian models; future models include recurrent neural network models.


Recurrent Neural Networks: LSTM

For an indepth understanding study2, 3, 4


Bayesian Structural Time Series + Variational Inference

A Bayesian Structural Time Series (STS) algorithm is a state space algorithm, in brief

\[y_{t} = \pmb{x}^{T}_{t}\pmb{\beta}_{t} + \epsilon_{t} \qquad \qquad \qquad 1\]
\[\pmb{\beta}_{t} = \mathbf{F}_{t}\pmb{\beta}_{t - 1} + \pmb{\varsigma}_{t} \qquad \qquad \quad 2\]
\[\epsilon_{t} \sim \mathcal{N}\bigl(0, \: \sigma^{2}_{t} \bigr) \qquad \qquad \qquad\]
\[\pmb{\varsigma}_{t} \sim \mathcal{N}\bigl(\mathbf{0}, \: \pmb{\mathcal{Z}}_{t}\bigr) \qquad \qquad \qquad\]

whereby

  description
\(y_{t}\) \(1 \times 1\) scalar. Herein, it is a gauge's river level measure at time point \(t\).
\(\pmb{x}_{t}\) \(p \times 1\). A design vector.
\(\pmb{\beta}_{t}\) \(p \times 1\). A state vector.
\(\epsilon_{t}\) \(1 \times 1\) scalar. An observation error, observation innovation.
\(\mathbf{F}_{t}\) \(p \times p\). A transition matrix.
\(\pmb{\varsigma}_{t}\) \(q \times 1\). A system error, or state innovation.1


Formally, Eq. 1 is the observation model, whilst Eq. 2 is the transition or state model. The latter models the transition of a state from \(t - 1\) to \(t\).

A key advantage of state space modelling is \(\rightarrow\) modelling via the superimposition of behaviours. The superimposition, encoding, of behaviours occurs via the components \(\pmb{x}_{t}\) & \(\mathbf{F}_{t}\). For an in-depth outline, study Bayesian Inference of State Space Models by K. Triantafyllopoulos.

In practice, model development is via TensorFlow Probability libraries. Visit the project's river level modelling repository; the modelling arguments are readable with or without comments/definitions.