# Optimal Smoothing Sequential Monte Carlo Proposal for Linear Gaussian State Space Model

08 August 2022

Given a linear Gaussian state space model (LGSSM), we have previously derived the optimal proposals for filtering SMC, $$p(z_1 \given x_1)$$ and $$\{p(z_t \given z_{t - 1}, x_t)\}_{t = 2}^T$$. For smoothing, we need the optimal SMC proposals to be conditioned on the future: $$p(z_1 \given x_{1:T})$$ and $$\{p(z_t \given z_{t - 1}, x_{1:T})\}_{t = 2}^T$$. How can we compute these proposals?

The first proposal is easy since it’s just the marginal smoothing distribution which can be computed like this.

For $$p(z_t \given z_{t - 1}, x_{1:T})$$, we reuse $$p(z_{t - 1} \given z_t, x_{1:T})$$ from the derivation of Kalman smoothing in equation 19 of this note: \begin{align} p(z_t \given z_{t - 1}, x_{1:T}) = \frac{p(z_{t - 1} \given z_t, x_{1:T}) p(z_t \given x_{1:T})}{p(z_{t - 1} \given x_{1:T})}. \end{align} Since $$p(z_t \given x_{1:T})$$ is Gaussian, obtained from smoothing, and $$p(z_{t - 1} \given z_t, x_{1:T})$$ is Gaussian whose mean is a linear function of $$z_t$$, we can compute left hand side as a conjugate posterior for the “prior” $$p(z_t \given x_{1:T})$$ and “likelihood” $$p(z_{t - 1} \given z_t, x_{1:T})$$.

[back]