Moving from linear gaussian ssm, we relax the assumptions of linear dynamics (
This model can be used to track an object performing nonlinear motion or in online learning using ssm for neural networks.
inference
Since the dynamics
Extended Kalman filter (EKF)
First-order Taylor expansion around the current state estimate
giving the local linear parameters
- Taylor linearisation (
cuthbert.gaussian.taylor): provide log densitiesand . cuthbert auto-differentiates ( jax.hessian+jax.jacobian) to extract the local linear-Gaussian approximation.
from cuthbert.gaussian.taylor import build_filter
filter_obj = build_filter(
get_init_log_density=..., # returns log p(x_0) + linearisation point
get_dynamics_log_density=..., # returns log p(x_t | x_{t-1}) + linearisation points
get_observation_func=..., # returns log p(y_t | x_t) + linearisation point
)- Moments linearisation (
cuthbert.gaussian.moments): provide conditional mean and Cholesky covariance functions. cuthbert linearises viajax.jacfwd.
from cuthbert.gaussian.moments import build_filter
filter_obj = build_filter(
get_init_params=..., # returns (m0, chol_P0)
get_dynamics_params=..., # returns (mean_and_chol_cov_func, linearisation_point)
get_observation_params=..., # returns (mean_and_chol_cov_func, linearisation_point, y)
)Unscented Kalman filter (UKF)
Instead of linearising analytically, the UKF passes a set of deterministically chosen sigma points through the nonlinear functions
Particle filter (SMC)
For strongly nonlinear models where Gaussian approximations are poor, sequential Monte Carlo (SMC) / particle filtering can be used. This makes no Gaussian assumption on the posterior but scales poorly with state dimension. See inference methods.