Problem specific
This section contains a set of examples for Bayesian Inference with RxInfer
package in various probabilistic models.
All examples have been pre-generated automatically from the examples/
folder at GitHub repository.
Problem specific examples contain specialized models and inference for various domains.
- Autoregressive Models: An example of Bayesian treatment of latent AR and ARMA models. Reference: Albert Podusenko, Message Passing-Based Inference for Time-Varying Autoregressive Models.
- Gamma Mixture Model: This example implements one of the Gamma mixture experiments outlined in https://biaslab.github.io/publication/mp-based-inference-in-gmm/ .
- Gaussian Mixture: This example implements variational Bayesian inference in univariate and multivariate Gaussian mixture models with mean-field assumption.
- Hierarchical Gaussian Filter: An example of online inference procedure for Hierarchical Gaussian Filter with univariate noisy observations using Variational Message Passing algorithm. Reference: Ismail Senoz, Online Message Passing-based Inference in the Hierarchical Gaussian Filter.
- Invertible neural networks: a tutorial: An example of variational Bayesian Inference with invertible neural networks. Reference: Bart van Erp, Hybrid Inference with Invertible Neural Networks in Factor Graphs.
- Probit Model (EP): In this demo we illustrate EP in the context of state-estimation in a linear state-space model that combines a Gaussian state-evolution model with a discrete observation model.
- RTS vs BIFM Smoothing: This example performs BIFM Kalman smoother on a factor graph using message passing and compares it with the RTS implementation.
- Simple Nonlinear Node: In this example we create a non-conjugate model and use a nonlinear link function between variables. We show how to extend the functionality of
RxInfer
and to create a custom factor node with arbitrary message passing update rules. - Universal Mixtures: Universal mixture modeling.