Flow node
The flow node encodes a normalizing flow — a parameterized, invertible transformation that maps a simple base distribution (e.g., a Gaussian) into a complex, multimodal one. Because the transformation is invertible, both forward and backward messages can be computed without approximation.
y ~ Flow(x) where { meta = FlowMeta(compiled_model) }This asserts that y = f(x) where f is the composed invertible transformation defined by the flow model. The node type is Deterministic.
Building a flow model
A flow is assembled from layers stacked inside a FlowModel. Each layer is an invertible mapping. ReactiveMP.jl provides:
| Layer | Description |
|---|---|
PlanarFlow | Planar contraction/expansion along a learned direction |
RadialFlow | Radial contraction/expansion around a learned center point |
AdditiveCouplingLayer | Affine coupling — splits the input and transforms one half conditioned on the other |
PermutationLayer | Permutes the input dimensions to mix information across coupling layers |
InputLayer | Declares the input dimensionality; must be the first layer in a model |
Layers are composed into a FlowModel and then compiled into a CompiledFlowModel before use. Compilation fixes the layer sizes and randomly initializes parameters:
model = FlowModel((
InputLayer(2),
AdditiveCouplingLayer(PlanarFlow()),
PermutationLayer(),
AdditiveCouplingLayer(PlanarFlow()),
))
compiled = compile(model) # randomly initialized parameters
# or pass your own parameter vector:
compiled = compile(model, params)The compiled model is then wrapped in FlowMeta and attached to the node:
y ~ Flow(x) where { meta = FlowMeta(compiled) }Approximation inside the flow node
By default, FlowMeta uses Linearization for any messages that require approximation (e.g., backward messages when the inverse Jacobian is unavailable in closed form). A different approximation can be passed as the second argument:
FlowMeta(compiled, Unscented())Learning flow parameters
See also the Flow tutorial in the RxInfer.jl documentation for a complete end-to-end example.
ReactiveMP.PlanarFlow — Type
The PlanarFlow function is defined as
\[f({\bf{x}}) = {\bf{x}} + {\bf{u}} \tanh({\bf{w}}^\top {\bf{x}} + b)\]
with input and output dimension $D$. Here ${\bf{x}}\in \mathbb{R}^D$ represents the input of the function. Furthermore ${\bf{u}}\in \mathbb{R}^D$, ${\bf{w}}\in \mathbb{R}^D$ and $b\in\mathbb{R}$ represent the parameters of the function. The function contracts and expands the input space.
This function has been introduced in:
Rezende, Danilo, and Shakir Mohamed. "Variational inference with normalizing flows." International conference on machine learning. PMLR, 2015.
ReactiveMP.RadialFlow — Type
The RadialFlow function is defined as
\[f({\bf{x}}) = {\bf{x}} + \frac{\beta({bf{z}} - {\bf{z}}_0)}{\alpha + |{\bf{z}} - {\bf{z}}_0|}\]
with input and output dimension $D$. Here ${\bf{x}}\in \mathbb{R}^D$ represents the input of the function. Furthermore ${\bf{z}}_0\in \mathbb{R}^D$, $\alpha\in \mathbb{R}$ and $\beta\in\mathbb{R}$ represent the parameters of the function. The function contracts and expands the input space.
This function has been introduced in:
Rezende, Danilo, and Shakir Mohamed. "Variational inference with normalizing flows." International conference on machine learning. PMLR, 2015.
ReactiveMP.FlowModel — Type
The FlowModel structure is the most generic type of Flow model, in which the layers are not constrained to be of a specific type. The FlowModel structure contains the input dimensionality and a tuple of layers and can be constructed as FlowModel( dim, (layer1, layer2, ...) ).
Note: this model can be specialized by constraining the types of layers. This potentially allows for more efficient specialized methods that can deal with specifics of these layers, such as triangular jacobian matrices.
ReactiveMP.CompiledFlowModel — Type
The CompiledFlowModel structure is the most generic type of compiled Flow model, in which the layers are not constrained to be of a specific type. The FlowModel structure contains the input dimension and a tuple of compiled layers. Do not manually create a CompiledFlowModel! Instead create a FlowModel first and compile it with compile(model::FlowModel). This will make sure that all layers/mappings are configured with the proper dimensionality and with randomly sampled parameters. Alternatively, if you would like to pass your own parameters, call compile(model::FlowModel, params::Vector).
Note: this model can be specialized by constraining the types of layers. This potentially allows for more efficient specialized methods that can deal with specifics of these layers, such as triangular jacobian matrices.
ReactiveMP.compile — Function
compile() compiles a model by setting its parameters. It randomly sets parameter values in the layers and flows such that inference in the model can be obtained.
Input arguments
model::FlowModel- a model of which the dimensionality of its layers/flows has been initialized, but its parameters have not been set.
Return arguments
::CompiledFlowModel- a compiled model with set parameters, such that it can be used for processing data.
compile(model::FlowModel, params::Vector) lets you initialize a model model with a vector of parameters params.
Input arguments
model::FlowModel- a model of which the dimensionality of its layers/flows has been initialized, but its parameters have not been set.params::Vector- a vector of parameters with which the model should be compiled.
Return arguments
::CompiledFlowModel- a compiled model with set parameters, such that it can be used for processing data.
ReactiveMP.AdditiveCouplingLayer — Type
The additive coupling layer specifies an invertible function ${\bf{y}} = g({\bf{x}})$ following the specific structure (for the mapping $g: \mathbb{R}^2 \rightarrow \mathbb{R}^2$):
\[ \begin{align} y_1 &= x_1 \\ y_2 &= x_2 + f(x_1) \end{align}\]
where $f(\cdot)$ denotes an arbitrary function with mapping $f: \mathbb{R} \rightarrow \mathbb{R}$. This function can be chosen arbitrarily complex. Non-linear functions (neural networks) are often chosen to model complex relationships. From the definition of the model, invertibility can be easily achieved as
\[ \begin{align} x_1 &= y_1 \\ x_2 &= y_2 - f(y_1) \end{align}\]
The current implementation only allows for the mapping $g: \mathbb{R}^2 \rightarrow \mathbb{R}^2$, although this layer can be generalized for arbitrary input dimensions.
AdditiveCouplingLayer(f <: AbstractCouplingFlow) creates the layer structure with function f.
Example
f = PlanarFlow()
layer = AdditiveCouplingLayer(f)This layer structure has been introduced in:
Dinh, Laurent, David Krueger, and Yoshua Bengio. "Nice: Non-linear independent components estimation." arXiv preprint arXiv:1410.8516 (2014).
ReactiveMP.InputLayer — Type
The input layer specifies the input dimension to a flow model.
layer = InputLayer(3)ReactiveMP.PermutationLayer — Type
The permutation layer specifies an invertible mapping ${\bf{y}} = g({\bf{x}}) = P{\bf{x}}$ where $P$ is a permutation matrix.
ReactiveMP.FlowMeta — Type
The FlowMeta structure contains the meta data of the Flow factor node. More specifically, it contains the model of the Flow factor node. The FlowMeta structure can be constructed as FlowMeta(model). Make sure that the flow model has been compiled.
The FlowMeta structure is required for the Flow factor node and can be included with the Flow node as: y ~ Flow(x) where { meta = FlowMeta(...) }