Messages implementation
In the message passing framework, one of the most important concepts is the message. Given a factor graph, messages are arbitrary functions that flow along the edges of the graph and hold information about the part of the graph from which they originate.
Message as a distribution
Often, a message can be represented in the form of a probability distribution, as a probability distribution can be matched with its probability density function. The representation of messages as probability distributions is not only for convenience but also for performance reasons. For example, a univariate Gaussian distribution can be parameterized with two numbers, which significantly reduce the amount of information needed to pass along the edges of the graph.
using StatsPlots, Distributions
plot(Normal(0.0, 1.0), label = "Univariate Gaussian distribution", fillalpha = 0.4, fill = 0)Variational Message Passing
The message passing technique is useful for finding the posterior distribution over certain parameters in a model, originating from exact Bayesian inference, which is also known as Belief Propagation. However, the message passing technique can also be used to find approximate solutions to posteriors - a technique known as Variational inference. The ReactiveMP.jl package implements Variational Message Passing since it is a more general form than exact inference, and also because the exact solution can be framed as an approximate solution subject to no constraints. Here are visual schematics of the differences between messages in Belief propagation and Variational inference.
Belief-Propagation (or Sum-Product) message
Belief propagation message
Variational message
Variational message with structured factorisation q(x, y)q(z) assumption
Message type
All messages are encoded with the type Message.
ReactiveMP.AbstractMessage — Type
An abstract supertype for all concrete message types.
ReactiveMP.Message — Type
Message(data, is_clamped, is_initial, addons)An implementation of a message in variational message passing framework.
Arguments
data::D: message always holds some data object associated with it, which is usually a probability distribution, but can also be an arbitrary functionis_clamped::Bool, specifies if this message was the result of constant computations (e.g. clamped constants)is_initial::Bool, specifies if this message was used for initializationaddons::A, specifies the addons of the message, which may carry extra bits of information, e.g. debug information, memory, etc.
Example
julia> distribution = Gamma(10.0, 2.0)
Distributions.Gamma{Float64}(α=10.0, θ=2.0)
julia> message = Message(distribution, false, true, nothing)
Message(Distributions.Gamma{Float64}(α=10.0, θ=2.0))
julia> mean(message)
20.0
julia> getdata(message)
Distributions.Gamma{Float64}(α=10.0, θ=2.0)
julia> is_clamped(message)
false
julia> is_initial(message)
true
From an implementation point a view the Message structure does nothing but hold some data object and redirects most of the statistical related functions to that data object. However, this object is used extensively in Julia's multiple dispatch. Our implementation also uses extra is_initial and is_clamped fields to determine if product of two messages results in is_initial or is_clamped posterior marginal. The final field contains the addons. These contain additional information on top of the functional form of the distribution, such as its scaling or computation history.
ReactiveMP.getdata — Method
getdata(message::Message)Returns data associated with the message.
ReactiveMP.is_clamped — Method
is_clamped(message::Message)Checks if message is clamped or not.
ReactiveMP.is_initial — Method
is_initial(message::Message)Checks if message is initial or not.
ReactiveMP.getaddons — Method
getaddons(message::Message)Returns addons associated with the message.
ReactiveMP.as_message — Function
as_message(::AbstractMessage)A function that converts an abstract message to an instance of Message.
using ReactiveMP, BayesBase, ExponentialFamily
distribution = ExponentialFamily.NormalMeanPrecision(0.0, 1.0)
message = Message(distribution, false, true, nothing)Message(ExponentialFamily.NormalMeanPrecision{Float64}(μ=0.0, w=1.0))mean(message), precision(message)(0.0, 1.0)logpdf(message, 1.0)-1.4189385332046727is_clamped(message), is_initial(message)(false, true)Product of messages
In message passing framework, in order to compute a posterior we must compute a normalized product of two messages. For this purpose the ReactiveMP.jl uses the multiply_messages function, which internally uses the prod function defined in BayesBase.jl with various product strategies. We refer an interested reader to the documentation of the BayesBase.jl package for more information.
ReactiveMP.multiply_messages — Function
multiply_messages(prod_strategy, left::Message, right::Message)Multiplies two messages left and right using a given product strategy prod_strategy. Returns a new message with the result of the multiplication. Note that the resulting message is not necessarily normalized.
ReactiveMP.messages_prod_fn — Function
messages_prod_fn(strategy, prod_constraint, form_constraint, form_check_strategy)Returns a suitable prod computation function for a given strategy and constraints
Deferred messages
ReactiveMP.DeferredMessage — Type
A special type of a message, for which the actual message is not computed immediately, but is computed later on demand (potentially never). To compute and get the actual message, one needs to call the as_message method.
Message mappings
A message mapping defines how messages are transformed or mapped during the propagation process — for example, when combining multiple incoming messages or applying specific transformation rules. This structure helps organize and reuse mapping logic across different inference algorithms.
ReactiveMP.MessageMapping — Type
MessageMappingA callable structure representing a deferred computation of a message in the variational message passing framework. It stores all contextual information necessary to compute a message later, such as variable tags, constraints, addons, and the associated factor node.
MessageMapping replaces the original lambda-based implementation to improve type stability and inference. When invoked as a function, it computes an outgoing Message from given input messages and marginals using the appropriate @rule.
See also: Message, DeferredMessage