Helper utilities
RxGP exports several utility functions used throughout the message passing rules and by end users.
Mean function application
RxGP.apply_mean_fn — Function
apply_mean_fn(x, mf)Apply a scalar mean function mf to input x, handling different input types:
- If
xis a scalar, applymfdirectly, e.g.mf(x) = x^2givesapply_mean_fn(3, mf) == 9. - If
xis a length-1 vector of numbers, applymfto the single element. - If
xis a longer vector of numbers, applymfto the entire vector. - If
xis a vector of vectors, useapply_mean_fn.(x, mf)to broadcast over sub-vectors.
Applies the scalar mean function mf to an input x. Handles dispatch for PointMass, Distribution, scalar, and vector inputs.
Mean and covariance extraction
RxGP.mean_cov_scalar_matrix — Function
mean_cov_scalar_matrix(x)Return (mean, cov) where mean is a scalar and cov is a 1×1 matrix. Input x must be one-dimensional (scalar, length-1 vector, PointMass, or univariate distribution).
RxGP.mean_cov_vector_matrix — Function
mean_cov_vector_matrix(x)Return (mean, cov) where mean is a vector and cov is a matrix, regardless of the dimensionality of x. Accepts Real, AbstractVector, PointMass, or any NormalDistributionsFamily.
These convenience functions extract (mean, cov) from various input types (PointMass, Real, distribution) and normalise the output shapes:
| Function | Returns |
|---|---|
mean_cov_scalar_matrix | (scalar, 1×1 matrix) |
mean_cov_vector_matrix | (vector, matrix) |
Linear algebra helpers
RxGP.jdotavx — Function
jdotavx(a, b)Vectorised dot product of arrays a and b using LoopVectorization.@turbo.
RxGP.create_blockmatrix — Function
create_blockmatrix(A, d, M)Return a d×d matrix of M×M views into the block structure of A.
Inducing variable buffer
RxGP.BufferUniSGP — Type
BufferUniSGP{D,M}Wraps an inducing-variable message qv together with its UniSGPMeta. When the product of all N incoming messages has been accumulated (tracked via meta.counter), updates the Cholesky factor meta.Uv.
BufferUniSGP wraps an inducing-variable message together with its meta object. When the product of all incoming messages to v has been accumulated (tracked via meta.counter), it updates the Cholesky factor of the second moment meta.Uv. This ensures that the free-energy computation remains efficient.
Hyperparameter optimisation
The following functions compute the negative log backward message and its gradient with respect to the kernel hyperparameters $\theta$. They are designed to be used with gradient-based optimisers (e.g. Optim.jl or Flux.jl).
RxGP.neg_log_backwardmess_fast — Function
neg_log_backwardmess_fast(θ; y_data, x_y_data, q_v, q_w, kernel, mean_fn, Xu, ...)Negative log backward message toward θ for the univariate VSGP. Supports function-value observations (y_data/x_y_data), gradient observations (ω_data/x_ω_data), and either point or distributional inputs. Used as the M-step objective in grad_llh_default!.
RxGP.neg_log_backwardmess_uncertain — Function
neg_log_backwardmess_uncertain(θ; y_data, qx, v, Uv, w, kernel, Xu, method)Negative log backward message toward θ when inputs qx are distributions. Uses Gauss–Hermite cubature to compute kernel expectations Ψ0, Ψ1, Ψ2.
RxGP.neg_log_backwardmess_msg — Function
neg_log_backwardmess_msg(θ; in_data_, out_data_, q_v, q_W_, meta_)Negative log backward message toward θ computed via @call_rule on UniSGP_dID(:θ, Marginalisation). Sums log-pdf contributions over all data points and batches in in_data_.
RxGP.grad_llh_default! — Function
grad_llh_default!(grad, θ; ...)In-place gradient of neg_log_backwardmess_fast w.r.t. θ via ForwardDiff. Accepts the same keyword arguments as neg_log_backwardmess_fast.
RxGP.neg_log_backwardmess_multi — Function
neg_log_backwardmess_multi(θ; y_data, qx, sumRv_Wbar, v, W, tr_W, kernel, Xu, method)Negative log backward message toward θ for the multivariate (MultiSGP) VSGP with C = I. Expects pre-computed sumRv_Wbar = sum(Rv_blk .* W) and tr_W = tr(W) for efficiency.
RxGP.grad_llh_multi! — Function
grad_llh_multi!(grad, θ; ...)In-place gradient of neg_log_backwardmess_multi w.r.t. θ via ForwardDiff.