AbstractBayesOpt.jl
AbstractBayesOpt.jl
is a Julia library for Bayesian Optimisation (BO), which relies on abstract classes for surrogate models, acquisition functions and domain definitions.
The library is designed to solve minimisation problems of the form:
\[\min_{x \in \mathcal{X}} f(x)\]
where $f: \mathcal{X} \to \mathbb{R}$ is the objective function, which can be expensive to evaluate, non-differentiable, or noisy. The optimisation domain $\mathcal{X} \subseteq \mathbb{R}^d$ can be continuous, bounded, and possibly multi-dimensional.
The library uses Bayesian Optimisation (BO) to iteratively propose evaluation points $x$ in the domain by:
- Modeling the objective function with a surrogate model (e.g., Gaussian Process).
- Using an acquisition function to select the next query point that balances exploration and exploitation.
- Updating the surrogate with new observations and repeating until a stopping criterion is met.
How AbstractBayesOpt.jl fits in the Julia ecosystem
AbstractBayesOpt.jl provides a modular, abstract framework for Bayesian Optimisation in Julia. It defines three core abstractions: AbstractSurrogate
, AbstractAcquisition
, and AbstractDomain
, as well as a standard optimisation loop, allowing users to plug in any surrogate model, acquisition function, or search space.
Unlike traditional BO libraries that rely on a specific surrogate implementation (e.g. BayesianOptimization.jl
using GaussianProcesses.jl
), AbstractBayesOpt.jl
is fully flexible. Users are free to use packages such as AbstractGPs.jl
or GaussianProcesses.jl
; in fact, our standard and gradient-enhanced GP implementations leverage AbstractGPs.jl
and KernelFunctions.jl
. We also mention the Surrogates.jl
package, that implements a high level of BO using implemented surrogates (Kriging, Gradient-Enhanced Kriging, GPs from AbstractGPs.jl
)
In short, AbstractBayesOpt.jl
acts as a general "glue" layer, unifying the Julia BO ecosystem under a simple and extensible interface.
Abstract Interfaces
We currently have three main abstract interfaces that work with our BO loop:
AbstractAcquisition
: Interface to implement for an acquisition function to be used inAbstractBayesOpt.jl
.AbstractDomain
: Interface to implement for the optimisation domain to be used inAbstractBayesOpt.jl
.AbstractSurrogate
: Interface to implement for a surrogate to be used inAbstractBayesOpt.jl
.
AbstractBayesOpt.jl
defines the core abstractions for building Bayesian optimisation algorithms. To add a new surrogate model, acquisition function, or domain, implement the following interfaces.
Acquisition Functions
Subtype AbstractAcquisition
and implement:
(acq::AbstractAcquisition)(model::AbstractSurrogate, x::AbstractVector)
: Evaluate the acquisition function atx
. We viewx
as a set of observations, and hence return a vector when we queryacq
.update(acq::AbstractAcquisition, ys::AbstractVector, model::AbstractSurrogate)
: Update acquisition state given new observations.Base.copy(acq::AbstractAcquisition)
: Return a copy of the acquisition function.
Domains
Subtype AbstractDomain
and implement:
Concrete implementations should subtype this and define the necessary properties:
lower
: The lower bounds of the domain.upper
: The upper bounds of the domain.
as well as creating its constructor.
Concrete implementations may add additional methods as needed, but these are the minimum required for compatibility with the BO loop. We note that we are using Optim.jl
to solve the acquisition function maximisation problem for now,and hence the lower and upper bounds must be compatible with their optimisation interface, which might limit quite a lot the type of usable domains.
Surrogates
Subtype AbstractSurrogate
and implement:
update(model::AbstractSurrogate, xs::AbstractVector, ys::AbstractVector)
: Update the surrogate with new data(xs, ys)
.posterior_mean(model::AbstractSurrogate, x)
: Return the posterior mean at pointsx
.posterior_var(model::AbstractSurrogate, x)
: Return the posterior variance at pointsx
.nlml(model::AbstractSurrogate, params, xs::AbstractVector, ys::AbstractVector)
: Compute the negative log marginal likelihood given hyperparameters and data.
What is currently implemented?
We list below the abstract subtypes currently implemented in AbstractBayesOpt.jl
.
Surrogates
StandardGP
: Gaussian Process surrogate model with standard mean and covariance functions.GradientGP
: Gaussian Process surrogate model supporting gradient information.
Acquisition functions
ExpectedImprovement
: Standard expected improvement acquisition function for balancing exploration and exploitation.UpperConfidenceBound
: Acquisition function using a confidence bound to guide optimisation.GradientNormUCB
: Gradient-based variant of the Upper Confidence Bound acquisition function.ProbabilityImprovement
: Probability of improvement acquisition function.EnsembleAcquisition
: Combines multiple acquisition functions into an ensemble to leverage complementary strategies.
Domains
ContinuousDomain
: Represents a continuous optimisation domain, defining bounds and dimensionality for optimisation problems.