Purely Explorative Acquisition Functions for bayesopt
10 views (last 30 days)
Show older comments
Hello, I am trying to use Bayesian Optimization with the goal of learning the surrogate function as accurately as possible instead of its maximization. Essentially, I would like to use bayesopt, but with a purely explorative aquisition function like LCB with a high exploration constant (or without the mean entirely) or the Integrated Variance Reduction function described (not originally) here.
Is there any way to do this? I don't see any documented way to modify the LCB function or use a custom aquisition function.
0 Comments
Answers (1)
Parag
on 29 Jan 2025
Hi, you want to use ‘bayesopt’ with a purely explorative aquisition function but as per documentation of ‘bayesopt’. You can take a look at documenation for more detail.
Even as per a previous question answered by MathWorks staff there is no way to use your own custom acquisition function in ‘bayesopt’
You can check the previous question answered by staff-
% Define your objective function (e.g., a simple quadratic function)
objectiveFunction = @(x) (x - 0.3).^2 + sin(5*x);
% Initial sample points
xInitial = linspace(0, 1, 5)'; % Initial points (column vector)
yInitial = objectiveFunction(xInitial); % Evaluate the objective function
% Fit initial Gaussian Process model
gprMdl = fitrgp(xInitial, yInitial, 'BasisFunction', 'constant', 'KernelFunction', 'squaredexponential', 'Sigma', 0.1);
% Number of iterations for Bayesian Optimization
numIterations = 20;
kappa = 2.0; % Exploration parameter for LCB
% Explore the function space
for iter = 1:numIterations
% Define a fine grid over the input space
xGrid = linspace(0, 1, 1000)';
% Predict using the GP model
[predY, predSD] = predict(gprMdl, xGrid);
% Calculate LCB
lcbValues = predY - kappa * predSD;
% Choose the next point based on minimum LCB value
[~, nextIdx] = min(lcbValues);
nextPoint = xGrid(nextIdx);
% Evaluate the objective function at the new point
newY = objectiveFunction(nextPoint);
% Update the dataset
xInitial = [xInitial; nextPoint];
yInitial = [yInitial; newY];
% Refit the Gaussian Process model
gprMdl = fitrgp(xInitial, yInitial, 'BasisFunction', 'constant', 'KernelFunction', 'squaredexponential', 'Sigma', 0.1);
% Plot the current state
figure(1);
clf;
hold on;
plot(xGrid, objectiveFunction(xGrid), 'r--', 'LineWidth', 1.5); % True function
plot(xInitial, yInitial, 'bo', 'MarkerSize', 6, 'MarkerFaceColor', 'b'); % Sampled points
plot(xGrid, predY, 'b-', 'LineWidth', 1.5); % Predicted mean
fill([xGrid; flipud(xGrid)], [predY + 2*predSD; flipud(predY - 2*predSD)], 'b', 'FaceAlpha', 0.1, 'EdgeColor', 'none'); % Confidence interval
plot(xGrid, lcbValues, 'g-', 'LineWidth', 1.5); % LCB values
title(sprintf('Iteration %d', iter));
xlabel('x');
ylabel('f(x)');
legend('True Function', 'Sampled Points', 'GP Prediction', 'Confidence Interval', 'LCB');
hold off;
pause(0.5); % Pause to visualize the updates
end
% Display final sampled points and function evaluations
disp('Final Sampled Points:');
disp(xInitial);
disp('Function Evaluations:');
disp(yInitial);
You can see the documentation for ‘fitrgp’ for more detail-
0 Comments
See Also
Categories
Find more on Gaussian Process Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!