# Bayesian Optimization ionosphere example

2 views (last 30 days)
yufan ji on 24 Jul 2020
Answered: Alan Weiss on 26 Jul 2020
How to understand the program in the penultimate line？Thank you for any advice.
%% https://www.mathworks.com/help/stats/bayesopt.html
clear;clc;
rng default
num = optimizableVariable('n',[1,30],'Type','integer');
dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical');
vars = [num,dst];
c = cvpartition(351,'Kfold',5);
fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,'Distance',char(x.dst),'NSMethod','exhaustive'));%How to understand this line??
What is 'x' and 'x.n'? Does the 'fitcknn(...)' outputs validation loss? Thanks a lot.
results = bayesopt(fun,vars,'Verbose',1,'AcquisitionFunctionName','expected-improvement-plus')

Alan Weiss on 26 Jul 2020
The bayesopt solver passes a table of values to the objective function.
The x.n argument takes the 'n' field of the table x, and the x.dst argument takes the dst field of the table x. You see that the earlier lines of code set up the optimizable variables n and tbl. The variable x is the name given in the @(x) line for the variable of optimization, which, as I stated, is a table variable. You can read about these things in the Hyperparameter Optimization section of Model Building and Assessment.
The objective function fun computes the cross-validation loss of the k-nearest-neighbor classifier for given values of the optimizable parameters.
Alan Weiss
MATLAB mathematical toolbox documentation

#### 1 Comment

yufan ji on 26 Jul 2020
Thank you for your answer! But I have other questions. This is a Broad Learning System example. I want to change it to a battery life prediction program. Using Bayesian Optimization to determine the 'NumFea, NumWin, and NumEnhan'.
Before using BO algorithm, do I need to determine these three parameters？Based on your previous answer, should I write "x.NumFea"? No.2 method can work, but can't find the optimized parameter. Can you tell me where to change? Thank you very much!
%%
clear;clc;close all;
warning off all;
format compact;
Ca_shiyan = Ca_shiyan';
Ca_shiyan = mapminmax(Ca_shiyan, 0, 1);
Ca_shiyan = Ca_shiyan';
lag = 6;
n = length(Ca_shiyan);
xx = zeros(n-lag, lag);
for i = 1 : n-lag
xx(i, :) = Ca_shiyan(i : i+lag-1);
end
yy = Ca_shiyan(lag+1 : end);
co = 500;%starting point
train_x = xx(1:co-lag,:);%450;294
train_y = yy(1:co-lag,:);%450;294
% test_x = xx(co-lag+1,:);
% test_y = yy(co-lag+1,:);
test_x = xx(co-lag+1:end,:);
test_y = yy(co-lag+1:end,:);
clear i;
%%
C = 2^-30;
s = .8;
best = 5;%训练与预测用
result = [];
NumFea = 21;
NumWin = 4;
NumEnhan = 2;
clc;
rand('state',1)
for i = 1:NumWin
WeightFea = 2*rand(size(train_x,2)+1,NumFea)-1;
b1=rand(size(train_x,2)+1,NumFea);
WF{i} = WeightFea;
end
WeightEnhan = 2*rand(NumWin*NumFea+1,NumEnhan)-1;
fprintf(1, 'Fea. No.= %d, Win. No. =%d, Enhan. No. = %d\n', NumFea, NumWin, NumEnhan);
% BO
vars = [
optimizableVariable('NumFea',[1 50],'Type','integer')
optimizableVariable('NumWin',[1 50],'Type','integer')
optimizableVariable('NumEnhan',[1 200],'Type','integer')];
fun=@train;
results = bayesopt(fun, vars, 'Verbose',1, 'AcquisitionFunctionName', 'expected-improvement-plus');
% NO.1 Can't work
function [test_RMSE] = train(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin)
[test_RMSE] = bls_train_original(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin);
end
% % NO.2 This works, 'bls_train_original' is training and testing program
% function fun = train(~)
% fun = @train;
% function [test_RMSE] = train(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin)
% [test_RMSE] = bls_train_original(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin);
% end
% end
% NO.3 Can't work
% function [test_RMSE] = train(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin)
% [test_RMSE] = bls_train_original(train_x, train_y, test_x, test_y, WF, WeightEnhan, s, C, NumFea, NumWin);
% end

Alan Weiss on 26 Jul 2020
I think that you need to do some reading in Model Building and Assessment. Please read about how to specify variables and objective functions. Your objective function is a function of ONE VARIABLE, typically called x. It is not a function of x.a, x.b, etc. Just x. If you need to pass extra parameters that are not optimizable variables, well, the examples show how to include them, too.
Alan Weiss
MATLAB mathematical toolbox documentation

R2018a

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!