Minimization with Linear Equality Constraints, Trust-Region Reflective Algorithm
The fmincon
trust-region-reflective
algorithm can minimize a nonlinear objective function subject to linear equality constraints only (no bounds or any other constraints). For example, minimize
subject to some linear equality constraints. This example takes .
Create Problem
The browneq.mat
file contains the matrices Aeq
and beq
, which represent the linear constraints Aeq*x = beq
. The Aeq
matrix has 100 rows representing 100 linear constraints (so Aeq
is a 100-by-1000 matrix). Load the browneq.mat
data.
load browneq.mat
The brownfgh
helper function at the end of this example implements the objective function, including its gradient and Hessian.
Set Options
The trust-region-reflective
algorithm requires the objective function to include a gradient. The algorithm accepts a Hessian in the objective function. Set the options to include all of the derivative information.
options = optimoptions('fmincon','Algorithm','trust-region-reflective',... 'SpecifyObjectiveGradient',true,'HessianFcn','objective');
Solve Problem
Set the initial point to –1 for odd indices and +1 for even indices.
n = 1000; x0 = -ones(n,1); x0(2:2:n) = 1;
The problem has no bounds, linear inequality constraints, or nonlinear constraints, so set those parameters to []
.
A = []; b = []; lb = []; ub = []; nonlcon = [];
Call fmincon
to solve the problem.
[x,fval,exitflag,output] = ...
fmincon(@brownfgh,x0,A,b,Aeq,beq,lb,ub,nonlcon,options);
Local minimum possible. fmincon stopped because the final change in function value relative to its initial value is less than the value of the function tolerance.
Examine Solution and Solution Process
Examine the exit flag, objective function value, and constraint violation.
disp(exitflag)
3
disp(fval)
205.9313
disp(output.constrviolation)
2.2338e-13
The exitflag value of 3 indicates that fmincon
stops because the change in the objective function value is less than the tolerance FunctionTolerance
. The final objective function value is given by fval
. Constraints are satisfied, as shown in output.constrviolation
, which displays a very small number.
To calculate the constraint violation yourself, execute the following code.
norm(Aeq*x-beq,Inf)
ans = 2.2338e-13
Helper Function
The following code creates the brownfgh
helper function.
function [f,g,H] = brownfgh(x) %BROWNFGH Nonlinear minimization problem (function, its gradients % and Hessian). % Documentation example. % Copyright 1990-2019 The MathWorks, Inc. % Evaluate the function. n = length(x); y = zeros(n,1); i = 1:(n-1); y(i) = (x(i).^2).^(x(i+1).^2+1)+(x(i+1).^2).^(x(i).^2+1); f = sum(y); % Evaluate the gradient. if nargout > 1 i=1:(n-1); g = zeros(n,1); g(i) = 2*(x(i+1).^2+1).*x(i).*((x(i).^2).^(x(i+1).^2))+... 2*x(i).*((x(i+1).^2).^(x(i).^2+1)).*log(x(i+1).^2); g(i+1) = g(i+1)+... 2*x(i+1).*((x(i).^2).^(x(i+1).^2+1)).*log(x(i).^2)+... 2*(x(i).^2+1).*x(i+1).*((x(i+1).^2).^(x(i).^2)); end % Evaluate the (sparse, symmetric) Hessian matrix if nargout > 2 v = zeros(n,1); i = 1:(n-1); v(i) = 2*(x(i+1).^2+1).*((x(i).^2).^(x(i+1).^2))+... 4*(x(i+1).^2+1).*(x(i+1).^2).*(x(i).^2).*((x(i).^2).^((x(i+1).^2)-1))+... 2*((x(i+1).^2).^(x(i).^2+1)).*(log(x(i+1).^2)); v(i) = v(i)+4*(x(i).^2).*((x(i+1).^2).^(x(i).^2+1)).*((log(x(i+1).^2)).^2); v(i+1) = v(i+1)+... 2*(x(i).^2).^(x(i+1).^2+1).*(log(x(i).^2))+... 4*(x(i+1).^2).*((x(i).^2).^(x(i+1).^2+1)).*((log(x(i).^2)).^2)+... 2*(x(i).^2+1).*((x(i+1).^2).^(x(i).^2)); v(i+1) = v(i+1)+4*(x(i).^2+1).*(x(i+1).^2).*(x(i).^2).*((x(i+1).^2).^(x(i).^2-1)); v0 = v; v = zeros(n-1,1); v(i) = 4*x(i+1).*x(i).*((x(i).^2).^(x(i+1).^2))+... 4*x(i+1).*(x(i+1).^2+1).*x(i).*((x(i).^2).^(x(i+1).^2)).*log(x(i).^2); v(i) = v(i)+ 4*x(i+1).*x(i).*((x(i+1).^2).^(x(i).^2)).*log(x(i+1).^2); v(i) = v(i)+4*x(i).*((x(i+1).^2).^(x(i).^2)).*x(i+1); v1 = v; i = [(1:n)';(1:(n-1))']; j = [(1:n)';(2:n)']; s = [v0;2*v1]; H = sparse(i,j,s,n,n); H = (H+H')/2; end end