Logistic regression in MATLAB (without Statistics and Machine Learning Toolbox)
Show older comments
Anyone knows if there is a way to perform logistic regression (similar to the "LogisticRegression" model utilized in Python / scikit-learn) in MATLAB but without the "Statistics and Machine Learning Toolbox"? I have the basic MATLAB software (R2020b) without such toolbox and I didn´t want to migrate to the Python environment as well as to buy this specific toolbox.
Many thanks.
3 Comments
Matt J
on 20 Aug 2025
You can use the Deep Learning Toolbox, but I guess you don't want to rely on that either?
dpb
on 20 Aug 2025
There are at least a couple of well-rated functions at the File Exchange you could look at...
Alexandre Englert
on 21 Aug 2025
Accepted Answer
More Answers (1)
the cyclist
on 20 Aug 2025
Edited: the cyclist
on 20 Aug 2025
Logistic regression is an algorithm that can definitely be programmed in MATLAB, which is a general-purpose programming language. It is a numerical optimization problem.
I was not willing to put in the work to solve that for you, but ChatGPT was. Here is what it came up with, along with a comparison to the output of fitglm (from the Statistics and Machine Learning Toolbox), which is what I would typically use to do a logistic regression.
Without extensive testing, I cannot vouch that this code is doing what I would expect. Frankly, I have not even checked to see if there is a stupendously easier way to do this.
% - Small default ridge for stability (lambda = 1e-6)
% - Base MATLAB implementation + optional fitglm comparison
rng default
% Generate a simple synthetic binary classification dataset (2 informative features)
n = 400;
X1 = [randn(n/2,1) + 1.0; randn(n/2,1) - 1.0];
X2 = [randn(n/2,1) - 0.5; randn(n/2,1) + 0.5];
X = [ones(n,1), X1, X2]; % add intercept only (no extra constant feature)
% True parameters (with intercept)
beta_true = [-0.25; 2.0; -1.5];
p = sigmoid(X*beta_true);
y = double(rand(n,1) < p);
% Fit with our scratch implementation (no toolboxes)
opts = struct('maxIter', 100, 'tol', 1e-8, 'lambda', 1e-6); % tiny ridge for numerical safety
[beta_hat, stats] = logreg_newton(X, y, opts);
% Report results
fprintf('=== Base MATLAB Logistic Regression (Newton/IRLS) ===\n');
disp(table((0:size(X,2)-1)', beta_hat, 'VariableNames', {'CoeffIndex','Estimate'}));
fprintf('Converged: %d in %d iters, final |step| = %.3e, logLik = %.6f\n', ...
stats.converged, stats.iters, stats.lastStepNorm, stats.logLik);
% Train/test split and accuracy
idx = randperm(n);
train = idx(1:round(0.7*n));
test = idx(round(0.7*n)+1:end);
phat_train = sigmoid(X(train,:)*beta_hat);
phat_test = sigmoid(X(test,:)*beta_hat);
yhat_train = phat_train >= 0.5;
yhat_test = phat_test >= 0.5;
acc_train = mean(yhat_train == y(train));
acc_test = mean(yhat_test == y(test));
fprintf('Train acc: %.2f%% | Test acc: %.2f%%\n', 100*acc_train, 100*acc_test);
% Optional: compare to fitglm
hasStatsTBX = ~isempty(ver('stats'));
if hasStatsTBX
Xglm = X(:,2:end); % drop explicit intercept for fitglm
mdl = fitglm(Xglm, y, 'Distribution', 'binomial', 'Link', 'logit', 'Intercept', true);
beta_glm = mdl.Coefficients.Estimate;
fprintf('\n=== fitglm Comparison ===\n');
disp(table((0:size(X,2)-1)', beta_hat, beta_glm, beta_hat - beta_glm, ...
'VariableNames', {'CoeffIndex','BaseMATLAB','fitglm','Diff'}));
phat_test_glm = predict(mdl, Xglm(test,:));
yhat_test_glm = phat_test_glm >= 0.5;
acc_test_glm = mean(yhat_test_glm == y(test));
fprintf('Test acc (base): %.2f%% | Test acc (fitglm): %.2f%%\n', 100*acc_test, 100*acc_test_glm);
else
fprintf('\n[Note] Statistics and Machine Learning Toolbox not detected. Skipping fitglm comparison.\n');
end
% ------- Local functions (base MATLAB only) -------
function [beta, stats] = logreg_newton(X, y, opts)
%LOGREG_NEWTON Logistic regression via Newton-Raphson (IRLS).
if nargin < 3, opts = struct; end
if ~isfield(opts, 'maxIter'), opts.maxIter = 100; end
if ~isfield(opts, 'tol'), opts.tol = 1e-8; end
if ~isfield(opts, 'lambda'), opts.lambda = 1e-6; end % tiny ridge
[n,p] = size(X);
beta = zeros(p,1);
lambda = opts.lambda;
R = zeros(p); % L2 penalty (no intercept penalty)
if lambda > 0
R(2:end,2:end) = lambda*eye(p-1);
end
for k = 1:opts.maxIter
eta = X*beta;
p1 = sigmoid(eta);
W = p1 .* (1 - p1);
W = max(W, 1e-12); % avoid zeros
g = X'*(y - p1) - R*beta;
% Build H = X' * diag(W) * X + R without forming diag(W)
H = X'*(bsxfun(@times, X, W)) + R;
% Extra diagonal jitter for numerical stability in tough cases
% (helps if features are nearly collinear)
H = H + 1e-12*eye(p);
% Solve Newton step
step = H \ g;
% Backtracking line search on penalized log-likelihood
t = 1.0;
pll_prev = loglik(y, eta) - 0.5*lambda*sum(beta(2:end).^2);
while t > 1e-8
beta_try = beta + t*step;
eta_try = X*beta_try;
pll_try = loglik(y, eta_try) - 0.5*lambda*sum(beta_try(2:end).^2);
if pll_try >= pll_prev
beta = beta_try;
pll_prev = pll_try;
break;
end
t = t/2;
end
if norm(step) < opts.tol
stats.converged = true;
stats.iters = k;
stats.lastStepNorm = norm(step);
stats.logLik = pll_prev;
return;
end
end
stats.converged = false;
stats.iters = opts.maxIter;
stats.lastStepNorm = norm(step);
stats.logLik = pll_prev;
end
function y = sigmoid(z)
%SIGMOID Numerically stable logistic sigmoid
y = zeros(size(z));
pos = z >= 0;
neg = ~pos;
y(pos) = 1 ./ (1 + exp(-z(pos)));
ez = exp(z(neg));
y(neg) = ez ./ (1 + ez);
end
function ll = loglik(y, eta)
%LOGLIK Bernoulli log-likelihood
% Compute sum(y.*eta - log(1+exp(eta))) stably
ll = 0;
for i = 1:numel(eta)
t = eta(i);
if t > 0
ll = ll + y(i)*t - (t + log1p(exp(-t)));
else
ll = ll + y(i)*t - log1p(exp(t));
end
end
end
2 Comments
Alexandre Englert
on 21 Aug 2025
I fully support your efforts to program your own Logistic Regression code.
But from the economic point of view, the time you will have to spend understanding the theory and reliably coding the software will not pay compared to the price of the toolbox.
Categories
Find more on Support Vector Machine Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!


