Handling bound constraints by the Levenberg-Marquardt algorithm

Hello,
Could anyone please tell me why the Levenberg-Marquardt algorithm embedded in "lsqnonlin" does not handle bound constraints, while the other one ("trust-region-reflective") does?
There are implementations of the Levenberg-Marquardt algorithm that do accept bound constraints, so what is the principle limitation explaining why this has not been implemented in "lsqnonlin"?
Thank you!
Igor.

 Accepted Answer

Matt J
Matt J on 29 Jan 2019
Edited: Matt J on 29 Jan 2019
The theory of Levenberg-Marquardt does not define a way to handle bound constraints. If, as you claim, there are modifications of classical LM that support bounds, I surmise that they involve manipulations similar in spirit to what is done in the trust-region algorithm.

3 Comments

Hello Matt,
Thank you for the response!
This follows from what I have found on the Internet: https://www.cs.ubc.ca/grads/resources/thesis/Nov08/Shan_Shidong.pdf
As I am too far from being an expert on optimization, I accept your answer.
Thanks again!
May I also ask you another question: is there any "stable" (meaning "scientifically accepted") term for the "trust-region-reflective" method used by "lsqnonlin"? The LM algorithm is widely known and very often included in relevant books, but the combination of "trust-region" and "reflective" does not seem to have been described in classical texbooks. The references in the Help only cite papers by Coleman and Li.
Thank you,
Igor.
Trust region methods are a very large family, but I don't know of a widely accepted term for Matlab's specific implementation. Maybe you should just cite Coleman and Li.

Sign in to comment.

More Answers (1)

In matlab2020b, the doc of lsqcurvefit() has removed "The Levenberg-Marquardt algorithm does not handle bound constraints". So it support bound constraints in LM, can be seen in release notes and doc.

Asked:

on 29 Jan 2019

Edited:

on 27 Apr 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!