Newton Raphson Optimization by Symbolic Math

Minimizes a target function. Derivatives are computed automatically by the software.

You are now following this Submission

For a quick start, copy the files and run 'Newton_Raphson_Symbolic_Math_Example.m'
The Newton-Raphson optimization method attempts to minimizes a target function by zeroing its gradient. This method is highly efficient, especially for convex or semi-convex functions, but requires explicit expressions of the gradient vector and Hessian matrix. Direct calculation of these derivatives may be tedious in many cases. This function simplifies the Newton Raphson algorithm by calculating these derivatives automatically using symbolic math.
To use the function, all one has to do is to create a symbolic function. The software will compute the derivatives automatically, and execute the Newton Raphson algorithm to find a minimum point.

Cite As

yoash levron (2026). Newton Raphson Optimization by Symbolic Math (https://uk.mathworks.com/matlabcentral/fileexchange/53422-newton-raphson-optimization-by-symbolic-math), MATLAB Central File Exchange. Retrieved .

General Information

MATLAB Release Compatibility

  • Compatible with any release

Platform Compatibility

  • Windows
  • macOS
  • Linux
Version Published Release Notes Action
1.0.0.0