Numerical Optimization is a mathematical formulation that allows to minimize or maximize a particular objective function subjected to constraints on its variables(Nocedal and Wright, 2006).
Since generality is key point in AI models, it is important to understand numerical optimization techniques for finding general optimal solution in given AI problems.
Here's various optimization techniques from univariate to multivariate problems, from unconstrained optimization techniques to constrained optimization techniques and some stochastic global optimization techniques.
-
Univariate Optimization(code)
- Root Finding Techniques (Bisection method, Newton's method, Regular falsi method, Secant method)
- Comparison Optimization Techniques (Fibonacci search method, Golden section search method)
-
Multivariate Optimization(code)
- Non-Smooth Function (Nelder-Mead method, Powell's method)
- Gradient-based Method (Steepest Descent, Newton's method, Two Quasi Newton's method(SR1, BFGS))
- Conjugate-Gradient Method (linear, nonlinear: CG-FR, CG-PR, CG-HS)
- Least Square Methods
- Gauss-Newton's method
- Levenberg-Marquardt method
-
Constrained Optimization
- I will update it later
-
Global Optimization
- I will update Genetic Algorithms in this repository