
OptiMaizer
A modular Python optimization framework implementing classical and modern algorithms for unconstrained and constrained optimization problems. Developed for Math 562/IOE 511: Continuous Optimization Methods at the University of Michigan. I designed and implemented a complete optimization library featuring multiple descent methods (Gradient Descent, Newton, BFGS, L-BFGS, DFP), trust region algorithms (Newton-CG, SR1-CG), constrained optimization via quadratic penalty methods, and machine learning optimization for linear least squares and logistic regression. The framework includes sophisticated line search strategies (backtracking, Wolfe conditions), numerical stability features like modified Newton with Cholesky factorization, and comprehensive performance benchmarking tools.