Optimization in engineering¶
- I created handouts for personal notes and teaching in related courses. They are still under revision.
- These topics will be taught in EE508, EE509, EE510, EE511 (optimization course series) starting in semester 2, academic year 2022. EE732 (convex optimization) will be closed soon and be taught as part of EE511.
Optimization courses at CUEE¶
- EE508 (1 credit) Optimization concepts and applications (mandatory before taking any of the following courses)
- EE509 (2 credits) Introduction to optimization techniques
- EE510 (1 credit) Linear programming
- EE511 (2 credits) Optimization methods for engineering and machine learning
- EE512 (2 credits) Heuristic optimization (taught by Teerapol)
It is recommended that before taking any of above classes, students should have a good background in linear algebra. We recommend students to self-check by reviewing - Math review for students (undergrad level)
It is complimentary and it would strengthen your background if students also take EE 500 Linear algebra for EE along with EE508.
Lecture notes¶
The contents are summarized from the reference textbooks and partly from class notes of Prof. Lieven Vandenberghe.
Check out the YouTube playlist: Optimization in engineering and machine learning
I also taught this course at NIDA in 2021; see the playlist here
- Overview of optimization concepts
- standard formulation
- overview of problem types and numerical methods
- this handout is used in EE508 (optimization concepts)
- Convex optimization
- convex formulation
- problem transformulation
- LP, QP, QCQP, and some structured convex problems
- part of this handout is used in EE508 and EE511
- Unconstrained optimization
- Gradient-descent, Newton, Quasi Newton, Conjugate gradient
- Accelerated gradient methods for convex problem
- Momentum-accelerated gradient descent
- Mini-batch optimization
- Gradient methods in ML
- Computation graph
- Automatic differentiation and backpropagation
- Mini batch optimization
- Issues of gradient methods in ML
- Gradient descent via a change of coordinate
- Momentum-accelerated algorithms (ADAM and others)
- Constrained optimization
- Lagrange multiplier theorem
- constraint elimination
- convex constraints
- Linear programming: formulation and algorithms
- Quadratic programming: formulation and algorithms
- Optimization problems in applications (more list will be added)
- portfolio optimization
- traffic network
- regression, logistic regression
- SVM, Neural network
- Regularization techniques
- l1 and l2 regularized regression
- generalized l1 regularization (variants of lasso)
- Duality theory: dual problem, KKT conditions, examples
- Proximal methods
- proximal algorithms
- ADMM
- augmented Lagrangian
- Block coordinate descent
References: texbooks, class notes¶
Nonlinear Optimization
- D.P. Bertsekas, Nonlinear Programming, 2nd edition, Athena Scientific, 1999
- Nocedal and S.J. Wright, Numerical Optimization, 2nd edition, Springer, 2006
- D.G. Luenberger and Y. Ye, Linear and Nonlinear Programming, 4th edition, Springer, 2008
- Griver, S.G. Nash, and A. Sofer, Linear and Nonlinear Optimization, 2nd edition, SIAM, 2009
- Convex Optimization
- Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004
- Calafiore and L. El Ghaoui, Optimization Models, Cambridge University Press, 2014
- D.P. Bertsekas, Convex Optimization Algorithms, Athena Scientific, 2015
- D.P. Bertsekas, Convex Optimization Theory, Athena Scientific, 2009
- Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, 2004
- Bertsimas and J.N. Tsisiklis, Introduction to Linear Optimization, Athena Scientific, 1997
- Optimization in Machine Learning
- Watt, R. Borhani, and A.K. Katsaggelos, Machine Learning Refined: Foundations, Algorithms, and Applications, 2nd edition, Cambridge University Press, 2020
- C.C. Aggarwal, Linear Algebra and Optimization for Machine Learning: A Textbook, Springer, 2020
- D.Bertsimas and J. Dunn, Machine Learning under a Modern Optimization Lens, Dynamic Ideas LLC, 2019
- Boyd, N. Parikh, E. Chu, B. Peleato and J. Eckstein, Distributed Optimization and Statistical Learnign via the Alternating Direction Method of Multipliers, Foundations and Trends in Machine Learning, 2011
- Parikh and S. Boyd, Proximal Algorithms, Foundations and Trends in Optimization, 2013
- Linear algebra with applications
- Boyd and L. Vandenberghe, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least squares, Cambridge, 2018
- Strang, Linear Algebra and Learning from Data, Wellesley-Cambridge Press, 2019
- M.P. Deisenroth, A.A. Faisal, and C.S. Ong, Mathematics for Machine Learning, Cambridge University Press, 2020
- Statistical learning and ML
- Hastie and R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edition, Springer, 2009
- Hastie, R. Tibshirani, and M. Wainwright, Statistical Learning with Sparsity : The Lasso and Generalizations, CRC Press, 2015
- Goodfellow, Y. Bengio, and A. Courville, Deep Learning, The MIT Press, 2016
- Class notes
- Lieven Vandenberghe: EE236A, EE236B, EE236C
- Stephen Boyd: EE364a EE364b