Optimization in engineering

  • I created handouts for personal notes and teaching in related courses. They are still under revision.
  • These topics will be taught in EE508, EE509, EE510, EE511 (optimization course series) starting in semester 2, academic year 2022. EE732 (convex optimization) will be closed soon and be taught as part of EE511.

Optimization courses at CUEE

  • EE508 (1 credit) Optimization concepts and applications (mandatory before taking any of the following courses)
  • EE509 (2 credits) Introduction to optimization techniques
  • EE510 (1 credit) Linear programming
  • EE511 (2 credits) Optimization methods for engineering and machine learning
  • EE512 (2 credits) Heuristic optimization (taught by Teerapol)

It is recommended that before taking any of above classes, students should have a good background in linear algebra. We recommend students to self-check by reviewing - Math review for students (undergrad level)

It is complimentary and it would strengthen your background if students also take EE 500 Linear algebra for EE along with EE508.

Lecture videos

Students can watch videos from my YouTube playlists on

Lecture notes

The contents are summarized from the reference textbooks and partly from class notes of Prof. Lieven Vandenberghe.

Check out the YouTube playlist: Optimization in engineering and machine learning

I also taught this course at NIDA in 2021; see the playlist here

  1. Overview of optimization concepts
    • standard formulation
    • overview of problem types and numerical methods
    • this handout is used in EE508 (optimization concepts)
  2. Convex optimization
    • convex formulation
    • problem transformulation
    • LP, QP, QCQP, and some structured convex problems
    • part of this handout is used in EE508 and EE511
  3. Unconstrained optimization
    • Gradient-descent, Newton, Quasi Newton, Conjugate gradient
    • Accelerated gradient methods for convex problem
    • Momentum-accelerated gradient descent
    • Mini-batch optimization
  4. Gradient methods in ML
    • Computation graph
    • Automatic differentiation and backpropagation
    • Mini batch optimization
    • Issues of gradient methods in ML
    • Gradient descent via a change of coordinate
    • Momentum-accelerated algorithms (ADAM and others)
  5. Constrained optimization
    • Lagrange multiplier theorem
    • constraint elimination
    • convex constraints
  6. Linear programming: formulation and algorithms
  7. Quadratic programming: formulation and algorithms
  8. Optimization problems in applications (more list will be added)
    • portfolio optimization
    • traffic network
    • regression, logistic regression
    • SVM, Neural network
  9. Regularization techniques
    • l1 and l2 regularized regression
    • generalized l1 regularization (variants of lasso)
  10. Duality theory: dual problem, KKT conditions, examples
  11. Proximal methods
    • proximal algorithms
    • ADMM
    • augmented Lagrangian
  12. Block coordinate descent

References: texbooks, class notes

Nonlinear Optimization

  1. D.P. Bertsekas, Nonlinear Programming, 2nd edition, Athena Scientific, 1999
    1. Nocedal and S.J. Wright, Numerical Optimization, 2nd edition, Springer, 2006
  2. D.G. Luenberger and Y. Ye, Linear and Nonlinear Programming, 4th edition, Springer, 2008
    1. Griver, S.G. Nash, and A. Sofer, Linear and Nonlinear Optimization, 2nd edition, SIAM, 2009
Convex Optimization
    1. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004
    1. Calafiore and L. El Ghaoui, Optimization Models, Cambridge University Press, 2014
  1. D.P. Bertsekas, Convex Optimization Algorithms, Athena Scientific, 2015
  2. D.P. Bertsekas, Convex Optimization Theory, Athena Scientific, 2009
    1. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, 2004
    1. Bertsimas and J.N. Tsisiklis, Introduction to Linear Optimization, Athena Scientific, 1997
Optimization in Machine Learning
    1. Watt, R. Borhani, and A.K. Katsaggelos, Machine Learning Refined: Foundations, Algorithms, and Applications, 2nd edition, Cambridge University Press, 2020
  1. C.C. Aggarwal, Linear Algebra and Optimization for Machine Learning: A Textbook, Springer, 2020
  2. D.Bertsimas and J. Dunn, Machine Learning under a Modern Optimization Lens, Dynamic Ideas LLC, 2019
    1. Boyd, N. Parikh, E. Chu, B. Peleato and J. Eckstein, Distributed Optimization and Statistical Learnign via the Alternating Direction Method of Multipliers, Foundations and Trends in Machine Learning, 2011
    1. Parikh and S. Boyd, Proximal Algorithms, Foundations and Trends in Optimization, 2013
Linear algebra with applications
    1. Boyd and L. Vandenberghe, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least squares, Cambridge, 2018
    1. Strang, Linear Algebra and Learning from Data, Wellesley-Cambridge Press, 2019
  1. M.P. Deisenroth, A.A. Faisal, and C.S. Ong, Mathematics for Machine Learning, Cambridge University Press, 2020
Statistical learning and ML
    1. Hastie and R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edition, Springer, 2009
    1. Hastie, R. Tibshirani, and M. Wainwright, Statistical Learning with Sparsity : The Lasso and Generalizations, CRC Press, 2015
    1. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, The MIT Press, 2016
Class notes
  1. Lieven Vandenberghe: EE236A, EE236B, EE236C
  2. Stephen Boyd: EE364a EE364b