ECE 273

Convex Optimization and Applications

Instructor: Jun-Kun Wang

Quarters: Spring 2025

This course covers the theoretical and algorithmic foundations of optimization. It covers some convex analysis and duality theory, and classical optimization algorithms such as Gradient Descent, Coordinate Descent, the Frank-Wolfe Method, Accelerated Methods, Mirror Descent, Stochastic Gradient Descent, and Online Gradient Descent. The class puts particular emphasis on understanding the behaviors and the convergence rate guarantees of these algorithms, as well as the tools and techniques to analyze them. Students will learn the basic foundations of deterministic convex optimization, stochastic optimization, online convex optimization, min-max optimization, and non-convex optimization.

As the graduate teaching assistant, I was responsible for grading HW assignments, conducting office hours to help with better understanding of the material, and answering questions on Piazza.