by Li He
In this seminar, we will take a detailed look at semi-definite programming (SDP) and optimization on Stiefel-Grassmann optimization. The audience should be familiar with some basic ideas in optimization, such as gradient descent, conjugate gradient (CG) methods and basic concepts frequently employed in this area (e.g. feasible set, convexity, primal and dual formulation of an optimization problem).
In this talk I will cover the following topics:
- Some basic ideas of interior point methods.
- The interior point methods for SDP.
- Some common examples of SDP.
- Some applications of SDP in machine learning.
- Some basic concepts about Stiefel-Grassmann manifolds.
- How to design gradient-based optimization techniques on Stiefel-Grassmann manifolds.
- Applications of these algorithms in machine learning.
References
- Convex Optimization, by Stephen Boyd and Lieven Vandenberghe.
- Numerical Optimization (2nd ed.), by Jorge Nocedal, and Stephen J. Wright.
- Semidefinite Programming, by Lieven Vandenberghe and Stephen Boyd, Siam Review, 1996.
- The geometry of algorithms with orthogonality constraints, by Alan Edelman, Tomas A. Arias and Steven T. Smith.