2009年9月16日星期三

Semi-definite Programming and Optimization on Stiefel-Grassmann Manifolds

by Li He

In this seminar, we will take a detailed look at semi-definite programming (SDP) and optimization on Stiefel-Grassmann optimization. The audience should be familiar with some basic ideas in optimization, such as gradient descent, conjugate gradient (CG) methods and basic concepts frequently employed in this area (e.g. feasible set, convexity, primal and dual formulation of an optimization problem).

In this talk I will cover the following topics:
  • Some basic ideas of interior point methods.
  • The interior point methods for SDP.
  • Some common examples of SDP.
  • Some applications of SDP in machine learning.
  • Some basic concepts about Stiefel-Grassmann manifolds.
  • How to design gradient-based optimization techniques on Stiefel-Grassmann manifolds.
  • Applications of these algorithms in machine learning.

References