菜单总览
— 活动 —

【学术会议】Exploiting Second Order Sparsity in Big Data Optimization - Prof. Toh Kim Chuan

  • 2019.04.30
  • 活动
Exploiting Second Order Sparsity in Big Data Optimization

主题: Exploiting Second Order Sparsity in Big Data Optimization

报告人: Prof. Toh Kim Chuan, National University of Singapore

时间: 11:00 am - 12:00 pm, Tuesday, April 30, 2019

地点: Boardroom, Dao Yuan Building

 

摘要:

In this talk, we shall demonstrate how second order sparsity (SOS) in important optimization problems such as sparse optimization models in machine learning, semidefinite programming, and many others can be exploited to design highly efficient algorithms. 

The SOS property appears naturally when one applies a semismooth Newton (SSN) method to solve the subproblems in an augmented Lagrangian method (ALM) designed for certain classes of structured convex optimization problems. With in-depth analysis of the underlying generalized Jacobians and sophisticated numerical implementation, one can solve the subproblems at surprisingly low costs. For lasso problems with sparse solutions, the cost of solving a single ALM subproblem by our second order method is comparable or even lower than that in a single iteration of many first order methods.

Consequently, with the fast convergence of the SSN based ALM, we are able to solve many challenging large scale convex optimization problems in big data applications efficiently and robustly.  For the purpose of illustration, we present a highly efficient software called SuiteLasso for solving various well-known Lasso-type problems.

This talk is based on joint work with Xudong Li (Fudan U.) and Defeng Sun (PolyU).

 

简介:

Dr. Toh is a Professor at the Department of Mathematics, National University of Singapore (NUS). He obtained his BSc degree in Mathematics from NUS in 1990 and the PhD degree in Applied Mathematics from Cornell University in 1996.

His current research focuses on designing efficient algorithms and software for convex programming and its applications, particularly large-scale optimization problems arising from data science/machine learning, and large-scale matrix optimization problems such as linear semidefinite programming (SDP) and convex quadratic semidefinite programming (QSDP).

He is currently an Area Editor for Mathematical Programming Computation, an Associate Editor for the SIAM Journal on Optimization, Mathematical Programming, and ACM Transactions on Mathematical Software. He was an invited speaker at numerous conferences and workshops, including the SIAM Annual Meeting in 2010, and International Symposium in Mathematical Programing in 2006. He received the Farkas Prize awarded by the INFORMS Optimization Society in 2017 and the triennial Beale-Orchard Hays Prize awarded by the Mathematical Optimization Society in 2018. He was elected as a Fellow of the Society for Industrial and Applied Mathematics in 2018.