菜单总览
— 活动 —

Academic Seminar | R-Local Minimum, Global Optimum, and Run-and-Inspect Method

  • 2018.06.19
  • 活动
You are cordially invited to the seminar to be delivered by Prof. Wotao YIN of University of California, Los Angeles (UCLA) from 11:00am to 12:00pm on Tuesday, June 19, 2018.

 

v  Topic: R-Local Minimum, Global Optimum, and Run-and-Inspect Method

v  Time & Date: 11:00am-12:00pm, Tuesday, June 19, 2018

v  Venue: Boardroom, Dao Yuan Building

v  Speaker: Prof. Wotao YIN, University of California, Los Angeles (UCLA)

Abstract: Many optimization algorithms provably converge to stationary points. When the underlying problem is nonconvex, those algorithms may get trapped at local minimizers or stagnate near saddle points. We show that, when the nonconvex objective function is a sum of a convex function and perturbation, its “R-local minimizer” with a sufficient radius R is globally optimal. In other words, this class of problems is simpler than general nonconvex minimization. We propose the Run-and-Inspect Method, which adds an “inspection” step to existing algorithms to escape from local minimizers and stationary points. The “inspection” step either finds a sufficient descent or certifies that the current point is an “approximate R-local minimizer” with bounds to the global minimum. Simple inspection procedures are developed for nonconvex compressed sensing, clustering, and other problems. This is joint work with Yifan Chen (Tsinghua) and Yuejiao Sun (UCLA)..

Biography: Wotao Yin received his Ph.D. degree in operations research from Columbia University in 2006. He is currently a Professor with the Department of Mathematics, University of California, Los Angeles. His research interests include computational optimization and its applications in signal processing, machine learning, and other data science problems. During 2006–2013, he was at Rice University. He was the NSF CAREER award in 2008, the Alfred P. Sloan Research Fellowship in 2009, and the Morningside Gold Medal in 2016, and has coauthored five papers receiving best paper-type awards. He co-invented fast algorithms for sparse optimization and has been working on optimization algorithms for nonconvex and large-scale problems.