Main Menu
— Event —

【Academic Seminar】A stochastic semismooth Newton method for nonsmooth nonconvex optimization

  • 2018.08.23
  • Event
A stochastic semismooth Newton method for nonsmooth nonconvex optimization

Title:A stochastic semismooth Newton method for nonsmooth nonconvex optimization

Speaker:Andre Milzarek, Peking University

Time and Date: 14:30 -15:30, Aug 28, 2018

Venue: Boardroom, Dao Yuan Building

 

Abstract:

In this talk, we present a globalized semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. The class of problems that can be solved within our algorithmic framework comprises a large variety of applications such as l1-logistic regression, structured dictionary learning, and other minimization problems arising in machine learning, statistics, or image processing. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first- and second-order oracles. Our approach utilizes approximate second order information and stochastic semismooth Newton steps for a prox-type fixed-point equation, representing the associated optimality conditions, to accelerate the basic stochastic proximal gradient method for convex composite programming. Inexact growth conditions are introduced to monitor the quality and acceptance of the Newton steps and to combine the two different methods. We prove that the proposed algorithm converges globally to stationary points in expectation and almost surely. Moreover, under standard assumptions, the method can be shown to locally turn into a pure semismooth Newton method and fast local convergence can be established with high probability. Finally, we provide numerical experiments illustrating the efficiency of the stochastic semismooth Newton method. 

Biography:

Image result for Andre Milzarek

Andre Milzarek received his master’s degree with honours and his doctoral degree in mathematics from the Technical University of Munich in Germany under the supervision of Michael Ulbrich in 2013 and 2016, respectively. Currently, he is a postdoctoral researcher at the Beijing International Center for Mathematical Research at the Peking University in Beijing. His main research directions and interests cover nonsmooth optimization, large-scale and stochastic optimization, second order methods and theory. From 2010 to 2012 he was supported by the Max-Weber program of the state of Bavaria and in 2017 he received the Boya Postdoctoral Fellowship at Peking University. He published papers in SIAM Journal on Optimization, SIAM Journal on Scientific Computing and gave talks at various important international conferences, including EURO, SIAM and AIMS Conferences.