Jorge Nocedal is the Walter P. Murphy Professor of Industrial Engineering and Management Sciences at Northwestern University and the Director of the Optimization Center at IEMS Department, Northwestern University.
He holds a Ph.D. in Mathematical Sciences from Rice University, 1978 and a B. Sc. in Physics, National University of Mexico, 1974. He was awarded the George B. Dantzig Prize by the Mathematical Optimization Society and SIAM in 2012. His current research interests are in optimization and its application in machine learning and in disciplines involving differential equations. He specializes in nonlinear optimization, both convex and non-convex; deterministic and stochastic, with particular interest on solving ever larger optimization problems. He has contributed to the development of algorithms that scale well with the number of variables, make judicious use of second-order information, and parallelize well. The motivation for his current algorithmic and theoretical research stems from applications in image and speech recognition, recommendation systems, and search engines. Prof. Nocedal has served as editor and associate editor of some of the most prestigious journals in the field, like Mathematical Programming and SIAM Journal on Optimization. He is Author of the book Numerical Optimization, with Stephen Wright, which is widely used in Optimization courses worldwide.
Stochastic Gradient Methods for Machine Learning
This talk provides an overview of recent advances in stochastic gradient methods for large-scale machine learning. We begin by summarizing the main convergence results for the convex and nonconvex cases, as well as computational complexity bounds. We then consider new variants of the method that reduce the variance of the search direction. We conclude with a discussion of some key issues arising in distributed computing implementations.