AIML Special Presentation: Convergence and Asymptotic Optimality of the Heavy Ball Method and Its Relatives

Abstract: In this talk we first aim to shed light on the urban legend surrounding the ‘complexity lower bound’ for the heavy ball algorithm. Second, we revisit the original heavy-ball algorithm proposed by Polyak and provide a conditions for it to be globally converging, and provide step-size rules. Then, we investigate the performance of a related algorithm dubbed the Accelerated Generalised Gradient Method and see how how it can be beneficial in the case of tracking the minimum of a time-varying function.

Iman Shames is a Professor of Mechatronics and the Mechatronics Cluster lead as well as the CIICADA Lab director at the School of Engineering, the Australian National University. Previously, he had been an Associate Professor at the Department of Electrical and ElectronicEngineering, the University of Melbourne from 2014 to 2020 and a Senior Lecturer and a McKenzie fellow at the same department from 2012 to 2014, and before that he was an ACCESS Postdoctoral Researcher at the ACCESS Linnaeus Centre, the KTH Royal Institute of Technology, Stockholm, Sweden. He received his B.Sc. degree in Electrical Engineering from Shiraz University in 2006, and the Ph.D. degree in engineering from the Australian National University, Canberra, Australia in 2011. His current research interests include, but are not limited to, decision making for dynamical systems under uncertainty, optimisation theory and its application in control and estimation, and mathematical systems theory of cyber-physical systems.

Iman Shames

Professor Iman Shames speaks before the AIML Community

Tagged in mechatronics, algorithms