About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
INFORMS 2022
Invited talk
Robust Multi-objective Bilevel Optimization With Applications In Machine Learning
Abstract
We consider the generic min-max form of a multi-objective bilevel nonconvex optimization problem where we have (i) n objectives at both the levels, (ii) the upper level variable is shared across all objectives at both the levels, and (iii) every lower level variable is limited to one objective in each of the upper and lower levels. Such a problem appears in various machine learning applications such as representation learning and hyperparameter optimization. We propose a gradient descent-ascent based single-loop two-timescale algorithm, building upon recent single objective bilevel optimization schemes. Our theoretical analyses show that this algorithm converges to the first-order stationary point at a rate of O(√n T-2/5) for a class of nonconvex problems, where n is the number of objectives at each level, and T is the number of algorithm iterations.