About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICML 2020
Workshop paper
Chi-square Information for Invariant Learning
Abstract
Invariant learning aims to train models robust to nuisance confounding that may be present in the data. This is typically achieved by minimizing some measure of dependence between learned representations or predictions and confounding factors. However, accurate estimation as well as reliable minimization of typically used dependence measures can be challenging. Chi square divergence based dependence measure has recently been found effective in enforcing fairness through learning invariant representations. We show that with an appropriate parameterization, this choice both improves dependence estimation quality and simplifies its minimization. Empirically, we find that our proposal is effective at fair predictor learning and domain generalization.