About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Conference paper
Transform regression and the Kolmogorov superposition theorem
Abstract
This paper presents a new predictive modeling algorithm that draws inspiration from the Kolmogorov superposition theorem. An initial version of the algorithm is presented that combines gradient boosting, generalized additive models, and decision-tree methods to construct models that have the same overall mathematical structure as Kolmogorov's superposition equation. Improvements to the algorithm are then presented that significantly increase its rate of convergence. The resulting algorithm, dubbed "transform regression," generates surprisingly good models compared to those produced by the underlying decision-tree method when the latter is applied directly. Transform regression is highly scalable and a parallelized database-embedded version of the algorithm has been implemented as part of IBM DB2 Intelligent Miner Modeling.