Yale Song, Zhen Wen, et al.
IJCAI 2013
We propose a convex optimization approach to solving the nonparametric regression estimation problem when the underlying regression function is Lipschitz continuous. This approach is based on the minimization of the sum of empirical squared errors, subject to the constraints implied by Lipschitz continuity. The resulting optimization problem has a convex objective function and linear constraints, and as a result, is efficiently solvable. The estimated function computed by this technique, is proven to converge to the underlying regression function uniformly and almost surely, when the sample size grows to infinity, thus providing a very strong form of consistency. We also propose a convex optimization approach to the maximum likelihood estimation of unknown parameters in statistical models, where the parameters depend continuously on some observable input variables. For a number of classical distributional forms, the objective function in the underlying optimization problem is convex and the constraints are linear. These problems are, therefore, also efficiently solvable.
Yale Song, Zhen Wen, et al.
IJCAI 2013
Miao Guo, Yong Tao Pei, et al.
WCITS 2011
Robert Farrell, Rajarshi Das, et al.
AAAI-SS 2010
Fahiem Bacchus, Joseph Y. Halpern, et al.
IJCAI 1995