Publication
INFORMS 2021
Talk

Black-box Optimization for Optimizing Expensive Functions with Mixed Inputs

View publication

Abstract

We propose a deep neural network-based optimization for minimizing expensive black-box functions with mixed categorical-continuous inputs and linear constraints. We use a ReLU deep neural network to get a surrogate model from the historical data. To overcome the non-smoothness and bad local minimum of the training problem, a smoothed DNN optimized by a second-order optimization method is utilized. A new sample is obtained by solving a linearized version of the DNN surrogate model.

Date

Publication

INFORMS 2021

Authors

Share