Derivative-Free Optimization via Classification

Authors

  • Yang Yu Nanjing University
  • Hong Qian Nanjing University
  • Yi-Qi Hu Nanjing University

DOI:

https://doi.org/10.1609/aaai.v30i1.10289

Keywords:

randomized optimization, non-convex optimization, time complexity, evolutionary algorithm

Abstract

Many randomized heuristic derivative-free optimization methods share a framework that iteratively learns a model for promising search areas and samples solutions from the model. This paper studies a particular setting of such framework, where the model is implemented by a classification model discriminating good solutions from bad ones. This setting allows a general theoretical characterization, where critical factors to the optimization are discovered. We also prove that optimization problems with Local Lipschitz continuity can be solved in polynomial time by proper configurations of this framework. Following the critical factors, we propose the randomized coordinate shrinking classification algorithm to learn the model, forming the RACOS algorithm, for optimization in continuous and discrete domains. Experiments on the testing functions as well as on the machine learning tasks including spectral clustering and classification with Ramp loss demonstrate the effectiveness of RACOS.

Downloads

Published

2016-03-02

How to Cite

Yu, Y., Qian, H., & Hu, Y.-Q. (2016). Derivative-Free Optimization via Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10289

Issue

Section

Technical Papers: Machine Learning Methods