Alexandre TERMIER
Prof Takashi Washio
Submodular maximization is a discrete optimization problem to maximize a set function. This problem widely appears in various problems in computer science, operation research, economics and their applications. For example, feature selection which is a highly important task in machine learning is a typical problem of the submodular maximization. Prediction error of a learner is known to be a submodular function of a feature subset where improvement of the error is less when the size of the selected feature subset gets larger, and thus the task to derive the globally optimum feature set is a submodular optimization problem which is NP-hard. In this talk, we explain the framework of this problem and introduce our recent study on the global maximization of the submodular function which uses continuous relaxation and cutting plain techniques.
Bibliography Takashi Washio is a professor in Department of Reasoning for Intelligence at the Institute of Scientific and Industrial Research of Osaka University. He received Bs, Ms, and PhD in nuclear engineering of Tohoku University in Japan. He had been a visiting researcher of Nuclear Reactor Laboratory of Massachusetts Institute of Technology (MIT) from 1988 to 1990 and a researcher in Mitsubishi Research Institute (MRI) from 1990 to 1996. His research interest moved to general technique of data mining and machine learning from data analyses in nuclear engineering during his stay in MRI. He joined to the institute of Osaka University to be an associate professor in 1996, and promoted to be a full professor in 2006. His current interests are theories and algorithms for identification of data generation process behind given data and machine learning techniques for high dimensional data.