### Gert R. G. Lanckriet, Laurent El Ghaoui, Chiranjib Bhattacharyya and Michael I. Jordan

###
EECS Department

University of California, Berkeley

Technical Report No. UCB/CSD-02-1218

December 2002

### http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/CSD-02-1218.pdf

When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the class-conditional distributions. Misclassification probabilities are then controlled in a worst-case setting: that is, under all possible choices of class-conditional densities with given mean and covariance matrix, we minimize the worst-case (maximum) probability of misclassification of future data points. For a linear decision boundary, this desideratum is translated in a very direct way into a (convex) second order cone optimization problem, with complexity similar to a support vector machine problem. The minimax problem can be interpreted geometrically as minimizing the maximum of the Mahalanobis distances to the two classes. We address the issue of robustness with respect to estimation errors (in the means and covariances of the classes) via a simple modification of the input data. We also show how to exploit Mercer kernels in this setting to obtain nonlinear decision boundaries, yielding a classifier which proves to be competitive with current methods, including support vector machines. An important feature of this method is that a worst-case bound on the probability of misclassification of future data is always obtained explicitly.

BibTeX citation:

@techreport{Lanckriet:CSD-02-1218, Author = {Lanckriet, Gert R. G. and El Ghaoui, Laurent and Bhattacharyya, Chiranjib and Jordan, Michael I.}, Title = {A Robust Minimax Approach to Classification}, Institution = {EECS Department, University of California, Berkeley}, Year = {2002}, Month = {Dec}, URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/5337.html}, Number = {UCB/CSD-02-1218}, Abstract = {When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the class-conditional distributions. Misclassification probabilities are then controlled in a worst-case setting: that is, under all possible choices of class-conditional densities with given mean and covariance matrix, we minimize the worst-case (maximum) probability of misclassification of future data points. For a linear decision boundary, this desideratum is translated in a very direct way into a (convex) second order cone optimization problem, with complexity similar to a support vector machine problem. The minimax problem can be interpreted geometrically as minimizing the maximum of the Mahalanobis distances to the two classes. We address the issue of robustness with respect to estimation errors (in the means and covariances of the classes) via a simple modification of the input data. We also show how to exploit Mercer kernels in this setting to obtain nonlinear decision boundaries, yielding a classifier which proves to be competitive with current methods, including support vector machines. An important feature of this method is that a worst-case bound on the probability of misclassification of future data is always obtained explicitly.} }

EndNote citation:

%0 Report %A Lanckriet, Gert R. G. %A El Ghaoui, Laurent %A Bhattacharyya, Chiranjib %A Jordan, Michael I. %T A Robust Minimax Approach to Classification %I EECS Department, University of California, Berkeley %D 2002 %@ UCB/CSD-02-1218 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/5337.html %F Lanckriet:CSD-02-1218