Presenter:
                      Eric
              Chi
      
  University:
                      Rice University
              Program:
                      CSGF
              Year:
                      2010
              We present a robust variant of regularized maximum likelihood methods for classification problems in high dimensional data. L1 regularized model fitting has inspired many approaches that simultaneously do model fitting and variable selection. If parametric models are employed, typically some form of regularized maximum likelihood estimation is done. While this is an asymptotically efficient procedure under very general conditions, it is not robust. In contrast, minimizing the integrated square error, while less efficient, proves to be robust to a fair amount of contamination. We discuss an iterative approach to fitting logistic models using this alternative criterion under the elastic net penalty.
Program Review:
                      
              