Akademska digitalna zbirka SLovenije - logo
E-viri
Recenzirano Odprti dostop
  • DC approximation approaches...
    Le Thi, H.A.; Pham Dinh, T.; Le, H.M.; Vo, X.T.

    European journal of operational research, 07/2015, Letnik: 244, Številka: 1
    Journal Article

    •A unifying DC approximation, including all standard approximations, of the zero-norm is proposed.•The consistency between global/local minima of approximate and original problems are proved.•The equivalence between approximate and original problems are established for some approximations.•Four DCA schemes are developed that cover all standard nonconvex approximation algorithms.•A careful empirical experiment for feature selection in SVM are performed. Sparse optimization refers to an optimization problem involving the zero-norm in objective or constraints. In this paper, nonconvex approximation approaches for sparse optimization have been studied with a unifying point of view in DC (Difference of Convex functions) programming framework. Considering a common DC approximation of the zero-norm including all standard sparse inducing penalty functions, we studied the consistency between global minimums (resp. local minimums) of approximate and original problems. We showed that, in several cases, some global minimizers (resp. local minimizers) of the approximate problem are also those of the original problem. Using exact penalty techniques in DC programming, we proved stronger results for some particular approximations, namely, the approximate problem, with suitable parameters, is equivalent to the original problem. The efficiency of several sparse inducing penalty functions have been fully analyzed. Four DCA (DC Algorithm) schemes were developed that cover all standard algorithms in nonconvex sparse approximation approaches as special versions. They can be viewed as, an ℓ1-perturbed algorithm/reweighted-ℓ1 algorithm / reweighted-ℓ2 algorithm. We offer a unifying nonconvex approximation approach, with solid theoretical tools as well as efficient algorithms based on DC programming and DCA, to tackle the zero-norm and sparse optimization. As an application, we implemented our methods for the feature selection in SVM (Support Vector Machine) problem and performed empirical comparative numerical experiments on the proposed algorithms with various approximation functions.