UNI-MB - logo
UMNIK - logo
 
E-viri
Recenzirano Odprti dostop
  • Class specific or shared? A...
    Wang, Yan-Jiang; Shao, Shuai; Xu, Rui; Liu, Weifeng; Liu, Bao-Di

    Signal processing, November 2020, 2020-11-00, Letnik: 176
    Journal Article

    •Propose a novel class shared dictionary learning method named label embedded dictionary learning (LEDL).•Propose a novel dictionary learning framework named cascaded dictionary learning framework (CDLF).•Propose to utilize the alternating direction method of multipliers algorithm and blockwise coordinate descent algorithm to optimize each layer of a dictionary learning task.•The proposed LEDL and CDLF methods achieve superior performance on six benchmark datasets. Dictionary learning methods can be split into: i) class specific dictionary learning ii) class shared dictionary learning. The difference between the two categories is how to use discriminative information. With the first category, samples of different classes are mapped into different subspaces, which leads to some redundancy with the class specific base vectors. While for the second category, the samples in each specific class can not be described accurately. In this paper, we first propose a novel class shared dictionary learning method named label embedded dictionary learning (LEDL). It is the improvement based on LCKSVD, which is easier to find out the optimal solution. Then we propose a novel framework named cascaded dictionary learning framework (CDLF) to combine the specific dictionary learning with shared dictionary learning to describe the feature to boost the performance of classification sufficiently. Extensive experimental results on six benchmark datasets illustrate that our methods are capable of achieving superior performance compared to several state-of-art classification algorithms.