Kernel-Based Discriminative Stochastic Neighbor Embedding Analysis
-
Graphical Abstract
-
Abstract
In order to improve the discriminative efficiency and solve the out-of-sample problem which exist in non-linear feature extraction, a kernel-based discriminative stochastic neighbor embedding analysis (KDSNE) method is proposed by imposing the kernel trick, which furthest maintains the observation information and effectively improves the performance of dimensionality reduction.Based on DSNE, the proposed method skillfully introduces kernel function and maps the data into a highdimensional feature space, then it selects the joint probability to model the pairwise similarities of input samples and uses a linear projection matrix to get low-dimensional representations.Moreover, KDSNE chooses the Kullback-Leiber divergence to quantify the proximity of two probability distributions to build the penalty function.KDSNE outstands the feature differences between interclass samples and makes the samples linear separable so as to improve the classification performance. Experimental results on COIL-20, ORL and Yale databases show the discriminative performance of the method.
-
-