Advanced Search
Wang Xiuyou, Liu Huaming, Fan Jianzhong, Xu Dongqing. Effective Covariance Discriminative Learning Algorithm[J]. Journal of Computer-Aided Design & Computer Graphics, 2019, 31(10): 1847-1857. DOI: 10.3724/SP.J.1089.2019.17455
Citation: Wang Xiuyou, Liu Huaming, Fan Jianzhong, Xu Dongqing. Effective Covariance Discriminative Learning Algorithm[J]. Journal of Computer-Aided Design & Computer Graphics, 2019, 31(10): 1847-1857. DOI: 10.3724/SP.J.1089.2019.17455

Effective Covariance Discriminative Learning Algorithm

  • In the field of video-based image set classification, intra-class diversity is one of the main factors affecting the classification performance of image set classification algorithms. To seek a possible way to tackle this problem, this paper presents a novel image set classification algorithm, and our goals lie in two aspects. The first is to make the proposed algorithm have a certain improvement in time efficiency compared to some representative image set classification methods such as covariance discriminant learning(CDL), and the second is to make the proposed algorithm still comparable in classification performance. In order to compact the original covariance features, the two-directional two-dimensional principal component analysis((2 D)2 PCA) is firstly exploited to perform dimensionality reduction. Then, the QR decomposition technique is applied to convert each into a nonsingular upper triangular matrix and an orthonormal basis matrix for the sake of extracting more discriminative feature information. Taking the Riemannian manifold property into account, a function is defined to make the nonsingular upper triangular matrix still reside on the symmetric positive definite(SPD) manifold which spanned by a set of SPD matrices. As a result, the original sample space is now transformed into a lower dimensional and more compact SPD manifold and a Grassmann manifold, respectively. To better integrate these two types of generated Riemannian manifold-valued features, a new Riemannian manifold geodesic distance metric is first induced by making use of Stein divergence and log-Euclidean distance(LED). Afterwards, a positive definite kernel is devised to embed these features into a high dimensional Hilbert space. Finally, the kernel discriminant analysis(KDA) algorithm is utilized to perform discriminant subspace feature learning. The proposed algorithm is evaluated on several benchmark video datasets. Extensive experimental results demonstrate its superiority in terms of classification result and computation time.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return