高级检索

基于自监督与蒸馏约束的正则化类增量学习方法

A Regularized Class Incremental Learning Method Based on Self-Supervision with Distillation Constraints

  • 摘要: 针对神经网络模型在增量学习中存在灾难性遗忘问题, 提出一种基于自监督与隐层蒸馏约束的正则化类增量学习方法, 包括自监督伪标签预测、隐层蒸馏约束和参数正则化. 首先基于贝叶斯和信息论提出一种对模型参数重要性评价的正则化策略; 然后利用自监督伪标签预测增强模型的表征能力, 并保留隐层特征, 通过加入高斯噪声提高隐层特征的泛化能力; 最后使用蒸馏约束方法与交叉熵分类损失对历史任务的隐层特征与输出层特征进行训练. 在CIFAR-10和CIFAR-100数据集上的实验结果表明, 所提方法取得较好的效果, 其中, 在CIFAR-100数据集上的平均准确率和遗忘率分别达到64.16%和15.95%; 该方法能够有效地减少灾难性遗忘的影响.

     

    Abstract: Aiming at the problem of catastrophic forgetting in incremental learning of neural network models, a regularized class of incremental learning method based on self-supervision with hidden layer distillation constraints is proposed, including pseudo label prediction, knowledge distillation and parameter regularization. First, a regularization constraint method based on Bayesian and information theory is proposed for the importance evaluation of model parameters, and then the characterization ability of the model is enhanced by using self-supervised pseudo label prediction, and the hidden layer features are preserved by adding Gaussian noise to improve the generalization ability of the features. The hidden layer features and output layer features of the historical task are trained using a distillation constraint method with cross-entropy classification loss. The experimental results show that better results are achieved on the CIFAR-10 and CIFAR-100 datasets, where the average accuracy and forgetting rates reach 64.16% and 15.95%, respectively, on the CIFAR-100 dataset. The proposed method is effective in reducing the effects of catastrophic forgetting.

     

/

返回文章
返回