分享到:
研究生刘雨的论文被ACM MM Asia接收
发布者: 洪晓鹏 | 2021-09-28 | 1293

一篇关于增量学习的研究生一作论文被ACM Multimedia Asia 2021 接收。恭喜刘雨、小语和松林,感谢合作的老师!

 

Structural Knowledge Organization and Transfer for Class-Incremental Learning

 

  • Yu Liu ( Xi'an Jiaotong University ) 
  • Xiaopeng Hong ( Xi'an Jiaotong University ) 
  • Xiaoyu Tao ( Institute of Artificial Intelligence and Robotics, Xi'an Jiaotong University. ) 
  • Songlin Dong ( Xi'an Jiaotong University ) 
  • Jingang Shi ( Xi'an Jiaotong University ) 
  • Yihong Gong ( Xi'an Jiaotong University )

Abstract:

Deep models are vulnerable to catastrophic forgetting when finetuned on new data, which will degrade the performance of old tasks. Popular distillation-based learning methods usually neglect the relations between data samples and may eventually forget essential structural knowledge. To solve these shortcomings, we propose a structural graph knowledge distillation-based incremental learning framework to preserve both the positions of samples and their relations. Firstly, a memory knowledge graph (MKG) is generated to fully characterize the structural knowledge of historical tasks. Secondly, we develop a graph interpolation mechanism to enrich the domain of knowledge and alleviate the inter-class sample imbalance issue. Thirdly, we introduce structural graph knowledge distillation to transfer the knowledge of historical tasks, by penalizing the offset of the vertices and edge weights of MKG on the new model during new-task learning. Comprehensive experiments on CIFAR100, ImageNet-100, and ImageNet-1000 confirm the effectiveness of the proposed method.