分享到:
喜讯: 祝贺孙菱博士的论文被AAAI2023录用,这也是一年内第二次入选AAAI,热烈祝贺
发布者: 饶元 | 2022-11-20 | 5516

喜讯: AAAI2023录用结果已公布,祝贺我们实验室的博士生孙菱同学的论文“HG-SL: Jointly Learning of Global and Local User Spreading Behavior for Fake News Early Detection” 被AAAI2023录用,这也是一年内第二次入选AAAI,热烈祝贺。

 

      邮件显示,今年投稿共有8777篇,而正式录用1721篇,录用率19.6%,去年AAAI2022录用率为15%。能够再次被录用,也充分表明信息传播与预测领域的工作已成为研究的一个新热点,同时利用传播与行为特征来进行早期的虚假检测,则成为目前认识对抗领域的一个重要的手段。

 

 

=================

附论文信息:

HG-SL: Jointly Learning of Global and Local User Spreading Behavior for Fake News Early Detection

 

Abstract:  Recently, fake news forgery technology has become more and more sophisticated, and even the profiles of participants may be faked, which challenges the robustness and effectiveness of traditional detection methods involving text or user identity. Most propagation-only approaches mainly rely on neural networks to learn the diffusion pattern of individual news, which is insufficient to describe the differences in news spread ability, and also ignores the valuable global connections of news and users, limiting the performance of detection. Therefore, we propose a joint learning model named HG-SL, which is blind to news content and user identities, but capable of catching the differences between true and fake news in the early stages of propagation through global and local user spreading behavior. Specifically, we innovatively design a Hypergraph-based Global interaction learning module to capture the global preferences of users from their co-spreading relationships, and introduce node centrality encoding to complement user influence in hypergraph learning. Moreover, the designed Self-attention-based Local context learning module first introduce spread status to highlight the propagation ability of news and users, thus providing additional signals for verifying news authenticity. Experiments on real-world datasets indicate that our HG-SL, which solely relies on user behavior, outperforms SOTA baselines utilizing multidimensional features in both fake news detection and early detection task.