论文标题
通过选择性跨越人际关系转移的联合文本分类的持续学习
Federated Continual Learning for Text Classification via Selective Inter-client Transfer
论文作者
论文摘要
在这项工作中,我们结合了两个范式:在云边缘连续体中进行文本分类任务的联合学习(FL)和持续学习(CL)。联合持续学习(FCL)的目的是通过(相关和高效的)知识转移而无需共享数据来改善每个客户的深度学习模型。在这里,我们解决了最小化偏离跨越干扰的挑战,而由于FCL设置中客户端的异质任务而引起的知识共享。在此过程中,我们提出了一个新颖的框架,联合的选择性互惠转移(FedSeit),该转移有选择地结合了外国客户的模型参数。为了进一步最大化知识转移,我们评估域重叠,并从每个外国客户的历史任务顺序中选择信息的任务,同时保留隐私。评估基线,我们显示出改进的性能,使用来自不同域中的五个数据集,文本分类中的(平均)12.4%\%。据我们所知,这是将FCL应用于NLP的第一项工作。
In this work, we combine the two paradigms: Federated Learning (FL) and Continual Learning (CL) for text classification task in cloud-edge continuum. The objective of Federated Continual Learning (FCL) is to improve deep learning models over life time at each client by (relevant and efficient) knowledge transfer without sharing data. Here, we address challenges in minimizing inter-client interference while knowledge sharing due to heterogeneous tasks across clients in FCL setup. In doing so, we propose a novel framework, Federated Selective Inter-client Transfer (FedSeIT) which selectively combines model parameters of foreign clients. To further maximize knowledge transfer, we assess domain overlap and select informative tasks from the sequence of historical tasks at each foreign client while preserving privacy. Evaluating against the baselines, we show improved performance, a gain of (average) 12.4\% in text classification over a sequence of tasks using five datasets from diverse domains. To the best of our knowledge, this is the first work that applies FCL to NLP.