论文标题

通过动态沟通图的个性化分散的多任务学习

Personalized Decentralized Multi-Task Learning Over Dynamic Communication Graphs

论文作者

Mortaheb, Matin, Ulukus, Sennur

论文摘要

分散和联合学习算法面临数据异质性是最大的挑战之一,尤其是当用户想学习特定任务时。即使将个性化的标头使用与共享网络(PF-MTL)连接在一起,通过数据中的异质性,通过分散算法汇总所有网络也可能导致性能下降。我们的算法使用交换的梯度来自动计算任务之间的相关性,并动态调整通信图以连接互惠互利的任务并隔离可能会对彼此产生负面影响的任务。与所有客户彼此相关的情况相比,该算法改善了学习成绩,并导致更快的融合。我们对合成高斯数据集和大规模名人属性(Celeba)数据集进行实验。使用合成数据的实验表明,我们提出的方法能够检测到正相关和负相关的任务。此外,使用Celeba实验的结果表明,所提出的方法可能比完全连接的网络产生更快的训练结果。

Decentralized and federated learning algorithms face data heterogeneity as one of the biggest challenges, especially when users want to learn a specific task. Even when personalized headers are used concatenated to a shared network (PF-MTL), aggregating all the networks with a decentralized algorithm can result in performance degradation as a result of heterogeneity in the data. Our algorithm uses exchanged gradients to calculate the correlations among tasks automatically, and dynamically adjusts the communication graph to connect mutually beneficial tasks and isolate those that may negatively impact each other. This algorithm improves the learning performance and leads to faster convergence compared to the case where all clients are connected to each other regardless of their correlations. We conduct experiments on a synthetic Gaussian dataset and a large-scale celebrity attributes (CelebA) dataset. The experiment with the synthetic data illustrates that our proposed method is capable of detecting tasks that are positively and negatively correlated. Moreover, the results of the experiments with CelebA demonstrate that the proposed method may produce significantly faster training results than fully-connected networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源