论文标题

负载平衡政策与记忆的绩效分析

Performance Analysis of Load Balancing Policies with Memory

论文作者

Hellemans, Tim, Van Houdt, Benny

论文摘要

在随机选择的$ d $中加入最短或最少的队列是两个基本的负载平衡策略。在这两个策略下,调度程序都不保留有关服务器队列长度或负载的任何信息。在本文中,当调度员有一些内存可存储某些空闲服务器的ID时,我们将分析这些策略的性能。我们考虑调度员发现空闲服务器的方法以及空闲服务器将其状态告知调度员的方法。 我们专注于大规模系统,我们的分析使用了空腔方法。提供的主要见解是,通过腔量平衡策略获得的性能度量{\ it}内存{\ it}记忆将相同策略的性能度量降低到没有}内存的同一策略{\ it}的性能度量,前提是到达率适当地缩放。因此,我们可以以与没有内存的负载平衡器相同的方式研究带有内存的负载平衡器的性能。特别是,这需要封闭的表单解决方案,以在$ d $选择的队列中加入最短或最小加载的队列,以便在指数级的工作尺寸的情况下随机选择的队列。此外,随着系统趋于不稳定,我们为(缩放)预期等待时间获得了简单的封闭形式表达式。 我们提出了模拟结果,以支持我们的信念,即随着服务器数量倾向于无限,通过空腔方法获得的近似变得精确。

Joining the shortest or least loaded queue among $d$ randomly selected queues are two fundamental load balancing policies. Under both policies the dispatcher does not maintain any information on the queue length or load of the servers. In this paper we analyze the performance of these policies when the dispatcher has some memory available to store the ids of some of the idle servers. We consider methods where the dispatcher discovers idle servers as well as methods where idle servers inform the dispatcher about their state. We focus on large-scale systems and our analysis uses the cavity method. The main insight provided is that the performance measures obtained via the cavity method for a load balancing policy {\it with} memory reduce to the performance measures for the same policy {\it without} memory provided that the arrival rate is properly scaled. Thus, we can study the performance of load balancers with memory in the same manner as load balancers without memory. In particular this entails closed form solutions for joining the shortest or least loaded queue among $d$ randomly selected queues with memory in case of exponential job sizes. Moreover, we obtain a simple closed form expression for the (scaled) expected waiting time as the system tends towards instability. We present simulation results that support our belief that the approximation obtained by the cavity method becomes exact as the number of servers tends to infinity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源