Emergent Mind

Packet Forwarding with a Locally Bursty Adversary

(2208.09522)
Published Aug 19, 2022 in cs.DC and cs.DS

Abstract

We consider packet forwarding in the adversarial queueing theory (AQT) model introduced by Borodin et al. We introduce a refinement of the AQT $(\rho, \sigma)$-bounded adversary, which we call a \emph{locally bursty adversary} (LBA) that parameterizes injection patterns jointly by edge utilization and packet origin. For constant ($O(1)$) parameters, the LBA model is strictly more permissive than the $(\rho, \sigma)$ model. For example, there are injection patterns in the LBA model with constant parameters that can only be realized as $(\rho, \sigma)$-bounded injection patterns with $\rho + \sigma = \Omega(n)$ (where $n$ is the network size). We show that the LBA model (unlike the $(\rho, \sigma)$ model) is closed under packet bundling and discretization operations. Thus, the LBA model allows one to reduce the study of general (uniform) capacity networks and inhomogenous packet sizes to unit capacity networks with homogeneous packets. On the algorithmic side, we focus on information gathering networks -- i.e., networks in which all packets share a common destination, and the union of packet routes forms a tree. We show that the Odd-Even Downhill (OED) forwarding protocol described independently by Dobrev et al.\ and Patt-Shamir and Rosenbaum achieves buffer space usage of $O(\log n)$ against all LBAs with constant parameters. OED is a local protocol, but we show that the upper bound is tight even when compared to centralized protocols. Our lower bound for the LBA model is in contrast to the $(\rho, \sigma)$-model, where centralized protocols can achieve worst-case buffer space usage $O(1)$ for $\rho, \sigma = O(1)$, while the $O(\log n)$ upper bound for OED is optimal only for local protocols.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.