Quick Minimization of Tardy Processing Time on a Single Machine (2301.05460v2)
Abstract: We consider the problem of minimizing the total processing time of tardy jobs on a single machine. This is a classical scheduling problem, first considered by [Lawler and Moore 1969], that also generalizes the Subset Sum problem. Recently, it was shown that this problem can be solved efficiently by computing $(\max,\min)$-skewed-convolutions. The running time of the resulting algorithm is equivalent, up to logarithmic factors, to the time it takes to compute a $(\max,\min)$-skewed-convolution of two vectors of integers whose sum is $O(P)$, where $P$ is the sum of the jobs' processing times. We further improve the running time of the minimum tardy processing time computation by introducing a job ``bundling'' technique and achieve a $\tilde{O}\left(P{2-1/\alpha}\right)$ running time, where $\tilde{O}\left(P\alpha\right)$ is the running time of a $(\max,\min)$-skewed-convolution of vectors of size $P$. This results in a $\tilde{O}\left(P{7/5}\right)$ time algorithm for tardy processing time minimization, an improvement over the previously known $\tilde{O}\left(P{5/3}\right)$ time algorithm.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.