Improved Rectangular Matrix Multiplication using Powers of the Coppersmith-Winograd Tensor (1708.05622v2)
Abstract: In the past few years, successive improvements of the asymptotic complexity of square matrix multiplication have been obtained by developing novel methods to analyze the powers of the Coppersmith-Winograd tensor, a basic construction introduced thirty years ago. In this paper we show how to generalize this approach to make progress on the complexity of rectangular matrix multiplication as well, by developing a framework to analyze powers of tensors in an asymmetric way. By applying this methodology to the fourth power of the Coppersmith-Winograd tensor, we succeed in improving the complexity of rectangular matrix multiplication. Let $\alpha$ denote the maximum value such that the product of an $n\times n\alpha$ matrix by an $n\alpha\times n$ matrix can be computed with $O(n{2+\epsilon})$ arithmetic operations for any $\epsilon>0$. By analyzing the fourth power of the Coppersmith-Winograd tensor using our methods, we obtain the new lower bound $\alpha>0.31389$, which improves the previous lower bound $\alpha>0.30298$ obtained five years ago by Le Gall (FOCS'12) from the analysis of the second power of the Coppersmith-Winograd tensor. More generally, we give faster algorithms computing the product of an $n\times nk$ matrix by an $nk\times n$ matrix for any value $k\neq 1$. (In the case $k=1$, we recover the bounds recently obtained for square matrix multiplication). These improvements immediately lead to improvements in the complexity of a multitude of fundamental problems for which the bottleneck is rectangular matrix multiplication, such as computing the all-pair shortest paths in directed graphs with bounded weights.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.