Emergent Mind

Abstract

Compared with classical block codes, efficient list decoding of rank-metric codes seems more difficult. Although the list decodability of random rank-metric codes and limits to list decodability have been completely determined, little work on efficient list decoding rank-metric codes has been done. The only known efficient list decoding of rank-metric codes $\mC$ gives decoding radius up to the Singleton bound $1-R-\Ge$ with positive rate $R$ when $\rho(\mC)$ is extremely small, i.e., $\Theta(\Ge2)$ , where $\rho(\mC)$ denotes the ratio of the number of rows over the number of columns of $\mC$ \cite[STOC2013]{Guru2013}. It is commonly believed that list decoding of rank-metric codes $\mC$ with not small constant ratio $\rho(\mC)$ is hard. The main purpose of the present paper is to explicitly construct a class of rank-metric codes $\mC$ with not small constant ratio $\rho(\mC)$ and efficiently list decode these codes with decoding radius beyond $(1-R)/2$. Our key idea is to employ two-variable polynomials $f(x,y)$, where $f$ is linearized in variable $x$ and the variable $y$ is used to "fold" the code. In other words, rows are used to correct rank errors and columns are used to "fold" the code to enlarge decoding radius. Apart from the above algebraic technique, we have to prune down the list. The algebraic idea enables us to pin down the messages into a structured subspace of dimension linear in the number $n$ of columns. This "periodic" structure allows us to pre-encoding the messages to prune down the list. More precisely, we use subspace design introduced in \cite[STOC2013]{Guru2013} to get a deterministic algorithm with a larger constant list size and employ hierarchical subspace-evasive sets introduced in \cite[STOC2012]{Guru2012} to obtain a randomized algorithm with a smaller constant list size.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.