Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 161 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 120 tok/s Pro
Kimi K2 142 tok/s Pro
GPT OSS 120B 433 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Local List Recovery of High-rate Tensor Codes and Applications (1706.03383v1)

Published 11 Jun 2017 in cs.IT and math.IT

Abstract: In this work, we give the first construction of high-rate locally list-recoverable codes. List-recovery has been an extremely useful building block in coding theory, and our motivation is to use these codes as such a building block. In particular, our construction gives the first capacity-achieving locally list-decodable codes (over constant-sized alphabet); the first capacity achieving globally list-decodable codes with nearly linear time list decoding algorithm (once more, over constant-sized alphabet); and a randomized construction of binary codes on the Gilbert-Varshamov bound that can be uniquely decoded in near-linear-time, with higher rate than was previously known. Our techniques are actually quite simple, and are inspired by an approach of Gopalan, Guruswami, and Raghavendra (Siam Journal on Computing, 2011) for list-decoding tensor codes. We show that tensor powers of (globally) list-recoverable codes are "approximately" locally list-recoverable, and that the "approximately" modifier may be removed by pre-encoding the message with a suitable locally decodable code. Instantiating this with known constructions of high-rate globally list-recoverable codes and high-rate locally decodable codes finishes the construction.

Citations (25)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.