Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 165 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

$d$-imbalance WOM Codes for Reduced Inter-Cell Interference in Multi-Level NVMs (1605.05281v1)

Published 17 May 2016 in cs.IT and math.IT

Abstract: In recent years, due to the spread of multi-level non-volatile memories (NVM), $q$-ary write-once memories (WOM) codes have been extensively studied. By using WOM codes, it is possible to rewrite NVMs $t$ times before erasing the cells. The use of WOM codes enables to improve the performance of the storage device, however, it may also increase errors caused by inter-cell interference (ICI). This work presents WOM codes that restrict the imbalance between code symbols throughout the write sequence, hence decreasing ICI. We first specify the imbalance model as a bound $d$ on the difference between codeword levels. Then a $2$-cell code construction for general $q$ and input size is proposed. An upper bound on the write count is also derived, showing the optimality of the proposed construction. In addition to direct WOM constructions, we derive closed-form optimal write regions for codes constructed with continuous lattices. On the coding side, the proposed codes are shown to be competitive with known codes not adhering to the bounded imbalance constraint. On the memory side, we show how the codes can be deployed within flash wordlines, and quantify their BER advantage using accepted ICI models.

Citations (8)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube