Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 34 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 80 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 461 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

$d$-imbalance WOM Codes for Reduced Inter-Cell Interference in Multi-Level NVMs (1605.05281v1)

Published 17 May 2016 in cs.IT and math.IT

Abstract: In recent years, due to the spread of multi-level non-volatile memories (NVM), $q$-ary write-once memories (WOM) codes have been extensively studied. By using WOM codes, it is possible to rewrite NVMs $t$ times before erasing the cells. The use of WOM codes enables to improve the performance of the storage device, however, it may also increase errors caused by inter-cell interference (ICI). This work presents WOM codes that restrict the imbalance between code symbols throughout the write sequence, hence decreasing ICI. We first specify the imbalance model as a bound $d$ on the difference between codeword levels. Then a $2$-cell code construction for general $q$ and input size is proposed. An upper bound on the write count is also derived, showing the optimality of the proposed construction. In addition to direct WOM constructions, we derive closed-form optimal write regions for codes constructed with continuous lattices. On the coding side, the proposed codes are shown to be competitive with known codes not adhering to the bounded imbalance constraint. On the memory side, we show how the codes can be deployed within flash wordlines, and quantify their BER advantage using accepted ICI models.

Citations (8)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.