Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 153 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Finite-Alphabet MMSE Equalization for All-Digital Massive MU-MIMO mmWave Communication (2009.02747v1)

Published 6 Sep 2020 in cs.IT, eess.SP, and math.IT

Abstract: We propose finite-alphabet equalization, a new paradigm that restricts the entries of the spatial equalization matrix to low-resolution numbers, enabling high-throughput, low-power, and low-cost hardware equalizers. To minimize the performance loss of this paradigm, we introduce FAME, short for finite-alphabet minimum mean-square error (MMSE) equalization, which is able to significantly outperform a naive quantization of the linear MMSE matrix. We develop efficient algorithms to approximately solve the NP-hard FAME problem and showcase that near-optimal performance can be achieved with equalization coefficients quantized to only 1-3 bits for massive multi-user multiple-input multiple-output (MU-MIMO) millimeter-wave (mmWave) systems. We provide very-large scale integration (VLSI) results that demonstrate a reduction in equalization power and area by at least a factor of 3.9x and 5.8x, respectively.

Citations (25)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.