Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 28 tok/s
Gemini 2.5 Pro 40 tok/s Pro
GPT-5 Medium 16 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 197 tok/s Pro
GPT OSS 120B 471 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Downlink Precoding for Massive MIMO Systems Exploiting Virtual Channel Model Sparsity (1706.03294v3)

Published 11 Jun 2017 in cs.IT and math.IT

Abstract: In this paper, the problem of designing a forward link linear precoder for Massive Multiple-Input Multiple-Output (MIMO) systems in conjunction with Quadrature Amplitude Modulation (QAM) is addressed. First, we employ a novel and efficient methodology that allows for a sparse representation of multiple users and groups in a fashion similar to Joint Spatial Division and Multiplexing. Then, the method is generalized to include Orthogonal Frequency Division Multiplexing (OFDM) for frequency selective channels, resulting in Combined Frequency and Spatial Division and Multiplexing, a configuration that offers high flexibility in Massive MIMO systems. A challenge in such system design is to consider finite alphabet inputs, especially with larger constellation sizes such as $M\geq 16$. The proposed methodology is next applied jointly with the complexity-reducing Per-Group Processing (PGP) technique, on a per user group basis, in conjunction with QAM modulation and in simulations, for constellation size up to $M=64$. We show by numerical results that the precoders developed offer significantly better performance than the configuration with no precoder or the plain beamformer and with $M\geq 16$.

Citations (16)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.