Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 54 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 105 tok/s Pro
Kimi K2 182 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4 40 tok/s Pro
2000 character limit reached

Precoder Design and Power Allocation for Downlink MIMO-NOMA via Simultaneous Triangularization (2006.04581v2)

Published 8 Jun 2020 in cs.IT and math.IT

Abstract: In this paper, we consider the downlink precoder design for two-user power-domain multiple-input multiple-output (MIMO) non-orthogonal multiple access (NOMA) systems. The proposed precoding scheme is based on simultaneous triangularization and decomposes the MIMO-NOMA channels of the two users into multiple single-input single-output NOMA channels, assuming low-complexity self-interference cancellation at the users. In contrast to the precoding schemes based on simultaneous diagonalization (SD), the proposed scheme avoids inverting the MIMO channels of the users, thereby enhancing the ergodic rate performance. Furthermore, we develop a power allocation algorithm based on the convex-concave procedure, and exploit it to obtain the ergodic achievable rate region of the proposed MIMO-NOMA scheme. Our results illustrate that the proposed scheme outperforms baseline precoding schemes based on SD and orthogonal multiple access for a wide range of user rates and performs close to the dirty paper coding upper bound. The ergodic rate region can further be improved by utilizing a hybrid scheme based on time sharing between the proposed MIMO-NOMA scheme and point-to-point MIMO.

Citations (6)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.