Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 154 tok/s
Gemini 2.5 Pro 37 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 169 tok/s Pro
GPT OSS 120B 347 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

The Capacity of Less Noisy Cognitive Interference Channels (1206.1948v2)

Published 9 Jun 2012 in cs.IT and math.IT

Abstract: Fundamental limits of the cognitive interference channel (CIC) with two pairs of transmitter-receiver has been under exploration for several years. In this paper, we study the discrete memoryless cognitive interference channel (DM-CIC) in which the cognitive transmitter non-causally knows the message of the primary transmitter. The capacity of this channel is not known in general; it is only known in some special cases. Inspired by the concept of less noisy broadcast channel (BC), in this work we introduce the notion of less noisy cognitive interference channel. Unlike BC, due to the inherent asymmetry of the cognitive channel, two different less noisy channels are distinguishable; these are named the primary-less-noisy and cognitive-less-noisy channels. We derive capacity region for the latter case, by introducing inner and outer bounds on the capacity of the DM-CIC and showing that these bounds coincide for the cognitive-less-noisy channel. Having established the capacity region, we prove that superposition coding is the optimal encoding technique.

Citations (5)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.