Emergent Mind

A Mathematical Theory of Semantic Communication: Overview

(2401.14160)
Published Jan 25, 2024 in cs.IT and math.IT

Abstract

Semantic communication initiates a new direction for future communication. In this paper, we aim to establish a systematic framework of semantic information theory (SIT). First, we propose a semantic communication model and define the synonymous mapping to indicate the critical relationship between semantic information and syntactic information. Based on this core concept, we introduce the measures of semantic information, such as semantic entropy $Hs(\tilde{U})$, up/down semantic mutual information $Is(\tilde{X};\tilde{Y})$ $(Is(\tilde{X};\tilde{Y}))$, semantic capacity $Cs=\max{p(x)}Is(\tilde{X};\tilde{Y})$, and semantic rate-distortion function $Rs(D)=\min{p(\hat{x}|x):\mathbb{E}ds(\tilde{x},\hat{\tilde{x}})\leq D}Is(\tilde{X};\hat{\tilde{X}})$. Furthermore, we prove three coding theorems of SIT, that is, the semantic source coding theorem, semantic channel coding theorem, and semantic rate-distortion coding theorem. We find that the limits of information theory are extended by using synonymous mapping, that is, $Hs(\tilde{U})\leq H(U)$, $Cs\geq C$ and $R_s(D)\leq R(D)$. All these works composite the basis of semantic information theory. In summary, the theoretic framework proposed in this paper is a natural extension of classic information theory and may reveal great performance potential for future communication.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.