background
logo
ArxivPaperAI

Fundamental Limitation of Semantic Communications: Neural Estimation for Rate-Distortion

Author:
Dongxu Li, Jianhao Huang, Chuan Huang, Xiaoqi Qin, Han Zhang, Ping Zhang
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT), Machine Learning (cs.LG), Signal Processing (eess.SP)
journal:
--
date:
2024-01-02 00:00:00
Abstract
This paper studies the fundamental limit of semantic communications over the discrete memoryless channel. We consider the scenario to send a semantic source consisting of an observation state and its corresponding semantic state, both of which are recovered at the receiver. To derive the performance limitation, we adopt the semantic rate-distortion function (SRDF) to study the relationship among the minimum compression rate, observation distortion, semantic distortion, and channel capacity. For the case with unknown semantic source distribution, while only a set of the source samples is available, we propose a neural-network-based method by leveraging the generative networks to learn the semantic source distribution. Furthermore, for a special case where the semantic state is a deterministic function of the observation, we design a cascade neural network to estimate the SRDF. For the case with perfectly known semantic source distribution, we propose a general Blahut-Arimoto algorithm to effectively compute the SRDF. Finally, experimental results validate our proposed algorithms for the scenarios with ideal Gaussian semantic source and some practical datasets.
PDF: Fundamental Limitation of Semantic Communications: Neural Estimation for Rate-Distortion.pdf
Empowered by ChatGPT