background
logo
ArxivPaperAI

Joint Data and Semantics Lossy Compression: Nonasymptotic Converse Bounds and Second-Order Asymptotics

Author:
Huiyuan Yang, Yuxuan Shi, Shuo Shao, Xiaojun Yuan
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT)
journal:
--
date:
2024-02-04 00:00:00
Abstract
This paper studies the joint data and semantics lossy compression problem, i.e., an extension of the hidden lossy source coding problem that entails recovering both the hidden and observable sources. We aim to study the nonasymptotic and second-order properties of this problem, especially the converse aspect. Specifically, we begin by deriving general nonasymptotic converse bounds valid for general sources and distortion measures, utilizing properties of distortion-tilted information. Subsequently, a second-order converse bound is derived under the standard block coding setting through asymptotic analysis of the nonasymptotic bounds. This bound is tight since it coincides with a known second-order achievability bound. We then examine the case of erased fair coin flips (EFCF), providing its specific nonasymptotic achievability and converse bounds. Numerical results under the EFCF case demonstrate that our second-order asymptotic approximation effectively approximates the optimum rate at given blocklengths.
PDF: Joint Data and Semantics Lossy Compression: Nonasymptotic Converse Bounds and Second-Order Asymptotics.pdf
Empowered by ChatGPT