background
logo
ArxivPaperAI

Approaching Maximum Likelihood Decoding Performance via Reshuffling ORBGRAND

Author:
Li Wan, Wenyi Zhang
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT)
journal:
--
date:
2024-01-29 00:00:00
Abstract
Guessing random additive noise decoding (GRAND) is a recently proposed decoding paradigm particularly suitable for codes with short length and high rate. Among its variants, ordered reliability bits GRAND (ORBGRAND) exploits soft information in a simple and effective fashion to schedule its queries, thereby allowing efficient hardware implementation. Compared with maximum likelihood (ML) decoding, however, ORBGRAND still exhibits noticeable performance gap in terms of block error rate (BLER). In order to improve the performance of ORBGRAND while still retaining its amenability to hardware implementation, a new variant of ORBGRAND termed RS-ORBGRAND is proposed, whose basic idea is to reshuffle the queries of ORBGRAND so that the expected number of queries is minimized. Numerical simulations show that RS-ORBGRAND leads to noticeable gains compared with ORBGRAND and its existing variants, and is only 0.1dB away from ML decoding, for BLER as low as $10^{-6}$.
PDF: Approaching Maximum Likelihood Decoding Performance via Reshuffling ORBGRAND.pdf
Empowered by ChatGPT