background
logo
ArxivPaperAI

Exact Thresholds for Noisy Non-Adaptive Group Testing

Author:
Junren Chen, Jonathan Scarlett
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT), Discrete Mathematics (cs.DM), Probability (math.PR), Statistics Theory (math.ST)
journal:
--
date:
2024-01-10 00:00:00
Abstract
In recent years, the mathematical limits and algorithmic bounds for probabilistic group testing having become increasingly well-understood, with exact asymptotic thresholds now being known in general scaling regimes for the noiseless setting. In the noisy setting where each test outcome is flipped with constant probability, there have been similar developments, but the overall understanding has lagged significantly behind the noiseless setting. In this paper, we substantially narrow this gap by deriving exact asymptotic thresholds for the noisy setting under two widely-studied random test designs: i.i.d. Bernoulli and near-constant tests-per-item. These thresholds are established by combining components of an existing information-theoretic threshold decoder with a novel analysis of maximum-likelihood decoding (upper bounds), and deriving a novel set of impossibility results by analyzing certain failure events for optimal maximum-likelihood decoding (lower bounds). Our results show that existing algorithmic upper bounds for the noisy setting are strictly suboptimal, and leave open the interesting question of whether our thresholds can be attained using computationally efficient algorithms.
PDF: Exact Thresholds for Noisy Non-Adaptive Group Testing.pdf
Empowered by ChatGPT