background
logo
ArxivPaperAI

On Decentralized Linearly Separable Computation With the Minimum Computation Cost

Author:
Haoning Chen, Minquan Cheng, Zhenhao Huang, Youlong Wu
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT)
journal:
--
date:
2024-01-29 00:00:00
Abstract
The distributed linearly separable computation problem finds extensive applications across domains such as distributed gradient coding, distributed linear transform, real-time rendering, etc. In this paper, we investigate this problem in a fully decentralized scenario, where $\mathsf{N}$ workers collaboratively perform the computation task without a central master. Each worker aims to compute a linearly separable computation that can be manifested as $\mathsf{K}_{\mathrm{c}}$ linear combinations of $\mathsf{K}$ messages, where each message is a function of a distinct dataset. We require that each worker successfully fulfill the task based on the transmissions from any $\mathsf{N}_{\mathrm{r}}$ workers, such that the system can tolerate any $\mathsf{N}-\mathsf{N}_{\mathrm{r}}$ stragglers. We focus on the scenario where the computation cost (the number of uncoded datasets assigned to each worker) is minimum, and aim to minimize the communication cost (the number of symbols the fastest $\mathsf{N}_{\mathrm{r}}$ workers transmit). We propose a novel distributed computing scheme that is optimal under the widely used cyclic data assignment. Interestingly, we demonstrate that the side information at each worker is ineffective in reducing the communication cost when $\mathsf{K}_{\mathrm{c}}\leq {\mathsf{K}}\mathsf{N}_{\mathrm{r}}/{\mathsf{N}}$, while it helps reduce the communication cost as $\mathsf{K}_{\mathrm{c}}$ increases.
PDF: On Decentralized Linearly Separable Computation With the Minimum Computation Cost.pdf
Empowered by ChatGPT