Cours/Séminaire
Notice
Lieu de réalisation
France
Langue :
Anglais
Crédits
Derya Malak (Intervention)
Conditions d'utilisation
Droit commun de la propriété intellectuelle
DOI : 10.60527/7w1p-2590
Citer cette ressource :
Derya Malak. FOUNDS. (2024, 26 janvier). FOUNDS Seminar Series Talk 04 - Distributed Computation over Networks. [Vidéo]. Canal-U. https://doi.org/10.60527/7w1p-2590. (Consultée le 19 juillet 2024)

FOUNDS Seminar Series Talk 04 - Distributed Computation over Networks

Réalisation : 26 janvier 2024 - Mise en ligne : 2 mars 2024
  • document 1 document 2 document 3
  • niveau 1 niveau 2 niveau 3
Descriptif

Talk Abstract : Large-scale distributed computing systems, such as MapReduce, Spark, or distributed deep networks, are critical for parallelizing the execution of computational tasks. Nevertheless, a struggle between computation and communication complexity lies at the heart of distributed computing. There has been recently a substantial effort to address this problem for a class of functions, such as distributed matrix multiplication, distributed gradient coding, linearly separable functions, etc. The optimal cost has been achieved under some constraints, based mainly on ideas of linear separability of the tasks and linear space intersections. Motivated by the same challenge, we propose a novel distributed computing framework where a master seeks to compute an arbitrary function of distributed datasets in an asymptotically lossless manner. Our approach exploits the notion of characteristic graphs, which have been widely utilized by Shannon, Korner, and Witsenhausen to derive the rate lower bounds for computation, and later by Alon-Orlitsky, Orlitsky-Roche, Doshi-Shah-Medard, and Feizi-Medard, to resolve some well-known distributed coding and communication problems, allowing for lowered communication complexity and even for a) correlated data, b) a broad class of functions, and c) well-known topologies. The novelty of our approach lies in accurately capturing the communication-computation cost tradeoff by melding the notions of characteristic graphs and distributed placement, to provide a natural generalization of distributed linear function computation, thus elevating distributed gradient coding and distributed linear transform to the realm of distributed computing of any function.

Intervention