In-Network Computing (INC) is emerging as a key enabler of Sixth-Generation (6 G) systems, allowing programmable network nodes to provide not only connectivity, but also storage and processing across the cloud-to-edge continuum. Machine learning (ML) tasks, particularly Deep Neural Network (DNN) inference, stand to benefit significantly from this shift. Under the Split Inference (SI) paradigm, different layers of a DNN can be distributed across multiple in-network nodes that cooperate with the end-device requesting inference. In this work, we explore the potential of Named Data Networking (NDN) as an enabler for in-network SI. We demonstrate how NDN's native features, such as in-network caching and routing-by name, can reduce inference delays and improve robustness under lossy edge connectivity, compared to traditional host-centric networking. Simulation results validate the effectiveness of NDN-based innetwork SI, highlighting its potential to enable resilient and efficient ML services in future 6 G environments.

In-Network Split Inference with Named Data Networking under Lossy Edge Connectivity

Amadeo, Marica
Primo
;
Molinaro, Antonella;
2025-01-01

Abstract

In-Network Computing (INC) is emerging as a key enabler of Sixth-Generation (6 G) systems, allowing programmable network nodes to provide not only connectivity, but also storage and processing across the cloud-to-edge continuum. Machine learning (ML) tasks, particularly Deep Neural Network (DNN) inference, stand to benefit significantly from this shift. Under the Split Inference (SI) paradigm, different layers of a DNN can be distributed across multiple in-network nodes that cooperate with the end-device requesting inference. In this work, we explore the potential of Named Data Networking (NDN) as an enabler for in-network SI. We demonstrate how NDN's native features, such as in-network caching and routing-by name, can reduce inference delays and improve robustness under lossy edge connectivity, compared to traditional host-centric networking. Simulation results validate the effectiveness of NDN-based innetwork SI, highlighting its potential to enable resilient and efficient ML services in future 6 G environments.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11570/3350497
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact