This work deals with the resource management for fleets of unmanned vehicles (UV), both ground-based and aerial, which offload computation tasks to remote services. The UVs are equipped with onboard sensors (camera, etc.), have scarce computation resources, and exploit multiple networks with different radio-access technologies, for requesting additional computation resources during their mission. The UVs are so entitled to offload intensive CPU tasks, such as object detection, to computational nodes located in the edge/cloud. The aim is to optimize the average latency, taking into account also throughput and handover rate. We consider that UVs have to manage their resources without cooperation or the help of a control server. To this end, we propose and compare decentralized algorithms, also based on reinforcement learning techniques.
Radio and Computation Resource Management in Unmanned Vehicles with Edge Computing
Giuseppe Baruffa
;Luca Rugini;Fabrizio Frescura;Paolo Banelli
2025
Abstract
This work deals with the resource management for fleets of unmanned vehicles (UV), both ground-based and aerial, which offload computation tasks to remote services. The UVs are equipped with onboard sensors (camera, etc.), have scarce computation resources, and exploit multiple networks with different radio-access technologies, for requesting additional computation resources during their mission. The UVs are so entitled to offload intensive CPU tasks, such as object detection, to computational nodes located in the edge/cloud. The aim is to optimize the average latency, taking into account also throughput and handover rate. We consider that UVs have to manage their resources without cooperation or the help of a control server. To this end, we propose and compare decentralized algorithms, also based on reinforcement learning techniques.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


