Serverless computing is a recent deployment model for cloud, edge and fog computing platforms, which ultimate goal is to provide cost reduction and scalability enhancement with no additional deployment overhead. The main implementation of this model is Functions-as-a-Service (FaaS): Developers deploy modular functions, which are typically event-driven, on the platform without the need to manage the underlying infrastructure. Moreover, using the so called warm start mode, the FaaS containers hosting the application are kept up and running after initialization, granting the user the impression of high availability. Conversely, in a cold start mode scenario, those containers are deleted when no application requests are received within a certain time window, to save resources. This focus on resources efficiency and flexibility could make the serverless approach significantly convenient for edge computing based applications, in which the hosting nodes consist of devices and machines with limited resources, geographically distributed in proximity to the users. In this paper, we explore the available solutions to deploy a serverless application in an edge computing scenario, with a focus on open-source tools and IoT data.

Open-Source Serverless for Edge Computing: A Tutorial

Benedetti P.
Investigation
;
Femminella M.
Conceptualization
;
Reali G.
Writing – Original Draft Preparation
;
2023

Abstract

Serverless computing is a recent deployment model for cloud, edge and fog computing platforms, which ultimate goal is to provide cost reduction and scalability enhancement with no additional deployment overhead. The main implementation of this model is Functions-as-a-Service (FaaS): Developers deploy modular functions, which are typically event-driven, on the platform without the need to manage the underlying infrastructure. Moreover, using the so called warm start mode, the FaaS containers hosting the application are kept up and running after initialization, granting the user the impression of high availability. Conversely, in a cold start mode scenario, those containers are deleted when no application requests are received within a certain time window, to save resources. This focus on resources efficiency and flexibility could make the serverless approach significantly convenient for edge computing based applications, in which the hosting nodes consist of devices and machines with limited resources, geographically distributed in proximity to the users. In this paper, we explore the available solutions to deploy a serverless application in an edge computing scenario, with a focus on open-source tools and IoT data.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11391/1553503
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact