This study presents a novel object-based classification approach using the Google Earth Engine (GEE) platform, explicitly designed for urban tree areas. By integrating high-resolution orthophotos, LiDAR, PlanetScope, Sentinel-2, and Sentinel-1 data, we aimed to enhance classification accuracy through comprehensive multi-sensor data fusion. The object-based approach included SNIC (Simple Non-Iterative Clustering) object identification and GLCM (Gray Level Co-occurrence Matrix) textural analysis in GEE using the orthophotos. The methodology was developed and systematically assessed through twenty-two different Random Forest (RF) classifications of single- and multi-sensor datasets in two representative Italian urban environments, Perugia and Bologna. For the Perugia area, we identified Olea europea, Quercus ilex, Tilia, Pinus, and Cupressus, while for the Bologna area, we differentiated Fraxinus, Acer, Celtis, Tilia, and Platanus. The results demonstrated significant improvements in overall and spatial accuracy and F-scores with the object-based fusion of diverse data sources, highlighting the substantial benefits of combining spectral, spatial, and height information, obtaining an overall accuracy and average F-scores up to 92 % and 91 %, respectively. Specifically, integrating orthophotos and LiDAR data provided robust initial segmentation and feature extraction, while including PlanetScope and Sentinel multispectral information further refined classification performance. Integrating only RGB orthophotos with multispectral data at the object level achieved promising results, offering perspectives for high-resolution urban tree mapping using broadly available data. The proposed approach, developed in GEE, provides a scalable and efficient framework for urban planners and environmental managers, supporting urban forest monitoring and ecosystem services modeling.

Urban greenery mapping using object-based classification and multi-sensor data fusion in Google Earth Engine

Vizzari, Marco
Methodology
;
Antonielli, Francesco
Software
;
Bonciarelli, Livia
Data Curation
;
Grohmann, David
Validation
;
Menconi, Maria Elena
Supervision
2025

Abstract

This study presents a novel object-based classification approach using the Google Earth Engine (GEE) platform, explicitly designed for urban tree areas. By integrating high-resolution orthophotos, LiDAR, PlanetScope, Sentinel-2, and Sentinel-1 data, we aimed to enhance classification accuracy through comprehensive multi-sensor data fusion. The object-based approach included SNIC (Simple Non-Iterative Clustering) object identification and GLCM (Gray Level Co-occurrence Matrix) textural analysis in GEE using the orthophotos. The methodology was developed and systematically assessed through twenty-two different Random Forest (RF) classifications of single- and multi-sensor datasets in two representative Italian urban environments, Perugia and Bologna. For the Perugia area, we identified Olea europea, Quercus ilex, Tilia, Pinus, and Cupressus, while for the Bologna area, we differentiated Fraxinus, Acer, Celtis, Tilia, and Platanus. The results demonstrated significant improvements in overall and spatial accuracy and F-scores with the object-based fusion of diverse data sources, highlighting the substantial benefits of combining spectral, spatial, and height information, obtaining an overall accuracy and average F-scores up to 92 % and 91 %, respectively. Specifically, integrating orthophotos and LiDAR data provided robust initial segmentation and feature extraction, while including PlanetScope and Sentinel multispectral information further refined classification performance. Integrating only RGB orthophotos with multispectral data at the object level achieved promising results, offering perspectives for high-resolution urban tree mapping using broadly available data. The proposed approach, developed in GEE, provides a scalable and efficient framework for urban planners and environmental managers, supporting urban forest monitoring and ecosystem services modeling.
2025
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11391/1597775
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 5
social impact