Unmanned aerial vehicles (UAVs) also known as drones are increasingly populating our skies. This represents a relevant issue both for the legislator and the researcher. While the regulation plans often assume precautionary approaches, stating restrictive conditions of use for the sake of public safety, the applied research is exploring novel strategies to develop autonomous vehicles endowed with trusty operating mechanisms. The challenge is to let drones overflying even populated areas while keeping a steady control of the situation on the ground, thus enabling the possibility of safe landing with no harm for people. This can be done employing on-board cameras and embedded GPUs which allow for the execution in real-time of computer vision applications. In this paper, we introduce a crowd detection method for drone safe landing. The pivotal points of our proposal are related to the computational limitations imposed by the currently available hardware resources of UAVs. In this sense, our method is based on the light-weight scheme of a fully-convolutional neural network which conjugates nimble computations and effectiveness. We propose a two-loss model where a classification task (oriented to distinguish between crowded/non-crowded scenes) is supported by a regression task (aimed at better focusing the agglomeration tendency of the persons). This latter job is realized by resorting to the construction of a spatial graph for each analysed image and to the evaluation of the corresponding clustering coefficient. As a further element, our model is endowed with the capability to produce class activation heatmaps which contribute to the semantic enrichment of the flight maps. We tested our model on a large dataset of aerial images and we observed how it compares favorably with other approaches proposed in literature.

Crowd Detection in Aerial Images Using Spatial Graphs and Fully-Convolutional Neural Networks

Castellano, Giovanna;Castiello, Ciro;Mencar, Corrado;Vessio, Gennaro
2020-01-01

Abstract

Unmanned aerial vehicles (UAVs) also known as drones are increasingly populating our skies. This represents a relevant issue both for the legislator and the researcher. While the regulation plans often assume precautionary approaches, stating restrictive conditions of use for the sake of public safety, the applied research is exploring novel strategies to develop autonomous vehicles endowed with trusty operating mechanisms. The challenge is to let drones overflying even populated areas while keeping a steady control of the situation on the ground, thus enabling the possibility of safe landing with no harm for people. This can be done employing on-board cameras and embedded GPUs which allow for the execution in real-time of computer vision applications. In this paper, we introduce a crowd detection method for drone safe landing. The pivotal points of our proposal are related to the computational limitations imposed by the currently available hardware resources of UAVs. In this sense, our method is based on the light-weight scheme of a fully-convolutional neural network which conjugates nimble computations and effectiveness. We propose a two-loss model where a classification task (oriented to distinguish between crowded/non-crowded scenes) is supported by a regression task (aimed at better focusing the agglomeration tendency of the persons). This latter job is realized by resorting to the construction of a spatial graph for each analysed image and to the evaluation of the corresponding clustering coefficient. As a further element, our model is endowed with the capability to produce class activation heatmaps which contribute to the semantic enrichment of the flight maps. We tested our model on a large dataset of aerial images and we observed how it compares favorably with other approaches proposed in literature.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/266196
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 34
  • ???jsp.display-item.citation.isi??? 26
social impact