Menu Close

Milano pilot update

6 October 2021

ENEA, AMAT and Capgemini developed a new smart camera system that can work by means of edge computing. Regular Computer Vision solutions first transport all video images to the cloud, were the material is efficiently analysed in a dedicated cloud-based environment. The new system that the Milan Living Lab develops makes us of edge computing. This means that all algorithms run locally on a secure device that is directly attached to the camera. Consequently, only aggregated results are transmitted, such as the number of people in the image, the location in the image where a pedestrian is located. After the numbers are derived, the images are destroyed immediately (no backup exists). This automatically enhances the protection of citizen’s privacy and limits data requirements.

Moreover, instead of just identifying the location of pedestrians, the algorithms also transform their coordinates from pixel coordinates (feet are at the coordinate [600px, 534px]) to actual coordinates (feet are at [3.10m, 6.78m]). Consequently, the algorithm can deduce the distance between any two individuals. For crowd management purposes, this information is very useful, as it allows them to assess in real-time whether people still have the ability to keep 1.5 meters apart.

Milano Centrale - Camera installation
Milano Centrale - camera installation
Visualisation of the algorithm identifying and transposing coordinates of people in the image.