Human Detection from 4D Radar Data in Low-Visibility Field Conditions
2024 (English)Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]
Autonomous driving technology is increasingly being used on public roads and in industrial settings such as mines. While it is essential to detect pedestrians, vehicles, or other obstacles, adverse field conditions negatively affect the performance of classical sensors such as cameras or lidars. Radar, on the other hand, is a promising modality that is less affected by, e.g., dust, smoke, water mist or fog. In particular, modern 4D imaging radars provide target responses across the range, vertical angle, horizontal angle and Doppler velocity dimensions. We propose TMVA4D, a CNN architecture that leverages this 4D radar modality for semantic segmentation. The CNN is trained to distinguish between the background and person classes based on a series of 2D projections of the 4D radar data that include the elevation, azimuth, range, and Doppler velocity dimensions. We also outline the process of compiling a novel dataset consisting of data collected in industrial settings with a car-mounted 4D radar and describe how the ground-truth labels were generated from reference thermal images. Using TMVA4D on this dataset, we achieve an mIoU score of 78.2% and an mDice score of 86.1%, evaluated on the two classes background and person.
Place, publisher, year, edition, pages
2024.
Keywords [en]
Automotive Radar, 4D Radar, Human Detection, Semantic Segmentation, Convolutional Neural Network, Deep Learning
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-118150OAI: oai:DiVA.org:oru-118150DiVA, id: diva2:1925679
Conference
Radar in Robotics: Resilience from Signal to Navigation - Full-Day Workshop at 2024 IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan, May 13-17, 2024
2025-01-092025-01-092025-02-09Bibliographically approved