To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Human Detection from 4D Radar Data in Low-Visibility Field Conditions
School of Science and Technology, Örebro University, Örebro, Sweden.
Örebro University, School of Science and Technology. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-2744-0132
Örebro University, School of Science and Technology. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0001-8393-9969
Örebro University, School of Science and Technology. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0001-8658-2985
2024 (English)Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Autonomous driving technology is increasingly being used on public roads and in industrial settings such as mines. While it is essential to detect pedestrians, vehicles, or other obstacles, adverse field conditions negatively affect the performance of classical sensors such as cameras or lidars. Radar, on the other hand, is a promising modality that is less affected by, e.g., dust, smoke, water mist or fog. In particular, modern 4D imaging radars provide target responses across the range, vertical angle, horizontal angle and Doppler velocity dimensions. We propose TMVA4D, a CNN architecture that leverages this 4D radar modality for semantic segmentation. The CNN is trained to distinguish between the background and person classes based on a series of 2D projections of the 4D radar data that include the elevation, azimuth, range, and Doppler velocity dimensions. We also outline the process of compiling a novel dataset consisting of data collected in industrial settings with a car-mounted 4D radar and describe how the ground-truth labels were generated from reference thermal images. Using TMVA4D on this dataset, we achieve an mIoU score of 78.2% and an mDice score of 86.1%, evaluated on the two classes background and person.

Place, publisher, year, edition, pages
2024.
Keywords [en]
Automotive Radar, 4D Radar, Human Detection, Semantic Segmentation, Convolutional Neural Network, Deep Learning
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-118150OAI: oai:DiVA.org:oru-118150DiVA, id: diva2:1925679
Conference
Radar in Robotics: Resilience from Signal to Navigation - Full-Day Workshop at 2024 IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan, May 13-17, 2024
Available from: 2025-01-09 Created: 2025-01-09 Last updated: 2025-02-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

arXiv

Authority records

Kotlyar, OleksandrKubelka, VladimírMagnusson, Martin

Search in DiVA

By author/editor
Kotlyar, OleksandrKubelka, VladimírMagnusson, Martin
By organisation
School of Science and Technology
Robotics and automation

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 61 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf