To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments
Örebro University, School of Science and Technology. (Mobile Robotics and Olfaction Lab, AASS Research Center)ORCID iD: 0000-0003-2504-2488
Örebro University, School of Science and Technology. (Mobile Robotics and Olfaction Lab, AASS Research Center)ORCID iD: 0000-0001-8658-2985
Örebro University, Örebro, Sweden; Computer Engineering Department, University of Baghdad, Baghdad, Iraq. (Mobile Robotics and Olfaction Lab, AASS Research Center)ORCID iD: 0000-0001-6868-2210
Örebro University, School of Science and Technology. (Mobile Robotics and Olfaction Lab, AASS Research Center)ORCID iD: 0000-0003-0217-9326
Show others and affiliations
2023 (English)In: IEEE Transactions on robotics, ISSN 1552-3098, E-ISSN 1941-0468, Vol. 39, no 2, p. 1476-1495Article in journal (Refereed) Published
Abstract [en]

This article presents an accurate, highly efficient, and learning-free method for large-scale odometry estimation using spinning radar, empirically found to generalize well across very diverse environments—outdoors, from urban to woodland, and indoors in warehouses and mines—without changing parameters. Our method integrates motion compensation within a sweep with one-to-many scan registration that minimizes distances between nearby oriented surface points and mitigates outliers with a robust loss function. Extending our previous approach conservative filtering for efficient and accurate radar odometry (CFEAR), we present an in-depth investigation on a wider range of datasets, quantifying the importance of filtering, resolution, registration cost and loss functions, keyframe history, and motion compensation. We present a new solving strategy and configuration that overcomes previous issues with sparsity and bias, and improves our state-of-the-art by 38%, thus, surprisingly, outperforming radar simultaneous localization and mapping (SLAM) and approaching lidar SLAM. The most accurate configuration achieves 1.09% error at 5 Hz on the Oxford benchmark, and the fastest achieves 1.79% error at 160 Hz.

Place, publisher, year, edition, pages
IEEE, 2023. Vol. 39, no 2, p. 1476-1495
Keywords [en]
Radar, Sensors, Spinning, Azimuth, Simultaneous localization and mapping, Estimation, Location awareness, Localization, radar odometry, range sensing, SLAM
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems) Robotics
Research subject
Computer and Systems Science; Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-103116DOI: 10.1109/tro.2022.3221302ISI: 000912778500001Scopus ID: 2-s2.0-85144032264OAI: oai:DiVA.org:oru-103116DiVA, id: diva2:1727222
Available from: 2023-01-16 Created: 2023-01-16 Last updated: 2023-10-18
In thesis
1. Robust large-scale mapping and localization: Combining robust sensing and introspection
Open this publication in new window or tab >>Robust large-scale mapping and localization: Combining robust sensing and introspection
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The presence of autonomous systems is rapidly increasing in society and industry. To achieve successful, efficient, and safe deployment of autonomous systems, they must be navigated by means of highly robust localization systems. Additionally, these systems need to localize accurately and efficiently in realtime under adverse environmental conditions, and within considerably diverse and new previously unseen environments.

This thesis focuses on investigating methods to achieve robust large-scale localization and mapping, incorporating robustness at multiple stages. Specifically, the research explores methods with sensory robustness, utilizing radar, which exhibits tolerance to harsh weather, dust, and variations in lighting conditions. Furthermore, the thesis presents methods with algorithmic robustness, which prevent failures by incorporating introspective awareness of localization quality. This thesis aims to answer the following research questions:

How can radar data be efficiently filtered and represented for robust radar odometry? How can accurate and robust odometry be achieved with radar? How can localization quality be assessed and leveraged for robust detection of localization failures? How can self-awareness of localization quality be utilized to enhance the robustness of a localization system?

While addressing these research questions, this thesis makes the following contributions to large-scale localization and mapping: A method for robust and efficient radar processing and state-of-the-art odometry estimation, and a method for self-assessment of localization quality and failure detection in lidar and radar localization. Self-assessment of localization quality is integrated into robust systems for large-scale Simultaneous Localization And Mapping, and rapid global localization in prior maps. These systems leverage self-assessment of localization quality to improve performance and prevent failures in loop closure and global localization, and consequently achieve safe robot localization.

The methods presented in this thesis were evaluated through comparative assessments of public benchmarks and real-world data collected from various industrial scenarios. These evaluations serve to validate the effectiveness and reliability of the proposed approaches. As a result, this research represents a significant advancement toward achieving highly robust localization capabilities with broad applicability.

Place, publisher, year, edition, pages
Örebro: Örebro University, 2023. p. 72
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 100
Keywords
SLAM, Localization, Robustness, Radar
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-107548 (URN)9789175295244 (ISBN)
Public defence
2023-10-31, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:00 (English)
Opponent
Supervisors
Available from: 2023-08-15 Created: 2023-08-15 Last updated: 2024-01-19Bibliographically approved

Open Access in DiVA

Lidar-level localization with radar? The CFEAR approach to accurate, fast and robust large-scale radar odometry in diverse environments(14299 kB)207 downloads
File information
File name FULLTEXT01.pdfFile size 14299 kBChecksum SHA-512
0375a2fa68cc6bcaee5e062484d83e2855659d6a3dbb7ad8ea9ca0ff54e9fe5d948602cfe73c28b0202357a9ff901bcd1c213ad40ff739f1087e450d6e888c03
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopusFree full text in ArXiv

Authority records

Adolfsson, DanielMagnusson, MartinLilienthal, AchimAndreasson, Henrik

Search in DiVA

By author/editor
Adolfsson, DanielMagnusson, MartinAlhashimi, AnasLilienthal, AchimAndreasson, Henrik
By organisation
School of Science and Technology
In the same journal
IEEE Transactions on robotics
Computer SciencesComputer Vision and Robotics (Autonomous Systems)Robotics

Search outside of DiVA

GoogleGoogle Scholar
Total: 207 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 510 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf