To Örebro University

oru.seÖrebro University Publications
Change search
Link to record
Permanent link

Direct link
Publications (10 of 16) Show all publications
Alhashimi, A., Adolfsson, D., Andreasson, H., Lilienthal, A. & Magnusson, M. (2024). BFAR: improving radar odometry estimation using a bounded false alarm rate detector. Autonomous Robots, 48(8), Article ID 29.
Open this publication in new window or tab >>BFAR: improving radar odometry estimation using a bounded false alarm rate detector
Show others...
2024 (English)In: Autonomous Robots, ISSN 0929-5593, E-ISSN 1573-7527, Vol. 48, no 8, article id 29Article in journal (Refereed) Published
Abstract [en]

This work introduces a novel detector, bounded false-alarm rate (BFAR), for distinguishing true detections from noise in radar data, leading to improved accuracy in radar odometry estimation. Scanning frequency-modulated continuous wave (FMCW) radars can serve as valuable tools for localization and mapping under low visibility conditions. However, they tend to yield a higher level of noise in comparison to the more commonly employed lidars, thereby introducing additional challenges to the detection process. We propose a new radar target detector called BFAR which uses an affine transformation of the estimated noise level compared to the classical constant false-alarm rate (CFAR) detector. This transformation employs learned parameters that minimize the error in odometry estimation. Conceptually, BFAR can be viewed as an optimized blend of CFAR and fixed-level thresholding designed to minimize odometry estimation error. The strength of this approach lies in its simplicity. Only a single parameter needs to be learned from a training dataset when the affine transformation scale parameter is maintained. Compared to ad-hoc detectors, BFAR has the advantage of a specified upper-bound for the false-alarm probability, and better noise handling than CFAR. Repeatability tests show that BFAR yields highly repeatable detections with minimal redundancy. We have conducted simulations to compare the detection and false-alarm probabilities of BFAR with those of three baselines in non-homogeneous noise and varying target sizes. The results show that BFAR outperforms the other detectors. Moreover, We apply BFAR to the use case of radar odometry, and adapt a recent odometry pipeline, replacing its original conservative filtering with BFAR. In this way, we reduce the translation/rotation odometry errors/100 m from 1.3%/0.4◦ to 1.12%/0.38◦, and from 1.62%/0.57◦ to 1.21%/0.32◦, improving translation error by 14.2% and 25% on Oxford and Mulran public data sets, respectively.

Place, publisher, year, edition, pages
Springer, 2024
Keywords
Radar, CFAR, Odometry, FMCW
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-117575 (URN)10.1007/s10514-024-10176-2 (DOI)001358908800001 ()2-s2.0-85209565335 (Scopus ID)
Funder
Örebro University
Available from: 2024-12-05 Created: 2024-12-05 Last updated: 2025-02-07Bibliographically approved
Molina, S., Mannucci, A., Magnusson, M., Adolfsson, D., Andreasson, H., Hamad, M., . . . Lilienthal, A. J. (2024). The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots. IEEE robotics & automation magazine, 31(3), 48-59
Open this publication in new window or tab >>The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots
Show others...
2024 (English)In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223X, Vol. 31, no 3, p. 48-59Article in journal (Refereed) Published
Abstract [en]

Current intralogistics services require keeping up with e-commerce demands, reducing delivery times and waste, and increasing overall flexibility. As a consequence, the use of automated guided vehicles (AGVs) and, more recently, autonomous mobile robots (AMRs) for logistics operations is steadily increasing.

Place, publisher, year, edition, pages
IEEE, 2024
Keywords
Robots, Safety, Navigation, Mobile robots, Human-robot interaction, Hidden Markov models, Trajectory
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-108145 (URN)10.1109/MRA.2023.3296983 (DOI)001051249900001 ()2-s2.0-85167792783 (Scopus ID)
Funder
EU, Horizon 2020, 732737
Available from: 2023-09-14 Created: 2023-09-14 Last updated: 2025-02-07Bibliographically approved
Hilger, M., Kubelka, V., Adolfsson, D., Andreasson, H. & Lilienthal, A. (2024). Towards introspective loop closure in 4D radar SLAM. In: : . Paper presented at Radar in Robotics: Resilience from Signal to Navigation - Full-Day Workshop at 2024 IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan, May 13-17, 2024.
Open this publication in new window or tab >>Towards introspective loop closure in 4D radar SLAM
Show others...
2024 (English)Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Imaging radar is an emerging sensor modality in the context of Localization and Mapping (SLAM), especially suitable for vision-obstructed environments. This article investigates the use of 4D imaging radars for SLAM and analyzes the challenges in robust loop closure. Previous work indicates that 4D radars, together with inertial measurements, offer ample information for accurate odometry estimation. However, the low field of view, limited resolution, and sparse and noisy measurements render loop closure a significantly more challenging problem. Our work builds on the previous work - TBV SLAM - which was proposed for robust loop closure with 360∘ spinning radars. This article highlights and addresses challenges inherited from a directional 4D radar, such as sparsity, noise, and reduced field of view, and discusses why the common definition of a loop closure is unsuitable. By combining multiple quality measures for accurate loop closure detection adapted to 4D radar data, significant results in trajectory estimation are achieved; the absolute trajectory error is as low as 0.46 m over a distance of 1.8 km, with consistent operation over multiple environments. 

National Category
Robotics and automation
Identifiers
urn:nbn:se:oru:diva-114189 (URN)
Conference
Radar in Robotics: Resilience from Signal to Navigation - Full-Day Workshop at 2024 IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan, May 13-17, 2024
Funder
EU, Horizon 2020, 858101
Available from: 2024-06-12 Created: 2024-06-12 Last updated: 2025-02-09Bibliographically approved
Adolfsson, D., Magnusson, M., Alhashimi, A., Lilienthal, A. & Andreasson, H. (2023). Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments. IEEE Transactions on robotics, 39(2), 1476-1495
Open this publication in new window or tab >>Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments
Show others...
2023 (English)In: IEEE Transactions on robotics, ISSN 1552-3098, E-ISSN 1941-0468, Vol. 39, no 2, p. 1476-1495Article in journal (Refereed) Published
Abstract [en]

This article presents an accurate, highly efficient, and learning-free method for large-scale odometry estimation using spinning radar, empirically found to generalize well across very diverse environments—outdoors, from urban to woodland, and indoors in warehouses and mines—without changing parameters. Our method integrates motion compensation within a sweep with one-to-many scan registration that minimizes distances between nearby oriented surface points and mitigates outliers with a robust loss function. Extending our previous approach conservative filtering for efficient and accurate radar odometry (CFEAR), we present an in-depth investigation on a wider range of datasets, quantifying the importance of filtering, resolution, registration cost and loss functions, keyframe history, and motion compensation. We present a new solving strategy and configuration that overcomes previous issues with sparsity and bias, and improves our state-of-the-art by 38%, thus, surprisingly, outperforming radar simultaneous localization and mapping (SLAM) and approaching lidar SLAM. The most accurate configuration achieves 1.09% error at 5 Hz on the Oxford benchmark, and the fastest achieves 1.79% error at 160 Hz.

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
Radar, Sensors, Spinning, Azimuth, Simultaneous localization and mapping, Estimation, Location awareness, Localization, radar odometry, range sensing, SLAM
National Category
Computer Sciences Computer graphics and computer vision Robotics and automation
Research subject
Computer and Systems Science; Computer Science
Identifiers
urn:nbn:se:oru:diva-103116 (URN)10.1109/tro.2022.3221302 (DOI)000912778500001 ()2-s2.0-85144032264 (Scopus ID)
Available from: 2023-01-16 Created: 2023-01-16 Last updated: 2025-02-05Bibliographically approved
Adolfsson, D. (2023). Robust large-scale mapping and localization: Combining robust sensing and introspection. (Doctoral dissertation). Örebro: Örebro University
Open this publication in new window or tab >>Robust large-scale mapping and localization: Combining robust sensing and introspection
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The presence of autonomous systems is rapidly increasing in society and industry. To achieve successful, efficient, and safe deployment of autonomous systems, they must be navigated by means of highly robust localization systems. Additionally, these systems need to localize accurately and efficiently in realtime under adverse environmental conditions, and within considerably diverse and new previously unseen environments.

This thesis focuses on investigating methods to achieve robust large-scale localization and mapping, incorporating robustness at multiple stages. Specifically, the research explores methods with sensory robustness, utilizing radar, which exhibits tolerance to harsh weather, dust, and variations in lighting conditions. Furthermore, the thesis presents methods with algorithmic robustness, which prevent failures by incorporating introspective awareness of localization quality. This thesis aims to answer the following research questions:

How can radar data be efficiently filtered and represented for robust radar odometry? How can accurate and robust odometry be achieved with radar? How can localization quality be assessed and leveraged for robust detection of localization failures? How can self-awareness of localization quality be utilized to enhance the robustness of a localization system?

While addressing these research questions, this thesis makes the following contributions to large-scale localization and mapping: A method for robust and efficient radar processing and state-of-the-art odometry estimation, and a method for self-assessment of localization quality and failure detection in lidar and radar localization. Self-assessment of localization quality is integrated into robust systems for large-scale Simultaneous Localization And Mapping, and rapid global localization in prior maps. These systems leverage self-assessment of localization quality to improve performance and prevent failures in loop closure and global localization, and consequently achieve safe robot localization.

The methods presented in this thesis were evaluated through comparative assessments of public benchmarks and real-world data collected from various industrial scenarios. These evaluations serve to validate the effectiveness and reliability of the proposed approaches. As a result, this research represents a significant advancement toward achieving highly robust localization capabilities with broad applicability.

Place, publisher, year, edition, pages
Örebro: Örebro University, 2023. p. 72
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 100
Keywords
SLAM, Localization, Robustness, Radar
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-107548 (URN)9789175295244 (ISBN)
Public defence
2023-10-31, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:00 (English)
Opponent
Supervisors
Available from: 2023-08-15 Created: 2023-08-15 Last updated: 2024-01-19Bibliographically approved
Adolfsson, D., Karlsson, M., Kubelka, V., Magnusson, M. & Andreasson, H. (2023). TBV Radar SLAM - Trust but Verify Loop Candidates. IEEE Robotics and Automation Letters, 8(6), 3613-3620
Open this publication in new window or tab >>TBV Radar SLAM - Trust but Verify Loop Candidates
Show others...
2023 (English)In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 8, no 6, p. 3613-3620Article in journal (Refereed) Published
Abstract [en]

Robust SLAM in large-scale environments requires fault resilience and awareness at multiple stages, from sensing and odometry estimation to loop closure. In this work, we present TBV (Trust But Verify) Radar SLAM, a method for radar SLAM that introspectively verifies loop closure candidates. TBV Radar SLAM achieves a high correct-loop-retrieval rate by combining multiple place-recognition techniques: tightly coupled place similarity and odometry uncertainty search, creating loop descriptors from origin-shifted scans, and delaying loop selection until after verification. Robustness to false constraints is achieved by carefully verifying and selecting the most likely ones from multiple loop constraints. Importantly, the verification and selection are carried out after registration when additional sources of loop evidence can easily be computed. We integrate our loop retrieval and verification method with a robust odometry pipeline within a pose graph framework. By evaluation on public benchmarks we found that TBV Radar SLAM achieves 65% lower error than the previous state of the art. We also show that it generalizes across environments without needing to change any parameters. We provide the open-source implementation at https://github.com/dan11003/tbv_slam_public

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
SLAM, localization, radar, introspection
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-106249 (URN)10.1109/LRA.2023.3268040 (DOI)000981889200013 ()2-s2.0-85153499426 (Scopus ID)
Funder
Vinnova, 2021-04714 2019-05878
Available from: 2023-06-13 Created: 2023-06-13 Last updated: 2025-02-07Bibliographically approved
Adolfsson, D., Castellano-Quero, M., Magnusson, M., Lilienthal, A. J. & Andreasson, H. (2022). CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy. Robotics and Autonomous Systems, 155, Article ID 104136.
Open this publication in new window or tab >>CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy
Show others...
2022 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 155, article id 104136Article in journal (Refereed) Published
Abstract [en]

Robust perception is an essential component to enable long-term operation of mobile robots. It depends on failure resilience through reliable sensor data and pre-processing, as well as failure awareness through introspection, for example the ability to self-assess localization performance. This paper presents CorAl: a principled, intuitive, and generalizable method to measure the quality of alignment between pairs of point clouds, which learns to detect alignment errors in a self-supervised manner. CorAl compares the differential entropy in the point clouds separately with the entropy in their union to account for entropy inherent to the scene. By making use of dual entropy measurements, we obtain a quality metric that is highly sensitive to small alignment errors and still generalizes well to unseen environments. In this work, we extend our previous work on lidar-only CorAl to radar data by proposing a two-step filtering technique that produces high-quality point clouds from noisy radar scans. Thus, we target robust perception in two ways: by introducing a method that introspectively assesses alignment quality, and by applying it to an inherently robust sensor modality. We show that our filtering technique combined with CorAl can be applied to the problem of alignment classification, and that it detects small alignment errors in urban settings with up to 98% accuracy, and with up to 96% if trained only in a different environment. Our lidar and radar experiments demonstrate that CorAl outperforms previous methods both on the ETH lidar benchmark, which includes several indoor and outdoor environments, and the large-scale Oxford and MulRan radar data sets for urban traffic scenarios. The results also demonstrate that CorAl generalizes very well across substantially different environments without the need of retraining.

Place, publisher, year, edition, pages
Elsevier, 2022
Keywords
Radar, Introspection, Localization
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-100756 (URN)10.1016/j.robot.2022.104136 (DOI)000833416900001 ()2-s2.0-85132693467 (Scopus ID)
Funder
Knowledge FoundationEuropean Commission, 101017274Vinnova, 2019-05878
Available from: 2022-08-24 Created: 2022-08-24 Last updated: 2025-02-07Bibliographically approved
Alhashimi, A., Adolfsson, D., Magnusson, M., Andreasson, H. & Lilienthal, A. (2021). BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation. In: : . Paper presented at ICRA.
Open this publication in new window or tab >>BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation
Show others...
2021 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a new detector for filtering noise from true detections in radar data, which improves the state of the art in radar odometry. Scanning Frequency-Modulated Continuous Wave (FMCW) radars can be useful for localisation and mapping in low visibility, but return a lot of noise compared to (more commonly used) lidar, which makes the detection task more challenging. Our Bounded False-Alarm Rate (BFAR) detector is different from the classical Constant False-Alarm Rate (CFAR) detector in that it applies an affine transformation on the estimated noise level after which the parameters that minimize the estimation error can be learned. BFAR is an optimized combination between CFAR and fixed-level thresholding. Only a single parameter needs to be learned from a training dataset. We apply BFAR tothe use case of radar odometry, and adapt a state-of-the-art odometry pipeline (CFEAR), replacing its original conservative filtering with BFAR. In this way we reduce the state-of-the-art translation/rotation odometry errors from 1.76%/0.5◦/100 m to 1.55%/0.46◦/100 m; an improvement of 12.5%.

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-108800 (URN)
Conference
ICRA
Funder
Knowledge Foundation
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2024-01-02Bibliographically approved
Adolfsson, D., Magnusson, M., Alhashimi, A., Lilienthal, A. & Andreasson, H. (2021). CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021): . Paper presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, (Online Conference), September 27 - October 1, 2021 (pp. 5462-5469). IEEE
Open this publication in new window or tab >>CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry
Show others...
2021 (English)In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), IEEE, 2021, p. 5462-5469Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents the accurate, highly efficient, and learning-free method CFEAR Radarodometry for large-scale radar odometry estimation. By using a filtering technique that keeps the k strongest returns per azimuth and by additionally filtering the radar data in Cartesian space, we are able to compute a sparse set of oriented surface points for efficient and accurate scan matching. Registration is carried out by minimizing a point-to-line metric and robustness to outliers is achieved using a Huber loss. We were able to additionally reduce drift by jointly registering the latest scan to a history of keyframes and found that our odometry method generalizes to different sensor models and datasets without changing a single parameter. We evaluate our method in three widely different environments and demonstrate an improvement over spatially cross-validated state-of-the-art with an overall translation error of 1.76% in a public urban radar odometry benchmark, running at 55Hz merely on a single laptop CPU thread.

Place, publisher, year, edition, pages
IEEE, 2021
Series
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858, E-ISSN 2153-0866
Keywords
Localization SLAM Mapping Radar
National Category
Computer graphics and computer vision
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-94463 (URN)10.1109/IROS51168.2021.9636253 (DOI)000755125504051 ()9781665417143 (ISBN)9781665417150 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, (Online Conference), September 27 - October 1, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2021-09-20 Created: 2021-09-20 Last updated: 2025-02-07Bibliographically approved
Adolfsson, D., Magnusson, M., Liao, Q., Lilienthal, A. & Andreasson, H. (2021). CorAl – Are the point clouds Correctly Aligned?. In: 10th European Conference on Mobile Robots (ECMR 2021): . Paper presented at 10th European Conference on Mobile Robots (ECMR 2021), Bonn, Germany, (Online Conference), August 31 - September 3, 2021. IEEE, 10
Open this publication in new window or tab >>CorAl – Are the point clouds Correctly Aligned?
Show others...
2021 (English)In: 10th European Conference on Mobile Robots (ECMR 2021), IEEE, 2021, Vol. 10Conference paper, Published paper (Refereed)
Abstract [en]

In robotics perception, numerous tasks rely on point cloud registration. However, currently there is no method that can automatically detect misaligned point clouds reliably and without environment-specific parameters. We propose "CorAl", an alignment quality measure and alignment classifier for point cloud pairs, which facilitates the ability to introspectively assess the performance of registration. CorAl compares the joint and the separate entropy of the two point clouds. The separate entropy provides a measure of the entropy that can be expected to be inherent to the environment. The joint entropy should therefore not be substantially higher if the point clouds are properly aligned. Computing the expected entropy makes the method sensitive also to small alignment errors, which are particularly hard to detect, and applicable in a range of different environments. We found that CorAl is able to detect small alignment errors in previously unseen environments with an accuracy of 95% and achieve a substantial improvement to previous methods.

Place, publisher, year, edition, pages
IEEE, 2021
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-94464 (URN)10.1109/ECMR50962.2021.9568846 (DOI)000810510000059 ()
Conference
10th European Conference on Mobile Robots (ECMR 2021), Bonn, Germany, (Online Conference), August 31 - September 3, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737 101017274
Available from: 2021-09-22 Created: 2021-09-22 Last updated: 2025-02-07Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2504-2488

Search in DiVA

Show all publications