To Örebro University

oru.seÖrebro University Publications
Change search
Link to record
Permanent link

Direct link
Publications (10 of 14) Show all publications
Adolfsson, D., Magnusson, M., Alhashimi, A., Lilienthal, A. & Andreasson, H. (2023). Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments. IEEE Transactions on robotics, 39(2), 1476-1495
Open this publication in new window or tab >>Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments
Show others...
2023 (English)In: IEEE Transactions on robotics, ISSN 1552-3098, E-ISSN 1941-0468, Vol. 39, no 2, p. 1476-1495Article in journal (Refereed) Published
Abstract [en]

This article presents an accurate, highly efficient, and learning-free method for large-scale odometry estimation using spinning radar, empirically found to generalize well across very diverse environments—outdoors, from urban to woodland, and indoors in warehouses and mines—without changing parameters. Our method integrates motion compensation within a sweep with one-to-many scan registration that minimizes distances between nearby oriented surface points and mitigates outliers with a robust loss function. Extending our previous approach conservative filtering for efficient and accurate radar odometry (CFEAR), we present an in-depth investigation on a wider range of datasets, quantifying the importance of filtering, resolution, registration cost and loss functions, keyframe history, and motion compensation. We present a new solving strategy and configuration that overcomes previous issues with sparsity and bias, and improves our state-of-the-art by 38%, thus, surprisingly, outperforming radar simultaneous localization and mapping (SLAM) and approaching lidar SLAM. The most accurate configuration achieves 1.09% error at 5 Hz on the Oxford benchmark, and the fastest achieves 1.79% error at 160 Hz.

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
Radar, Sensors, Spinning, Azimuth, Simultaneous localization and mapping, Estimation, Location awareness, Localization, radar odometry, range sensing, SLAM
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems) Robotics
Research subject
Computer and Systems Science; Computer Science
Identifiers
urn:nbn:se:oru:diva-103116 (URN)10.1109/tro.2022.3221302 (DOI)000912778500001 ()2-s2.0-85144032264 (Scopus ID)
Available from: 2023-01-16 Created: 2023-01-16 Last updated: 2023-10-18
Adolfsson, D. (2023). Robust large-scale mapping and localization: Combining robust sensing and introspection. (Doctoral dissertation). Örebro: Örebro University
Open this publication in new window or tab >>Robust large-scale mapping and localization: Combining robust sensing and introspection
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The presence of autonomous systems is rapidly increasing in society and industry. To achieve successful, efficient, and safe deployment of autonomous systems, they must be navigated by means of highly robust localization systems. Additionally, these systems need to localize accurately and efficiently in realtime under adverse environmental conditions, and within considerably diverse and new previously unseen environments.

This thesis focuses on investigating methods to achieve robust large-scale localization and mapping, incorporating robustness at multiple stages. Specifically, the research explores methods with sensory robustness, utilizing radar, which exhibits tolerance to harsh weather, dust, and variations in lighting conditions. Furthermore, the thesis presents methods with algorithmic robustness, which prevent failures by incorporating introspective awareness of localization quality. This thesis aims to answer the following research questions:

How can radar data be efficiently filtered and represented for robust radar odometry? How can accurate and robust odometry be achieved with radar? How can localization quality be assessed and leveraged for robust detection of localization failures? How can self-awareness of localization quality be utilized to enhance the robustness of a localization system?

While addressing these research questions, this thesis makes the following contributions to large-scale localization and mapping: A method for robust and efficient radar processing and state-of-the-art odometry estimation, and a method for self-assessment of localization quality and failure detection in lidar and radar localization. Self-assessment of localization quality is integrated into robust systems for large-scale Simultaneous Localization And Mapping, and rapid global localization in prior maps. These systems leverage self-assessment of localization quality to improve performance and prevent failures in loop closure and global localization, and consequently achieve safe robot localization.

The methods presented in this thesis were evaluated through comparative assessments of public benchmarks and real-world data collected from various industrial scenarios. These evaluations serve to validate the effectiveness and reliability of the proposed approaches. As a result, this research represents a significant advancement toward achieving highly robust localization capabilities with broad applicability.

Place, publisher, year, edition, pages
Örebro: Örebro University, 2023. p. 72
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 100
Keywords
SLAM, Localization, Robustness, Radar
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-107548 (URN)9789175295244 (ISBN)
Public defence
2023-10-31, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:00 (English)
Opponent
Supervisors
Available from: 2023-08-15 Created: 2023-08-15 Last updated: 2024-01-19Bibliographically approved
Adolfsson, D., Karlsson, M., Kubelka, V., Magnusson, M. & Andreasson, H. (2023). TBV Radar SLAM - Trust but Verify Loop Candidates. IEEE Robotics and Automation Letters, 8(6), 3613-3620
Open this publication in new window or tab >>TBV Radar SLAM - Trust but Verify Loop Candidates
Show others...
2023 (English)In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 8, no 6, p. 3613-3620Article in journal (Refereed) Published
Abstract [en]

Robust SLAM in large-scale environments requires fault resilience and awareness at multiple stages, from sensing and odometry estimation to loop closure. In this work, we present TBV (Trust But Verify) Radar SLAM, a method for radar SLAM that introspectively verifies loop closure candidates. TBV Radar SLAM achieves a high correct-loop-retrieval rate by combining multiple place-recognition techniques: tightly coupled place similarity and odometry uncertainty search, creating loop descriptors from origin-shifted scans, and delaying loop selection until after verification. Robustness to false constraints is achieved by carefully verifying and selecting the most likely ones from multiple loop constraints. Importantly, the verification and selection are carried out after registration when additional sources of loop evidence can easily be computed. We integrate our loop retrieval and verification method with a robust odometry pipeline within a pose graph framework. By evaluation on public benchmarks we found that TBV Radar SLAM achieves 65% lower error than the previous state of the art. We also show that it generalizes across environments without needing to change any parameters. We provide the open-source implementation at https://github.com/dan11003/tbv_slam_public

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
SLAM, localization, radar, introspection
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-106249 (URN)10.1109/LRA.2023.3268040 (DOI)000981889200013 ()2-s2.0-85153499426 (Scopus ID)
Funder
Vinnova, 2021-04714 2019-05878
Available from: 2023-06-13 Created: 2023-06-13 Last updated: 2024-01-17Bibliographically approved
Molina, S., Mannucci, A., Magnusson, M., Adolfsson, D., Andreasson, H., Hamad, M., . . . Lilienthal, A. J. (2023). The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots. IEEE robotics & automation magazine
Open this publication in new window or tab >>The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots
Show others...
2023 (English)In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223XArticle in journal (Refereed) Epub ahead of print
Abstract [en]

Current intralogistics services require keeping up with e-commerce demands, reducing delivery times and waste, and increasing overall flexibility. As a consequence, the use of automated guided vehicles (AGVs) and, more recently, autonomous mobile robots (AMRs) for logistics operations is steadily increasing.

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
Robots, Safety, Navigation, Mobile robots, Human-robot interaction, Hidden Markov models, Trajectory
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-108145 (URN)10.1109/MRA.2023.3296983 (DOI)001051249900001 ()
Funder
EU, Horizon 2020, 732737
Available from: 2023-09-14 Created: 2023-09-14 Last updated: 2024-01-02Bibliographically approved
Adolfsson, D., Castellano-Quero, M., Magnusson, M., Lilienthal, A. J. & Andreasson, H. (2022). CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy. Robotics and Autonomous Systems, 155, Article ID 104136.
Open this publication in new window or tab >>CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy
Show others...
2022 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 155, article id 104136Article in journal (Refereed) Published
Abstract [en]

Robust perception is an essential component to enable long-term operation of mobile robots. It depends on failure resilience through reliable sensor data and pre-processing, as well as failure awareness through introspection, for example the ability to self-assess localization performance. This paper presents CorAl: a principled, intuitive, and generalizable method to measure the quality of alignment between pairs of point clouds, which learns to detect alignment errors in a self-supervised manner. CorAl compares the differential entropy in the point clouds separately with the entropy in their union to account for entropy inherent to the scene. By making use of dual entropy measurements, we obtain a quality metric that is highly sensitive to small alignment errors and still generalizes well to unseen environments. In this work, we extend our previous work on lidar-only CorAl to radar data by proposing a two-step filtering technique that produces high-quality point clouds from noisy radar scans. Thus, we target robust perception in two ways: by introducing a method that introspectively assesses alignment quality, and by applying it to an inherently robust sensor modality. We show that our filtering technique combined with CorAl can be applied to the problem of alignment classification, and that it detects small alignment errors in urban settings with up to 98% accuracy, and with up to 96% if trained only in a different environment. Our lidar and radar experiments demonstrate that CorAl outperforms previous methods both on the ETH lidar benchmark, which includes several indoor and outdoor environments, and the large-scale Oxford and MulRan radar data sets for urban traffic scenarios. The results also demonstrate that CorAl generalizes very well across substantially different environments without the need of retraining.

Place, publisher, year, edition, pages
Elsevier, 2022
Keywords
Radar, Introspection, Localization
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-100756 (URN)10.1016/j.robot.2022.104136 (DOI)000833416900001 ()2-s2.0-85132693467 (Scopus ID)
Funder
Knowledge FoundationEuropean Commission, 101017274Vinnova, 2019-05878
Available from: 2022-08-24 Created: 2022-08-24 Last updated: 2024-01-02Bibliographically approved
Alhashimi, A., Adolfsson, D., Magnusson, M., Andreasson, H. & Lilienthal, A. (2021). BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation. In: : . Paper presented at ICRA.
Open this publication in new window or tab >>BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation
Show others...
2021 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a new detector for filtering noise from true detections in radar data, which improves the state of the art in radar odometry. Scanning Frequency-Modulated Continuous Wave (FMCW) radars can be useful for localisation and mapping in low visibility, but return a lot of noise compared to (more commonly used) lidar, which makes the detection task more challenging. Our Bounded False-Alarm Rate (BFAR) detector is different from the classical Constant False-Alarm Rate (CFAR) detector in that it applies an affine transformation on the estimated noise level after which the parameters that minimize the estimation error can be learned. BFAR is an optimized combination between CFAR and fixed-level thresholding. Only a single parameter needs to be learned from a training dataset. We apply BFAR tothe use case of radar odometry, and adapt a state-of-the-art odometry pipeline (CFEAR), replacing its original conservative filtering with BFAR. In this way we reduce the state-of-the-art translation/rotation odometry errors from 1.76%/0.5◦/100 m to 1.55%/0.46◦/100 m; an improvement of 12.5%.

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-108800 (URN)
Conference
ICRA
Funder
Knowledge Foundation
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2024-01-02Bibliographically approved
Adolfsson, D., Magnusson, M., Alhashimi, A., Lilienthal, A. & Andreasson, H. (2021). CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021): . Paper presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, (Online Conference), September 27 - October 1, 2021 (pp. 5462-5469). IEEE
Open this publication in new window or tab >>CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry
Show others...
2021 (English)In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), IEEE, 2021, p. 5462-5469Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents the accurate, highly efficient, and learning-free method CFEAR Radarodometry for large-scale radar odometry estimation. By using a filtering technique that keeps the k strongest returns per azimuth and by additionally filtering the radar data in Cartesian space, we are able to compute a sparse set of oriented surface points for efficient and accurate scan matching. Registration is carried out by minimizing a point-to-line metric and robustness to outliers is achieved using a Huber loss. We were able to additionally reduce drift by jointly registering the latest scan to a history of keyframes and found that our odometry method generalizes to different sensor models and datasets without changing a single parameter. We evaluate our method in three widely different environments and demonstrate an improvement over spatially cross-validated state-of-the-art with an overall translation error of 1.76% in a public urban radar odometry benchmark, running at 55Hz merely on a single laptop CPU thread.

Place, publisher, year, edition, pages
IEEE, 2021
Series
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858, E-ISSN 2153-0866
Keywords
Localization SLAM Mapping Radar
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-94463 (URN)10.1109/IROS51168.2021.9636253 (DOI)000755125504051 ()9781665417143 (ISBN)9781665417150 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, (Online Conference), September 27 - October 1, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2021-09-20 Created: 2021-09-20 Last updated: 2024-01-02Bibliographically approved
Adolfsson, D., Magnusson, M., Liao, Q., Lilienthal, A. & Andreasson, H. (2021). CorAl – Are the point clouds Correctly Aligned?. In: 10th European Conference on Mobile Robots (ECMR 2021): . Paper presented at 10th European Conference on Mobile Robots (ECMR 2021), Bonn, Germany, (Online Conference), August 31 - September 3, 2021. IEEE, 10
Open this publication in new window or tab >>CorAl – Are the point clouds Correctly Aligned?
Show others...
2021 (English)In: 10th European Conference on Mobile Robots (ECMR 2021), IEEE, 2021, Vol. 10Conference paper, Published paper (Refereed)
Abstract [en]

In robotics perception, numerous tasks rely on point cloud registration. However, currently there is no method that can automatically detect misaligned point clouds reliably and without environment-specific parameters. We propose "CorAl", an alignment quality measure and alignment classifier for point cloud pairs, which facilitates the ability to introspectively assess the performance of registration. CorAl compares the joint and the separate entropy of the two point clouds. The separate entropy provides a measure of the entropy that can be expected to be inherent to the environment. The joint entropy should therefore not be substantially higher if the point clouds are properly aligned. Computing the expected entropy makes the method sensitive also to small alignment errors, which are particularly hard to detect, and applicable in a range of different environments. We found that CorAl is able to detect small alignment errors in previously unseen environments with an accuracy of 95% and achieve a substantial improvement to previous methods.

Place, publisher, year, edition, pages
IEEE, 2021
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-94464 (URN)10.1109/ECMR50962.2021.9568846 (DOI)000810510000059 ()
Conference
10th European Conference on Mobile Robots (ECMR 2021), Bonn, Germany, (Online Conference), August 31 - September 3, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737 101017274
Available from: 2021-09-22 Created: 2021-09-22 Last updated: 2024-01-02Bibliographically approved
Zhou, Z., Zhao, C., Adolfsson, D., Su, S., Gao, Y., Duckett, T. & Sun, L. (2021). NDT-Transformer: Large-Scale 3D Point Cloud Localisation using the Normal Distribution Transform Representation. In: 2021 IEEE International Conference on Robotics and Automation (ICRA): . Paper presented at 2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021. IEEE
Open this publication in new window or tab >>NDT-Transformer: Large-Scale 3D Point Cloud Localisation using the Normal Distribution Transform Representation
Show others...
2021 (English)In: 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021Conference paper, Published paper (Refereed)
Abstract [en]

3D point cloud-based place recognition is highly demanded by autonomous driving in GPS-challenged environments and serves as an essential component (i.e. loop-closure detection) in lidar-based SLAM systems. This paper proposes a novel approach, named NDT-Transformer, for real-time and large-scale place recognition using 3D point clouds. Specifically, a 3D Normal Distribution Transform (NDT) representation is employed to condense the raw, dense 3D point cloud as probabilistic distributions (NDT cells) to provide the geometrical shape description. Then a novel NDT-Transformer network learns a global descriptor from a set of 3D NDT cell representations. Benefiting from the NDT representation and NDT-Transformer network, the learned global descriptors are enriched with both geometrical and contextual information. Finally, descriptor retrieval is achieved using a query-database for place recognition. Compared to the state-of-the-art methods, the proposed approach achieves an improvement of 7.52% on average top 1 recall and 2.73% on average top 1% recall on the Oxford Robotcar benchmark.

Place, publisher, year, edition, pages
IEEE, 2021
Series
IEEE International Conference on Robotics and Automation (ICRA), ISSN 1050-4729, E-ISSN 2577-087X
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-96652 (URN)10.1109/ICRA48506.2021.9560932 (DOI)000765738804041 ()2-s2.0-85124680724 (Scopus ID)9781728190778 (ISBN)9781728190785 (ISBN)
Conference
2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021
Funder
EU, Horizon 2020, 732737
Note

Funding agencies:

UK Research & Innovation (UKRI)

Engineering & Physical Sciences Research Council (EPSRC) EP/R026092/1  

Royal Society of London European Commission RGS202432

Available from: 2022-01-24 Created: 2022-01-24 Last updated: 2024-01-02Bibliographically approved
Adolfsson, D., Magnusson, M., Alhashimi, A., Lilienthal, A. & Andreasson, H. (2021). Oriented surface points for efficient and accurate radar odometry. In: : . Paper presented at Radar Perception for All-Weather Autonomy - Half-Day Workshop at 2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021.
Open this publication in new window or tab >>Oriented surface points for efficient and accurate radar odometry
Show others...
2021 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents an efficient and accurate radar odometry pipeline for large-scale localization. We propose a radar filter that keeps only the strongest reflections per-azimuth that exceeds the expected noise level. The filtered radar data is used to incrementally estimate odometry by registering the current scan with a nearby keyframe. By modeling local surfaces, we were able to register scans by minimizing a point-to-line metric and accurately estimate odometry from sparse point sets, hence improving efficiency. Specifically, we found that a point-to-line metric yields significant improvements compared to a point-to-point metric when matching sparse sets of surface points. Preliminary results from an urban odometry benchmark show that our odometry pipeline is accurate and efficient compared to existing methods with an overall translation error of 2.05%, down from 2.78% from the previously best published method, running at 12.5ms per frame without need of environmental specific training. 

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-108799 (URN)
Conference
Radar Perception for All-Weather Autonomy - Half-Day Workshop at 2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2024-01-02Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2504-2488

Search in DiVA

Show all publications