To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Robust large-scale mapping and localization: Combining robust sensing and introspection
Örebro University, School of Science and Technology.ORCID iD: 0000-0003-2504-2488
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The presence of autonomous systems is rapidly increasing in society and industry. To achieve successful, efficient, and safe deployment of autonomous systems, they must be navigated by means of highly robust localization systems. Additionally, these systems need to localize accurately and efficiently in realtime under adverse environmental conditions, and within considerably diverse and new previously unseen environments.

This thesis focuses on investigating methods to achieve robust large-scale localization and mapping, incorporating robustness at multiple stages. Specifically, the research explores methods with sensory robustness, utilizing radar, which exhibits tolerance to harsh weather, dust, and variations in lighting conditions. Furthermore, the thesis presents methods with algorithmic robustness, which prevent failures by incorporating introspective awareness of localization quality. This thesis aims to answer the following research questions:

How can radar data be efficiently filtered and represented for robust radar odometry? How can accurate and robust odometry be achieved with radar? How can localization quality be assessed and leveraged for robust detection of localization failures? How can self-awareness of localization quality be utilized to enhance the robustness of a localization system?

While addressing these research questions, this thesis makes the following contributions to large-scale localization and mapping: A method for robust and efficient radar processing and state-of-the-art odometry estimation, and a method for self-assessment of localization quality and failure detection in lidar and radar localization. Self-assessment of localization quality is integrated into robust systems for large-scale Simultaneous Localization And Mapping, and rapid global localization in prior maps. These systems leverage self-assessment of localization quality to improve performance and prevent failures in loop closure and global localization, and consequently achieve safe robot localization.

The methods presented in this thesis were evaluated through comparative assessments of public benchmarks and real-world data collected from various industrial scenarios. These evaluations serve to validate the effectiveness and reliability of the proposed approaches. As a result, this research represents a significant advancement toward achieving highly robust localization capabilities with broad applicability.

Place, publisher, year, edition, pages
Örebro: Örebro University , 2023. , p. 72
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 100
Keywords [en]
SLAM, Localization, Robustness, Radar
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-107548ISBN: 9789175295244 (print)OAI: oai:DiVA.org:oru-107548DiVA, id: diva2:1787996
Public defence
2023-10-31, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:00 (English)
Opponent
Supervisors
Available from: 2023-08-15 Created: 2023-08-15 Last updated: 2024-01-19Bibliographically approved
List of papers
1. Oriented surface points for efficient and accurate radar odometry
Open this publication in new window or tab >>Oriented surface points for efficient and accurate radar odometry
Show others...
2021 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents an efficient and accurate radar odometry pipeline for large-scale localization. We propose a radar filter that keeps only the strongest reflections per-azimuth that exceeds the expected noise level. The filtered radar data is used to incrementally estimate odometry by registering the current scan with a nearby keyframe. By modeling local surfaces, we were able to register scans by minimizing a point-to-line metric and accurately estimate odometry from sparse point sets, hence improving efficiency. Specifically, we found that a point-to-line metric yields significant improvements compared to a point-to-point metric when matching sparse sets of surface points. Preliminary results from an urban odometry benchmark show that our odometry pipeline is accurate and efficient compared to existing methods with an overall translation error of 2.05%, down from 2.78% from the previously best published method, running at 12.5ms per frame without need of environmental specific training. 

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-108799 (URN)
Conference
Radar Perception for All-Weather Autonomy - Half-Day Workshop at 2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2024-01-02Bibliographically approved
2. CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry
Open this publication in new window or tab >>CFEAR Radarodometry - Conservative Filtering for Efficient and Accurate Radar Odometry
Show others...
2021 (English)In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), IEEE, 2021, p. 5462-5469Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents the accurate, highly efficient, and learning-free method CFEAR Radarodometry for large-scale radar odometry estimation. By using a filtering technique that keeps the k strongest returns per azimuth and by additionally filtering the radar data in Cartesian space, we are able to compute a sparse set of oriented surface points for efficient and accurate scan matching. Registration is carried out by minimizing a point-to-line metric and robustness to outliers is achieved using a Huber loss. We were able to additionally reduce drift by jointly registering the latest scan to a history of keyframes and found that our odometry method generalizes to different sensor models and datasets without changing a single parameter. We evaluate our method in three widely different environments and demonstrate an improvement over spatially cross-validated state-of-the-art with an overall translation error of 1.76% in a public urban radar odometry benchmark, running at 55Hz merely on a single laptop CPU thread.

Place, publisher, year, edition, pages
IEEE, 2021
Series
IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858, E-ISSN 2153-0866
Keywords
Localization SLAM Mapping Radar
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-94463 (URN)10.1109/IROS51168.2021.9636253 (DOI)000755125504051 ()9781665417143 (ISBN)9781665417150 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021), Prague, Czech Republic, (Online Conference), September 27 - October 1, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2021-09-20 Created: 2021-09-20 Last updated: 2024-01-02Bibliographically approved
3. Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments
Open this publication in new window or tab >>Lidar-Level Localization With Radar? The CFEAR Approach to Accurate, Fast, and Robust Large-Scale Radar Odometry in Diverse Environments
Show others...
2023 (English)In: IEEE Transactions on robotics, ISSN 1552-3098, E-ISSN 1941-0468, Vol. 39, no 2, p. 1476-1495Article in journal (Refereed) Published
Abstract [en]

This article presents an accurate, highly efficient, and learning-free method for large-scale odometry estimation using spinning radar, empirically found to generalize well across very diverse environments—outdoors, from urban to woodland, and indoors in warehouses and mines—without changing parameters. Our method integrates motion compensation within a sweep with one-to-many scan registration that minimizes distances between nearby oriented surface points and mitigates outliers with a robust loss function. Extending our previous approach conservative filtering for efficient and accurate radar odometry (CFEAR), we present an in-depth investigation on a wider range of datasets, quantifying the importance of filtering, resolution, registration cost and loss functions, keyframe history, and motion compensation. We present a new solving strategy and configuration that overcomes previous issues with sparsity and bias, and improves our state-of-the-art by 38%, thus, surprisingly, outperforming radar simultaneous localization and mapping (SLAM) and approaching lidar SLAM. The most accurate configuration achieves 1.09% error at 5 Hz on the Oxford benchmark, and the fastest achieves 1.79% error at 160 Hz.

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
Radar, Sensors, Spinning, Azimuth, Simultaneous localization and mapping, Estimation, Location awareness, Localization, radar odometry, range sensing, SLAM
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems) Robotics
Research subject
Computer and Systems Science; Computer Science
Identifiers
urn:nbn:se:oru:diva-103116 (URN)10.1109/tro.2022.3221302 (DOI)000912778500001 ()2-s2.0-85144032264 (Scopus ID)
Available from: 2023-01-16 Created: 2023-01-16 Last updated: 2023-10-18
4. BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation
Open this publication in new window or tab >>BFAR – Bounded False Alarm Rate detector for improved radar odometry estimation
Show others...
2021 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a new detector for filtering noise from true detections in radar data, which improves the state of the art in radar odometry. Scanning Frequency-Modulated Continuous Wave (FMCW) radars can be useful for localisation and mapping in low visibility, but return a lot of noise compared to (more commonly used) lidar, which makes the detection task more challenging. Our Bounded False-Alarm Rate (BFAR) detector is different from the classical Constant False-Alarm Rate (CFAR) detector in that it applies an affine transformation on the estimated noise level after which the parameters that minimize the estimation error can be learned. BFAR is an optimized combination between CFAR and fixed-level thresholding. Only a single parameter needs to be learned from a training dataset. We apply BFAR tothe use case of radar odometry, and adapt a state-of-the-art odometry pipeline (CFEAR), replacing its original conservative filtering with BFAR. In this way we reduce the state-of-the-art translation/rotation odometry errors from 1.76%/0.5◦/100 m to 1.55%/0.46◦/100 m; an improvement of 12.5%.

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-108800 (URN)
Conference
ICRA
Funder
Knowledge Foundation
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2024-01-02Bibliographically approved
5. CorAl – Are the point clouds Correctly Aligned?
Open this publication in new window or tab >>CorAl – Are the point clouds Correctly Aligned?
Show others...
2021 (English)In: 10th European Conference on Mobile Robots (ECMR 2021), IEEE, 2021, Vol. 10Conference paper, Published paper (Refereed)
Abstract [en]

In robotics perception, numerous tasks rely on point cloud registration. However, currently there is no method that can automatically detect misaligned point clouds reliably and without environment-specific parameters. We propose "CorAl", an alignment quality measure and alignment classifier for point cloud pairs, which facilitates the ability to introspectively assess the performance of registration. CorAl compares the joint and the separate entropy of the two point clouds. The separate entropy provides a measure of the entropy that can be expected to be inherent to the environment. The joint entropy should therefore not be substantially higher if the point clouds are properly aligned. Computing the expected entropy makes the method sensitive also to small alignment errors, which are particularly hard to detect, and applicable in a range of different environments. We found that CorAl is able to detect small alignment errors in previously unseen environments with an accuracy of 95% and achieve a substantial improvement to previous methods.

Place, publisher, year, edition, pages
IEEE, 2021
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-94464 (URN)10.1109/ECMR50962.2021.9568846 (DOI)000810510000059 ()
Conference
10th European Conference on Mobile Robots (ECMR 2021), Bonn, Germany, (Online Conference), August 31 - September 3, 2021
Funder
Knowledge FoundationEU, Horizon 2020, 732737 101017274
Available from: 2021-09-22 Created: 2021-09-22 Last updated: 2024-01-02Bibliographically approved
6. CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy
Open this publication in new window or tab >>CorAl: Introspection for robust radar and lidar perception in diverse environments using differential entropy
Show others...
2022 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 155, article id 104136Article in journal (Refereed) Published
Abstract [en]

Robust perception is an essential component to enable long-term operation of mobile robots. It depends on failure resilience through reliable sensor data and pre-processing, as well as failure awareness through introspection, for example the ability to self-assess localization performance. This paper presents CorAl: a principled, intuitive, and generalizable method to measure the quality of alignment between pairs of point clouds, which learns to detect alignment errors in a self-supervised manner. CorAl compares the differential entropy in the point clouds separately with the entropy in their union to account for entropy inherent to the scene. By making use of dual entropy measurements, we obtain a quality metric that is highly sensitive to small alignment errors and still generalizes well to unseen environments. In this work, we extend our previous work on lidar-only CorAl to radar data by proposing a two-step filtering technique that produces high-quality point clouds from noisy radar scans. Thus, we target robust perception in two ways: by introducing a method that introspectively assesses alignment quality, and by applying it to an inherently robust sensor modality. We show that our filtering technique combined with CorAl can be applied to the problem of alignment classification, and that it detects small alignment errors in urban settings with up to 98% accuracy, and with up to 96% if trained only in a different environment. Our lidar and radar experiments demonstrate that CorAl outperforms previous methods both on the ETH lidar benchmark, which includes several indoor and outdoor environments, and the large-scale Oxford and MulRan radar data sets for urban traffic scenarios. The results also demonstrate that CorAl generalizes very well across substantially different environments without the need of retraining.

Place, publisher, year, edition, pages
Elsevier, 2022
Keywords
Radar, Introspection, Localization
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-100756 (URN)10.1016/j.robot.2022.104136 (DOI)000833416900001 ()2-s2.0-85132693467 (Scopus ID)
Funder
Knowledge FoundationEuropean Commission, 101017274Vinnova, 2019-05878
Available from: 2022-08-24 Created: 2022-08-24 Last updated: 2024-01-02Bibliographically approved
7. TBV Radar SLAM - Trust but Verify Loop Candidates
Open this publication in new window or tab >>TBV Radar SLAM - Trust but Verify Loop Candidates
Show others...
2023 (English)In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 8, no 6, p. 3613-3620Article in journal (Refereed) Published
Abstract [en]

Robust SLAM in large-scale environments requires fault resilience and awareness at multiple stages, from sensing and odometry estimation to loop closure. In this work, we present TBV (Trust But Verify) Radar SLAM, a method for radar SLAM that introspectively verifies loop closure candidates. TBV Radar SLAM achieves a high correct-loop-retrieval rate by combining multiple place-recognition techniques: tightly coupled place similarity and odometry uncertainty search, creating loop descriptors from origin-shifted scans, and delaying loop selection until after verification. Robustness to false constraints is achieved by carefully verifying and selecting the most likely ones from multiple loop constraints. Importantly, the verification and selection are carried out after registration when additional sources of loop evidence can easily be computed. We integrate our loop retrieval and verification method with a robust odometry pipeline within a pose graph framework. By evaluation on public benchmarks we found that TBV Radar SLAM achieves 65% lower error than the previous state of the art. We also show that it generalizes across environments without needing to change any parameters. We provide the open-source implementation at https://github.com/dan11003/tbv_slam_public

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
SLAM, localization, radar, introspection
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-106249 (URN)10.1109/LRA.2023.3268040 (DOI)000981889200013 ()2-s2.0-85153499426 (Scopus ID)
Funder
Vinnova, 2021-04714 2019-05878
Available from: 2023-06-13 Created: 2023-06-13 Last updated: 2024-01-17Bibliographically approved
8. Localising Faster: Efficient and precise lidar-based robot localisation in large-scale environments
Open this publication in new window or tab >>Localising Faster: Efficient and precise lidar-based robot localisation in large-scale environments
Show others...
2020 (English)In: 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2020, p. 4386-4392Conference paper, Published paper (Refereed)
Abstract [en]

This paper proposes a novel approach for global localisation of mobile robots in large-scale environments. Our method leverages learning-based localisation and filtering-based localisation, to localise the robot efficiently and precisely through seeding Monte Carlo Localisation (MCL) with a deeplearned distribution. In particular, a fast localisation system rapidly estimates the 6-DOF pose through a deep-probabilistic model (Gaussian Process Regression with a deep kernel), then a precise recursive estimator refines the estimated robot pose according to the geometric alignment. More importantly, the Gaussian method (i.e. deep probabilistic localisation) and nonGaussian method (i.e. MCL) can be integrated naturally via importance sampling. Consequently, the two systems can be integrated seamlessly and mutually benefit from each other. To verify the proposed framework, we provide a case study in large-scale localisation with a 3D lidar sensor. Our experiments on the Michigan NCLT long-term dataset show that the proposed method is able to localise the robot in 1.94 s on average (median of 0.8 s) with precision 0.75 m in a largescale environment of approximately 0.5 km 2.

Place, publisher, year, edition, pages
IEEE, 2020
Series
IEEE International Conference on Robotics and Automation (ICRA), ISSN 1050-4729, E-ISSN 2577-087X
Keywords
Gaussian processes, learning (artificial intelligence), mobile robots, Monte Carlo methods, neural nets, optical radar, path planning, recursive estimation, robot vision, SLAM (robots), precise lidar-based robot localisation, large-scale environments, global localisation, Monte Carlo Localisation, MCL, fast localisation system, deep-probabilistic model, Gaussian process regression, deep kernel, precise recursive estimator, Gaussian method, deep probabilistic localisation, large-scale localisation, largescale environment, time 0.8 s, size 0.75 m, Robots, Neural networks, Three-dimensional displays, Laser radar, Kernel
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-88030 (URN)10.1109/ICRA40945.2020.9196708 (DOI)000712319503010 ()2-s2.0-85092712554 (Scopus ID)978-1-7281-7396-2 (ISBN)978-1-7281-7395-5 (ISBN)
Conference
2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, May 31 - August 31, 2020
Funder
EU, Horizon 2020, 732737
Note

Funding agency:

UK Research & Innovation (UKRI)

Engineering & Physical Sciences Research Council (EPSRC) EP/M019918/1

Available from: 2021-01-31 Created: 2021-01-31 Last updated: 2024-01-02Bibliographically approved
9. NDT-Transformer: Large-Scale 3D Point Cloud Localisation using the Normal Distribution Transform Representation
Open this publication in new window or tab >>NDT-Transformer: Large-Scale 3D Point Cloud Localisation using the Normal Distribution Transform Representation
Show others...
2021 (English)In: 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021Conference paper, Published paper (Refereed)
Abstract [en]

3D point cloud-based place recognition is highly demanded by autonomous driving in GPS-challenged environments and serves as an essential component (i.e. loop-closure detection) in lidar-based SLAM systems. This paper proposes a novel approach, named NDT-Transformer, for real-time and large-scale place recognition using 3D point clouds. Specifically, a 3D Normal Distribution Transform (NDT) representation is employed to condense the raw, dense 3D point cloud as probabilistic distributions (NDT cells) to provide the geometrical shape description. Then a novel NDT-Transformer network learns a global descriptor from a set of 3D NDT cell representations. Benefiting from the NDT representation and NDT-Transformer network, the learned global descriptors are enriched with both geometrical and contextual information. Finally, descriptor retrieval is achieved using a query-database for place recognition. Compared to the state-of-the-art methods, the proposed approach achieves an improvement of 7.52% on average top 1 recall and 2.73% on average top 1% recall on the Oxford Robotcar benchmark.

Place, publisher, year, edition, pages
IEEE, 2021
Series
IEEE International Conference on Robotics and Automation (ICRA), ISSN 1050-4729, E-ISSN 2577-087X
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-96652 (URN)10.1109/ICRA48506.2021.9560932 (DOI)000765738804041 ()2-s2.0-85124680724 (Scopus ID)9781728190778 (ISBN)9781728190785 (ISBN)
Conference
2021 IEEE International Conference on Robotics and Automation (ICRA 2021), Xi'an, China, May 30 - June 5, 2021
Funder
EU, Horizon 2020, 732737
Note

Funding agencies:

UK Research & Innovation (UKRI)

Engineering & Physical Sciences Research Council (EPSRC) EP/R026092/1  

Royal Society of London European Commission RGS202432

Available from: 2022-01-24 Created: 2022-01-24 Last updated: 2024-01-02Bibliographically approved
10. Improving Localisation Accuracy using Submaps in warehouses
Open this publication in new window or tab >>Improving Localisation Accuracy using Submaps in warehouses
2018 (English)Conference paper, Oral presentation with published abstract (Other academic)
Abstract [en]

This paper presents a method for localisation in hybrid metric-topological maps built using only local information that is, only measurements that were captured by the robot when it was in a nearby location. The motivation is that observations are typically range and viewpoint dependent and that a map a discrete map representation might not be able to explain the full structure within a voxel. The localisation system uses a method to select submap based on how frequently and where from each submap was updated. This allow the system to select the most descriptive submap, thereby improving the localisation and increasing performance by up to 40%.

National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-71844 (URN)
Conference
IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Workshop on Robotics for Logistics in Warehouses and Environments Shared with Humans, Madrid, Spain, October 5, 2018
Projects
Iliad
Available from: 2019-01-28 Created: 2019-01-28 Last updated: 2024-01-02Bibliographically approved
11. A Submap per Perspective: Selecting Subsets for SuPer Mapping that Afford Superior Localization Quality
Open this publication in new window or tab >>A Submap per Perspective: Selecting Subsets for SuPer Mapping that Afford Superior Localization Quality
Show others...
2019 (English)In: 2019 European Conference on Mobile Robots (ECMR), IEEE, 2019Conference paper, Published paper (Refereed)
Abstract [en]

This paper targets high-precision robot localization. We address a general problem for voxel-based map representations that the expressiveness of the map is fundamentally limited by the resolution since integration of measurements taken from different perspectives introduces imprecisions, and thus reduces localization accuracy.We propose SuPer maps that contain one Submap per Perspective representing a particular view of the environment. For localization, a robot then selects the submap that best explains the environment from its perspective. We propose SuPer mapping as an offline refinement step between initial SLAM and deploying autonomous robots for navigation. We evaluate the proposed method on simulated and real-world data that represent an important use case of an industrial scenario with high accuracy requirements in an repetitive environment. Our results demonstrate a significantly improved localization accuracy, up to 46% better compared to localization in global maps, and up to 25% better compared to alternative submapping approaches.

Place, publisher, year, edition, pages
IEEE, 2019
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-79739 (URN)10.1109/ECMR.2019.8870941 (DOI)000558081900037 ()2-s2.0-85074443858 (Scopus ID)978-1-7281-3605-9 (ISBN)
Conference
European Conference on Mobile Robotics (ECMR), Prague, Czech Republic, September 4-6, 2019
Funder
EU, Horizon 2020, 732737Knowledge Foundation
Available from: 2020-02-03 Created: 2020-02-03 Last updated: 2024-01-02Bibliographically approved
12. Incorporating Ego-motion Uncertainty Estimates in Range Data Registration
Open this publication in new window or tab >>Incorporating Ego-motion Uncertainty Estimates in Range Data Registration
Show others...
2017 (English)In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 1389-1395Conference paper, Published paper (Refereed)
Abstract [en]

Local scan registration approaches commonlyonly utilize ego-motion estimates (e.g. odometry) as aninitial pose guess in an iterative alignment procedure. Thispaper describes a new method to incorporate ego-motionestimates, including uncertainty, into the objective function of aregistration algorithm. The proposed approach is particularlysuited for feature-poor and self-similar environments,which typically present challenges to current state of theart registration algorithms. Experimental evaluation showssignificant improvements in accuracy when using data acquiredby Automatic Guided Vehicles (AGVs) in industrial productionand warehouse environments.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
Series
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems, ISSN 2153-0858, E-ISSN 2153-0866
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-62803 (URN)10.1109/IROS.2017.8202318 (DOI)000426978201108 ()2-s2.0-85041958720 (Scopus ID)978-1-5386-2682-5 (ISBN)978-1-5386-2683-2 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2017), Vancouver, Canada, September 24–28, 2017
Projects
Semantic RobotsILIAD
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2017-11-24 Created: 2017-11-24 Last updated: 2024-01-02Bibliographically approved

Open Access in DiVA

Cover(540 kB)43 downloads
File information
File name COVER01.pdfFile size 540 kBChecksum SHA-512
3a5e6636a3ca44c8e21285b7d10b94b0581510467527561843aa2f89f9090c5ed1f90a1de7d285da7e61455badb3fc913169e83420fd66528f4f63ce970e0036
Type coverMimetype application/pdf
Robust large-scale mapping and localization: Combining robust sensing and introspection(11933 kB)7 downloads
File information
File name FULLTEXT01.pdfFile size 11933 kBChecksum SHA-512
18ba31e598aa9f3cd516b007afbf9dff4efb95bf0afc727e57328db2a6965ca2302d43b54e2d91f9baa9291d6c79f7f29277498c79b198ed3d9f157bf80617c2
Type fulltextMimetype application/pdf
Spikblad(95 kB)23 downloads
File information
File name SPIKBLAD01.pdfFile size 95 kBChecksum SHA-512
54b023b27378065a6760ec0af814988f15dd038d9b84d8aec00cd4275c56ebd66081e29b398b28338cf360d6e844e26af7d06a526b2a4d2deff844a274091449
Type spikbladMimetype application/pdf

Authority records

Adolfsson, Daniel

Search in DiVA

By author/editor
Adolfsson, Daniel
By organisation
School of Science and Technology
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 7 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 1885 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf