oru.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 68) Show all publications
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R. & Lilienthal, A. J. (2020). Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction. Robotics and Computer-Integrated Manufacturing, 61, Article ID 101830.
Open this publication in new window or tab >>Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction
Show others...
2020 (English)In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 61, article id 101830Article in journal (Refereed) Published
Abstract [en]

Safety, legibility and efficiency are essential for autonomous mobile robots that interact with humans. A key factor in this respect is bi-directional communication of navigation intent, which we focus on in this article with a particular view on industrial logistic applications. In the direction robot-to-human, we study how a robot can communicate its navigation intent using Spatial Augmented Reality (SAR) such that humans can intuitively understand the robot's intention and feel safe in the vicinity of robots. We conducted experiments with an autonomous forklift that projects various patterns on the shared floor space to convey its navigation intentions. We analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift and carried out stimulated recall interviews (SRI) in order to identify desirable features for projection of robot intentions. In the direction human-to-robot, we argue that robots in human co-habited environments need human-aware task and motion planning to support safety and efficiency, ideally responding to people's motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond what can be inferred from the trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. In this work, we investigate the possibility of human-to-robot implicit intention transference solely from eye gaze data and evaluate how the observed eye gaze patterns of the participants relate to their navigation decisions. We again analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift for clues that could reveal direction intent. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human gaze for early obstacle avoidance.

Place, publisher, year, edition, pages
Elsevier, 2020
Keywords
Human-robot interaction (HRI), Mobile robots, Intention communication, Eye-tracking, Intention recognition, Spatial augmented reality, Stimulated recall interview, Obstacle avoidance, Safety, Logistics
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-78358 (URN)10.1016/j.rcim.2019.101830 (DOI)000496834800002 ()2-s2.0-85070732550 (Scopus ID)
Note

Funding Agencies:

KKS SIDUS project AIR: "Action and Intention Recognition in Human Interaction with Autonomous Systems"  20140220

H2020 project ILIAD: "Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces"  732737

Available from: 2019-12-03 Created: 2019-12-03 Last updated: 2020-02-06Bibliographically approved
Adolfsson, D., Lowry, S., Magnusson, M., Lilienthal, A. J. & Andreasson, H. (2019). A Submap per Perspective: Selecting Subsets for SuPer Mapping that Afford Superior Localization Quality. In: 2019 European Conference on Mobile Robots (ECMR): . Paper presented at European Conference on Mobile Robotics (ECMR), Prague, Czech Republic, September 4 - 6, 2019. IEEE
Open this publication in new window or tab >>A Submap per Perspective: Selecting Subsets for SuPer Mapping that Afford Superior Localization Quality
Show others...
2019 (English)In: 2019 European Conference on Mobile Robots (ECMR), IEEE, 2019Conference paper, Published paper (Refereed)
Abstract [en]

This paper targets high-precision robot localization. We address a general problem for voxel-based map representations that the expressiveness of the map is fundamentally limited by the resolution since integration of measurements taken from different perspectives introduces imprecisions, and thus reduces localization accuracy.We propose SuPer maps that contain one Submap per Perspective representing a particular view of the environment. For localization, a robot then selects the submap that best explains the environment from its perspective. We propose SuPer mapping as an offline refinement step between initial SLAM and deploying autonomous robots for navigation. We evaluate the proposed method on simulated and real-world data that represent an important use case of an industrial scenario with high accuracy requirements in an repetitive environment. Our results demonstrate a significantly improved localization accuracy, up to 46% better compared to localization in global maps, and up to 25% better compared to alternative submapping approaches.

Place, publisher, year, edition, pages
IEEE, 2019
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-79739 (URN)10.1109/ECMR.2019.8870941 (DOI)2-s2.0-85074443858 (Scopus ID)978-1-7281-3605-9 (ISBN)
Conference
European Conference on Mobile Robotics (ECMR), Prague, Czech Republic, September 4 - 6, 2019
Funder
EU, Horizon 2020, 732737
Available from: 2020-02-03 Created: 2020-02-03 Last updated: 2020-02-14Bibliographically approved
Chadalavada, R. T., Andreasson, H., Schindler, M. & Lilienthal, A. J. (2019). Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction. In: : . Paper presented at International Conference on Social Robotics - Quality of Interaction in Socially Assistive Robots Workshop, Madrid, Spain, November 26th-29th, 2019.
Open this publication in new window or tab >>Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction
2019 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

Eye gaze can convey information about intentions beyond what can beinferred from the trajectory and head pose of a person. We propose eye-trackingglasses as safety equipment in industrial environments shared by humans androbots. In this work, an implicit intention transference system was developed and implemented. Robot was given access to human eye gaze data, and it responds tothe eye gaze data through spatial augmented reality projections on the sharedfloor space in real-time and the robot could also adapt its path. This allows proactivesafety approaches in HRI for example by attempting to get the human'sattention when they are in the vicinity of a moving robot. A study was conductedwith workers at an industrial warehouse. The time taken to understand the behaviorof the system was recorded. Electrodermal activity and pupil diameter wererecorded to measure the increase in stress and cognitive load while interactingwith an autonomous system, using these measurements as a proxy to quantifytrust in autonomous systems.

Keywords
Human-robot interaction, intention communication, eye tracking, spatial augmented reality, electrodermal activity, stress, cognitive load.
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-79736 (URN)
Conference
International Conference on Social Robotics - Quality of Interaction in Socially Assistive Robots Workshop, Madrid, Spain, November 26th-29th, 2019
Projects
ILAID
Available from: 2020-02-03 Created: 2020-02-03 Last updated: 2020-02-14Bibliographically approved
Della Corte, B., Andreasson, H., Stoyanov, T. & Grisetti, G. (2019). Unified Motion-Based Calibration of Mobile Multi-Sensor Platforms With Time Delay Estimation. IEEE Robotics and Automation Letters, 4(2), 902-909
Open this publication in new window or tab >>Unified Motion-Based Calibration of Mobile Multi-Sensor Platforms With Time Delay Estimation
2019 (English)In: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 4, no 2, p. 902-909Article in journal (Refereed) Published
Abstract [en]

The ability to maintain and continuously update geometric calibration parameters of a mobile platform is a key functionality for every robotic system. These parameters include the intrinsic kinematic parameters of the platform, the extrinsic parameters of the sensors mounted on it, and their time delays. In this letter, we present a unified pipeline for motion-based calibration of mobile platforms equipped with multiple heterogeneous sensors. We formulate a unified optimization problem to concurrently estimate the platform kinematic parameters, the sensors extrinsic parameters, and their time delays. We analyze the influence of the trajectory followed by the robot on the accuracy of the estimate. Our framework automatically selects appropriate trajectories to maximize the information gathered and to obtain a more accurate parameters estimate. In combination with that, our pipeline observes the parameters evolution in long-term operation to detect possible values change in the parameters set. The experiments conducted on real data show a smooth convergence along with the ability to detect changes in parameters value. We release an open-source version of our framework to the community.

Place, publisher, year, edition, pages
IEEE, 2019
Keywords
Calibration and Identification
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-72756 (URN)10.1109/LRA.2019.2892992 (DOI)000458182100012 ()
Note

Funding Agency:

Semantic Robots Research Profile - Swedish Knowledge Foundation (KKS) 

Available from: 2019-02-25 Created: 2019-02-25 Last updated: 2019-02-25Bibliographically approved
Pecora, F., Andreasson, H., Mansouri, M. & Petkov, V. (2018). A Loosely-Coupled Approach for Multi-Robot Coordination, Motion Planning and Control, ICAPS. In: Mathijs de Weerdt, Sven Koenig, Gabriele Röger, Matthijs Spaan (Ed.), Proceedings of the International Conference on Automated Planning and Scheduling: . Paper presented at International Conference on Automated Planning and Scheduling (ICAPS 2018), Delft, The Netherland, June 24-29, 2018 (pp. 485-493). Delft, The Netherlands: AAAI Press, 2018-June, Article ID 139850.
Open this publication in new window or tab >>A Loosely-Coupled Approach for Multi-Robot Coordination, Motion Planning and Control, ICAPS
2018 (English)In: Proceedings of the International Conference on Automated Planning and Scheduling / [ed] Mathijs de Weerdt, Sven Koenig, Gabriele Röger, Matthijs Spaan, Delft, The Netherlands: AAAI Press, 2018, Vol. 2018-June, p. 485-493, article id 139850Conference paper, Published paper (Refereed)
Abstract [en]

Deploying fleets of autonomous robots in real-world applications requires addressing three problems: motion planning, coordination, and control. Application-specific features of the environment and robots often narrow down the possible motion planning and control methods that can be used. This paper proposes a lightweight coordination method that implements a high-level controller for a fleet of potentially heterogeneous robots. Very few assumptions are made on robot controllers, which are required only to be able to accept set point updates and to report their current state. The approach can be used with any motion planning method for computing kinematically-feasible paths. Coordination uses heuristics to update priorities while robots are in motion, and a simple model of robot dynamics to guarantee dynamic feasibility. The approach avoids a priori discretization of the environment or of robot paths, allowing robots to “follow each other” through critical sections. We validate the method formally and experimentally with different motion planners and robot controllers, in simulation and with real robots.

Place, publisher, year, edition, pages
Delft, The Netherlands: AAAI Press, 2018
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-64721 (URN)000492986200059 ()2-s2.0-85054990876 (Scopus ID)
Conference
International Conference on Automated Planning and Scheduling (ICAPS 2018), Delft, The Netherland, June 24-29, 2018
Projects
Semantic RobotsILIAD
Funder
Knowledge Foundation, 20140033EU, Horizon 2020, 732737Vinnova
Available from: 2018-01-31 Created: 2018-01-31 Last updated: 2019-11-12Bibliographically approved
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R. & Lilienthal, A. (2018). Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses. In: Case K. &Thorvald P. (Ed.), Advances in Manufacturing Technology XXXII: Proceedings of the 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, September 11–13, 2018, University of Skövde, Sweden. Paper presented at 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, University of Skövde, Sweden, September 11–13, 2018 (pp. 253-258). Amsterdam, Netherlands: IOS Press
Open this publication in new window or tab >>Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses
Show others...
2018 (English)In: Advances in Manufacturing Technology XXXII: Proceedings of the 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, September 11–13, 2018, University of Skövde, Sweden / [ed] Case K. &Thorvald P., Amsterdam, Netherlands: IOS Press, 2018, p. 253-258Conference paper, Published paper (Refereed)
Abstract [en]

Robots in human co-habited environments need human-aware task and motion planning, ideally responding to people’s motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. This paper investigates the possibility of human-to-robot implicit intention transference solely from eye gaze data.  We present experiments in which humans wearing eye-tracking glasses encountered a small forklift truck under various conditions. We evaluate how the observed eye gaze patterns of the participants related to their navigation decisions. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human eye gaze for early obstacle avoidance.

Place, publisher, year, edition, pages
Amsterdam, Netherlands: IOS Press, 2018
Series
Advances in Transdisciplinary Engineering, ISSN 2352-751X, E-ISSN 2352-7528 ; 8
Keywords
Human-Robot Interaction (HRI), Eye-tracking, Eye-Tracking Glasses, Navigation Intent, Implicit Intention Transference, Obstacle avoidance.
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-70706 (URN)10.3233/978-1-61499-902-7-253 (DOI)000462212700041 ()2-s2.0-85057390000 (Scopus ID)978-1-61499-901-0 (ISBN)978-1-61499-902-7 (ISBN)
Conference
16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, University of Skövde, Sweden, September 11–13, 2018
Projects
Action and Intention Recognition (AIR)ILIAD
Available from: 2018-12-12 Created: 2018-12-12 Last updated: 2019-04-04Bibliographically approved
Adolfsson, D., Lowry, S. & Andreasson, H. (2018). Improving Localisation Accuracy using Submaps in warehouses. In: : . Paper presented at IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Workshop on Robotics for Logistics in Warehouses and Environments Shared with Humans, Madrid, Spain, October 5, 2018.
Open this publication in new window or tab >>Improving Localisation Accuracy using Submaps in warehouses
2018 (English)Conference paper, Oral presentation with published abstract (Other academic)
Abstract [en]

This paper presents a method for localisation in hybrid metric-topological maps built using only local information that is, only measurements that were captured by the robot when it was in a nearby location. The motivation is that observations are typically range and viewpoint dependent and that a map a discrete map representation might not be able to explain the full structure within a voxel. The localisation system uses a method to select submap based on how frequently and where from each submap was updated. This allow the system to select the most descriptive submap, thereby improving the localisation and increasing performance by up to 40%.

National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-71844 (URN)
Conference
IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Workshop on Robotics for Logistics in Warehouses and Environments Shared with Humans, Madrid, Spain, October 5, 2018
Projects
Iliad
Available from: 2019-01-28 Created: 2019-01-28 Last updated: 2019-01-28Bibliographically approved
Lowry, S. & Andreasson, H. (2018). Lightweight, Viewpoint-Invariant Visual Place Recognition in Changing Environments. IEEE Robotics and Automation Letters, 3(2), 957-964
Open this publication in new window or tab >>Lightweight, Viewpoint-Invariant Visual Place Recognition in Changing Environments
2018 (English)In: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 3, no 2, p. 957-964Article in journal (Refereed) Published
Abstract [en]

This paper presents a viewpoint-invariant place recognition algorithm which is robust to changing environments while requiring only a small memory footprint. It demonstrates that condition-invariant local features can be combined with Vectors of Locally Aggregated Descriptors (VLAD) to reduce high-dimensional representations of images to compact binary signatures while retaining place matching capability across visually dissimilar conditions. This system provides a speed-up of two orders of magnitude over direct feature matching, and outperforms a bag-of-visual-words approach with near-identical computation speed and memory footprint. The experimental results show that single-image place matching from non-aligned images can be achieved in visually changing environments with as few as 256 bits (32 bytes) per image.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2018
Keywords
Visual-based navigation, recognition, localization
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-64652 (URN)10.1109/LRA.2018.2793308 (DOI)000424646100015 ()
Note

Funding Agency:

Semantic Robots Research Profile - Swedish Knowledge Foundation

Available from: 2018-01-30 Created: 2018-01-30 Last updated: 2018-02-28Bibliographically approved
Lowry, S. & Andreasson, H. (2018). LOGOS: Local geometric support for high-outlier spatial verification. In: : . Paper presented at IEEE International Conference of Robotics and Automation (ICRA 2018), Brisbane, Australia, May 21-25, 2018 (pp. 7262-7269). IEEE Computer Society
Open this publication in new window or tab >>LOGOS: Local geometric support for high-outlier spatial verification
2018 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents LOGOS, a method of spatial verification for visual localization that is robust in the presence of a high proportion of outliers. LOGOS uses scale and orientation information from local neighbourhoods of features to determine which points are likely to be inliers. The inlier points can be used for secondary localization verification and pose estimation. LOGOS is demonstrated on a number of benchmark localization datasets and outperforms RANSAC as a method of outlier removal and localization verification in scenarios that require robustness to many outliers.

Place, publisher, year, edition, pages
IEEE Computer Society, 2018
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-68446 (URN)000446394505077 ()
Conference
IEEE International Conference of Robotics and Automation (ICRA 2018), Brisbane, Australia, May 21-25, 2018
Note

Funding Agency:

Semantic Robots Research Profile - Swedish Knowledge Foundation (KKS)

Available from: 2018-08-13 Created: 2018-08-13 Last updated: 2018-10-22Bibliographically approved
Andreasson, H., Adolfsson, D., Stoyanov, T., Magnusson, M. & Lilienthal, A. (2017). Incorporating Ego-motion Uncertainty Estimates in Range Data Registration. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): . Paper presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2017), Vancouver, Canada, September 24–28, 2017 (pp. 1389-1395). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Incorporating Ego-motion Uncertainty Estimates in Range Data Registration
Show others...
2017 (English)In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 1389-1395Conference paper, Published paper (Refereed)
Abstract [en]

Local scan registration approaches commonlyonly utilize ego-motion estimates (e.g. odometry) as aninitial pose guess in an iterative alignment procedure. Thispaper describes a new method to incorporate ego-motionestimates, including uncertainty, into the objective function of aregistration algorithm. The proposed approach is particularlysuited for feature-poor and self-similar environments,which typically present challenges to current state of theart registration algorithms. Experimental evaluation showssignificant improvements in accuracy when using data acquiredby Automatic Guided Vehicles (AGVs) in industrial productionand warehouse environments.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
Series
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems, ISSN 2153-0858, E-ISSN 2153-0866
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-62803 (URN)10.1109/IROS.2017.8202318 (DOI)000426978201108 ()2-s2.0-85041958720 (Scopus ID)978-1-5386-2682-5 (ISBN)978-1-5386-2683-2 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2017), Vancouver, Canada, September 24–28, 2017
Projects
Semantic RobotsILIAD
Funder
Knowledge FoundationEU, Horizon 2020, 732737
Available from: 2017-11-24 Created: 2017-11-24 Last updated: 2018-04-09Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-2953-1564

Search in DiVA

Show all publications