oru.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 229) Show all publications
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R. & Lilienthal, A. (2020). Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction. Robotics and Computer-Integrated Manufacturing, 61, Article ID 101830.
Open this publication in new window or tab >>Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction
Show others...
2020 (English)In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 61, article id 101830Article in journal (Refereed) Published
Abstract [en]

Safety, legibility and efficiency are essential for autonomous mobile robots that interact with humans. A key factor in this respect is bi-directional communication of navigation intent, which we focus on in this article with a particular view on industrial logistic applications. In the direction robot-to-human, we study how a robot can communicate its navigation intent using Spatial Augmented Reality (SAR) such that humans can intuitively understand the robot's intention and feel safe in the vicinity of robots. We conducted experiments with an autonomous forklift that projects various patterns on the shared floor space to convey its navigation intentions. We analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift and carried out stimulated recall interviews (SRI) in order to identify desirable features for projection of robot intentions. In the direction human-to-robot, we argue that robots in human co-habited environments need human-aware task and motion planning to support safety and efficiency, ideally responding to people's motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond what can be inferred from the trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. In this work, we investigate the possibility of human-to-robot implicit intention transference solely from eye gaze data and evaluate how the observed eye gaze patterns of the participants relate to their navigation decisions. We again analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift for clues that could reveal direction intent. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human gaze for early obstacle avoidance.

Place, publisher, year, edition, pages
Elsevier, 2020
Keywords
Human-robot interaction (HRI), Mobile robots, Intention communication, Eye-tracking, Intention recognition, Spatial augmented reality, Stimulated recall interview, Obstacle avoidance, Safety, Logistics
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-78358 (URN)10.1016/j.rcim.2019.101830 (DOI)000496834800002 ()2-s2.0-85070732550 (Scopus ID)
Note

Funding Agencies:

KKS SIDUS project AIR: "Action and Intention Recognition in Human Interaction with Autonomous Systems"  20140220

H2020 project ILIAD: "Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces"  732737

Available from: 2019-12-03 Created: 2019-12-03 Last updated: 2019-12-03Bibliographically approved
Mielle, M., Magnusson, M. & Lilienthal, A. (2019). A comparative analysis of radar and lidar sensing for localization and mapping. In: : . Paper presented at 9th European Conference on Mobile Robots (ECMR 2019), Prague, Czech Republic, September 4-6, 2019. IEEE
Open this publication in new window or tab >>A comparative analysis of radar and lidar sensing for localization and mapping
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Lidars and cameras are the sensors most commonly used for Simultaneous Localization And Mapping (SLAM). However, they are not effective in certain scenarios, e.g. when fire and smoke are present in the environment. While radars are much less affected by such conditions, radar and lidar have rarely been compared in terms of the achievable SLAM accuracy. We present a principled comparison of the accuracy of a novel radar sensor against that of a Velodyne lidar, for localization and mapping.

We evaluate the performance of both sensors by calculating the displacement in position and orientation relative to a ground-truth reference positioning system, over three experiments in an indoor lab environment. We use two different SLAM algorithms and found that the mean displacement in position when using the radar sensor was less than 0.037 m, compared to 0.011m for the lidar. We show that while producing slightly less accurate maps than a lidar, the radar can accurately perform SLAM and build a map of the environment, even including details such as corners and small walls.

Place, publisher, year, edition, pages
IEEE, 2019
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-76976 (URN)
Conference
9th European Conference on Mobile Robots (ECMR 2019), Prague, Czech Republic, September 4-6, 2019
Available from: 2019-10-02 Created: 2019-10-02 Last updated: 2019-10-02Bibliographically approved
Mielle, M., Magnusson, M. & Lilienthal, A. (2019). A comparative analysis of radar and lidar sensing for localization and mapping. In: : . Paper presented at 9th European Conference on Mobile Robots (ECMR 2019), Prague, Czech Republic, September 4-6, 2019. IEEE
Open this publication in new window or tab >>A comparative analysis of radar and lidar sensing for localization and mapping
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Lidars and cameras are the sensors most commonly used for Simultaneous Localization And Mapping (SLAM). However, they are not effective in certain scenarios, e.g. when fire and smoke are present in the environment. While radars are much less affected by such conditions, radar and lidar have rarely been compared in terms of the achievable SLAM accuracy. We present a principled comparison of the accuracy of a novel radar sensor against that of a Velodyne lidar, for localization and mapping.

We evaluate the performance of both sensors by calculating the displacement in position and orientation relative to a ground-truth reference positioning system, over three experiments in an indoor lab environment. We use two different SLAM algorithms and found that the mean displacement in position when using the radar sensor was less than 0.037 m, compared to 0.011m for the lidar. We show that while producing slightly less accurate maps than a lidar, the radar can accurately perform SLAM and build a map of the environment, even including details such as corners and small walls.

Place, publisher, year, edition, pages
IEEE, 2019
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-77623 (URN)
Conference
9th European Conference on Mobile Robots (ECMR 2019), Prague, Czech Republic, September 4-6, 2019
Available from: 2019-10-25 Created: 2019-10-25 Last updated: 2019-10-25
Hüllmann, D., Neumann, P. P., Monroy, J. & Lilienthal, A. (2019). A Realistic Remote Gas Sensor Model for Three-Dimensional Olfaction Simulations. In: ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN): . Paper presented at 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), Fukuoka, japan, mMy 26-29, 2019 (pp. 1-3). IEEE
Open this publication in new window or tab >>A Realistic Remote Gas Sensor Model for Three-Dimensional Olfaction Simulations
2019 (English)In: ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), IEEE, 2019, p. 1-3Conference paper, Published paper (Refereed)
Abstract [en]

Remote gas sensors like those based on the Tunable Diode Laser Absorption Spectroscopy (TDLAS) enable mobile robots to scan huge areas for gas concentrations in reasonable time and are therefore well suited for tasks such as gas emission surveillance and environmental monitoring. A further advantage of remote sensors is that the gas distribution is not disturbed by the sensing platform itself if the measurements are carried out from a sufficient distance, which is particularly interesting when a rotary-wing platform is used. Since there is no possibility to obtain ground truth measurements of gas distributions, simulations are used to develop and evaluate suitable olfaction algorithms. For this purpose several models of in-situ gas sensors have been developed, but models of remote gas sensors are missing. In this paper we present two novel 3D ray-tracer-based TDLAS sensor models. While the first model simplifies the laser beam as a line, the second model takes the conical shape of the beam into account. Using a simulated gas plume, we compare the line model with the cone model in terms of accuracy and computational cost and show that the results generated by the cone model can differ significantly from those of the line model.

Place, publisher, year, edition, pages
IEEE, 2019
Keywords
gas detector, remote gas sensor, sensor modelling, TDLAS, gas dispersion simulation
National Category
Remote Sensing Robotics
Identifiers
urn:nbn:se:oru:diva-76220 (URN)10.1109/ISOEN.2019.8823330 (DOI)978-1-5386-8327-9 (ISBN)978-1-5386-8328-6 (ISBN)
Conference
2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), Fukuoka, japan, mMy 26-29, 2019
Available from: 2019-09-11 Created: 2019-09-11 Last updated: 2019-09-11Bibliographically approved
Neumann, P. P., Hüllmann, D., Krentel, D., Kluge, M., Dzierliński, M., Lilienthal, A. & Bartholmai, M. (2019). Aerial-based gas tomography: from single beams to complex gas distributions. European Journal of Remote Sensing, 1-15
Open this publication in new window or tab >>Aerial-based gas tomography: from single beams to complex gas distributions
Show others...
2019 (English)In: European Journal of Remote Sensing, ISSN 2279-7254, p. 1-15Article in journal (Refereed) Epub ahead of print
Abstract [en]

In this paper, we present and validate the concept of an autonomous aerial robot to reconstruct tomographic 2D slices of gas plumes in outdoor environments. Our platform, the so-called Unmanned Aerial Vehicle for Remote Gas Sensing (UAV-REGAS), combines a lightweight Tunable Diode Laser Absorption Spectroscopy (TDLAS) gas sensor with a 3-axis aerial stabilization gimbal for aiming at a versatile octocopter. While the TDLAS sensor provides integral gas concentration measurements, it does not measure the distance traveled by the laser diode?s beam nor the distribution of gas along the optical path. Thus, we complement the set-up with a laser rangefinder and apply principles of Computed Tomography (CT) to create a model of the spatial gas distribution from a set of integral concentration measurements. To allow for a fundamental ground truth evaluation of the applied gas tomography algorithm, we set up a unique outdoor test environment based on two 3D ultrasonic anemometers and a distributed array of 10 infrared gas transmitters. We present results showing its performance characteristics and 2D plume reconstruction capabilities under realistic conditions. The proposed system can be deployed in scenarios that cannot be addressed by currently available robots and thus constitutes a significant step forward for the field of Mobile Robot Olfaction (MRO).

Place, publisher, year, edition, pages
London: Taylor & Francis, 2019
Keywords
Aerial robot olfaction, mobile robot olfaction, gas tomography, TDLAS, plume
National Category
Remote Sensing Occupational Health and Environmental Health Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-76009 (URN)10.1080/22797254.2019.1640078 (DOI)000490523700001 ()
Note

Funding Agencies:

German Federal Ministry for Economic Affairs and Energy (BMWi) within the ZIM program  KF2201091HM4

BAM 

Available from: 2019-09-02 Created: 2019-09-02 Last updated: 2019-11-15Bibliographically approved
Wiedemann, T., Lilienthal, A. & Shutin, D. (2019). Analysis of Model Mismatch Effects for a Model-based Gas Source Localization Strategy Incorporating Advection Knowledge. Sensors, 19(3), Article ID 520.
Open this publication in new window or tab >>Analysis of Model Mismatch Effects for a Model-based Gas Source Localization Strategy Incorporating Advection Knowledge
2019 (English)In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 19, no 3, article id 520Article in journal (Refereed) Published
Abstract [en]

In disaster scenarios, where toxic material is leaking, gas source localization is a common but also dangerous task. To reduce threats for human operators, we propose an intelligent sampling strategy that enables a multi-robot system to autonomously localize unknown gas sources based on gas concentration measurements. This paper discusses a probabilistic, model-based approach for incorporating physical process knowledge into the sampling strategy. We model the spatial and temporal dynamics of the gas dispersion with a partial differential equation that accounts for diffusion and advection effects. We consider the exact number of sources as unknown, but assume that gas sources are sparsely distributed. To incorporate the sparsity assumption we make use of sparse Bayesian learning techniques. Probabilistic modeling can account for possible model mismatch effects that otherwise can undermine the performance of deterministic methods. In the paper we evaluate the proposed gas source localization strategy in simulations using synthetic data. Compared to real-world experiments, a simulated environment provides us with ground truth data and reproducibility necessary to get a deeper insight into the proposed strategy. The investigation shows that (i) the probabilistic model can compensate imperfect modeling; (ii) the sparsity assumption significantly accelerates the source localization; and (iii) a-priori advection knowledge is of advantage for source localization, however, it is only required to have a certain level of accuracy. These findings will help in the future to parameterize the proposed algorithm in real world applications.

Place, publisher, year, edition, pages
Basel, Switzerland: MDPI, 2019
Keywords
Robotic exploration, gas source localization, mobile robot olfaction, sparse Bayesian learning, multi-agent system, advection-diffusion model
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-71964 (URN)10.3390/s19030520 (DOI)000459941200083 ()30691174 (PubMedID)2-s2.0-85060572534 (Scopus ID)
Projects
SmokeBot (EC H2020, 645101)
Note

Funding Agencies:

European Commission  645101 

Valles Marineris Explorer initiative of DLR (German Aerospace Center) Space Administration 

Available from: 2019-01-31 Created: 2019-01-31 Last updated: 2019-06-19Bibliographically approved
Schindler, M. & Lilienthal, A. J. (2019). Domain-specific interpretation of eye tracking data: towards a refined use of the eye-mind hypothesis for the field of geometry. Educational Studies in Mathematics, 101(1), 123-139
Open this publication in new window or tab >>Domain-specific interpretation of eye tracking data: towards a refined use of the eye-mind hypothesis for the field of geometry
2019 (English)In: Educational Studies in Mathematics, ISSN 0013-1954, E-ISSN 1573-0816, Vol. 101, no 1, p. 123-139Article in journal (Refereed) Published
Abstract [en]

Eye tracking is getting increasingly popular in mathematics education research. Studies predominantly rely on the so-called eye-mind hypothesis (EMH), which posits that what persons fixate on closely relates to what they process. Given that the EMH was developed in reading research, we see the risk that implicit assumptions are tacitly adopted in mathematics even though they may not apply in this domain. This article investigates to what extent the EMH applies in mathematics - geometry in particular - and aims to lift the discussion of what inferences can be validly made from eye-tracking data. We use a case study to investigate the need for a refinement of the use of the EMH. In a stimulated recall interview, a student described his original thoughts perusing a gaze-overlaid video recorded when he was working on a geometry problem. Our findings contribute to better a understanding of when and how the EMH applies in the subdomain of geometry. In particular, we identify patterns of eye movements that provide valuable information on students' geometry problem solving: certain patterns where the eye fixates on what the student is processing and others where the EMH does not hold. Identifying such patterns may contribute to an interpretation theory for students' eye movements in geometry - exemplifying a domain-specific theory that may reduce the inherent ambiguity and uncertainty that eye tracking data analysis has.

Place, publisher, year, edition, pages
Springer, 2019
Keywords
Eye tracking, Eye movements, Eye-mind hypothesis, Geometry
National Category
Educational Sciences
Identifiers
urn:nbn:se:oru:diva-73868 (URN)10.1007/s10649-019-9878-z (DOI)000463669800009 ()2-s2.0-85061182709 (Scopus ID)
Available from: 2019-04-23 Created: 2019-04-23 Last updated: 2019-04-23Bibliographically approved
Xing, Y., Vincent, T. A., Fan, H., Schaffernicht, E., Hernandez Bennetts, V., Lilienthal, A., . . . Gardner, J. W. (2019). FireNose on Mobile Robot in Harsh Environments. IEEE Sensors Journal
Open this publication in new window or tab >>FireNose on Mobile Robot in Harsh Environments
Show others...
2019 (English)In: IEEE Sensors Journal, ISSN 1530-437X, E-ISSN 1558-1748Article in journal (Refereed) Epub ahead of print
Abstract [en]

In this work we present a novel multi-sensor unit, a.k.a. FireNose, to detect and discriminate both known and unknown gases in uncontrolled conditions to aid firefighters under harsh conditions. The unit includes three metal oxide (MOX) gas sensors with CMOS micro heaters, a plasmonic enhanced non-dispersive infrared (NDIR) sensor optimized for the detection of CO2, a commercial temperature humidity sensor, and a flow sensor. We developed custom film coatings for the MOX sensors (SnO2, WO3 and NiO) which greatly improved the gas sensitivity, response time and lifetime of the miniature devices. Our proposed system exhibits promising performance for gas sensing in harsh environments, in terms of power consumption (∼ 35 mW at 350°C per MOX sensor), response time (<10 s), robustness and physical size. The sensing unit was evaluated with plumes of gases in both, a laboratory setup on a gas testing rig and on-board a mobile robot operating indoors. These high sensitivity, high-bandwidth sensors, together with online unsupervised gas discrimination algorithms, are able to detect and generate their spatial distribution maps accordingly. In the robotic experiments, the resulting gas distribution maps corresponded well to the actual location of the sources. Therefore, we verified its ability to differentiate gases and generate gas maps in real-world experiments.

Place, publisher, year, edition, pages
IEEE, 2019
Keywords
FireNose, mobile robot, MOX sensor, gas map, harsh environments
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-77784 (URN)10.1109/JSEN.2019.2939039 (DOI)
Funder
EU, Horizon 2020
Available from: 2019-11-06 Created: 2019-11-06 Last updated: 2019-11-06Bibliographically approved
Hüllmann, D., Neumann, P. P. & Lilienthal, A. (2019). Gas Dispersion Fluid Mechanics Simulation for Large Outdoor Environments. In: 36th Danubia Adria Symposium on Advances in Experimental Mechanics: Extended Abstracts. Paper presented at 36th Danubia Adria Symposium on Advances in Experimental Mechanics, Plzeň, Czech Republic, 24–27 September 2019 (pp. 49-50). Pilsen, Czech Republic: Danubia-Adria Symposium on Advances in Experimental Mechanics
Open this publication in new window or tab >>Gas Dispersion Fluid Mechanics Simulation for Large Outdoor Environments
2019 (English)In: 36th Danubia Adria Symposium on Advances in Experimental Mechanics: Extended Abstracts, Pilsen, Czech Republic: Danubia-Adria Symposium on Advances in Experimental Mechanics , 2019, p. 49-50Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

The development of algorithms for mapping gas distributions and localising gas sources is a challenging task, because gas dispersion is a highly dynamic process and it is impossible to capture ground truth data. Fluid-mechanical simulations are a suitable way to support the development of these algorithms. Several tools for gas dispersion simulation have been developed, but they are not suitable for simulations of large outdoor environments. In this paper, we present a concept of how an existing simulator can be extended to handle both indoor and large outdoor scenarios.

Place, publisher, year, edition, pages
Pilsen, Czech Republic: Danubia-Adria Symposium on Advances in Experimental Mechanics, 2019
Keywords
Gas dispersion simulation, CFD, gas tomography
National Category
Robotics Remote Sensing Fluid Mechanics and Acoustics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-77198 (URN)978-80-261-0876-4 (ISBN)
Conference
36th Danubia Adria Symposium on Advances in Experimental Mechanics, Plzeň, Czech Republic, 24–27 September 2019
Available from: 2019-10-11 Created: 2019-10-11 Last updated: 2019-10-15Bibliographically approved
Wiedemann, T., Shutin, D. & Lilienthal, A. (2019). Model-based gas source localization strategy for a cooperative multi-robot system-A probabilistic approach and experimental validation incorporating physical knowledge and model uncertainties. Robotics and Autonomous Systems, 118, 66-79
Open this publication in new window or tab >>Model-based gas source localization strategy for a cooperative multi-robot system-A probabilistic approach and experimental validation incorporating physical knowledge and model uncertainties
2019 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 118, p. 66-79Article in journal (Refereed) Published
Abstract [en]

Sampling gas distributions by robotic platforms in order to find gas sources is an appealing approach to alleviate threats for a human operator. Different sampling strategies for robotic gas exploration exist. In this paper we investigate the benefit that could be obtained by incorporating physical knowledge about the gas dispersion. By exploring a gas diffusion process using a multi-robot system. The physical behavior of the diffusion process is modeled using a Partial Differential Equation (PDE) which is integrated into the exploration strategy. It is assumed that the diffusion process is driven by only a few spatial sources at unknown locations with unknown intensity. The objective of the exploration strategy is to guide the robots to informative measurement locations and by means of concentration measurements estimate the source parameters, in particular, their number, locations and magnitudes. To this end we propose a probabilistic approach towards PDE identification under sparsity constraints using factor graphs and a message passing algorithm. Moreover, message passing schemes permit efficient distributed implementation of the algorithm, which makes it suitable for a multi-robot system. We designed an experimental setup that allows us to evaluate the performance of the exploration strategy in hardware-in-the-loop experiments as well as in experiments with real ethanol gas under laboratory conditions. The results indicate that the proposed exploration approach accelerates the identification of the source parameters and outperforms systematic sampling. (C) 2019 Elsevier B.V. All rights reserved.

Place, publisher, year, edition, pages
Elsevier, 2019
Keywords
Robotic exploration, Gas source localization, Multi-agent-system, Partial differential equation, Mobile robot olfaction, Sparse Bayesian learning, Factor graph, Message passing
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-75365 (URN)10.1016/j.robot.2019.03.014 (DOI)000474324100006 ()2-s2.0-85065544153 (Scopus ID)
Funder
EU, European Research Council, 645101
Available from: 2019-07-29 Created: 2019-07-29 Last updated: 2019-07-29Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-0217-9326

Search in DiVA

Show all publications