To Örebro University

oru.seÖrebro University Publications
Change search
Link to record
Permanent link

Direct link
Chadalavada, Ravi TejaORCID iD iconorcid.org/0000-0002-8380-4113
Alternative names
Publications (10 of 13) Show all publications
Molina, S., Mannucci, A., Magnusson, M., Adolfsson, D., Andreasson, H., Hamad, M., . . . Lilienthal, A. J. (2024). The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots. IEEE robotics & automation magazine, 31(3), 48-59
Open this publication in new window or tab >>The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots
Show others...
2024 (English)In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223X, Vol. 31, no 3, p. 48-59Article in journal (Refereed) Published
Abstract [en]

Current intralogistics services require keeping up with e-commerce demands, reducing delivery times and waste, and increasing overall flexibility. As a consequence, the use of automated guided vehicles (AGVs) and, more recently, autonomous mobile robots (AMRs) for logistics operations is steadily increasing.

Place, publisher, year, edition, pages
IEEE, 2024
Keywords
Robots, Safety, Navigation, Mobile robots, Human-robot interaction, Hidden Markov models, Trajectory
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-108145 (URN)10.1109/MRA.2023.3296983 (DOI)001051249900001 ()2-s2.0-85167792783 (Scopus ID)
Funder
EU, Horizon 2020, 732737
Available from: 2023-09-14 Created: 2023-09-14 Last updated: 2025-02-07Bibliographically approved
Schreiter, T., Morillo-Mendez, L., Chadalavada, R. T., Rudenko, A., Billing, E., Magnusson, M., . . . Lilienthal, A. J. (2023). Advantages of Multimodal versus Verbal-Only Robot-to-Human Communication with an Anthropomorphic Robotic Mock Driver. In: 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN): Proceedings. Paper presented at 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Busan, South Korea, August 28-31, 2023 (pp. 293-300). IEEE
Open this publication in new window or tab >>Advantages of Multimodal versus Verbal-Only Robot-to-Human Communication with an Anthropomorphic Robotic Mock Driver
Show others...
2023 (English)In: 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN): Proceedings, IEEE, 2023, p. 293-300Conference paper, Published paper (Refereed)
Abstract [en]

Robots are increasingly used in shared environments with humans, making effective communication a necessity for successful human-robot interaction. In our work, we study a crucial component: active communication of robot intent. Here, we present an anthropomorphic solution where a humanoid robot communicates the intent of its host robot acting as an "Anthropomorphic Robotic Mock Driver" (ARMoD). We evaluate this approach in two experiments in which participants work alongside a mobile robot on various tasks, while the ARMoD communicates a need for human attention, when required, or gives instructions to collaborate on a joint task. The experiments feature two interaction styles of the ARMoD: a verbal-only mode using only speech and a multimodal mode, additionally including robotic gaze and pointing gestures to support communication and register intent in space. Our results show that the multimodal interaction style, including head movements and eye gaze as well as pointing gestures, leads to more natural fixation behavior. Participants naturally identified and fixated longer on the areas relevant for intent communication, and reacted faster to instructions in collaborative tasks. Our research further indicates that the ARMoD intent communication improves engagement and social interaction with mobile robots in workplace settings.

Place, publisher, year, edition, pages
IEEE, 2023
Series
IEEE RO-MAN, ISSN 1944-9445, E-ISSN 1944-9437
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-110873 (URN)10.1109/RO-MAN57019.2023.10309629 (DOI)001108678600042 ()9798350336702 (ISBN)9798350336719 (ISBN)
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Busan, South Korea, August 28-31, 2023
Funder
EU, Horizon 2020, 101017274 (DARKO)
Available from: 2024-01-22 Created: 2024-01-22 Last updated: 2025-02-07Bibliographically approved
Schreiter, T., Morillo-Mendez, L., Chadalavada, R. T., Rudenko, A., Billing, E. A. & Lilienthal, A. J. (2022). The Effect of Anthropomorphism on Trust in an Industrial Human-Robot Interaction. In: SCRITA Workshop Proceedings (arXiv:2208.11090): . Paper presented at 31st IEEE International Conference on Robot & Human Interactive Communication, Naples, Italy, August 29 - September 2, 2022.
Open this publication in new window or tab >>The Effect of Anthropomorphism on Trust in an Industrial Human-Robot Interaction
Show others...
2022 (English)In: SCRITA Workshop Proceedings (arXiv:2208.11090), 2022Conference paper, Published paper (Refereed)
Abstract [en]

Robots are increasingly deployed in spaces shared with humans, including home settings and industrial environments. In these environments, the interaction between humans and robots (HRI) is crucial for safety, legibility, and efficiency. A key factor in HRI is trust, which modulates the acceptance of the system. Anthropomorphism has been shown to modulate trust development in a robot, but robots in industrial environments are not usually anthropomorphic. We designed a simple interaction in an industrial environment in which an anthropomorphic mock driver (ARMoD) robot simulates to drive an autonomous guided vehicle (AGV). The task consisted of a human crossing paths with the AGV, with or without the ARMoD mounted on the top, in a narrow corridor. The human and the system needed to negotiate trajectories when crossing paths, meaning that the human had to attend to the trajectory of the robot to avoid a collision with it. There was a significant increment in the reported trust scores in the condition where the ARMoD was present, showing that the presence of an anthropomorphic robot is enough to modulate the trust, even in limited interactions as the one we present here. 

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-102773 (URN)10.48550/arXiv.2208.14637 (DOI)
Conference
31st IEEE International Conference on Robot & Human Interactive Communication, Naples, Italy, August 29 - September 2, 2022
Projects
DARKO
Funder
EU, Horizon 2020, 101017274 754285
Available from: 2022-12-19 Created: 2022-12-19 Last updated: 2022-12-20Bibliographically approved
Rudenko, A., Kucner, T. P., Swaminathan, C. S., Chadalavada, R. T., Arras, K. O. & Lilienthal, A. (2020). Benchmarking Human Motion Prediction Methods. In: : . Paper presented at HRI 2020, Workshop on Test Methods and Metrics for Effective HRI in Real World Human-Robot Teams, Cambridge, UK,(Conference cancelled).
Open this publication in new window or tab >>Benchmarking Human Motion Prediction Methods
Show others...
2020 (English)Conference paper, Oral presentation only (Other academic)
Abstract [en]

In this extended abstract we present a novel dataset for benchmarking motion prediction algorithms. We describe our approach to data collection which generates diverse and accurate human motion in a controlled weakly-scripted setup. We also give insights for building a universal benchmark for motion prediction.

Keywords
human motion prediction, benchmarking, datasets
National Category
Robotics and automation
Identifiers
urn:nbn:se:oru:diva-89169 (URN)
Conference
HRI 2020, Workshop on Test Methods and Metrics for Effective HRI in Real World Human-Robot Teams, Cambridge, UK,(Conference cancelled)
Projects
ILIAD
Available from: 2021-02-01 Created: 2021-02-01 Last updated: 2025-02-09Bibliographically approved
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R. & Lilienthal, A. J. (2020). Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction. Robotics and Computer-Integrated Manufacturing, 61, Article ID 101830.
Open this publication in new window or tab >>Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction
Show others...
2020 (English)In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 61, article id 101830Article in journal (Refereed) Published
Abstract [en]

Safety, legibility and efficiency are essential for autonomous mobile robots that interact with humans. A key factor in this respect is bi-directional communication of navigation intent, which we focus on in this article with a particular view on industrial logistic applications. In the direction robot-to-human, we study how a robot can communicate its navigation intent using Spatial Augmented Reality (SAR) such that humans can intuitively understand the robot's intention and feel safe in the vicinity of robots. We conducted experiments with an autonomous forklift that projects various patterns on the shared floor space to convey its navigation intentions. We analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift and carried out stimulated recall interviews (SRI) in order to identify desirable features for projection of robot intentions. In the direction human-to-robot, we argue that robots in human co-habited environments need human-aware task and motion planning to support safety and efficiency, ideally responding to people's motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond what can be inferred from the trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. In this work, we investigate the possibility of human-to-robot implicit intention transference solely from eye gaze data and evaluate how the observed eye gaze patterns of the participants relate to their navigation decisions. We again analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift for clues that could reveal direction intent. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human gaze for early obstacle avoidance.

Place, publisher, year, edition, pages
Elsevier, 2020
Keywords
Human-robot interaction (HRI), Mobile robots, Intention communication, Eye-tracking, Intention recognition, Spatial augmented reality, Stimulated recall interview, Obstacle avoidance, Safety, Logistics
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-78358 (URN)10.1016/j.rcim.2019.101830 (DOI)000496834800002 ()2-s2.0-85070732550 (Scopus ID)
Note

Funding Agencies:

KKS SIDUS project AIR: "Action and Intention Recognition in Human Interaction with Autonomous Systems"  20140220

H2020 project ILIAD: "Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces"  732737

Available from: 2019-12-03 Created: 2019-12-03 Last updated: 2025-02-07Bibliographically approved
Palm, R., Chadalavada, R. T. & Lilienthal, A. (2019). Fuzzy Modeling, Control and Prediction in Human-Robot Systems. In: Juan Julian Merelo, Fernando Melício José M. Cadenas, António Dourado, Kurosh Madani, António Ruano, Joaquim Filipe (Ed.), Computational Intelligence: International Joint Conference, IJCCI2016 Porto, Portugal, November 9–11,2016 Revised Selected Papers (pp. 149-177). Switzerland: Springer Publishing Company
Open this publication in new window or tab >>Fuzzy Modeling, Control and Prediction in Human-Robot Systems
2019 (English)In: Computational Intelligence: International Joint Conference, IJCCI2016 Porto, Portugal, November 9–11,2016 Revised Selected Papers / [ed] Juan Julian Merelo, Fernando Melício José M. Cadenas, António Dourado, Kurosh Madani, António Ruano, Joaquim Filipe, Switzerland: Springer Publishing Company, 2019, p. 149-177Chapter in book (Refereed)
Abstract [en]

A safe and synchronized interaction between human agents and robots in shared areas requires both long distance prediction of their motions and an appropriate control policy for short distance reaction. In this connection recognition of mutual intentions in the prediction phase is crucial to improve the performance of short distance control.We suggest an approach for short distance control inwhich the expected human movements relative to the robot are being summarized in a so-called “compass dial” from which fuzzy control rules for the robot’s reactions are derived. To predict possible collisions between robot and human at the earliest possible time, the travel times to predicted human-robot intersections are calculated and fed into a hybrid controller for collision avoidance. By applying the method of velocity obstacles, the relation between a change in robot’s motion direction and its velocity during an interaction is optimized and a combination with fuzzy expert rules is used for a safe obstacle avoidance. For a prediction of human intentions to move to certain goals pedestrian tracks are modeled by fuzzy clustering, and trajectories of human and robot agents are extrapolated to avoid collisions at intersections. Examples with both simulated and real data show the applicability of the presented methods and the high performance of the results.

Place, publisher, year, edition, pages
Switzerland: Springer Publishing Company, 2019
Series
Studies in Computational Intelligence, ISSN 1860-949X, E-ISSN 1860-9503 ; 792
Keywords
Fuzzy control, Fuzzy modeling, Prediction, Human-robot interaction, Human intentions, Obstacle avoidance, Velocity obstacles
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-79743 (URN)10.1007/978-3-319-99283-9 (DOI)978-3-319-99282-2 (ISBN)978-3-319-99283-9 (ISBN)
Funder
Knowledge Foundation, 20140220
Available from: 2020-02-03 Created: 2020-02-03 Last updated: 2020-02-05Bibliographically approved
Chadalavada, R. T., Andreasson, H., Schindler, M. & Lilienthal, A. J. (2019). Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction. In: : . Paper presented at International Conference on Social Robotics - Quality of Interaction in Socially Assistive Robots Workshop, Madrid, Spain, November 26th-29th, 2019.
Open this publication in new window or tab >>Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Eye gaze can convey information about intentions beyond what can beinferred from the trajectory and head pose of a person. We propose eye-trackingglasses as safety equipment in industrial environments shared by humans androbots. In this work, an implicit intention transference system was developed and implemented. Robot was given access to human eye gaze data, and it responds tothe eye gaze data through spatial augmented reality projections on the sharedfloor space in real-time and the robot could also adapt its path. This allows proactivesafety approaches in HRI for example by attempting to get the human'sattention when they are in the vicinity of a moving robot. A study was conductedwith workers at an industrial warehouse. The time taken to understand the behaviorof the system was recorded. Electrodermal activity and pupil diameter wererecorded to measure the increase in stress and cognitive load while interactingwith an autonomous system, using these measurements as a proxy to quantifytrust in autonomous systems.

Keywords
Human-robot interaction, intention communication, eye tracking, spatial augmented reality, electrodermal activity, stress, cognitive load.
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-79736 (URN)
Conference
International Conference on Social Robotics - Quality of Interaction in Socially Assistive Robots Workshop, Madrid, Spain, November 26th-29th, 2019
Projects
ILAID
Available from: 2020-02-03 Created: 2020-02-03 Last updated: 2020-04-24Bibliographically approved
Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R. & Lilienthal, A. (2018). Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses. In: Case K. &Thorvald P. (Ed.), Advances in Manufacturing Technology XXXII: Proceedings of the 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, September 11–13, 2018, University of Skövde, Sweden. Paper presented at 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, University of Skövde, Sweden, September 11–13, 2018 (pp. 253-258). Amsterdam, Netherlands: IOS Press
Open this publication in new window or tab >>Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses
Show others...
2018 (English)In: Advances in Manufacturing Technology XXXII: Proceedings of the 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, September 11–13, 2018, University of Skövde, Sweden / [ed] Case K. &Thorvald P., Amsterdam, Netherlands: IOS Press, 2018, p. 253-258Conference paper, Published paper (Refereed)
Abstract [en]

Robots in human co-habited environments need human-aware task and motion planning, ideally responding to people’s motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. This paper investigates the possibility of human-to-robot implicit intention transference solely from eye gaze data.  We present experiments in which humans wearing eye-tracking glasses encountered a small forklift truck under various conditions. We evaluate how the observed eye gaze patterns of the participants related to their navigation decisions. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human eye gaze for early obstacle avoidance.

Place, publisher, year, edition, pages
Amsterdam, Netherlands: IOS Press, 2018
Series
Advances in Transdisciplinary Engineering, ISSN 2352-751X, E-ISSN 2352-7528 ; 8
Keywords
Human-Robot Interaction (HRI), Eye-tracking, Eye-Tracking Glasses, Navigation Intent, Implicit Intention Transference, Obstacle avoidance.
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-70706 (URN)10.3233/978-1-61499-902-7-253 (DOI)000462212700041 ()2-s2.0-85057390000 (Scopus ID)978-1-61499-901-0 (ISBN)978-1-61499-902-7 (ISBN)
Conference
16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, University of Skövde, Sweden, September 11–13, 2018
Projects
Action and Intention Recognition (AIR)ILIAD
Available from: 2018-12-12 Created: 2018-12-12 Last updated: 2019-04-04Bibliographically approved
Chadalavada, R. T., Andreasson, H., Krug, R. & Lilienthal, A. (2016). Empirical evaluation of human trust in an expressive mobile robot. In: Proceedings of RSS Workshop "Social Trust in Autonomous Robots 2016": . Paper presented at RSS Workshop "Social Trust in Autonomous Robots 2016", June 19, 2016.
Open this publication in new window or tab >>Empirical evaluation of human trust in an expressive mobile robot
2016 (English)In: Proceedings of RSS Workshop "Social Trust in Autonomous Robots 2016", 2016Conference paper, Published paper (Refereed)
Abstract [en]

A mobile robot communicating its intentions using Spatial Augmented Reality (SAR) on the shared floor space makes humans feel safer and more comfortable around the robot. Our previous work [1] and several other works established this fact. We built upon that work by adding an adaptable information and control to the SAR module. An empirical study about how a mobile robot builds trust in humans by communicating its intentions was conducted. A novel way of evaluating that trust is presented and experimentally shown that adaption in SAR module lead to natural interaction and the new evaluation system helped us discover that the comfort levels between human-robot interactions approached those of human-human interactions.

Keywords
Human robot interaction, hri, mobile robot, trust, evaluation
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-55259 (URN)
Conference
RSS Workshop "Social Trust in Autonomous Robots 2016", June 19, 2016
Available from: 2017-02-02 Created: 2017-02-02 Last updated: 2018-03-14Bibliographically approved
Palm, R., Chadalavada, R. & Lilienthal, A. (2016). Fuzzy Modeling and Control for Intention Recognition in Human-Robot Systems. In: Proceedings of the 8th International Joint Conference on Computational Intelligence (IJCCI 2016): . Paper presented at 8th International Conference on Computational Intelligence IJCCI 2016, FCTA, Porto, Portugal, November 9-11, 2016 (pp. 67-74). Setúbal, Portugal: SciTePress, 2
Open this publication in new window or tab >>Fuzzy Modeling and Control for Intention Recognition in Human-Robot Systems
2016 (English)In: Proceedings of the 8th International Joint Conference on Computational Intelligence (IJCCI 2016), Setúbal, Portugal: SciTePress, 2016, Vol. 2, p. 67-74Conference paper, Published paper (Refereed)
Abstract [en]

The recognition of human intentions from trajectories in the framework of human-robot interaction is a challenging field of research. In this paper some control problems of the human-robot interaction and their intentions to compete or cooperate in shared work spaces are addressed and the time schedule of the information flow is discussed. The expected human movements relative to the robot are summarized in a so-called "compass dial" from which fuzzy control rules for the robot's reactions are derived. To avoid collisions between robot and human very early the computation of collision times at predicted human-robot intersections is discussed and a switching controller for collision avoidance is proposed. In the context of the recognition of human intentions to move to certain goals, pedestrian tracks are modeled by fuzzy clustering, lanes preferred by human agents are identified, and the identification of degrees of membership of a pedestrian track to specific lanes are discussed. Computations based on simulated and experimental data show the applicability of the methods presented.

Place, publisher, year, edition, pages
Setúbal, Portugal: SciTePress, 2016
Keywords
Fuzzy control, Fuzzy modeling, Human-Robot interaction, human intentions
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-53710 (URN)10.5220/0006015400670074 (DOI)000393153800005 ()2-s2.0-85006466619 (Scopus ID)978-989-758-201-1 (ISBN)
Conference
8th International Conference on Computational Intelligence IJCCI 2016, FCTA, Porto, Portugal, November 9-11, 2016
Projects
Action and Intention Recognition in Human Interaction with Autonomous Systems
Note

Funding Agency:

AIR-project, Action and Intention Recognition in Human Interaction with Autonomous Systems

Available from: 2016-12-01 Created: 2016-12-01 Last updated: 2018-01-13Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-8380-4113

Search in DiVA

Show all publications