To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Human Gaze and Head Rotation during Navigation, Exploration and Object Manipulation in Shared Environments with Robots
Örebro University, School of Science and Technology. Tu Munich, Germany. (Centre for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-9387-2312
Robert Bosch GmbH, Corporate Research, Stuttgart, Germany.
Örebro University, School of Science and Technology. (Centre for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0001-8658-2985
Örebro University, School of Science and Technology. (Centre for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0003-0217-9326
2024 (English)In: 2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN), IEEE Computer Society, 2024, p. 1258-1265Conference paper, Published paper (Refereed)
Abstract [en]

The human gaze is an important cue to signal intention, attention, distraction, and the regions of interest in the immediate surroundings. Gaze tracking can transform how robots perceive, understand, and react to people, enabling new modes of robot control, interaction, and collaboration. In this paper, we use gaze tracking data from a rich dataset of human motion (THÖR-MAGNI) to investigate the coordination between gaze direction and head rotation of humans engaged in various indoor activities involving navigation, interaction with objects, and collaboration with a mobile robot. In particular, we study the spread and central bias of fixations in diverse activities and examine the correlation between gaze direction and head rotation. We introduce various human motion metrics to enhance the understanding of gaze behavior in dynamic interactions. Finally, we apply semantic object labeling to decompose the gaze distribution into activity-relevant regions.

Place, publisher, year, edition, pages
IEEE Computer Society, 2024. p. 1258-1265
Keywords [en]
Adversarial machine learning, Behavioral research, Human engineering, Human robot interaction, Industrial robots, Machine Perception, Microrobots, Motion tracking, Gaze direction, Gaze-tracking, Head rotation, Human motions, Indoor activities, Object manipulation, Region-of-interest, Regions of interest, Robots control, Tracking data, Mobile robots
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:oru:diva-118537DOI: 10.1109/RO-MAN60168.2024.10731190ISI: 001348918600163Scopus ID: 2-s2.0-85206976290ISBN: 9798350375022 (print)OAI: oai:DiVA.org:oru-118537DiVA, id: diva2:1927723
Conference
2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN), Passadena, CA, USA, 26-30 Aug. 2024
Funder
EU, Horizon 2020, 101017274Available from: 2025-01-15 Created: 2025-01-15 Last updated: 2025-01-20Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Schreiter, TimMagnusson, MartinLilienthal, Achim

Search in DiVA

By author/editor
Schreiter, TimMagnusson, MartinLilienthal, Achim
By organisation
School of Science and Technology
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 43 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf