To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera
Örebro University, School of Science and Technology. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-9686-9127
Örebro University, School of Science and Technology. (Center for Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-0305-3728
School of Innovation, Design and Engineering, Mälardalen University, Västerås, Sweden.
Örebro University, School of Medical Sciences.ORCID iD: 0000-0002-7173-5579
Show others and affiliations
2019 (English)In: Sensors, E-ISSN 1424-8220, Vol. 19, no 14, article id E3142Article in journal (Refereed) Published
Abstract [en]

Estimating distances between people and robots plays a crucial role in understanding social Human-Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human-robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method.

Place, publisher, year, edition, pages
MDPI, 2019. Vol. 19, no 14, article id E3142
Keywords [en]
Human–Robot Interaction, distance estimation, single RGB image, social interaction
National Category
Computer graphics and computer vision
Identifiers
URN: urn:nbn:se:oru:diva-75583DOI: 10.3390/s19143142ISI: 000479160300109PubMedID: 31319523Scopus ID: 2-s2.0-85070083052OAI: oai:DiVA.org:oru-75583DiVA, id: diva2:1343490
Note

Funding Agency:

Örebro University

Available from: 2019-08-16 Created: 2019-08-16 Last updated: 2025-05-12Bibliographically approved
In thesis
1. AGIR: A Framework for Mobile Robots to Join Social Group Interactions
Open this publication in new window or tab >>AGIR: A Framework for Mobile Robots to Join Social Group Interactions
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Social group interactions are a fundamental aspect of human communication and collaboration, characterized by dynamic spatial and orientational patterns. As robots become more prevalent in human environments, they are expected to adhere to social norms, including the ability to join these interactions seamlessly without causing disruptions.

Motivated by the need to explore how these norms extend to robots, we investigated the behavior of teleoperators during group interactions. The findings not only demonstrated that teleoperators inherently follow these socia lnorms but also highlighted their preference for robots with autonomous capabilities to seamlessly join group interactions adhering to these norms. In this regard, this thesis presents a new and comprehensive framework named as “Autonomous Group Interactions for Robots (AGIR)”, designed to enable mobile robots to join ongoing social group interactions autonomously through an egocentric camera perspective.

The AGIR framework is built upon principles from social psychology, such as Proxemics and F-formations, to ensure socially acceptable behavior. Its architecture comprises computational models for extracting spatial and orientational information, detecting groups, estimating spatial formations, and identifying optimal positions for robot in group interactions. Designed to operate in real-time using the robot’s onboard sensors, the framework is modular and adaptable to a diverse range of robotic platforms.

AGIR was rigorously evaluated through experiments conducted in both simulated and real-world environments. Real-world experiments were performed in corridor, lab, and home environments. While in simulation, three scenes were developed similar to conference lobby and coffee break scenarios. Results demonstrated high accuracy in spatial and orientational estimations, group detection, F-formation predictions, and determining optimal robot positions within groups. The framework effectively enabled operating in real-time from an egocentric view and autonomously joining group interactions without disruption. AGIR lays the groundwork for robots to seamlessly integrate into human social environments, enabling practical applications in domains such as elder care, telepresence, and collaborative workspaces.

Place, publisher, year, edition, pages
Örebro: Örebro University, 2025. p. 71
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 105
Keywords
Social Group Interactions, F-formations, HRI, Group Detection, Mobile Robots
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-119738 (URN)9789175296234 (ISBN)
Public defence
2025-04-25, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:15 (English)
Opponent
Supervisors
Available from: 2025-03-06 Created: 2025-03-06 Last updated: 2025-12-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records

Krishna Pathi, SaiKiselev, AndreyRepsilber, DirkLoutfi, Amy

Search in DiVA

By author/editor
Krishna Pathi, SaiKiselev, AndreyRepsilber, DirkLoutfi, Amy
By organisation
School of Science and TechnologySchool of Medical Sciences
In the same journal
Sensors
Computer graphics and computer vision

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 617 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf