To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
AGIR: A Framework for Mobile Robots to Join Social Group Interactions
Örebro University, School of Science and Technology.ORCID iD: 0000-0002-9686-9127
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Social group interactions are a fundamental aspect of human communication and collaboration, characterized by dynamic spatial and orientational patterns. As robots become more prevalent in human environments, they are expected to adhere to social norms, including the ability to join these interactions seamlessly without causing disruptions.

Motivated by the need to explore how these norms extend to robots, we investigated the behavior of teleoperators during group interactions. The findings not only demonstrated that teleoperators inherently follow these socia lnorms but also highlighted their preference for robots with autonomous capabilities to seamlessly join group interactions adhering to these norms. In this regard, this thesis presents a new and comprehensive framework named as “Autonomous Group Interactions for Robots (AGIR)”, designed to enable mobile robots to join ongoing social group interactions autonomously through an egocentric camera perspective.

The AGIR framework is built upon principles from social psychology, such as Proxemics and F-formations, to ensure socially acceptable behavior. Its architecture comprises computational models for extracting spatial and orientational information, detecting groups, estimating spatial formations, and identifying optimal positions for robot in group interactions. Designed to operate in real-time using the robot’s onboard sensors, the framework is modular and adaptable to a diverse range of robotic platforms.

AGIR was rigorously evaluated through experiments conducted in both simulated and real-world environments. Real-world experiments were performed in corridor, lab, and home environments. While in simulation, three scenes were developed similar to conference lobby and coffee break scenarios. Results demonstrated high accuracy in spatial and orientational estimations, group detection, F-formation predictions, and determining optimal robot positions within groups. The framework effectively enabled operating in real-time from an egocentric view and autonomously joining group interactions without disruption. AGIR lays the groundwork for robots to seamlessly integrate into human social environments, enabling practical applications in domains such as elder care, telepresence, and collaborative workspaces.

Place, publisher, year, edition, pages
Örebro: Örebro University , 2025. , p. 71
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 105
Keywords [en]
Social Group Interactions, F-formations, HRI, Group Detection, Mobile Robots
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:oru:diva-119738ISBN: 9789175296234 (print)OAI: oai:DiVA.org:oru-119738DiVA, id: diva2:1942862
Public defence
2025-04-25, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:15 (English)
Opponent
Supervisors
Available from: 2025-03-06 Created: 2025-03-06 Last updated: 2025-12-11Bibliographically approved
List of papers
1. F-Formations for Social Interaction in Simulation Using Virtual Agents and Mobile Robotic Telepresence Systems
Open this publication in new window or tab >>F-Formations for Social Interaction in Simulation Using Virtual Agents and Mobile Robotic Telepresence Systems
2019 (English)In: Multimodal Technologies and Interaction, E-ISSN 2414-4088, Vol. 3, no 4, article id 69Article in journal (Refereed) Published
Abstract [en]

F-formations are a set of possible patterns in which groups of people tend to spatially organize themselves while engaging in social interactions. In this paper, we study the behavior of teleoperators of mobile robotic telepresence systems to determine whether they adhere to spatial formations when navigating to groups. This work uses a simulated environment in which teleoperators are requested to navigate to different groups of virtual agents. The simulated environment represents a conference lobby scenario where multiple groups of Virtual Agents with varying group sizes are placed in different spatial formations. The task requires teleoperators to navigate a robot to join each group using an egocentric-perspective camera. In a second phase, teleoperators are allowed to evaluate their own performance by reviewing how they navigated the robot from an exocentric perspective. The two important outcomes from this study are, firstly, teleoperators inherently respect F-formations even when operating a mobile robotic telepresence system. Secondly, teleoperators prefer additional support in order to correctly navigate the robot into a preferred position that adheres to F-formations.

Place, publisher, year, edition, pages
MDPI, 2019
Keywords
telepresence, mobile robotic telepresence, F-formations, simulation, virtual agents, HRI
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-78830 (URN)10.3390/mti3040069 (DOI)000623570700005 ()2-s2.0-85079698695 (Scopus ID)
Note

Funding Agency:

Örebro University

Available from: 2019-12-20 Created: 2019-12-20 Last updated: 2025-09-29Bibliographically approved
2. A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera
Open this publication in new window or tab >>A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera
Show others...
2019 (English)In: Sensors, E-ISSN 1424-8220, Vol. 19, no 14, article id E3142Article in journal (Refereed) Published
Abstract [en]

Estimating distances between people and robots plays a crucial role in understanding social Human-Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human-robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method.

Place, publisher, year, edition, pages
MDPI, 2019
Keywords
Human–Robot Interaction, distance estimation, single RGB image, social interaction
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-75583 (URN)10.3390/s19143142 (DOI)000479160300109 ()31319523 (PubMedID)2-s2.0-85070083052 (Scopus ID)
Note

Funding Agency:

Örebro University

Available from: 2019-08-16 Created: 2019-08-16 Last updated: 2025-05-12Bibliographically approved
3. Detecting Groups and Estimating F-Formations for Social Human-Robot Interactions
Open this publication in new window or tab >>Detecting Groups and Estimating F-Formations for Social Human-Robot Interactions
2022 (English)In: Multimodal Technologies and Interaction, E-ISSN 2414-4088, Vol. 6, no 3, article id 18Article in journal (Refereed) Published
Abstract [en]

The ability of a robot to detect and join groups of people is of increasing importance in social contexts, and for the collaboration between teams of humans and robots. In this paper, we propose a framework, autonomous group interactions for robots (AGIR), that endows a robot with the ability to detect such groups while following the principles of F-formations. Using on-board sensors, this method accounts for a wide spectrum of different robot systems, ranging from autonomous service robots to telepresence robots. The presented framework detects individuals, estimates their position and orientation, detects groups, determines their F-formations, and is able to suggest a position for the robot to enter the social group. For evaluation, two simulation scenes were developed based on the standard real-world datasets. The 1st scene is built with 20 virtual agents (VAs) interacting in 7 different groups of varying sizes and 3 different formations. The 2nd scene is built with 36 VAs, positioned in 13 different groups of varying sizes and 6 different formations. A model of a Pepper robot is used in both simulated scenes in randomly generated different positions. The ability for the robot to estimate orientation, detect groups, and estimate F-formations at various locations is used to determine the validation of the approaches. The obtained results show a high accuracy within each of the simulated scenarios and demonstrates that the framework is able to work from an egocentric view with a robot in real time.

Place, publisher, year, edition, pages
MDPI, 2022
Keywords
human-robot interaction, social robotics, F-formations, group interactions, Kendon formations
National Category
Computer graphics and computer vision
Identifiers
urn:nbn:se:oru:diva-98588 (URN)10.3390/mti6030018 (DOI)000776301200001 ()2-s2.0-85125791499 (Scopus ID)
Note

Funding agency:

Örebro University

Available from: 2022-04-19 Created: 2022-04-19 Last updated: 2025-09-29Bibliographically approved
4. Estimating Optimal Placement for a Robot in Social Group Interaction
Open this publication in new window or tab >>Estimating Optimal Placement for a Robot in Social Group Interaction
2019 (English)In: IEEE International Workshop on Robot and Human Communication (ROMAN), IEEE, 2019Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we present a model to propose anoptimal placement for a robot in a social group interaction. Ourmodel estimates the O-space according to the F-formation theory. The method automatically calculates a suitable placementfor the robot. An evaluation of the method has been performedby conducting an experiment where participants stand in differ-ent formations and a robot is teleoperated to join the group. Inone condition, the operator positions the robot according to thespecified location given by our algorithm. In another condition,operators have the freedom to position the robot according totheir personal choice. Follow-up questionnaires were performedto determine which of the placements were preferred by theparticipants. The results indicate that the proposed methodfor automatic placement of the robot is supported from theparticipants. The contribution of this work resides in a novelmethod to automatically estimate the best placement of therobot, as well as the results from user experiments to verify thequality of this method. These results suggest that teleoperatedrobots such as mobile robot telepresence systems could benefitfrom tools that assist operators in placing the robot in groupsin a socially accepted manner.

Place, publisher, year, edition, pages
IEEE, 2019
Keywords
F-formations, Robot Positioning Spot, Mobile Robotic Telepresence, HRI
National Category
Engineering and Technology Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-78832 (URN)10.1109/RO-MAN46459.2019.8956318 (DOI)000533896300034 ()9781728126227 (ISBN)9781728126234 (ISBN)
Conference
The 28th IEEE International Conference on Robot and Human Interactive Communication – RO-MAN 2019, New Delhi, India, October 14-18, 2019.
Projects
Successful Ageing
Available from: 2019-12-20 Created: 2019-12-20 Last updated: 2025-05-12Bibliographically approved

Open Access in DiVA

Cover(151 kB)49 downloads
File information
File name COVER01.pdfFile size 151 kBChecksum SHA-512
0651c0a2253410b79384155c00f422c102b0a5f898072598438aba549fe85db504ee983782656f1efdfd905f7a64ff13db329415c2334201b4f8ae044a790076
Type coverMimetype application/pdf
AGIR: A Framework for Mobile Robots to Join Social Group Interactions(37623 kB)5 downloads
File information
File name FULLTEXT01.pdfFile size 37623 kBChecksum SHA-512
5b1fc835b8a8d4a58932bd5cdded04bf7c3e3dec85c789cf00f581f3d782f08893047182d84fb653fe8b2059a704738a3690a016a7cd40704461cc05ea9996ea
Type fulltextMimetype application/pdf

Authority records

Krishna Pathi, Sai

Search in DiVA

By author/editor
Krishna Pathi, Sai
By organisation
School of Science and Technology
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 5 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 701 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf