To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
The Influence of Feedback Type in Robot-Assisted Training
Örebro University, School of Science and Technology.ORCID iD: 0000-0001-6168-0706
School of Innovation, Design and Engineering, Mälardalen University, Västerås, Sweden.
Örebro University, School of Science and Technology.ORCID iD: 0000-0002-3122-693X
2019 (English)In: Multimodal Technologies and Interaction, E-ISSN 2414-4088, Vol. 3, no 4Article in journal (Refereed) Published
Abstract [en]

Robot-assisted training, where social robots can be used as motivational coaches, provides an interesting application area. This paper examines how feedback given by a robot agent influences the various facets of participant experience in robot-assisted training. Specifically, we investigated the effects of feedback type on robot acceptance, sense of safety and security, attitude towards robots and task performance. In the experiment, 23 older participants performed basic arm exercises with a social robot as a guide and received feedback. Different feedback conditions were administered, such as flattering, positive and negative feedback. Our results suggest that the robot with flattering and positive feedback was appreciated by older people in general, even if the feedback did not necessarily correspond to objective measures such as performance. Participants in these groups felt better about the interaction and the robot.

Place, publisher, year, edition, pages
Multidisciplinary Digital Publishing Institute , 2019. Vol. 3, no 4
Keywords [en]
feedback, acceptance, flattering robot, sense of safety and security, robot-assisted training
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
URN: urn:nbn:se:oru:diva-78492DOI: 10.3390/mti3040067ISI: 000623570700003Scopus ID: 2-s2.0-85079720466OAI: oai:DiVA.org:oru-78492DiVA, id: diva2:1376112
Funder
EU, Horizon 2020, 721619Available from: 2019-12-08 Created: 2019-12-08 Last updated: 2024-01-16Bibliographically approved
In thesis
1. Perceived Safety in Social Human-Robot Interaction
Open this publication in new window or tab >>Perceived Safety in Social Human-Robot Interaction
2022 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This compilation thesis contributes to a deeper understanding of perceived safety in human-robot interaction (HRI) with a particular focus on social robots. The current understanding of safety in HRI is mostly limited to physical safety, whereas perceived safety has often been neglected and underestimated. However, safe HRI requires a conceptualization of safety that goes beyond physical safety covering also perceived safety of the users. Within this context, this thesis provides a comprehensive analysis of perceived safety in HRI with social robots, considering a diverse set of human-related and robot-related factors.

Two particular challenges for providing perceived safety in HRI are 1) understanding and evaluating human safety perception through direct and indirect measures, and 2) utilizing the measured level of perceived safety for adapting the robot behaviors. The primary contribution of this dissertation is in addressing the first challenge. The thesis investigates perceived safety in HRI by alternating between conducting user studies, literature review, and testing the findings from the literature within user studies.

In this thesis, six main factors influencing perceived safety in HRI are lifted: the context of robot use, the user’s comfort, experience and familiarity with robots, trust, sense of control over the interaction, and transparent and predictable robot behaviors. These factors could provide a common understanding of perceived safety and bridge the theoretical gap in the literature. Moreover, this thesis proposes an experimental paradigm to observe and quantify perceived safety using objective and subjective measures. This contributes to bridging the methodological gap in the literature.

The six factors are reviewed in HRI literature, and the robot features that affect these factors are organized in a taxonomy. Although this taxonomy focuses on social robots, the identified characteristics are relevant to other types of robots and autonomous systems. In addition to the taxonomy, the thesis provides a set of guidelines for providing perceived safety in social HRI. As a secondary contribution, the thesis presents an overview of reinforcement learning applications in social robotics as a suitable learning mechanism for adapting the robots’ behaviors to mitigate psychological harm.

Place, publisher, year, edition, pages
Örebro: Örebro University, 2022. p. 77
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 94
National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-98102 (URN)9789175294322 (ISBN)
Public defence
2022-04-28, Örebro universitet, Långhuset, Hörsal L2, Fakultetsgatan 1, Örebro, 13:15 (English)
Opponent
Supervisors
Available from: 2022-03-17 Created: 2022-03-17 Last updated: 2022-05-04Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Akalin, NezihaLoutfi, Amy

Search in DiVA

By author/editor
Akalin, NezihaLoutfi, Amy
By organisation
School of Science and Technology
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 372 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf