oru.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera
Örebro universitet, Institutionen för naturvetenskap och teknik. (Center for Applied Autonomous Sensor Systems (AASS))
Örebro universitet, Institutionen för naturvetenskap och teknik. (Center for Applied Autonomous Sensor Systems (AASS))ORCID-id: 0000-0002-0305-3728
School of Innovation, Design and Engineering, Mälardalen University, Västerås, Sweden.
Örebro universitet, Institutionen för medicinska vetenskaper.ORCID-id: 0000-0002-7173-5579
Visa övriga samt affilieringar
2019 (Engelska)Ingår i: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 19, nr 14, artikel-id E3142Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Estimating distances between people and robots plays a crucial role in understanding social Human-Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human-robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method.

Ort, förlag, år, upplaga, sidor
MDPI, 2019. Vol. 19, nr 14, artikel-id E3142
Nyckelord [en]
Human–Robot Interaction, distance estimation, single RGB image, social interaction
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
URN: urn:nbn:se:oru:diva-75583DOI: 10.3390/s19143142ISI: 000479160300109PubMedID: 31319523Scopus ID: 2-s2.0-85070083052OAI: oai:DiVA.org:oru-75583DiVA, id: diva2:1343490
Anmärkning

Funding Agency:

Örebro University

Tillgänglig från: 2019-08-16 Skapad: 2019-08-16 Senast uppdaterad: 2019-11-15Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextPubMedScopus

Personposter BETA

Krishna, SaiKiselev, AndreyRepsilber, DirkLoutfi, Amy

Sök vidare i DiVA

Av författaren/redaktören
Krishna, SaiKiselev, AndreyRepsilber, DirkLoutfi, Amy
Av organisationen
Institutionen för naturvetenskap och teknikInstitutionen för medicinska vetenskaper
I samma tidskrift
Sensors
Datorseende och robotik (autonoma system)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetricpoäng

doi
pubmed
urn-nbn
Totalt: 159 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf