To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Inferring human body posture information from reflective patterns of protective work garments
Örebro University, School of Science and Technology. (AASS MR&O Lab)
(Mobile Robotics & Olfaction (MRO) Lab, Center of Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-0804-8637
Örebro University, School of Science and Technology. (Mobile Robotics & Olfaction (MRO) Lab, Center of Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0002-2953-1564
Örebro University, School of Science and Technology. (Mobile Robotics & Olfaction (MRO) Lab, Center of Applied Autonomous Sensor Systems (AASS))ORCID iD: 0000-0003-0217-9326
2016 (English)In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Institute of Electrical and Electronics Engineers (IEEE), 2016, p. 4131-4136Conference paper, Published paper (Refereed)
Abstract [en]

We address the problem of extracting human body posture labels, upper body orientation and the spatial location of individual body parts from near-infrared (NIR) images depicting patterns of retro-reflective markers. The analyzed patterns originate from the observation of humans equipped with protective high-visibility garments that represent common safety equipment in the industrial sector. Exploiting the shape of the observed reflectors we adopt shape matching based on the chamfer distance and infer one of seven discrete body posture labels as well as the approximate upper body orientation with respect to the camera. We then proceed to analyze the NIR images on a pixel scale and estimate a figure-ground segmentation together with human body part labels using classification of densely extracted local image patches. Our results indicate a body posture classification accuracy of 80% and figure-ground segmentations with 87% accuracy.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2016. p. 4131-4136
Keywords [en]
Computer Vision, Human Detection, Reflective Clothing, Image Segmentation, Active Illumination, Infrared Vision
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:oru:diva-54023DOI: 10.1109/IROS.2016.7759608ISI: 000391921704024Scopus ID: 2-s2.0-85006371512ISBN: 978-1-5090-3762-9 (print)OAI: oai:DiVA.org:oru-54023DiVA, id: diva2:1057245
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, October 9-14, 2016
Available from: 2016-12-16 Created: 2016-12-16 Last updated: 2018-01-13Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Mosberger, RafaelSchaffernicht, ErikAndreasson, HenrikLilienthal, Achim J.

Search in DiVA

By author/editor
Mosberger, RafaelSchaffernicht, ErikAndreasson, HenrikLilienthal, Achim J.
By organisation
School of Science and Technology
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 792 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf