oru.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Virtual sensors for human concepts: building detection by an outdoor mobile robot
Örebro University, Department of Technology. (Center for Applied Autonomous Sensor Systems)
Department of Computing and Informatics, University of Lincoln, Lincoln, UK.
Örebro University, Department of Technology.ORCID iD: 0000-0003-0217-9326
2006 (English)In: Proceedings of the IROS 2006 workshop: From Sensors toHuman Spatial Concepts, IEEE, 2006, p. 21-26Conference paper, Published paper (Refereed)
Abstract [en]

In human–robot communication it is often important to relate robot sensor readings to concepts used by humans. We suggest the use of a virtual sensor (one or several physical sensors with a dedicated signal processing unit for the recognition of real world concepts) and a method with which the virtual sensor can learn from a set of generic features. The virtual sensor robustly establishes the link between sensor data and a particular human concept. In this work, we present a virtual sensor for building detection that uses vision and machine learning to classify the image content in a particular direction as representing buildings or non-buildings. The virtual sensor is trained on a diverse set of image data, using features extracted from grey level images. The features are based on edge orientation, the configurations of these edges, and on grey level clustering. To combine these features, the AdaBoost algorithm is applied. Our experiments with an outdoor mobile robot show that the method is able to separate buildings from nature with a high classification rate, and to extrapolate well to images collected under different conditions. Finally, the virtual sensor is applied on the mobile robot, combining its classifications of sub-images from a panoramic view with spatial information (in the form of location and orientation of the robot) in order to communicate the likely locations of buildings to a remote human operator.

Place, publisher, year, edition, pages
IEEE, 2006. p. 21-26
Keywords [en]
Human–robot communication, Human concepts, Virtual sensor, Automatic building detection, AdaBoost
National Category
Engineering and Technology Computer and Information Sciences
Research subject
Computer and Systems Science
Identifiers
URN: urn:nbn:se:oru:diva-3958OAI: oai:DiVA.org:oru-3958DiVA, id: diva2:138257
Conference
IROS Workshop: From Sensors to Human Spatial Concepts, Beijing, China, October 10, 2006
Available from: 2007-08-27 Created: 2007-08-27 Last updated: 2018-06-11Bibliographically approved

Open Access in DiVA

Virtual Sensors for Human Concepts: Building Detection by an Outdoor Mobile Robot(429 kB)21 downloads
File information
File name FULLTEXT01.pdfFile size 429 kBChecksum SHA-512
da0c63449b5e93bcfc42adb8d783bb512d77a173ca9f5f866a69852f1bb73405f271e80bd9bd698bf38e2a83e07db11956049ed0b0f1d636a014f4cb37eaa78f
Type fulltextMimetype application/pdf

Authority records BETA

Persson, MartinLilienthal, Achim J.

Search in DiVA

By author/editor
Persson, MartinLilienthal, Achim J.
By organisation
Department of Technology
Engineering and TechnologyComputer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 21 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 312 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf