To Örebro University

oru.seÖrebro University Publications
Change search
Link to record
Permanent link

Direct link
Cielniak, Grzegorz
Publications (9 of 9) Show all publications
Cielniak, G., Duckett, T. & Lilienthal, A. J. (2007). Improved data association and occlusion handling for vision-based people tracking by mobile robots. In: 2007 IEEE/RSJ international conference on intelligent robots and systems: . Paper presented at IEEE/RSJ international conference on intelligent robots and systems, 2007, IROS 2007, San Diego, CA, USA, 29 Oct.-2 Nov., 2007 (pp. 3436-3441). New York, NY, USA: IEEE
Open this publication in new window or tab >>Improved data association and occlusion handling for vision-based people tracking by mobile robots
2007 (English)In: 2007 IEEE/RSJ international conference on intelligent robots and systems, New York, NY, USA: IEEE, 2007, p. 3436-3441Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents an approach for tracking multiple persons using a combination of colour and thermal vision sensors on a mobile robot. First, an adaptive colour model is incorporated into the measurement model of the tracker. Second, a new approach for detecting occlusions is introduced, using a machine learning classifier for pairwise comparison of persons (classifying which one is in front of the other). Third, explicit occlusion handling is then incorporated into the tracker.

Place, publisher, year, edition, pages
New York, NY, USA: IEEE, 2007
Keywords
Person tracking, robot vision, occlusion handling
National Category
Engineering and Technology Computer and Information Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3271 (URN)10.1109/IROS.2007.4399507 (DOI)000254073202089 ()2-s2.0-51349163493 (Scopus ID)978-1-4244-0912-9 (ISBN)
Conference
IEEE/RSJ international conference on intelligent robots and systems, 2007, IROS 2007, San Diego, CA, USA, 29 Oct.-2 Nov., 2007
Available from: 2008-11-28 Created: 2008-11-28 Last updated: 2018-06-12Bibliographically approved
Cielniak, G. (2007). People tracking by mobile robots using thermal and colour vision. (Doctoral dissertation). Örebro: Örebro universitetsbibliotek
Open this publication in new window or tab >>People tracking by mobile robots using thermal and colour vision
2007 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

This thesis addresses the problem of people detection and tracking by mobile robots in indoor environments. A system that can detect and recognise people is an essential part of any mobile robot that is designed to operate in populated environments. Information about the presence and location of persons in the robot’s surroundings is necessary to enable interaction with the human operator, and also for ensuring the safety of people near the robot.

The presented people tracking system uses a combination of thermal and colour information to robustly track persons. The use of a thermal camera simplifies the detection problem, which is especially difficult on a mobile platform. The system is based on a fast and efficient samplebased tracking method that enables tracking of people in real-time. The elliptic measurement model is fast to calculate and allows detection and tracking of persons under different views. An explicit model of the human silhouette effectively distinguishes persons from other objects in the scene. Moreover the process of detection and localisation is performed simultaneously so that measurements are incorporated directly into the tracking framework without thresholding of observations. With this approach persons can be detected independently from current light conditions and in situations where other popular detection methods based on skin colour would fail.

A very challenging situation for a tracking system occurs when multiple persons are present on the scene. The tracking system has to estimate the number and position of all persons in the vicinity of the robot. Tracking of multiple persons in the presented system is realised by an efficient algorithm that mitigates the problems of combinatorial explosion common to other known algorithms. A sequential detector initialises an independent tracking filter for each new person appearing in the image. A single filter is automatically deleted when it stops tracking a person. While thermal vision is good for detecting people, it can be very difficult to maintain the correct association between different observations and persons, especially where they occlude one another, due to the unpredictable appearance and social behaviour of humans. To address these problems the presented tracking system uses additional information from the colour camera. An adaptive colour model is incorporated into the measurement model of the tracker to improve data association. For this purpose an efficient integral image based method is used to maintain the real-time performance of the tracker. To deal with occlusions the system uses an explicit method that first detects situations where people occlude each other. This is realised by a new approach based on a machine learning classifier for pairwise comparison of persons that uses both thermal and colour features provided by the tracker. This information is then incorporated into the tracker for occlusion handling and to resolve situations where persons reappear in a scene.

Finally the thesis presents a comprehensive, quantitative evaluation of the whole system and its different components using a set of well defined performance measures. The behaviour of the system was investigated on different data sets including different real office environments and different appearances and behaviours of persons. Moreover the influence of all important system parameters on the performance of the system was checked and their values optimised based on these results.

Place, publisher, year, edition, pages
Örebro: Örebro universitetsbibliotek, 2007. p. 124
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 26
National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-1111 (URN)978-91-7668-536-5 (ISBN)
Public defence
2007-04-27, Hörsal T, Örebro universitet, Fakultetsgatan 1, Örebro, 13:00
Opponent
Supervisors
Available from: 2007-04-05 Created: 2007-04-05 Last updated: 2018-01-13Bibliographically approved
Treptow, A., Cielniak, G. & Duckett, T. (2006). Real-time people tracking for mobile robots using thermal vision. Robotics and Autonomous Systems, 54(9), 729-739
Open this publication in new window or tab >>Real-time people tracking for mobile robots using thermal vision
2006 (English)In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 54, no 9, p. 729-739Article in journal (Refereed) Published
Abstract [en]

This paper presents a vision-based approach for tracking people on a mobile robot using thermal images. The approach combines a particle filter with two alternative measurement models that are suitable for real-time tracking. With this approach a person can be detected independently from current light conditions and in situations where no skin colour is visible. In addition, the paper presents a comprehensive, quantitative evaluation of the different methods on a mobile robot in an office environment, for both single and multiple persons. The results show that the measurement model that was learned from local greyscale features could improve on the performance of the elliptic contour model, and that both models could be combined to further improve performance with minimal extra computational cost.

Keywords
unified tracking, people detection, autonomous robots, quantitative performance evaluation, adaptive boosting
National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3442 (URN)10.1016/j.robot.2006.04.013 (DOI)000240357200003 ()2-s2.0-33746947332 (Scopus ID)
Note

Selected papers from the 2nd European Conference on Mobile Robots (ECMR ’05)

Available from: 2007-07-19 Created: 2007-07-19 Last updated: 2023-12-08Bibliographically approved
Treptow, A., Cielniak, G. & Duckett, T. (2005). Active people recognition using thermal and grey images on a mobile security robot. In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005. (IROS 2005): IROS 2005 IEEE/RSJ -. Paper presented at 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, (IROS 2005), Edmonton, Alberta, Canada, 2-6 Aug. 2005 (pp. 2103-2108).
Open this publication in new window or tab >>Active people recognition using thermal and grey images on a mobile security robot
2005 (English)In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005. (IROS 2005): IROS 2005 IEEE/RSJ -, 2005, p. 2103-2108Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we present a vision-based approach to detect, track and identify people on a mobile robot in real time. While most vision systems for tracking people on mobile robots use skin color information, we present an approach using thermal images and a fast contour model together with a Particle Filter. With this method a person can be detected independently from current light conditions and in situations were no skin color is visible (the person is not close or does not face the robot). Tracking in thermal images is used as an attention system to get an estimate of the position of a person. Based on this estimate we use a pan-tilt camera to zoom to the expected face region and apply a fast face tracker in combination with face recognition to identify the person.

National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3962 (URN)10.1109/IROS.2005.1545530 (DOI)000235632103086 ()
Conference
2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, (IROS 2005), Edmonton, Alberta, Canada, 2-6 Aug. 2005
Available from: 2007-08-27 Created: 2007-08-27 Last updated: 2022-11-25Bibliographically approved
Treptow, A., Cielniak, G. & Duckett, T. (2005). Comparing measurement models for tracking people in thermal images on a mobile robot. In: : . Paper presented at 2nd European Conference on Mobile Robots, EMCR 2005, Ancona, Italy.
Open this publication in new window or tab >>Comparing measurement models for tracking people in thermal images on a mobile robot
2005 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

While most vision systems for tracking people on mobile robots use skin color information, we present an approach using thermal images and two different measurement models together with a Particle Filter. With this method a person can be detected independently from current light conditions and in situations were no skin color is visible (the person is not close or does not face the robot). The results show that a measurement model that was learned from local greyscale features improved on the performance of an elliptic contour model, and that both models could be used in combination to further improve performance with minimal extra computational cost

National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3967 (URN)
Conference
2nd European Conference on Mobile Robots, EMCR 2005, Ancona, Italy
Available from: 2007-08-29 Created: 2007-08-29 Last updated: 2022-08-04Bibliographically approved
Bennewitz, M., Burgard, W., Cielniak, G. & Thrun, S. (2005). Learning motion patterns of people for compliant robot motion. The international journal of robotics research, 24(1), 31-48
Open this publication in new window or tab >>Learning motion patterns of people for compliant robot motion
2005 (English)In: The international journal of robotics research, ISSN 0278-3649, E-ISSN 1741-3176, Vol. 24, no 1, p. 31-48Article in journal (Refereed) Published
Abstract [en]

Whenever people move through their environments they do not move randomly. Instead, they usually follow specific trajectories or motion patterns corresponding to their intentions. Knowledge about such patterns enables a mobile robot to robustly keep track of persons in its environment and to improve its behavior. This paper proposes a technique for learning collections of trajectories that characterize typical motion patterns of persons. Data recorded with laser-range finders is clustered using the expectation maximization algorithm. Based on the result of the clustering process we derive a Hidden Markov Model (HMM) that is applied to estimate the current and future positions of persons based on sensory input. We also describe how to incorporate the probabilistic belief about the potential trajectories of persons into the path planning process. We present several experiments carried out in different environments with a mobile robot equipped with a laser range scanner and a camera system. The results demonstrate that our approach can reliably learn motion patterns of persons, can robustly estimate and predict positions of persons, and can be used to improve the navigation behavior of a mobile robot.

Keywords
learning activity models, trajectory clustering, machine learning, mobile robot navigation, human robot interaction
National Category
Robotics and automation
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-3508 (URN)10.1177/0278364904048962 (DOI)000226344600002 ()2-s2.0-11444257830 (Scopus ID)
Available from: 2007-07-22 Created: 2007-07-22 Last updated: 2025-02-09Bibliographically approved
Cielniak, G., Treptow, A. & Duckett, T. (2005). Quantitative performance evaluation of a people tracking system on a mobile robot. In: : . Paper presented at 2nd European Conference on Mobile Robots, ECMR 2005, Ancona, Italy.
Open this publication in new window or tab >>Quantitative performance evaluation of a people tracking system on a mobile robot
2005 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Future service robots will need to keep track of the persons in their environment. A number of people tracking systems have been developed for mobile robots, but it is currently impossible to make objective comparisons of their performance. This paper presents a comprehensive, quantitative evaluation of a state-of-the-art people tracking system for a mobile robot in an office environment, for both single and multiple persons.

National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3965 (URN)
Conference
2nd European Conference on Mobile Robots, ECMR 2005, Ancona, Italy
Available from: 2007-08-29 Created: 2007-08-29 Last updated: 2022-08-03Bibliographically approved
Cielniak, G. & Duckett, T. (2004). People recognition by mobile robots. Journal of Intelligent & Fuzzy Systems, 15(1), 21-27
Open this publication in new window or tab >>People recognition by mobile robots
2004 (English)In: Journal of Intelligent & Fuzzy Systems, ISSN 1064-1246, E-ISSN 1875-8967, Vol. 15, no 1, p. 21-27Article in journal (Refereed) Published
Abstract [en]

This paper addresses the problem of detecting and identifying persons with a mobile robot, by sensory fusion of thermal and colour vision information. In the proposed system, people are first detected with a thermal camera, using image analysis techniques to segment the persons in the thermal images. This information is then used to segment the corresponding regions of the colour images, using an affine transformation to solve the image correspondence between the two cameras. After segmentation, the region of the image containing a person is further divided into regions corresponding to the person's head, torso and legs. Temperature and colour features are then extracted from each region for input to a pattern recognition system. Three alternative classfication methods were investigated in experiments with a moving mobile robot and moving persons in an office environment. The best identification performance was obtained with a dynamic recognition method based on a Bayes classifier, which takes into account evidence accumulated in a sequence of images.

National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-3541 (URN)
Available from: 2007-07-22 Created: 2007-07-22 Last updated: 2022-08-02Bibliographically approved
Cielniak, G., Miladinovic, M., Hammarin, D., Göransson, L., Lilienthal, A. J. & Duckett, T. (2003). Appearance-based tracking of persons with an omnidirectional vision sensor. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops: . Paper presented at IEEE Workshop on Omnidirectional Vision, OMNIVIS 2003, Madison, Wisconsin, USA, June 21, 2003. IEEE, 7, Article ID 4624346.
Open this publication in new window or tab >>Appearance-based tracking of persons with an omnidirectional vision sensor
Show others...
2003 (English)In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, IEEE, 2003, Vol. 7, article id 4624346Conference paper, Published paper (Refereed)
Abstract [en]

This paper addresses the problem of tracking a moving person with a single, omnidirectional camera. An appearance-based tracking system is described which uses a self-acquired appearance model and a Kalman filter to estimate the position of the person. Features corresponding to ``depth cues'' are first extracted from the panoramic images, then an artificial neural network is trained to estimate the distance of the person from the camera. The estimates are combined using a discrete Kalman filter to track the position of the person over time. The ground truth information required for training the neural network and the experimental analysis was obtained from another vision system, which uses multiple webcams and triangulation to calculate the true position of the person. Experimental results show that the tracking system is accurate and reliable, and that its performance can be further improved by learning multiple, person-specific appearance models

Place, publisher, year, edition, pages
IEEE, 2003
National Category
Computer Sciences
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-4022 (URN)10.1109/CVPRW.2003.10072 (DOI)2-s2.0-84954442356 (Scopus ID)0769519008 (ISBN)
Conference
IEEE Workshop on Omnidirectional Vision, OMNIVIS 2003, Madison, Wisconsin, USA, June 21, 2003
Available from: 2007-09-05 Created: 2007-09-05 Last updated: 2022-08-02Bibliographically approved
Organisations

Search in DiVA

Show all publications