oru.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 199) Show all publications
Fan, H., Hernandez Bennetts, V., Schaffernicht, E. & Lilienthal, A. (2018). A cluster analysis approach based on exploiting density peaks for gas discrimination with electronic noses in open environments. Sensors and actuators. B, Chemical, 259, 183-203
Open this publication in new window or tab >>A cluster analysis approach based on exploiting density peaks for gas discrimination with electronic noses in open environments
2018 (English)In: Sensors and actuators. B, Chemical, ISSN 0925-4005, E-ISSN 1873-3077, Vol. 259, p. 183-203Article in journal (Refereed) Published
Abstract [en]

Gas discrimination in open and uncontrolled environments based on smart low-cost electro-chemical sensor arrays (e-noses) is of great interest in several applications, such as exploration of hazardous areas, environmental monitoring, and industrial surveillance. Gas discrimination for e-noses is usually based on supervised pattern recognition techniques. However, the difficulty and high cost of obtaining extensive and representative labeled training data limits the applicability of supervised learning. Thus, to deal with the lack of information regarding target substances and unknown interferents, unsupervised gas discrimination is an advantageous solution. In this work, we present a cluster-based approach that can infer the number of different chemical compounds, and provide a probabilistic representation of the class labels for the acquired measurements in a given environment. Our approach is validated with the samples collected in indoor and outdoor environments using a mobile robot equipped with an array of commercial metal oxide sensors. Additional validation is carried out using a multi-compound data set collected with stationary sensor arrays inside a wind tunnel under various airflow conditions. The results show that accurate class separation can be achieved with a low sensitivity to the selection of the only free parameter, namely the neighborhood size, which is used for density estimation in the clustering process.

Place, publisher, year, edition, pages
Amsterda, Netherlands: Elsevier, 2018
Keywords
Gas discrimination, environmental monitoring, metal oxide sensors, cluster analysis, unsupervised learning
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-63468 (URN)10.1016/j.snb.2017.10.063 (DOI)
Projects
SmokBot
Funder
EU, Horizon 2020, 645101
Available from: 2017-12-19 Created: 2017-12-19 Last updated: 2018-01-13Bibliographically approved
Canelhas, D. R., Stoyanov, T. & Lilienthal, A. J. (2018). A Survey of Voxel Interpolation Methods and an Evaluation of Their Impact on Volumetric Map-Based Visual Odometry. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Brisbane, May 21 - 25, 2018: . Paper presented at IEEE International Conference on Robotics and Automation (ICRA).
Open this publication in new window or tab >>A Survey of Voxel Interpolation Methods and an Evaluation of Their Impact on Volumetric Map-Based Visual Odometry
2018 (English)In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Brisbane, May 21 - 25, 2018, 2018Conference paper, Published paper (Refereed)
Abstract [en]

Voxel volumes are simple to implement and lend themselves to many of the tools and algorithms available for 2D images. However, the additional dimension of voxels may be costly to manage in memory when mapping large spaces at high resolutions. While lowering the resolution and using interpolation is common work-around, in the literature we often find that authors either use trilinear interpolation or nearest neighbors and rarely any of the intermediate options. This paper presents a survey of geometric interpolation methods for voxel-based map representations. In particular we study the truncated signed distance field (TSDF) and the impact of using fewer than 8 samples to perform interpolation within a depth-camera pose tracking and mapping scenario. We find that lowering the number of samples fetched to perform the interpolation results in performance similar to the commonly used trilinear interpolation method, but leads to higher framerates. We also report that lower bit-depth generally leads to performance degradation, though not as much as may be expected, with voxels containing as few as 3 bits sometimes resulting in adequate estimation of camera trajectories.

Keywords
Voxels, Compression, Interpolation, TSDF, Visual Odometry
National Category
Robotics Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-67850 (URN)
Conference
IEEE International Conference on Robotics and Automation (ICRA)
Projects
H2020 ILIADH2020 Roblog
Funder
EU, Horizon 2020, 732737
Available from: 2018-07-11 Created: 2018-07-11 Last updated: 2018-07-16
Fan, H., Kucner, T. P., Magnusson, M., Li, T. & Lilienthal, A. (2017). A Dual PHD Filter for Effective Occupancy Filtering in a Highly Dynamic Environment. IEEE transactions on intelligent transportation systems (Print), PP(99), 1-17
Open this publication in new window or tab >>A Dual PHD Filter for Effective Occupancy Filtering in a Highly Dynamic Environment
Show others...
2017 (English)In: IEEE transactions on intelligent transportation systems (Print), ISSN 1524-9050, E-ISSN 1558-0016, Vol. PP, no 99, p. 1-17Article in journal (Refereed) Epub ahead of print
Abstract [en]

Environment monitoring remains a major challenge for mobile robots, especially in densely cluttered or highly populated dynamic environments, where uncertainties originated from environment and sensor significantly challenge the robot's perception. This paper proposes an effective occupancy filtering method called the dual probability hypothesis density (DPHD) filter, which models uncertain phenomena, such as births, deaths, occlusions, false alarms, and miss detections, by using random finite sets. The key insight of our method lies in the connection of the idea of dynamic occupancy with the concepts of the phase space density in gas kinetic and the PHD in multiple target tracking. By modeling the environment as a mixture of static and dynamic parts, the DPHD filter separates the dynamic part from the static one with a unified filtering process, but has a higher computational efficiency than existing Bayesian Occupancy Filters (BOFs). Moreover, an adaptive newborn function and a detection model considering occlusions are proposed to improve the filtering efficiency further. Finally, a hybrid particle implementation of the DPHD filter is proposed, which uses a box particle filter with constant discrete states and an ordinary particle filter with a time-varying number of particles in a continuous state space to process the static part and the dynamic part, respectively. This filter has a linear complexity with respect to the number of grid cells occupied by dynamic obstacles. Real-world experiments on data collected by a lidar at a busy roundabout demonstrate that our approach can handle monitoring of a highly dynamic environment in real time.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-63981 (URN)10.1109/TITS.2017.2770152 (DOI)
Available from: 2018-01-09 Created: 2018-01-09 Last updated: 2018-02-08Bibliographically approved
Wiedemann, T., Shutin, D., Hernandez Bennetts, V., Schaffernicht, E. & Lilienthal, A. (2017). Bayesian Gas Source Localization and Exploration with a Multi-Robot System Using Partial Differential Equation Based Modeling. In: 2017 ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017): Proceedings. Paper presented at International Symposium on Olfaction and Electronic Nose (ISOEN 2017), Montreal, Canada, May 28-31, 2017 (pp. 122-124).
Open this publication in new window or tab >>Bayesian Gas Source Localization and Exploration with a Multi-Robot System Using Partial Differential Equation Based Modeling
Show others...
2017 (English)In: 2017 ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017): Proceedings, 2017, p. 122-124Conference paper, Published paper (Refereed)
Abstract [en]

Here we report on active water sampling devices forunderwater chemical sensing robots. Crayfish generate jetlikewater currents during food search by waving theflagella of their maxillipeds. The jets generated toward theirsides induce an inflow from the surroundings to the jets.Odor sample collection from the surroundings to theirolfactory organs is promoted by the generated inflow.Devices that model the jet discharge of crayfish have beendeveloped to investigate the effectiveness of the activechemical sampling. Experimental results are presented toconfirm that water samples are drawn to the chemicalsensors from the surroundings more rapidly by using theaxisymmetric flow field generated by the jet discharge thanby centrosymmetric flow field generated by simple watersuction. Results are also presented to show that there is atradeoff between the angular range of chemical samplecollection and the sample collection time.

National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-60688 (URN)9781509023936 (ISBN)9781509023929 (ISBN)
Conference
International Symposium on Olfaction and Electronic Nose (ISOEN 2017), Montreal, Canada, May 28-31, 2017
Available from: 2017-09-08 Created: 2017-09-08 Last updated: 2018-01-23Bibliographically approved
Neumann, P. P., Kohlhoff, H., Hüllmann, D., Lilienthal, A. & Kluge, M. (2017). Bringing Mobile Robot Olfaction to the Next Dimension - UAV-based Remote Sensing of Gas Clouds and Source Localization. In: 2017 IEEE International Conference on Robotics and Automation (ICRA): . Paper presented at 2017 IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore, May 29 - June 3, 2017 (pp. 3910-3916). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Bringing Mobile Robot Olfaction to the Next Dimension - UAV-based Remote Sensing of Gas Clouds and Source Localization
Show others...
2017 (English)In: 2017 IEEE International Conference on Robotics and Automation (ICRA), Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 3910-3916Conference paper, Published paper (Refereed)
Abstract [en]

This paper introduces a novel robotic platform for aerial remote gas sensing. Spectroscopic measurement methods for remote sensing of selected gases lend themselves for use on mini-copters, which offer a number of advantages for inspection and surveillance. No direct contact with the target gas is needed and thus the influence of the aerial platform on the measured gas plume can be kept to a minimum. This allows to overcome one of the major issues with gas-sensitive mini-copters. On the other hand, remote gas sensors, most prominently Tunable Diode Laser Absorption Spectroscopy (TDLAS) sensors have been too bulky given the payload and energy restrictions of mini-copters. Here, we introduce and present the Unmanned Aerial Vehicle for Remote Gas Sensing (UAV-REGAS), which combines a novel lightweight TDLAS sensor with a 3-axis aerial stabilization gimbal for aiming on a versatile hexacopter. The proposed system can be deployed in scenarios that cannot be addressed by currently available robots and thus constitutes a significant step forward for the field of Mobile Robot Olfaction (MRO). It enables tomographic reconstruction of gas plumes and a localization of gas sources. We also present first results showing the gas sensing and aiming capabilities under realistic conditions.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
Series
IEEE International Conference on Robotics and Automation, ISSN 1050-4729
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-64767 (URN)10.1109/ICRA.2017.7989450 (DOI)2-s2.0-85027982828 (Scopus ID)978-1-5090-4633-1 (ISBN)978-1-5090-4634-8 (ISBN)
Conference
2017 IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore, May 29 - June 3, 2017
Available from: 2018-02-01 Created: 2018-02-01 Last updated: 2018-02-02Bibliographically approved
Canelhas, D. R., Schaffernicht, E., Stoyanov, T., Lilienthal, A. & Davison, A. J. (2017). Compressed Voxel-Based Mapping Using Unsupervised Learning. Robotics, 6(3), Article ID 15.
Open this publication in new window or tab >>Compressed Voxel-Based Mapping Using Unsupervised Learning
Show others...
2017 (English)In: Robotics, E-ISSN 2218-6581, Vol. 6, no 3, article id 15Article in journal (Refereed) Published
Abstract [en]

In order to deal with the scaling problem of volumetric map representations, we propose spatially local methods for high-ratio compression of 3D maps, represented as truncated signed distance fields. We show that these compressed maps can be used as meaningful descriptors for selective decompression in scenarios relevant to robotic applications. As compression methods, we compare using PCA-derived low-dimensional bases to nonlinear auto-encoder networks. Selecting two application-oriented performance metrics, we evaluate the impact of different compression rates on reconstruction fidelity as well as to the task of map-aided ego-motion estimation. It is demonstrated that lossily reconstructed distance fields used as cost functions for ego-motion estimation can outperform the original maps in challenging scenarios from standard RGB-D (color plus depth) data sets due to the rejection of high-frequency noise content.

Place, publisher, year, edition, pages
Basel, Switzerland: MDPI AG, 2017
Keywords
3D mapping, TSDF, compression, dictionary learning, auto-encoder, denoising
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-64420 (URN)10.3390/robotics6030015 (DOI)000419218300002 ()2-s2.0-85030989493 (Scopus ID)
Note

Funding Agencies:

European Commission  FP7-ICT-270350 

H-ICT  732737 

Available from: 2018-01-19 Created: 2018-01-19 Last updated: 2018-01-19Bibliographically approved
Lilienthal, A. & Schindler, M. (2017). Conducting Dual Portable Eye-Tracking in Mathematical Creativity Research. In: Kaur, B., Ho, W.K., Toh, T.L., & Choy, B.H (Ed.), Proceedings the 41th Conference of the International Group for the Psychology of Mathematics Education: . Paper presented at The 41th Conference of the International Group for the Psychology of Mathematics Education, Singapore, July 17 – 22, 2017 (pp. 233-233). Singapore: PME, 1
Open this publication in new window or tab >>Conducting Dual Portable Eye-Tracking in Mathematical Creativity Research
2017 (English)In: Proceedings the 41th Conference of the International Group for the Psychology of Mathematics Education / [ed] Kaur, B., Ho, W.K., Toh, T.L., & Choy, B.H, Singapore: PME , 2017, Vol. 1, p. 233-233Conference paper, Published paper (Refereed)
Abstract [en]

Eye-tracking opens a window to the focus of attention of persons and promises to allow studying, e.g., creative processes “in vivo” (Nüssli, 2011). Most eye-tracking studies in mathematics education research focus on single students. However, following a Vygotskyan notion of learning and development where the individual and the social are dialectically interrelated, eye-tracking studies of collaborating persons appear beneficial for understanding students’ learning in their social facet. Dual eye-tracking, where two persons’ eye-movements are recorded and related to a joint coordinate-system, has hardly been used in mathematics education research. Especially dual portable eye-tracking (DPET) with goggles has hardly been explored due to its technical challenges compared to screen-based eye-tracking.In our interdisciplinary research project between mathematics education and computer science, we conduct DPET for studying collective mathematical creativity (Levenson, 2011) in a process perspective. DPET offers certain advantages, including to carry out paper and pen tasks in rather natural settings. Our research interests are: conducting DPET (technical), investigating opportunities and limitations of DPET for studying students’ collective creativity (methodological), and studying students’ collective creative problem solving (empirical).We carried out experiments with two pairs of university students wearing Pupil Pro eye tracking goggles. The students were given 45 min to solve a geometry problem in as many ways as possible. For our analysis, we first programmed MATLAB code to synchronize data from both participants’ goggles; resulting in a video displaying both students’ eye-movements projected on the task sheet, the sound recorded by the goggles, and additional information, e.g. pupil dilation. With these videos we expect to get insights into how students’ attentions meet, if students’ eye-movements follow one another, or verbal inputs, etc. We expect insights into promotive aspects in students’ collaboration: e.g., if pointing on the figure or intensive verbal communication promote students’ joint attention (cf. Nüssli, 2011). Finally, we think that the expected insights can contribute to existing research on collective mathematical creativity, especially to the question of how to enhance students’ creative collaboration.

Place, publisher, year, edition, pages
Singapore: PME, 2017
National Category
Didactics
Research subject
Computer Engineering; Computer Science
Identifiers
urn:nbn:se:oru:diva-64763 (URN)978-138-71-3608-7 (ISBN)
Conference
The 41th Conference of the International Group for the Psychology of Mathematics Education, Singapore, July 17 – 22, 2017
Available from: 2018-02-01 Created: 2018-02-01 Last updated: 2018-02-02Bibliographically approved
Kucner, T. P., Magnusson, M., Schaffernicht, E., Hernandez Bennetts, V. M. & Lilienthal, A. (2017). Enabling Flow Awareness for Mobile Robots in Partially Observable Environments. IEEE Robotics and Automation Letters, 2(2), 1093-1100
Open this publication in new window or tab >>Enabling Flow Awareness for Mobile Robots in Partially Observable Environments
Show others...
2017 (English)In: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 2, no 2, p. 1093-1100Article in journal (Refereed) Published
Abstract [en]

Understanding the environment is a key requirement for any autonomous robot operation. There is extensive research on mapping geometric structure and perceiving objects. However, the environment is also defined by the movement patterns in it. Information about human motion patterns can, e.g., lead to safer and socially more acceptable robot trajectories. Airflow pattern information allow to plan energy efficient paths for flying robots and improve gas distribution mapping. However, modelling the motion of objects (e.g., people) and flow of continuous media (e.g., air) is a challenging task. We present a probabilistic approach for general flow mapping, which can readily handle both of these examples. Moreover, we present and compare two data imputation methods allowing to build dense maps from sparsely distributed measurements. The methods are evaluated using two different data sets: one with pedestrian data and one with wind measurements. Our results show that it is possible to accurately represent multimodal, turbulent flow using a set of Gaussian Mixture Models, and also to reconstruct a dense representation based on sparsely distributed locations.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
Keywords
Field robots; mapping; social human-robot interaction
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-55174 (URN)10.1109/LRA.2017.2660060 (DOI)000413736600094 ()
Projects
ILIAD
Funder
Knowledge Foundation, 20140220 20130196
Note

Funding Agencies:

EU project SPENCER  ICT-2011-600877 

H2020-ICT project SmokeBot  645101 

H2020-ICT project ILIAD  732737 

Available from: 2017-02-01 Created: 2017-02-01 Last updated: 2017-11-23Bibliographically approved
Vuka, M., Schaffernicht, E., Schmuker, M., Hernandez Bennetts, V., Amigoni, F. & Lilienthal, A. J. (2017). Exploration and Localization of a Gas Source with MOX Gas Sensorson a Mobile Robot: A Gaussian Regression Bout Amplitude Approach. In: 2017 ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017): Proceedings. Paper presented at IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017), Montreal, QC, Canada, May 28-31, 2017 (pp. 164-166). IEEE
Open this publication in new window or tab >>Exploration and Localization of a Gas Source with MOX Gas Sensorson a Mobile Robot: A Gaussian Regression Bout Amplitude Approach
Show others...
2017 (English)In: 2017 ISOCS/IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017): Proceedings, IEEE, 2017, p. 164-166Conference paper, Published paper (Refereed)
Abstract [en]

Mobile robot olfaction systems combine gas sensorswith mobility provided by robots. They relief humansof dull, dirty and dangerous tasks in applications such assearch & rescue or environmental monitoring. We address gassource localization and especially the problem of minimizingexploration time of the robot, which is a key issue due toenergy constraints. We propose an active search approach forrobots equipped with MOX gas sensors and an anemometer,given an occupancy map. Events of rapid change in the MOXsensor signal (“bouts”) are used to estimate the distance to agas source. The wind direction guides a Gaussian regression,which interpolates distance estimates. The contributions of thispaper are two-fold. First, we extend previous work on gassource distance estimation with MOX sensors and propose amodification to cope better with turbulent conditions. Second,we introduce a novel active search gas source localizationalgorithm and validate it in a real-world environment.

Place, publisher, year, edition, pages
IEEE, 2017
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-60672 (URN)10.1109/ISOEN.2017.7968898 (DOI)2-s2.0-85027226540 (Scopus ID)
Conference
IEEE International Symposium on Olfaction and Electronic Nose (ISOEN 2017), Montreal, QC, Canada, May 28-31, 2017
Available from: 2017-09-08 Created: 2017-09-08 Last updated: 2018-01-13Bibliographically approved
Schindler, M. & Lilienthal, A. (2017). Eye-Tracking and its Domain-Specific Interpretation: A Stimulated Recall Study on Eye Movements in Geometrical Tasks. In: Kaur, B., Ho, W.K., Toh, T.L., & Choy, B.H (Ed.), Proceedings of the 41st Conference of the International Group for the Psychology of Mathematics Education: . Paper presented at Annual Meeting of the International Group for the Psychology of Mathematics Education, Singapore, Singapore, July 17 – 22, 2017 (pp. 153-160). Singapore: PME, 4
Open this publication in new window or tab >>Eye-Tracking and its Domain-Specific Interpretation: A Stimulated Recall Study on Eye Movements in Geometrical Tasks
2017 (English)In: Proceedings of the 41st Conference of the International Group for the Psychology of Mathematics Education / [ed] Kaur, B., Ho, W.K., Toh, T.L., & Choy, B.H, Singapore: PME , 2017, Vol. 4, p. 153-160Conference paper, Published paper (Refereed)
Abstract [en]

Eye-tracking offers various possibilities for mathematics education. Yet, even in suitably visually presented tasks, interpretation of eye-tracking data is non-trivial. A key reason is that the interpretation of eye-tracking data is context-sensitive. To reduce ambiguity and uncertainty, we studied the interpretation of eye movements in a specific domain: geometrical mathematical creativity tasks. We present results from a qualitative empirical study in which we analyzed a Stimulated Recall Interview where a student watched the eye-tracking overlaid video of his work on a task. Our results hint at how eye movements can be interpreted and show limitations and opportunities of eye tracking in the domain of mathematical geometry tasks and beyond.

Place, publisher, year, edition, pages
Singapore: PME, 2017
National Category
Didactics
Research subject
Education
Identifiers
urn:nbn:se:oru:diva-64765 (URN)978-138-71-3613-1 (ISBN)
Conference
Annual Meeting of the International Group for the Psychology of Mathematics Education, Singapore, Singapore, July 17 – 22, 2017
Available from: 2018-02-01 Created: 2018-02-01 Last updated: 2018-02-02Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-0217-9326

Search in DiVA

Show all publications