To Örebro University

oru.seÖrebro University Publications
Change search
Link to record
Permanent link

Direct link
Krug, Robert
Publications (10 of 21) Show all publications
Krug, R., Lilienthal, A. J., Kragic, D. & Bekiroglu, Y. (2016). Analytic Grasp Success Prediction with Tactile Feedback. In: 2016 IEEE International Conference on Robotics and Automation, ICRA 2016: . Paper presented at IEEE International Conference on Robotics and Automation (ICRA), Royal Inst Technol, Ctr Autonomous Syst, Stockholm, Sweden, May 16-21, 2016 (pp. 165-171). New York, USA: IEEE
Open this publication in new window or tab >>Analytic Grasp Success Prediction with Tactile Feedback
2016 (English)In: 2016 IEEE International Conference on Robotics and Automation, ICRA 2016, New York, USA: IEEE , 2016, p. 165-171Conference paper, Published paper (Refereed)
Abstract [en]

Predicting grasp success is useful for avoiding failures in many robotic applications. Based on reasoning in wrench space, we address the question of how well analytic grasp success prediction works if tactile feedback is incorporated. Tactile information can alleviate contact placement uncertainties and facilitates contact modeling. We introduce a wrench-based classifier and evaluate it on a large set of real grasps. The key finding of this work is that exploiting tactile information allows wrench-based reasoning to perform on a level with existing methods based on learning or simulation. Different from these methods, the suggested approach has no need for training data, requires little modeling effort and is computationally efficient. Furthermore, our method affords task generalization by considering the capabilities of the grasping device and expected disturbance forces/moments in a physically meaningful way.

Place, publisher, year, edition, pages
New York, USA: IEEE, 2016
Series
IEEE International Conference on Robotics and Automation ICRA, ISSN 1050-4729
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-54682 (URN)10.1109/ICRA.2016.7487130 (DOI)000389516200021 ()2-s2.0-84977515559 (Scopus ID)978-1-4673-8026-3 (ISBN)
Conference
IEEE International Conference on Robotics and Automation (ICRA), Royal Inst Technol, Ctr Autonomous Syst, Stockholm, Sweden, May 16-21, 2016
Available from: 2017-01-13 Created: 2017-01-13 Last updated: 2018-01-13Bibliographically approved
Chadalavada, R. T., Andreasson, H., Krug, R. & Lilienthal, A. (2016). Empirical evaluation of human trust in an expressive mobile robot. In: Proceedings of RSS Workshop "Social Trust in Autonomous Robots 2016": . Paper presented at RSS Workshop "Social Trust in Autonomous Robots 2016", June 19, 2016.
Open this publication in new window or tab >>Empirical evaluation of human trust in an expressive mobile robot
2016 (English)In: Proceedings of RSS Workshop "Social Trust in Autonomous Robots 2016", 2016Conference paper, Published paper (Refereed)
Abstract [en]

A mobile robot communicating its intentions using Spatial Augmented Reality (SAR) on the shared floor space makes humans feel safer and more comfortable around the robot. Our previous work [1] and several other works established this fact. We built upon that work by adding an adaptable information and control to the SAR module. An empirical study about how a mobile robot builds trust in humans by communicating its intentions was conducted. A novel way of evaluating that trust is presented and experimentally shown that adaption in SAR module lead to natural interaction and the new evaluation system helped us discover that the comfort levels between human-robot interactions approached those of human-human interactions.

Keywords
Human robot interaction, hri, mobile robot, trust, evaluation
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-55259 (URN)
Conference
RSS Workshop "Social Trust in Autonomous Robots 2016", June 19, 2016
Available from: 2017-02-02 Created: 2017-02-02 Last updated: 2018-03-14Bibliographically approved
Stoyanov, T., Krug, R., Muthusamy, R. & Kyrki, V. (2016). Grasp Envelopes: Extracting Constraints on Gripper Postures from Online Reconstructed 3D Models. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): . Paper presented at IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), Daejeong, Korea, October 9-14, 2016 (pp. 885-892). New York: Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Grasp Envelopes: Extracting Constraints on Gripper Postures from Online Reconstructed 3D Models
2016 (English)In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), New York: Institute of Electrical and Electronics Engineers (IEEE), 2016, p. 885-892Conference paper, Published paper (Refereed)
Abstract [en]

Grasping systems that build upon meticulously planned hand postures rely on precise knowledge of object geometry, mass and frictional properties - assumptions which are often violated in practice. In this work, we propose an alternative solution to the problem of grasp acquisition in simple autonomous pick and place scenarios, by utilizing the concept of grasp envelopes: sets of constraints on gripper postures. We propose a fast method for extracting grasp envelopes for objects that fit within a known shape category, placed in an unknown environment. Our approach is based on grasp envelope primitives, which encode knowledge of human grasping strategies. We use environment models, reconstructed from noisy sensor observations, to refine the grasp envelope primitives and extract bounded envelopes of collision-free gripper postures. Also, we evaluate the envelope extraction procedure both in a stand alone fashion, as well as an integrated component of an autonomous picking system.

Place, publisher, year, edition, pages
New York: Institute of Electrical and Electronics Engineers (IEEE), 2016
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-53372 (URN)10.1109/IROS.2016.7759155 (DOI)000391921701009 ()978-1-5090-3762-9 (ISBN)
Conference
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), Daejeong, Korea, October 9-14, 2016
Available from: 2016-11-02 Created: 2016-11-02 Last updated: 2018-07-17Bibliographically approved
Stoyanov, T., Vaskevicius, N., Mueller, C. A., Fromm, T., Krug, R., Tincani, V., . . . Echelmeyer, W. (2016). No More Heavy Lifting: Robotic Solutions to the Container-Unloading Problem. IEEE robotics & automation magazine, 23(4), 94-106
Open this publication in new window or tab >>No More Heavy Lifting: Robotic Solutions to the Container-Unloading Problem
Show others...
2016 (English)In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223X, Vol. 23, no 4, p. 94-106Article in journal (Refereed) Published
Place, publisher, year, edition, pages
IEEE, 2016
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems)
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-53371 (URN)10.1109/MRA.2016.2535098 (DOI)000389874400011 ()2-s2.0-84981763797 (Scopus ID)
Note

Funding Agency:

EU FP7 project ROBLOG ICT-270350

Available from: 2016-11-02 Created: 2016-11-02 Last updated: 2018-07-17Bibliographically approved
Bunz, E., Chadalavada, R. T., Andreasson, H., Krug, R., Schindler, M. & Lilienthal, A. (2016). Spatial Augmented Reality and Eye Tracking for Evaluating Human Robot Interaction. In: Proceedings of RO-MAN 2016 Workshop: Workshop on Communicating Intentions in Human-Robot Interaction. Paper presented at RO-MAN 2016 Workshop: Workshop on Communicating Intentions in Human-Robot Interaction, New York, USA, Aug 31, 2016.
Open this publication in new window or tab >>Spatial Augmented Reality and Eye Tracking for Evaluating Human Robot Interaction
Show others...
2016 (English)In: Proceedings of RO-MAN 2016 Workshop: Workshop on Communicating Intentions in Human-Robot Interaction, 2016Conference paper, Published paper (Refereed)
Abstract [en]

Freely moving autonomous mobile robots may leadto anxiety when operating in workspaces shared with humans.Previous works have given evidence that communicating in-tentions using Spatial Augmented Reality (SAR) in the sharedworkspace will make humans more comfortable in the vicinity ofrobots. In this work, we conducted experiments with the robotprojecting various patterns in order to convey its movementintentions during encounters with humans. In these experiments,the trajectories of both humans and robot were recorded witha laser scanner. Human test subjects were also equipped withan eye tracker. We analyzed the eye gaze patterns and thelaser scan tracking data in order to understand how the robot’sintention communication affects the human movement behavior.Furthermore, we used retrospective recall interviews to aid inidentifying the reasons that lead to behavior changes.

National Category
Computer Sciences
Identifiers
urn:nbn:se:oru:diva-55274 (URN)
Conference
RO-MAN 2016 Workshop: Workshop on Communicating Intentions in Human-Robot Interaction, New York, USA, Aug 31, 2016
Available from: 2017-02-02 Created: 2017-02-02 Last updated: 2018-03-14Bibliographically approved
Krug, R., Stoyanov, T., Tincani, V., Andreasson, H., Mosberger, R., Fantoni, G. & Lilienthal, A. J. (2016). The Next Step in Robot Commissioning: Autonomous Picking and Palletizing. IEEE Robotics and Automation Letters, 1(1), 546-553
Open this publication in new window or tab >>The Next Step in Robot Commissioning: Autonomous Picking and Palletizing
Show others...
2016 (English)In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 1, no 1, p. 546-553Article in journal (Refereed) Published
Abstract [en]

So far, autonomous order picking (commissioning) systems have not been able to meet the stringent demands regarding speed, safety, and accuracy of real-world warehouse automation, resulting in reliance on human workers. In this letter, we target the next step in autonomous robot commissioning: automatizing the currently manual order picking procedure. To this end, we investigate the use case of autonomous picking and palletizing with a dedicated research platform and discuss lessons learned during testing in simplified warehouse settings. The main theoretical contribution is a novel grasp representation scheme which allows for redundancy in the gripper pose placement. This redundancy is exploited by a local, prioritized kinematic controller which generates reactive manipulator motions on-the-fly. We validated our grasping approach by means of a large set of experiments, which yielded an average grasp acquisition time of 23.5 s at a success rate of 94.7%. Our system is able to autonomously carry out simple order picking tasks in a humansafe manner, and as such serves as an initial step toward future commercial-scale in-house logistics automation solutions.

Place, publisher, year, edition, pages
Piscataway, USA: Institute of Electrical and Electronics Engineers (IEEE), 2016
Keywords
Logistics, grasping, autonomous vehicle navigation, robot safety, mobile manipulation
National Category
Computer Sciences Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:oru:diva-53370 (URN)10.1109/LRA.2016.2519944 (DOI)000413719900073 ()2-s2.0-84981762372 (Scopus ID)
Funder
EU, FP7, Seventh Framework Programme, ICT-270350Knowledge Foundation, 20140220
Available from: 2016-11-02 Created: 2016-11-02 Last updated: 2018-01-13Bibliographically approved
Krug, R., Stoyanov, T. & Lilienthal, A. (2015). Grasp Envelopes for Constraint-based Robot Motion Planning and Control. In: Robotics: Science and Systems Conference: Workshop on Bridging the Gap between Data-driven and Analytical Physics-based Grasping and Manipulation. Paper presented at 2015 Robotics: Science and Systems Conference (RSS, Rome, Italy, July 13-17, 2015.
Open this publication in new window or tab >>Grasp Envelopes for Constraint-based Robot Motion Planning and Control
2015 (English)In: Robotics: Science and Systems Conference: Workshop on Bridging the Gap between Data-driven and Analytical Physics-based Grasping and Manipulation, 2015Conference paper, Published paper (Refereed)
Abstract [en]

We suggest a grasp represen-tation in form of a set of enveloping spatial constraints. Our representation transforms the grasp synthesisproblem (i. e., the question of where to position the graspingdevice) from finding a suitable discrete manipulator wrist pose to finding a suitable pose manifold. Also the correspondingmotion planning and execution problem is relaxed – insteadof transitioning the wrist to a discrete pose, it is enough tomove it anywhere within the grasp envelope which allows toexploit kinematic redundancy.

Keywords
Grasping, Grasp Control, Motion Control
National Category
Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-45346 (URN)
Conference
2015 Robotics: Science and Systems Conference (RSS, Rome, Italy, July 13-17, 2015
Available from: 2015-07-21 Created: 2015-07-21 Last updated: 2018-03-14Bibliographically approved
Krug, R., Stoyanov, T., Tincani, V., Andreasson, H., Mosberger, R., Fantoni, G., . . . Lilienthal, A. (2015). On Using Optimization-based Control instead of Path-Planning for Robot Grasp Motion Generation. In: IEEE International Conference on Robotics and Automation (ICRA) - Workshop on Robotic Hands, Grasping, and Manipulation: . Paper presented at IEEE International Conference on Robotics and Automation (ICRA) - Workshop on Robotic Hands, Grasping, and Manipulation, Washington, USA, May 26-30, 2015.
Open this publication in new window or tab >>On Using Optimization-based Control instead of Path-Planning for Robot Grasp Motion Generation
Show others...
2015 (English)In: IEEE International Conference on Robotics and Automation (ICRA) - Workshop on Robotic Hands, Grasping, and Manipulation, 2015Conference paper, Published paper (Refereed)
Keywords
Grasping, Motion Planning, Control
National Category
Robotics
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-44485 (URN)
Conference
IEEE International Conference on Robotics and Automation (ICRA) - Workshop on Robotic Hands, Grasping, and Manipulation, Washington, USA, May 26-30, 2015
Available from: 2015-04-27 Created: 2015-04-27 Last updated: 2018-06-29Bibliographically approved
Tincani, V., Catalano, M., Grioli, G., Stoyanov, T., Krug, R., Lilienthal, A. J., . . . Bicchi, A. (2015). Sensitive Active Surfaces on the Velvet II Dexterous Gripper. In: : . Paper presented at IEEE International Conference on Robotics and Automation (ICRA) - Workshop "Get in Touch!" Tactile & Force Sensing for Autonomous, Compliant, Intelligent Robots, Seattle, USA, May 30, 2015 (pp. 2744-2750). IEEE
Open this publication in new window or tab >>Sensitive Active Surfaces on the Velvet II Dexterous Gripper
Show others...
2015 (English)Conference paper, Published paper (Refereed)
Place, publisher, year, edition, pages
IEEE, 2015
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-47935 (URN)
Conference
IEEE International Conference on Robotics and Automation (ICRA) - Workshop "Get in Touch!" Tactile & Force Sensing for Autonomous, Compliant, Intelligent Robots, Seattle, USA, May 30, 2015
Available from: 2016-02-04 Created: 2016-02-04 Last updated: 2023-05-25Bibliographically approved
Chadalavada, R. T., Andreasson, H., Krug, R. & Lilienthal, A. (2015). That’s on my Mind!: Robot to Human Intention Communication through on-board Projection on Shared Floor Space. In: 2015 European Conference on Mobile Robots (ECMR): . Paper presented at 7th European Conference on Mobile Robots (ECMR), Lincoln, Lincolnshire, United Kingdom, September 2-4, 2015. New York: IEEE conference proceedings
Open this publication in new window or tab >>That’s on my Mind!: Robot to Human Intention Communication through on-board Projection on Shared Floor Space
2015 (English)In: 2015 European Conference on Mobile Robots (ECMR), New York: IEEE conference proceedings , 2015Conference paper, Published paper (Refereed)
Abstract [en]

The upcoming new generation of autonomous vehicles for transporting materials in industrial environments will be more versatile, flexible and efficient than traditional AGVs, which simply follow pre-defined paths. However, freely navigating vehicles can appear unpredictable to human workers and thus cause stress and render joint use of the available space inefficient. Here we address this issue and propose on-board intention projection on the shared floor space for communication from robot to human. We present a research prototype of a robotic fork-lift equipped with a LED projector to visualize internal state information and intents. We describe the projector system and discuss calibration issues. The robot’s ability to communicate its intentions is evaluated in realistic situations where test subjects meet the robotic forklift. The results show that already adding simple information, such as the trajectory and the space to be occupied by the robot in the near future, is able to effectively improve human response to the robot.

Place, publisher, year, edition, pages
New York: IEEE conference proceedings, 2015
Keywords
Human Robot Interaction, Intention Communication, Shared spaces
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-47943 (URN)10.1109/ECMR.2015.7403771 (DOI)000380213600058 ()978-1-4673-9163-4 (ISBN)
Conference
7th European Conference on Mobile Robots (ECMR), Lincoln, Lincolnshire, United Kingdom, September 2-4, 2015
Projects
Action and Intention Recognition (AIR)
Funder
EU, FP7, Seventh Framework Programme, FP7-ICT-600877 (SPENCER)Knowledge Foundation, 20140220 (AIR)
Available from: 2016-02-04 Created: 2016-02-04 Last updated: 2018-01-10Bibliographically approved
Organisations

Search in DiVA

Show all publications