To Örebro University

oru.seÖrebro University Publications
Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
Refine search result
1 - 37 of 37
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Berglund, Erik
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Charusta, Krzysztof
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Mapping between different kinematic structures without absolute positioning during operation2012In: Electronics Letters, ISSN 0013-5194, E-ISSN 1350-911X, Vol. 48, no 18, p. 1110-1112Article in journal (Refereed)
    Abstract [en]

    When creating datasets for modelling of human skills based on training examples from human motion, one can encounter the problem that the kinematics of the robot does not match the human kinematics. Presented is a simple method of bypassing the explicit modelling of the human kinematics based on a variant of the self-organising map (SOM) algorithm. While the literature contains instances of SOM-type algorithms used for dimension reduction, this reported work deals with the inverse problem: dimension increase, as we are going from 4 to 5 degrees of freedom.

  • 2.
    Charusta, Krzysztof
    et al.
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Extraction of grasp-related features by human dual-hand object exploration2009In: 2009 International Conference on Advanced Robotics, Piscataway, NJ: IEEE conference proceedings, 2009, p. 1-6Conference paper (Refereed)
    Abstract [en]

    We consider the problem of objects exploration for grasping purposes, specifically in cases where vision based methods are not applicable. A novel dual-hand object exploration method is proposed that takes benefits from a human demonstration to enrich knowledge about an object. The user handles an object freely using both hands, without restricting the object pose. A set of grasp-related features obtained during exploration is demonstrated and utilized to generate grasp oriented bounding boxes that are basis for pre-grasp hypothesis. We believe that such exploration done in a natural and user friendly way creates important link between an operator intention and a robot action.

  • 3.
    Charusta, Krzysztof
    et al.
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Independent contact regions based on a patch contact model2012In: 2012 IEEE International Conference on Robotics and Automation (ICRA), IEEE conference proceedings, 2012, p. 4162-4169Conference paper (Refereed)
    Abstract [en]

    The synthesis of multi-fingered grasps on nontrivial objects requires a realistic representation of the contact between the fingers of a robotic hand and an object. In this work, we use a patch contact model to approximate the contact between a rigid object and a deformable anthropomorphic finger. This contact model is utilized in the computation of Independent Contact Regions (ICRs) that have been proposed as a way to compensate for shortcomings in the finger positioning accuracy of robotic grasping devices. We extend the ICR algorithm to account for the patch contact model and show the benefits of this solution.

  • 4.
    Charusta, Krzysztof
    et al.
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Stoyanov, Todor
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Generation of independent contact regions on objects reconstructed from noisy real-world range data2012In: 2012 IEEE International Conference on Robotics and Automation (ICRA), IEEE conference proceedings, 2012, p. 1338-1344Conference paper (Refereed)
    Abstract [en]

    The synthesis and evaluation of multi-fingered grasps on complex objects is a challenging problem that has received much attention in the robotics community. Although several promising approaches have been developed, applications to real-world systems are limited to simple objects or gripper configurations. The paradigm of Independent Contact Regions (ICRs) has been proposed as a way to increase the tolerance to grasp positioning errors. This concept is well established, though only on precise geometric object models. This work is concerned with the application of the ICR paradigm to models reconstructed from real-world range data. We propose a method for increasing the robustness of grasp synthesis on uncertain geometric models. The sensitivity of the ICR algorithm to noisy data is evaluated and a filtering approach is proposed to improve the quality of the final result.

  • 5. Hristozov, Iasen
    et al.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Eskiizmirliler, Selim
    A combined feature extraction method for an electronic nose2006In: Modern information processing: from theory to applications / [ed] Bernadette Bouchon-Meunier, Giulianella Coletti, Ronald Yager, Amsterdam: Elsevier, 2006, p. 453-466Chapter in book (Other academic)
  • 6.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Minimum-time sliding mode control of robot manipulators2002Licentiate thesis, monograph (Other academic)
  • 7.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Minimum-time sliding mode control of robot manipulators2004Doctoral thesis, monograph (Other academic)
    Abstract [en]

    Robot manipulators have a complex and highly nonlinear dynamics, accompanied with a high degree of uncertainty. These properties make them difficult for time-optimal control. The theory of sliding mode control can provide methods, able to cope with the uncertainty and nonlinearity in the system. However, besides the chattering problem it does not provide

    time-optimal behavior. The optimal control theory provides the appropriate design methodology for minimum-time control, but the designed system lacks robustness. In this thesis we combine these two approaches to obtain new control techniques, which have the robust properties of the sliding mode control and a performance, close to the time-optimal control. Two methods for minimum-time sliding mode control based on the concept of maximum slope sliding line are developed with a theoretical proof of their properties.

    In the time-optimal sliding mode control we prove that the time-optimal switching line of a simple linear system (double integrator) can be used as a sliding surface for a complex second order nonlinear system (robot manipulator) if the control gain is sufficiently high. Optimal

    performance is achieved by scaling the surface in such way that the maximum control action is efficiently used.

    The fuzzy minimum-time sliding mode control is developed employing a Takagi-Sugeno

    fuzzy model for the sliding surface. We demonstrate that designs, based on a single sliding line tend to be conservative, due to the nonlinearities in the robot's dynamics. The Takagi-Sugeno model represents the maximum slope sliding lines for different values of the joint angles taking into account the variation in the gravity and inertia terms. This gives a convenient way to provide adaptation and incorporate additional knowledge in the controller design.

    Design procedures for all the methods are developed and evaluated in simulation and in experiments with real robot manipulators.

  • 8.
    Iliev, Boyko
    et al.
    Örebro University, Department of Technology.
    Kadmiry, Bourhane
    Örebro University, Department of Technology.
    Palm, Rainer
    Örebro University, Department of Technology.
    Interpretation of human demonstrations using mirror neuron system principles2007In: IEEE 6th international conference on development and learning, ICDL 2007, New York: IEEE , 2007, p. 128-133Conference paper (Refereed)
    Abstract [en]

    In this article we suggest a framework for programming by demonstration of robotic grasping based on principles of the Mirror Neuron System (MNS) model. The approach uses a hand-state representation inspired by neurophysiological models of human grasping. We show that such a representation not only simplifies the grasp recognition but also preserves the essential part of the reaching motion associated with the grasp. We show that if the hand state trajectory of a demonstration can be reconstructed, the robot is able to replicate the grasp. This can be done using motion primitives, derived by fuzzy time-clustering from the demonstrated reach-and grasp motions. To illustrate the approach we show how human demonstrations of cylindrical grasps can be modeled, interpreted and replicated by a robot in this framework.

  • 9.
    Iliev, Boyko
    et al.
    Örebro University, Department of Technology.
    Kalaykov, Ivan
    Örebro University, Department of Technology.
    Improved sliding mode robot control: a fuzzy approach2002In: Proceedings of the third international workshop on robot motion and control, 2002. RoMoCo '02, 2002, p. 393-398Conference paper (Refereed)
    Abstract [en]

    An approach to the design of high performance sliding mode controllers for robot manipulators is presented. It employs a Takagi-Sugeno fuzzy system to describe the sliding surface. Each rule of this system represents the maximum slope sliding line for a certain set of parameters given in the premise part. Hence, the slope of the surface is adapted according to the current state of the manipulator. This new algorithm provides nearly time-optimal performance and still retains the robustness, typical for systems in sliding mode. The maximum slope sliding surfaces are designed using knowledge about robot's physical properties.

  • 10.
    Iliev, Boyko
    et al.
    Örebro University, Department of Technology.
    Kalaykov, Ivan
    Örebro University, Department of Technology.
    Minimum-time sliding mode control for second-order systems2004In: Proceedings of the 2004 American control conference, 2004: vol 1, 2004, p. 626-631Conference paper (Refereed)
    Abstract [en]

    Our approach for near time-optimal control is based on Takagi-Sugeno fuzzy model of the maximum slope SMC sliding surface as an adaptive technique for tuning the current slope of the sliding surface to the maximum feasible slope depending on the current state of system. The stability conditions of this method are proved and respective measures about the feasible maximum slope are presented. Experimental results demonstrate the system behaviour.

  • 11.
    Iliev, Boyko
    et al.
    Örebro University, Department of Technology.
    Lindquist, Malin
    Örebro University, Department of Technology.
    Robertsson, Linn
    Örebro University, Department of Technology.
    Wide, Peter
    Örebro University, Department of Technology.
    A fuzzy technique for food- and water quality assessment with an electronic tongue2006In: Fuzzy sets and systems (Print), ISSN 0165-0114, E-ISSN 1872-6801, Vol. 157, no 9, p. 1155-1168Article in journal (Refereed)
    Abstract [en]

    The problem of food- and water quality assessment is important for many practical applications, such as food industry and environmental monitoring. In this article we present a method for fast online quality assessment based on electronic tongue measurements. The idea is implemented in two steps. First we apply a fuzzy clustering technique to obtain prototypes corresponding to good and bad quality from a set of training data. During the second, online step we evaluate the membership of the current measurement to each cluster and make a decision about its quality. The result is presented to the user in a simple and understandable way, similar to the concept of traffic light signals. Namely, good quality is indicated with by a green light, bad quality with a red one, and a yellow light is a warning signal. The approach is demonstrated in two case studies: quality assessment of drinking water and baby food.

  • 12.
    Kalaykov, Ivan
    et al.
    Örebro University, School of Science and Technology.
    Ananiev, Anani
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    SME robotics demand flexible grippers and fixtures2008In: Proc. 39th Int. Symposium on Robotics, Seoul, Korea, 2008, p. 62-65Conference paper (Refereed)
  • 13.
    Kalaykov, Ivan
    et al.
    Örebro University, School of Science and Technology.
    Ananiev, Anani
    Örebro University, School of Science and Technology.
    Iliev, Boykov
    Örebro University, School of Science and Technology.
    Flexible grippers and fixtures2008Conference paper (Refereed)
  • 14.
    Kalaykov, Ivan
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Time-optimal sliding mode control of robot manipulator2000In: 26th annual conference of the IEEE industrial electronics society: IECON 2000, 2000, p. 265-270Conference paper (Refereed)
    Abstract [en]

    We demonstrate a time-optimal control algorithm based on the sliding mode control principle to control a robot manipulator. A designed time-optimal trajectory during the reaching phase is combined with fast sliding dynamics. The discontinuous algorithm gives a time response closer to the analytical time-optimal control solution based on the Pontryagin principle, and robust performance in the presence of plant parameter uncertainties.

  • 15.
    Krug, Robert
    et al.
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Charusta, Krzysztof
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    On the efficient computation of independent contact regions for force closure grasps2010In: IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems (IROS 2010), IEEE conference proceedings, 2010, p. 586-591Conference paper (Other academic)
    Abstract [en]

    Since the introduction of independent contact regions in order to compensate for shortcomings in the positioning accuracy of robotic hands, alternative methods for their generation have been proposed. Due to the fact that (in general) such regions are not unique, the computation methods used usually reflect the envisioned application and/or underlying assumptions made. This paper introduces a parallelizable algorithm for the efficient computation of independent contact regions, under the assumption that a user input in the form of initial guess for the grasping points is readily available. The proposed approach works on discretized 3D-objects with any number of contacts and can be used with any of the following models: frictionless point contact, point contact with friction and soft finger contact. An example of the computation of independent contact regions comprising a non-trivial task wrench space is given.

  • 16.
    Krug, Robert
    et al.
    Örebro University, School of Science and Technology.
    Dimitrov, Dimitar
    Örebro University, School of Science and Technology.
    Charusta, Krzysztof
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Prioritized independent contact regions for form closure grasps2011In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, p. 1797-1803Conference paper (Refereed)
    Abstract [en]

    The concept of independent contact regions on a target object's surface, in order to compensate for shortcomings in the positioning accuracy of robotic grasping devices, is well known. However, the numbers and distributions of contact points forming such regions is not unique and depends on the underlying computational method. In this work we present a computation scheme allowing to prioritize contact points for inclusion in the independent regions. This enables a user to affect their shape in order to meet the demands of the targeted application. The introduced method utilizes frictionless contact constraints and is able to efficiently approximate the space of disturbances resistible by all grasps comprising contacts within the independent regions.

  • 17.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Grasp recognition by time-clustering, fuzzy modeling, and Hidden Markov Models (HMM): a comparative study2008In: IEEE international conference on fuzzy systems, FUZZ-IEEE 2008. (IEEE World Congress on Computational Intelligence), NewYork: IEEE , 2008, p. 599-605Conference paper (Refereed)
    Abstract [en]

    This paper deals with three different methodsfor grasp recognition for a human hand. Grasp recognitionis a major part of the approach for Programming-by-Demonstration (PbD) for five-fingered robotic hands. A humanoperator instructs the robot to perform different grasps wearinga data glove. For a number of human grasps, the finger jointangle trajectories are recorded and modeled by fuzzy clusteringand Takagi-Sugeno modeling. This leads to grasp models usingthe time as input parameter and the joint angles as outputs.Given a test grasp by the human operator the robot classifiesand recognizes the grasp and generates the corresponding robotgrasp. Three methods for grasp recognition are presented andcompared. In the first method the test grasp is comparedwith model grasps using the difference between the modeloutputs. In the second one, qualitative fuzzy models are usedfor recognition and classification. The third method is based onHidden-Markov-Models (HMM) which are commonly used inrobot learning

  • 18.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Learning and adaptation of robot skills using fuzzy models2010In: 2010 IEEE International Conference on Fuzzy Systems (FUZZ), IEEE conference proceedings, 2010, p. 1-8Conference paper (Other academic)
    Abstract [en]

    Robot skills can be taught and recognized by a Programming-by-Demonstration technique where first a human operator demonstrates a set of reference skills. The operator's motions are then recorded by a data-capturing system and modeled via fuzzy clustering and a Takagi-Sugeno modeling technique. The resulting skill models use the time as input and the operator's actions as outputs. During the recognition phase, the robot recognizes which skill has been used by the operator in a novel demonstration. This is done by comparison between the time clusters of the test skill and those of the reference skills. Finally, the robot executes the recognized skill by using the corresponding reference skill model. Drastic differences between learned and real world conditions which occur during the execution of skills by the robot are eliminated by using the Broyden update formula for Jacobians. This method was extended for fuzzy models especially for time cluster models. After the online training of a skill model the updated model is used for further executions of the same skill by the robot.

  • 19.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    BAE Systems Bofors AB, Karlskoga, Sweden.
    Programming-by-Demonstration and Adaptation of Robot Skills by Fuzzy Time Modeling2014In: International Journal of Humanoid Robotics, ISSN 0219-8436, Vol. 11, no 1, article id 1450009Article in journal (Refereed)
    Abstract [en]

    Robot skills are motion or grasping primitives from which a complicated robot task consists. Skills can be directly learned and recognized by a technique named programming-bydemonstration. A human operator demonstrates a set of reference skills where his motions are recorded by a data-capturing system and modeled via fuzzy clustering and a Takagi–Sugeno modeling technique. The skill models use time instants as input and operator actions as outputs. In the recognition phase, the robot identi¯es the skill shown by the operator in a novel test demonstration. Finally, using the corresponding reference skill model the robot executes the recognized skill. Skill models can be updated online where drastic di®erences between learned and real world conditions are eliminated using the Broyden update formula. This method was extended for fuzzy models especially for time cluster models.

  • 20.
    Palm, Rainer
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Segmentation and recognition of human grasps for programming-by-demonstration using time-clustering and fuzzy modeling2007In: IEEE international fuzzy systems conference, FUZZ-IEEE 2007, New York: IEEE , 2007, p. 1-6Conference paper (Refereed)
    Abstract [en]

    In this article we address the problem of programming by demonstration (PbD) of grasping tasks for a five-fingered robotic hand. The robot is instructed by a human operator wearing a data glove capturing the hand poses. For a number of human grasps, the corresponding fingertip trajectories are modeled in time and space by fuzzy clustering and Takagi-Sugeno modeling. This so-called time-clustering leads to grasp models using the time as input parameter and the fingertip positions as outputs. For a test sequence of grasps the control system of the robot hand identifies the grasp segments, classifies the grasps and generates the sequence of grasps shown before. For this purpose, each grasp is correlated with a training sequence. By means of a hybrid fuzzy model the demonstrated grasp sequence can be reconstructed.

  • 21.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Kadmiry, Bourhane
    Grasp recognition by fuzzy modeling and hidden Markov models2010In: Robot intelligence: an advanced knowledge processing approach / [ed] Honghai Liu, Dongbing Gu, Robert J. Howlett, Yonghuai Liu, New York: Springer , 2010, p. 25-47Chapter in book (Other academic)
    Abstract [en]

    Grasp recognition is a major part of the approach for Programming-by-Demonstration (PbD) for five-fingered robotic hands. This chapter describes three different methods for grasp recognition for a human hand. A human operator wearing a data glove instructs the robot to perform different grasps. For a number of human grasps the finger joint angle trajectories are recorded and modeled by fuzzy clustering and Takagi-Sugeno modeling. This leads to grasp models using time as input parameter and joint angles as outputs. Given a test grasp by the human operator the robot classifies and recognizes the grasp and generates the corresponding robot grasp. Three methods for grasp recognition are compared with each other. In the first method, the test grasp is compared with model grasps using the difference between the model outputs. The second method deals with qualitative fuzzy models which used for recognition and classification. The third method is based on Hidden-Markov-Models (HMM) which are commonly used in robot learning.

  • 22.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Kadmiry, Bourhane
    Örebro University, School of Science and Technology.
    Recognition of human grasps by time-clustering and fuzzy modeling2009In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 57, no 5, p. 484-495Article in journal (Refereed)
    Abstract [en]

    In this paper we address the problem of recognition of human grasps for five-fingeredrobotic hands and industrial robots in the context of programming-by-demonstration. The robot isinstructed by a human operator wearing a data glove capturing the hand poses. For a number ofhuman grasps, the corresponding fingertip trajectories are modeled in time and space by fuzzyclustering and Takagi-Sugeno (TS) modeling. This so-called time-clustering leads to grasp modelsusing time as input parameter and fingertip positions as outputs. For a sequence of grasps thecontrol system of the robot hand identifies the grasp segments, classifies the grasps andgenerates the sequence of grasps shown before. For this purpose, each grasp is correlated with atraining sequence. By means of a hybrid fuzzy model the demonstrated grasp sequence can bereconstructed.

  • 23.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Kadmiry, Bourhane
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Driankov, Dimiter
    Örebro University, School of Science and Technology.
    Recognition and teaching of robot skills by fuzzy time-modeling2009In: Proceedings of the Joint 2009 international fuzzy systems association world congress and 2009 European society of fuzzy logic and technology conference / [ed] J. P. Carvalho, D. U. Kaymak, J. M. C. Sousa, Linz, Austria: Johannes Kepler university , 2009, p. 7-12Conference paper (Other academic)
    Abstract [en]

    Robot skills are low-level motion and/or grasping capabilities that constitute the basic building blocks from which tasks are built. Teaching and recognition of such skills can be done by Programming-by-Demonstration approach. A human operator demonstrates certain skills while his motions are recorded by a data-capturing device and modeled in our case via fuzzy clustering and Takagi-Sugeno modeling technique. The resulting skill models use the time as input and the operator's actions and reactions as outputs. Given a test skill by the human operator the robot control system recognizes the individual phases of skills and generates the type of skill shown by the operator.

  • 24.
    Robertsson, Linn
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Palm, Rainer
    Örebro University, Department of Technology.
    Wide, Peter
    Örebro University, Department of Technology.
    Perception modeling for human-like artificial sensor systems2007In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 65, no 5, p. 446-459Article in journal (Refereed)
    Abstract [en]

    In this article we present an approach to the design of human-like artificial systems. It uses a perception model to describe how sensory information is processed for a particular task and to correlate human and artificial perception. Since human-like sensors share their principle of operation with natural systems, their response can be interpreted in an intuitive way. Therefore, such sensors allow for easier and more natural human–machine interaction.

    The approach is demonstrated in two applications. The first is an “electronic tongue”, which performs quality assessment of food and water. In the second application we describe the development of an artificial hand for dexterous manipulation. We show that human-like functionality can be achieved even if the structure of the system is not completely biologically inspired.

  • 25.
    Robertsson, Linn
    et al.
    Örebro University, Department of Technology.
    Lindquist, Malin
    Örebro University, Department of Technology.
    Loutfi, Amy
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Wide, Peter
    Örebro University, Department of Technology.
    Human based sensor systems for safety assessment2005In: Proceedings of the 2005 IEEE International conference on computational intelligence for homeland security and personal safety, 2005. CIHSPS 2005, 2005, p. 137-142Conference paper (Refereed)
    Abstract [en]

    This paper focuses on the assumption that sensor system for personal use has optimal performance if coherent with the human perception system. Therefore, we provide arguments for this idea by demonstrating two examples. The first example is a personal taste sensor for use in finding abnormal ingredients in food. The second application is a mobile sniffing system, coherent with the behavior of a biological system when detecting unwanted material in hidden structures, e.g. explosives in a traveling bag

  • 26.
    Skoglund, Alexander
    et al.
    Örebro University, Department of Technology.
    Duckett, Tom
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Lilienthal, Achim J.
    Örebro University, Department of Technology.
    Palm, Rainer
    Örebro University, Department of Technology.
    Teaching by demonstration of robotic manipulators in non-stationary environments2006In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) ,2006, IEEE, 2006, p. 4339-4341Conference paper (Refereed)
    Abstract [en]

    In this paper we propose a system consisting of a manipulator equipped with range sensors, that is instructed to follow a trajectory demonstrated by a human teacher wearing a motion capturing device. During the demonstration a three dimensional occupancy grid of the environment is built using the range sensor information and the trajectory. The demonstration is followed by an exploration phase, where the robot undergoes self-improvement of the task, during which the occupancy grid is used to avoid collisions. In parallel a reinforcement learning (RL) agent, biased by the demonstration, learns a point-to-point task policy. When changes occur in the workspace, both the occupancy grid and the learned policy will be updated online by the system.

    Download full text (pdf)
    Teaching by Demonstration of Robotic Manipulators in Non-Stationary Environments
  • 27.
    Skoglund, Alexander
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Programming by demonstrating robots task primitives2007In: Servo Magazine, ISSN 1546-0592, no 12, p. 46-50Article in journal (Other academic)
  • 28.
    Skoglund, Alexander
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Programming by demonstration of robots using task primitives2007In: Servo magazine, Vol. 5, no 12, p. 46-50Article in journal (Other (popular science, discussion, etc.))
  • 29.
    Skoglund, Alexander
    et al.
    Örebro University, Department of Technology.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Kadmiry, Bourhane
    Örebro University, Department of Technology.
    Palm, Rainer
    Örebro University, Department of Technology.
    Programming by demonstration of pick-and-place tasks for industrial manipulators using task primitives2007In: International symposium on computational intelligence in robotics and automation, CIRA 2007, New York: IEEE , 2007, p. 368-373Conference paper (Refereed)
    Abstract [en]

    This article presents an approach to Programming by Demonstration (PbD) to simplify programming of industrial manipulators. By using a set of task primitives for a known task type, the demonstration is interpreted and a manipulator program is automatically generated. A pick-and-place task is analyzed, based on the velocity profile, and decomposed in task primitives. Task primitives are basic actions of the robot/gripper, which can be executed in a sequence to form a complete a task. For modeling and generation of the demonstrated trajectory, fuzzy time clustering is used, resulting in smooth and accurate motions. To illustrate our approach, we carried out our experiments on a real industrial manipulator.

  • 30.
    Skoglund, Alexander
    et al.
    Örebro University, School of Science and Technology.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Örebro University, School of Science and Technology.
    A Hand State Approach to Imitation with a Next-State-Planner for Industrial Manipulators2008In: Proceedings of the 2008 International Conference on Cognitive Systems, 2008, p. 130-137Conference paper (Refereed)
    Abstract [en]

     

    In this paper we present an approach to reproduce human demonstrations in a reach-to-grasp context. The demonstration is represented in hand state space. By using the distance to the target object as a scheduling variable, the way in which the robot approaches the object is controlled. The controller that we deploy to execute the motion is formulated as a nextstateplanner. The planner produces an action from the current state instead of planning the whole trajectory in advance which can be error prone in non-static environments. The results have a direct application in Programming-by-Demonstration. It also contributes to cognitive systems since the ability to reach-tograsp supports the development of cognitive abilities.

     

    Download full text (pdf)
    FULLTEXT01
  • 31.
    Skoglund, Alexander
    et al.
    AASS Learning Systems Lab, Örebro Universitet, Örebro, Sweden.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    AASS Learning Systems Lab, Örebro Universitet, Örebro, Sweden.
    Programming-by-demonstration of reaching motions: a next-state-planner approach2010In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 58, no 5, p. 607-621Article in journal (Refereed)
    Abstract [en]

    This paper presents a novel approach to skill acquisition from human demonstration. A robot manipulator with a morphology which is very different from the human arm simply cannot copy a human motion, but has to execute its own version of the skill. When a skill once has been acquired the robot must also be able to generalize to other similar skills, without a new learning process. By using a motion planner that operates in an object-related world frame called hand-state, we show that this representation simplifies skill reconstruction and preserves the essential parts of the skill. (C) 2010 Elsevier B.V. All rights reserved.

  • 32. Skoglund, Alexander
    et al.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Programming-by-demonstration of reaching motions using a next-state-planner2010In: Advances in robot manipulators / [ed] Ernest Hall, Rijeka, Croatia: InTech , 2010, p. 479-501Chapter in book (Other academic)
  • 33. Skoglund, Alexander
    et al.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Programming-by-demonstration of robot motions2010In: Robot intelligence: an advanced knowledge processing approach / [ed] Honghai Liu, Dongbing Gu, Robert J. Howlett, Yonghuai Liu, New York: Springer , 2010, p. 1-24Chapter in book (Other academic)
  • 34.
    Skoglund, Alexander
    et al.
    Örebro University, School of Science and Technology.
    Tegin, Johan
    Mechatronics Laboratory, Machine Design, Royal Institute of Technology, Stockholm, Sweden.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Örebro University, School of Science and Technology.
    Programming-by-demonstration of reaching motions for robot grasping2009In: ICAR 2009: 14th international conference on advanced robotics, vols 1-2, New York: IEEE conference proceedings, 2009, p. 1-7Conference paper (Refereed)
    Abstract [en]

    This paper presents a novel approach to skill modeling acquired from human demonstration. The approach is based on fuzzy modeling and is using a planner for generating corresponding robot trajectories. One of the main challenges stems from the morphological differences between human and robot hand/arm structure, which makes direct copying of human motions impossible in the general case. Thus, the planner works in hand state space, which is defined such that it is perception-invariant and valid for both human and robot hand. We show that this representation simplifies task reconstruction and preserves the essential parts of the task as well as the coordination between reaching and grasping motion. We also show how our approach can generalize observed trajectories based on multiple demonstrations and that the robot can match a demonstrated behavoir, despite morphological differences. To validate our approach we use a general-purpose robot manipulator equipped with an anthropomorphic three-fingered robot hand.

  • 35.
    Tegin, Johan
    et al.
    KTH, Stockholm, Sweden.
    Ekvall, Staffan
    KTH, Stockholm, Sweden.
    Kragic, Danica
    KTH, Stockholm, Sweden.
    Wikander, Jan
    KTH, Stockholm, Sweden.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Demonstration based learning and control for automatic grasping2009In: Intelligent Service Robotics, ISSN 1861-2776, Vol. 2, no 1, p. 23-30Article in journal (Refereed)
    Abstract [en]

    We present a method for automatic grasp generation based on object shape primitives in a Programming by Demonstration framework. The system first recognizes the grasp performed by a demonstrator as well as the object it is applied on and then generates a suitable grasping strategy on the robot. We start by presenting how to model and learn grasps and map them to robot hands. We continue by performing dynamic simulation of the grasp execution with a focus on grasping objects whose pose is not perfectly known.

  • 36. Tegin, Johan
    et al.
    Iliev, Boyko
    Örebro University, School of Science and Technology.
    Skoglund, Alexander
    Örebro University, School of Science and Technology.
    Kragic, Danica
    Royal Institute of Technology (KTH).
    Wikander, Jan
    Royal Institute of Technology (KTH).
    Real life grasping using an under-actuated robot hand: simulation and experiments2009In: ICAR 2009: 14th international conference on advanced robotics, vols 1-2, New York: IEEE conference proceedings, 2009, p. 366-373Conference paper (Refereed)
    Abstract [en]

    We present a system which includes an under-actuated anthropomorphic hand and control algorithms for autonomous grasping of everyday objects. The system comprised a control framework for hybrid force/position control in simulation and reality, a grasp simulator, and an under-actuated robot hand equipped with tactile sensors.We start by presenting the robot hand, the simulation environment and the control framework that enable dynamic simulation of an under-actuated robot hand. We continue by presenting simulation results and also discuss and exemplify the use of simulation in relation to autonomous grasping. Finally, we use the very same controller in real world grasping experiments to validate the simulations and to exemplify system capabilities and limitations.

  • 37.
    Tegin, Johan
    et al.
    KTH, Stockholm, Sweden.
    Wikander, Jan
    KTH, Stockholm, Sweden.
    Ekvall, Staffan
    KTH, Stockholm, Sweden.
    Kragic, Danica
    KTH, Stockholm, Sweden.
    Iliev, Boyko
    Örebro University, Department of Technology.
    Demonstration based learning and control for automatic grasping2007Conference paper (Other academic)
    Abstract [en]

    We present a method for automatic grasp generation based on object shape primitives in a Programming by Demonstration framework. The system first recognizes the grasp performed by a demonstrator as well as the object it is applied on and then generates a suitable grasping strategy on the robot. We start by presenting how to model and learn grasps and map them to robot hands. We continue by performing dynamic simulation of the grasp execution with a focus on grasping objects whose pose is not perfectly known.

1 - 37 of 37
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf