To Örebro University

oru.seÖrebro University Publications
Change search
Refine search result
1 - 15 of 15
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Bunz, Elsa
    et al.
    Örebro University, Örebro, Sweden.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Schindler, Maike
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Spatial Augmented Reality and Eye Tracking for Evaluating Human Robot Interaction2016In: Proceedings of RO-MAN 2016 Workshop: Workshop on Communicating Intentions in Human-Robot Interaction, 2016Conference paper (Refereed)
    Abstract [en]

    Freely moving autonomous mobile robots may leadto anxiety when operating in workspaces shared with humans.Previous works have given evidence that communicating in-tentions using Spatial Augmented Reality (SAR) in the sharedworkspace will make humans more comfortable in the vicinity ofrobots. In this work, we conducted experiments with the robotprojecting various patterns in order to convey its movementintentions during encounters with humans. In these experiments,the trajectories of both humans and robot were recorded witha laser scanner. Human test subjects were also equipped withan eye tracker. We analyzed the eye gaze patterns and thelaser scan tracking data in order to understand how the robot’sintention communication affects the human movement behavior.Furthermore, we used retrospective recall interviews to aid inidentifying the reasons that lead to behavior changes.

    Download full text (pdf)
    fulltext
  • 2.
    Chadalavada, Ravi Teja
    Chalmers University of Technology.
    Human Robot Interaction for Autonomous Systems in Industrial Environments2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The upcoming new generation of autonomous vehicles for transporting materials in industrial environments will be more versatile, flexible and efficient than traditional Automatic Guided Vehicles (AGV), which simply follow pre-defined paths. However, freely navigating vehicles can appear unpredictable to human workers and thus cause stress and render joint use of the available space inefficient. This work addresses the problem of providing information regarding a service robot’s intention to humans co-populating the environment. The overall goal is to make humans feel safer and more comfortable, even when they are in close vicinity of the robot. A spatial Augmented Reality (AR) system for robot intention communication by means of projecting proxemic information onto shared floor space is developed on a robotic fork-lift by equipping it with a LED projector. This helps in visualizing internal state information and intents on the shared floors spaces. The robot’s ability to communicate its intentions is evaluated in realistic situations where test subjects meet the robotic forklift. A Likert scalebased evaluation which also includes comparisons to human-human intention communication was performed. The results show that already adding simple information, such as the trajectory and the space to be occupied by the robot in the near future, is able to effectively improve human response to the robot. This kind of synergistic human-robot interaction in a work environment is expected to increase the robot’s acceptability in the industry.

    Download full text (pdf)
    fulltext
  • 3.
    Chadalavada, Ravi Teja
    et al.
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Empirical evaluation of human trust in an expressive mobile robot2016In: Proceedings of RSS Workshop "Social Trust in Autonomous Robots 2016", 2016Conference paper (Refereed)
    Abstract [en]

    A mobile robot communicating its intentions using Spatial Augmented Reality (SAR) on the shared floor space makes humans feel safer and more comfortable around the robot. Our previous work [1] and several other works established this fact. We built upon that work by adding an adaptable information and control to the SAR module. An empirical study about how a mobile robot builds trust in humans by communicating its intentions was conducted. A novel way of evaluating that trust is presented and experimentally shown that adaption in SAR module lead to natural interaction and the new evaluation system helped us discover that the comfort levels between human-robot interactions approached those of human-human interactions.

    Download full text (pdf)
    fulltext
  • 4.
    Chadalavada, Ravi Teja
    et al.
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Krug, Robert
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    That’s on my Mind!: Robot to Human Intention Communication through on-board Projection on Shared Floor Space2015In: 2015 European Conference on Mobile Robots (ECMR), New York: IEEE conference proceedings , 2015Conference paper (Refereed)
    Abstract [en]

    The upcoming new generation of autonomous vehicles for transporting materials in industrial environments will be more versatile, flexible and efficient than traditional AGVs, which simply follow pre-defined paths. However, freely navigating vehicles can appear unpredictable to human workers and thus cause stress and render joint use of the available space inefficient. Here we address this issue and propose on-board intention projection on the shared floor space for communication from robot to human. We present a research prototype of a robotic fork-lift equipped with a LED projector to visualize internal state information and intents. We describe the projector system and discuss calibration issues. The robot’s ability to communicate its intentions is evaluated in realistic situations where test subjects meet the robotic forklift. The results show that already adding simple information, such as the trajectory and the space to be occupied by the robot in the near future, is able to effectively improve human response to the robot.

  • 5.
    Chadalavada, Ravi Teja
    et al.
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Schindler, Maike
    Faculty of Human Sciences, University of Cologne, Germany, Cologne, Gemany.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology.
    Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction2019Conference paper (Refereed)
    Abstract [en]

    Eye gaze can convey information about intentions beyond what can beinferred from the trajectory and head pose of a person. We propose eye-trackingglasses as safety equipment in industrial environments shared by humans androbots. In this work, an implicit intention transference system was developed and implemented. Robot was given access to human eye gaze data, and it responds tothe eye gaze data through spatial augmented reality projections on the sharedfloor space in real-time and the robot could also adapt its path. This allows proactivesafety approaches in HRI for example by attempting to get the human'sattention when they are in the vicinity of a moving robot. A study was conductedwith workers at an industrial warehouse. The time taken to understand the behaviorof the system was recorded. Electrodermal activity and pupil diameter wererecorded to measure the increase in stress and cognitive load while interactingwith an autonomous system, using these measurements as a proxy to quantifytrust in autonomous systems.

    Download full text (pdf)
    Implicit intention transference using eye-tracking glasses for improved safety in human-robot interaction
  • 6.
    Chadalavada, Ravi Teja
    et al.
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Schindler, Maike
    Örebro University, School of Science and Technology.
    Palm, Rainer
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses2018In: Advances in Manufacturing Technology XXXII: Proceedings of the 16th International Conference on Manufacturing Research, incorporating the 33rd National Conference on Manufacturing Research, September 11–13, 2018, University of Skövde, Sweden / [ed] Case K. &Thorvald P., Amsterdam, Netherlands: IOS Press, 2018, p. 253-258Conference paper (Refereed)
    Abstract [en]

    Robots in human co-habited environments need human-aware task and motion planning, ideally responding to people’s motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. This paper investigates the possibility of human-to-robot implicit intention transference solely from eye gaze data.  We present experiments in which humans wearing eye-tracking glasses encountered a small forklift truck under various conditions. We evaluate how the observed eye gaze patterns of the participants related to their navigation decisions. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human eye gaze for early obstacle avoidance.

    Download full text (pdf)
    Accessing your navigation plans! Human-Robot Intention Transfer using Eye-Tracking Glasses
  • 7.
    Chadalavada, Ravi Teja
    et al.
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Schindler, Maike
    Faculty of Human Sciences, University of Cologne, Germany.
    Palm, Rainer
    Örebro University, School of Science and Technology.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology.
    Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction2020In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 61, article id 101830Article in journal (Refereed)
    Abstract [en]

    Safety, legibility and efficiency are essential for autonomous mobile robots that interact with humans. A key factor in this respect is bi-directional communication of navigation intent, which we focus on in this article with a particular view on industrial logistic applications. In the direction robot-to-human, we study how a robot can communicate its navigation intent using Spatial Augmented Reality (SAR) such that humans can intuitively understand the robot's intention and feel safe in the vicinity of robots. We conducted experiments with an autonomous forklift that projects various patterns on the shared floor space to convey its navigation intentions. We analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift and carried out stimulated recall interviews (SRI) in order to identify desirable features for projection of robot intentions. In the direction human-to-robot, we argue that robots in human co-habited environments need human-aware task and motion planning to support safety and efficiency, ideally responding to people's motion intentions as soon as they can be inferred from human cues. Eye gaze can convey information about intentions beyond what can be inferred from the trajectory and head pose of a person. Hence, we propose eye-tracking glasses as safety equipment in industrial environments shared by humans and robots. In this work, we investigate the possibility of human-to-robot implicit intention transference solely from eye gaze data and evaluate how the observed eye gaze patterns of the participants relate to their navigation decisions. We again analyzed trajectories and eye gaze patterns of humans while interacting with an autonomous forklift for clues that could reveal direction intent. Our analysis shows that people primarily gazed on that side of the robot they ultimately decided to pass by. We discuss implications of these results and relate to a control approach that uses human gaze for early obstacle avoidance.

  • 8.
    Molina, Sergi
    et al.
    University of Lincoln, Lincoln, U.K.
    Mannucci, Anna
    Robert Bosch GmbH, Renningen, Germany.
    Magnusson, Martin
    Örebro University, School of Science and Technology.
    Adolfsson, Daniel
    Örebro University, School of Science and Technology.
    Andreasson, Henrik
    Örebro University, School of Science and Technology.
    Hamad, Mazin
    Technical University of Munich, Munich, Germany.
    Abdolshah, Saeed
    Technical University of Munich, Munich, Germany.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Palmieri, Luigi
    Robert Bosch GmbH, Renningen, Germany.
    Linder, Timm
    Robert Bosch GmbH, Renningen, Germany.
    Swaminathan, Chittaranjan Srinivas
    Örebro University, School of Science and Technology.
    Kucner, Tomasz Piotr
    Aalto University, Aalto, Finland.
    Hanheide, Marc
    University of Lincoln, Lincoln, U.K..
    Fernandez-Carmona, Manuel
    University of Lincoln, Lincoln, U.K..
    Cielniak, Grzegorz
    University of Lincoln, Lincoln, U.K..
    Duckett, Tom
    University of Lincoln, Lincoln, U.K..
    Pecora, Federico
    Örebro University, School of Science and Technology.
    Bokesand, Simon
    Kollmorgen Automation AB, Mölndal, Sweden.
    Arras, Kai O.
    Robert Bosch GmbH, Renningen, Germany.
    Haddadin, Sami
    Technical University of Munich, Munich, Germany.
    Lilienthal, Achim J
    Örebro University, School of Science and Technology.
    The ILIAD Safety Stack: Human-Aware Infrastructure-Free Navigation of Industrial Mobile Robots2023In: IEEE robotics & automation magazine, ISSN 1070-9932, E-ISSN 1558-223XArticle in journal (Refereed)
    Abstract [en]

    Current intralogistics services require keeping up with e-commerce demands, reducing delivery times and waste, and increasing overall flexibility. As a consequence, the use of automated guided vehicles (AGVs) and, more recently, autonomous mobile robots (AMRs) for logistics operations is steadily increasing.

  • 9.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Fuzzy Modeling and Control for Intention Recognition in Human-Robot Systems2016In: Proceedings of the 8th International Joint Conference on Computational Intelligence (IJCCI 2016), Setúbal, Portugal: SciTePress, 2016, Vol. 2, p. 67-74Conference paper (Refereed)
    Abstract [en]

    The recognition of human intentions from trajectories in the framework of human-robot interaction is a challenging field of research. In this paper some control problems of the human-robot interaction and their intentions to compete or cooperate in shared work spaces are addressed and the time schedule of the information flow is discussed. The expected human movements relative to the robot are summarized in a so-called "compass dial" from which fuzzy control rules for the robot's reactions are derived. To avoid collisions between robot and human very early the computation of collision times at predicted human-robot intersections is discussed and a switching controller for collision avoidance is proposed. In the context of the recognition of human intentions to move to certain goals, pedestrian tracks are modeled by fuzzy clustering, lanes preferred by human agents are identified, and the identification of degrees of membership of a pedestrian track to specific lanes are discussed. Computations based on simulated and experimental data show the applicability of the methods presented.

  • 10.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Recognition of Human-Robot Motion Intentions by Trajectory Observation2016In: 2016 9th International Conference on Human System Interactions, HSI 2016: Proceedings, New York: Institute of Electrical and Electronics Engineers (IEEE), 2016, p. 229-235Conference paper (Refereed)
    Abstract [en]

    The intention of humans and autonomous robots to interact in shared spatial areas is a challenging field of research regarding human safety, system stability and performance of the system's behavior. In this paper the intention recognition between human and robot from the control point of view are addressed and the time schedule of the exchanged signals is discussed. After a description of the kinematic and geometric relations between human and robot a so-called 'compass dial' with the relative velocities is presented from which suitable fuzzy control rules are derived. The computation of the collision times at intersections and possible avoidance strategies are further discussed. Computations based on simulated and experimental data show the applicability of the methods presented.

  • 11.
    Palm, Rainer
    et al.
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Fuzzy Modeling, Control and Prediction in Human-Robot Systems2019In: Computational Intelligence: International Joint Conference, IJCCI2016 Porto, Portugal, November 9–11,2016 Revised Selected Papers / [ed] Juan Julian Merelo, Fernando Melício José M. Cadenas, António Dourado, Kurosh Madani, António Ruano, Joaquim Filipe, Switzerland: Springer Publishing Company, 2019, p. 149-177Chapter in book (Refereed)
    Abstract [en]

    A safe and synchronized interaction between human agents and robots in shared areas requires both long distance prediction of their motions and an appropriate control policy for short distance reaction. In this connection recognition of mutual intentions in the prediction phase is crucial to improve the performance of short distance control.We suggest an approach for short distance control inwhich the expected human movements relative to the robot are being summarized in a so-called “compass dial” from which fuzzy control rules for the robot’s reactions are derived. To predict possible collisions between robot and human at the earliest possible time, the travel times to predicted human-robot intersections are calculated and fed into a hybrid controller for collision avoidance. By applying the method of velocity obstacles, the relation between a change in robot’s motion direction and its velocity during an interaction is optimized and a combination with fuzzy expert rules is used for a safe obstacle avoidance. For a prediction of human intentions to move to certain goals pedestrian tracks are modeled by fuzzy clustering, and trajectories of human and robot agents are extrapolated to avoid collisions at intersections. Examples with both simulated and real data show the applicability of the presented methods and the high performance of the results.

  • 12.
    Rudenko, Andrey
    et al.
    Örebro University, School of Science and Technology. Robotics Research, Bosch Corporate Research, Stuttgart, Germany.
    Kucner, Tomasz Piotr
    Örebro University, School of Science and Technology.
    Swaminathan, Chittaranjan Srinivas
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Arras, Kai O.
    Robotics Research, Bosch Corporate Research, Stuttgart, Germany.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology.
    THÖR: Human-Robot Navigation Data Collection and Accurate Motion Trajectories Dataset2020In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 5, no 2, p. 676-682Article in journal (Refereed)
    Abstract [en]

    Understanding human behavior is key for robots and intelligent systems that share a space with people. Accordingly, research that enables such systems to perceive, track, learn and predict human behavior as well as to plan and interact with humans has received increasing attention over the last years. The availability of large human motion datasets that contain relevant levels of difficulty is fundamental to this research. Existing datasets are often limited in terms of information content, annotation quality or variability of human behavior. In this paper, we present THÖR, a new dataset with human motion trajectory and eye gaze data collected in an indoor environment with accurate ground truth for position, head orientation, gaze direction, social grouping, obstacles map and goal coordinates. THÖR also contains sensor data collected by a 3D lidar and involves a mobile robot navigating the space. We propose a set of metrics to quantitatively analyze motion trajectory datasets such as the average tracking duration, ground truth noise, curvature and speed variation of the trajectories. In comparison to prior art, our dataset has a larger variety in human motion behavior, is less noisy, and contains annotations at higher frequencies.

  • 13.
    Rudenko, Andrey
    et al.
    Örebro University, School of Science and Technology.
    Kucner, Tomasz Piotr
    Örebro University, School of Science and Technology.
    Swaminathan, Chittaranjan Srinivas
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Arras, Kai Oliver
    Bosch Corporate Research, Renningen, Germany.
    Lilienthal, Achim
    Örebro University, School of Science and Technology.
    Benchmarking Human Motion Prediction Methods2020Conference paper (Other academic)
    Abstract [en]

    In this extended abstract we present a novel dataset for benchmarking motion prediction algorithms. We describe our approach to data collection which generates diverse and accurate human motion in a controlled weakly-scripted setup. We also give insights for building a universal benchmark for motion prediction.

    Download full text (pdf)
    Benchmarking Human Motion Prediction Methods
  • 14.
    Schreiter, Tim
    et al.
    Örebro University, School of Science and Technology.
    Morillo-Mendez, Lucas
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi T.
    Örebro University, School of Science and Technology.
    Rudenko, Andrey
    Robert Bosch GmbH, Corporate Research, Stuttgart, Germany.
    Billing, Erik
    Interaction Lab, University of Skövde, Skövde, Sweden.
    Magnusson, Martin
    Örebro University, School of Science and Technology.
    Arras, Kai O.
    Robert Bosch GmbH, Corporate Research, Stuttgart, Germany.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology. TU Munich, Germany.
    Advantages of Multimodal versus Verbal-Only Robot-to-Human Communication with an Anthropomorphic Robotic Mock Driver2023In: 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN): Proceedings, IEEE, 2023, p. 293-300Conference paper (Refereed)
    Abstract [en]

    Robots are increasingly used in shared environments with humans, making effective communication a necessity for successful human-robot interaction. In our work, we study a crucial component: active communication of robot intent. Here, we present an anthropomorphic solution where a humanoid robot communicates the intent of its host robot acting as an "Anthropomorphic Robotic Mock Driver" (ARMoD). We evaluate this approach in two experiments in which participants work alongside a mobile robot on various tasks, while the ARMoD communicates a need for human attention, when required, or gives instructions to collaborate on a joint task. The experiments feature two interaction styles of the ARMoD: a verbal-only mode using only speech and a multimodal mode, additionally including robotic gaze and pointing gestures to support communication and register intent in space. Our results show that the multimodal interaction style, including head movements and eye gaze as well as pointing gestures, leads to more natural fixation behavior. Participants naturally identified and fixated longer on the areas relevant for intent communication, and reacted faster to instructions in collaborative tasks. Our research further indicates that the ARMoD intent communication improves engagement and social interaction with mobile robots in workplace settings.

  • 15.
    Schreiter, Tim
    et al.
    Örebro University, School of Science and Technology.
    Morillo-Mendez, Lucas
    Örebro University, School of Science and Technology.
    Chadalavada, Ravi Teja
    Örebro University, School of Science and Technology.
    Rudenko, Andrey
    Robert Bosch GmbH, Corporate Research, Stuttgart, Germany.
    Billing, Erik Alexander
    Interaction Lab, University of Skövde, Sweden.
    Lilienthal, Achim J.
    Örebro University, School of Science and Technology.
    The Effect of Anthropomorphism on Trust in an Industrial Human-Robot Interaction2022In: SCRITA Workshop Proceedings (arXiv:2208.11090), 2022Conference paper (Refereed)
    Abstract [en]

    Robots are increasingly deployed in spaces shared with humans, including home settings and industrial environments. In these environments, the interaction between humans and robots (HRI) is crucial for safety, legibility, and efficiency. A key factor in HRI is trust, which modulates the acceptance of the system. Anthropomorphism has been shown to modulate trust development in a robot, but robots in industrial environments are not usually anthropomorphic. We designed a simple interaction in an industrial environment in which an anthropomorphic mock driver (ARMoD) robot simulates to drive an autonomous guided vehicle (AGV). The task consisted of a human crossing paths with the AGV, with or without the ARMoD mounted on the top, in a narrow corridor. The human and the system needed to negotiate trajectories when crossing paths, meaning that the human had to attend to the trajectory of the robot to avoid a collision with it. There was a significant increment in the reported trust scores in the condition where the ARMoD was present, showing that the presence of an anthropomorphic robot is enough to modulate the trust, even in limited interactions as the one we present here. 

    Download full text (pdf)
    The Effect of Anthropomorphism on Trust in an Industrial Human-Robot Interaction
1 - 15 of 15
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf