To Örebro University

oru.seÖrebro University Publications
Change search
Refine search result
1 - 12 of 12
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Arad, Boaz
    et al.
    Department of Computer Science, Ben‐Gurion University of the Negev, Beer‐Sheva, Israel .
    Balendonck, Jos
    Greenhouse Horticulture, Wageningen University & Research, Wageningen, The Netherlands.
    Barth, Ruud
    Greenhouse Horticulture, Wageningen University & Research, Wageningen, The Netherlands.
    Ben-Shahar, Ohad
    Department of Computer Science, Ben‐Gurion University of the Negev, Beer‐Sheva, Israel .
    Edan, Yael
    Department of Industrial Engineering and Management, Ben‐Gurion University of the Negev, Beer‐Sheva, Israel .
    Hellström, Thomas
    Department of Computing Science, Umeå University, Umeå, Sweden.
    Hemming, Jochen
    Greenhouse Horticulture, Wageningen University & Research, Wageningen, The Netherlands.
    Kurtser, Polina
    Department of Industrial Engineering and Management, Ben‐Gurion University of the Negev, Beer‐Sheva, Israel .
    Ringdahl, Ola
    Department of Computing Science, Umeå University, Umeå, Sweden.
    Tielen, Toon
    Greenhouse Horticulture, Wageningen University & Research, Wageningen, The Netherlands.
    van Tuijl, Bart
    Greenhouse Horticulture, Wageningen University & Research, Wageningen, The Netherlands.
    Development of a sweet pepper harvesting robot2020In: Journal of Field Robotics, ISSN 1556-4959, E-ISSN 1556-4967, Vol. 37, no 6, p. 1027-1039Article in journal (Refereed)
    Abstract [en]

    This paper presents the development, testing and validation of SWEEPER, a robot for harvesting sweet pepper fruit in greenhouses. The robotic system includes a six degrees of freedom industrial arm equipped with a specially designed end effector, RGB-D camera, high-end computer with graphics processing unit, programmable logic controllers, other electronic equipment, and a small container to store harvested fruit. All is mounted on a cart that autonomously drives on pipe rails and concrete floor in the end-user environment. The overall operation of the harvesting robot is described along with details of the algorithms for fruit detection and localization, grasp pose estimation, and motion control. The main contributions of this paper are the integrated system design and its validation and extensive field testing in a commercial greenhouse for different varieties and growing conditions. A total of 262 fruits were involved in a 4-week long testing period. The average cycle time to harvest a fruit was 24 s. Logistics took approximately 50% of this time (7.8 s for discharge of fruit and 4.7 s for platform movements). Laboratory experiments have proven that the cycle time can be reduced to 15 s by running the robot manipulator at a higher speed. The harvest success rates were 61% for the best fit crop conditions and 18% in current crop conditions. This reveals the importance of finding the best fit crop conditions and crop varieties for successful robotic harvesting. The SWEEPER robot is the first sweet pepper harvesting robot to demonstrate this kind of performance in a commercial greenhouse.

    Download full text (pdf)
    Development ofasweetpepper harvesting robot
  • 2.
    Harel, Ben
    et al.
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Kurtser, Polina
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    van Herck, Liesbet
    Proefstation voor de Groenteteelt, Sint-Katelijne-Waver, Belgium .
    Parmet, Yisrael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Sweet pepper maturity evaluation via multiple viewpoints color analyses2016Conference paper (Refereed)
    Abstract [en]

    Maturity evaluation is an important feature for selective robotic harvesting. This paper focuses on maturity evaluationderived by a color camera for a sweet pepper robotic harvester. Fruit visibility for sweet peppers is limited to 65% andmultiple viewpoints are necessary to detect more than 90% of the fruit. This paper aims to determine the number ofviewpoints required to determine the maturity level of a sweet pepper and the best single viewpoint. Different colorbased measures to estimate the maturity level of a pepper were evaluated. Two datasets were analyzed: images of 54yellow bell sweet peppers and 30 red peppers both harvested at the last fruit setting; all images were taken in uniformillumination conditions with white background. Each pepper was photographed from 5-6 viewpoints: one photo of thetop of the pepper, one photo of the bottom and 3-4 photos of the pepper sides. Each pepper was manually tagged by ahuman professional observer as ‘mature’ or ‘immature’. Image processing routines were implemented to extract colorlevel measures which included different hue features. Results indicates high correlation between the sides to the bottomview, the bottom view shows the best 0.86 correlation in the case of yellow peppers while the side view shows the best0.835 correlation in the case of red peppers (the bottom view yields 0.82 correlation).

    Download full text (pdf)
    Sweet pepper maturity evaluation via multiple viewpoints color analyses
  • 3.
    Herck, Liesbet van
    et al.
    Proefstation voor de Groenteteelt (PSKW), Sint-Katelijne-Waver, Belgium.
    Kurtser, Polina
    Örebro University, School of Science and Technology. Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer-Sheva, Israel.
    Wittemans, Lieve
    Proefstation voor de Groenteteelt (PSKW), Sint-Katelijne-Waver, Belgium.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer-Sheva, Israel.
    Crop design for improved robotic harvesting: A case study of sweet pepper harvesting2020In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 192, p. 294-308Article in journal (Refereed)
    Abstract [en]

    Current harvesting robots have limited performance, due to the unstructured and dynamic nature of both the target crops and their environment. Efforts to date focus on improving sensing and robotic systems. This paper presents a parallel approach, to "design" the crop and its environment to best fit the robot, similar to robotic integration in industrial robot deployments.

    A systematic methodology to select and modify the crop "design" (crop and environment) to improve robotic harvesting is presented. We define crop-dependent robotic features for successful harvesting (e.g., visibility, reachability), from which associated crop features are identified (e.g., crop density, internode length). Methods to influence the crop features are derived (e.g., cultivation practices, climate control) along with a methodological approach to evaluate the proposed designs. A case study of crop "design" for robotic sweet pepper harvesting is presented, with statistical analyses of influential parameters. Since comparison of the multitude of existing crops and possible modifications is impossible due to complexity and time limitations, a sequential field experimental setup is planned. Experiments over three years, 10 cultivars, two climate control conditions, two cultivation techniques and two artificial illumination types were performed. Results showed how modifying the crop effects the crops characteristics influencing robotic harvesting by increased visibility and reachability. The systematic crop "design" approach also led to robot design recommendations. The presented "engineering" the crop "design" framework highlights the importance of close synergy between crop and robot design achieved by strong collaboration between robotic and agronomy experts resulting in improved robotic harvesting performance.

  • 4.
    Kurtser, Polina
    et al.
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Arad, Boaz
    Department of Computer Science, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Ben-Shahar, Ohad
    Department of Computer Science, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    van Bree, Milan
    Irmato Industrial Solutions Veghel B.V., Veghel, The Netherlands .
    Moonen, Joep
    Irmato Industrial Solutions Veghel B.V., Veghel, The Netherlands .
    van Tujil, Bart
    Greenhouse Horticulture, Wageningen University and Research Centre, Wageningen, the Netherlands.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Robotic data acquisition of sweet pepper images for research and development2016Conference paper (Refereed)
    Abstract [en]

    A main problem limiting the development of robotic harvesters is robust fruit detection [5]. Despite intensive research conducted in identifying the fruits and their location [2,3], current fruit detection algorithms have a limited detection rate of 0.87 which is unfeasible from an economic perspective [5]. The complexity of the fruit detection task is due to the unstructured and dynamic nature of both the objects and the environment [4-6]: the fruit have inherent high variability in size, shape, texture, and location; occlusion and variable illumination conditions significantly influence the detection performance[3].

    A common practice for image processing R&D for complicated problems is the acquisition of a large database (e.g., Labelme open source labeling database [1], Oxford building dataset [2]). These datasets enable to advance vision algorithms development [7] and provide a benchmark for evaluating new algorithms. To the best of our knowledge, to date there is no open dataset available for R&D in image processing of agricultural objects. Evaluation of previously reported algorithms was based on limited data [5]. Previous research indicated the importance of evaluating algorithms for a wide range of sensory, crop, and environmental conditions [5].

    A robotic acquisition system and procedure was developed using a 6 degree of freedom manipulator, equipped with 3 different sensors to automatically acquire images from several viewpoints with different sensors and illumination conditions. Measurements were conducted along the day and at night in a commercial greenhouse and resulted in a total of 1764 images from 14 viewpoints for each scene. Additionally, drawbacks and advantages of the proposed approach as compared to other approaches previously utilized will be discussed along with recommendations for future acquisitions.

    Download full text (pdf)
    Robotic Data Acquisition of Sweet Pepper Images for Research and Development
  • 5.
    Kurtser, Polina
    et al.
    Örebro University, School of Science and Technology. Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer-Sheva, Israel.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer-Sheva, Israel.
    Planning the sequence of tasks for harvesting robots2020In: Robotics and Autonomous Systems, ISSN 0921-8890, E-ISSN 1872-793X, Vol. 131, article id 103591Article in journal (Refereed)
    Abstract [en]

    A methodology for planning the sequence of tasks for a harvesting robot is presented. The fruit targets are situated at unknown locations and must be detected by the robot through a sequence of sensing tasks. Once the targets are detected, the robot must execute a harvest action at each target location. The traveling salesman paradigm (TSP) is used to plan the sequence of sensing and harvesting tasks taking into account the costs of the sensing and harvesting actions and the traveling times. Sensing is planned online. The methodology is validated and evaluated in both laboratory and greenhouse conditions for a case study of a sweet pepper harvesting robot. The results indicate that planning the sequence of tasks for a sweet pepper harvesting robot results in 12% cost reduction. Incorporating the sensing operation in the planning sequence for fruit harvesting is a new approach in fruit harvesting robots and is important for cycle time reduction. Furthermore, the sequence is re-planned as sensory information becomes available and the costs of these new sensing operations are also considered in the planning.

  • 6.
    Kurtser, Polina
    et al.
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Statistical models for fruit detectability: spatial and temporal analyses of sweet peppers2018In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 171, p. 272-289Article in journal (Refereed)
    Abstract [en]

    Statistical models for fruit detectability were developed to provide insights into preferable variable configurations for better robotic harvesting performance.

    The methodology includes several steps: definition of controllable and measurable variables, data acquisition protocol design, data processing, definition of performance measures and statistical modelling procedures. Given the controllable and measurable variables, a data acquisition protocol is defined to allow adequate variation in the variables, and determine the dataset size to ensure significant statistical analyses. Performance measures are defined for each combination of controllable and measurable variables identified in the protocol. Descriptive statistics of the measures allow insights into preferable configurations of controllable variables given the measurable variables values. The statistical model is performed by back-elimination Poisson regression with a loglink function process. Spatial and temporal analyses are performed.

    The methodology was applied to develop statistical models for sweet pepper (Capsicum annuum) detectability and revealed best viewpoints. 1312 images acquired from 10 to 14 viewpoints for 56 scenes were collected in commercial greenhouses, using an eye-in-hand configuration of a 6 DOF manipulator equipped with a RGB sensor and an illumination rig. Three databases from different sweet-pepper varieties were collected along different growing seasons.

    Target detectability highly depends on the imaging acquisition distance and the sensing system tilt. A minimum of 12 training scenes are necessary to discover the statistically significant spatial variables. Better prediction was achieved at the beginning of the season with slightly better prediction achieved in a temporal split of training and testing sets.

  • 7.
    Kurtser, Polina
    et al.
    Ben-Gurion University of the Negev, Department of Industrial Engineering and Management, Beer-Sheva, Israel.
    Edan, Yael
    Ben-Gurion University of the Negev, Department of Industrial Engineering and Management, Beer-Sheva, Israel.
    The use of dynamic sensing strategies to improve detection for a pepper harvesting robot2018In: IEEE International Conference on Intelligent Robots and Systems. Proceedings, ISSN 2153-0858, E-ISSN 2153-0866, p. 8286-8293Article in journal (Refereed)
    Abstract [en]

    This paper presents the use of dynamic sensing strategies to improve detection results for a pepper harvesting robot. The algorithm decides if an additional viewpoint is needed and selects the best-fit viewpoint location from a predefined set of locations based on the predicted profitability of such an action. The suggestion of a possible additional viewpoint is based on image analysis for fruit and occlusion level detection, prediction of the expected number of additional targets sensed from that viewpoint, and final decision if choosing the additional viewpoint is beneficial. The developed heuristic was applied on 96 greenhouse images of 30 sweet peppers and resulted in up to 19% improved detection. The harvesting utility cost function decreased by up to 10% compared to the conventional single viewpoint strategy.

  • 8.
    Kurtser, Polina
    et al.
    Örebro University, School of Science and Technology.
    Ringdahl, Ola
    Department of Computing Science, Umeå University, Umeå, Sweden.
    Rotstein, Nati
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Berenstein, Ron
    Institute of Agricultural Engineering, Agricultural Research Organization, The Volcani Center, Rishon Lezion, Israel.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D camera2020In: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 5, no 2, p. 2031-2038Article in journal (Refereed)
    Abstract [en]

    Current practice for vine yield estimation is based on RGB cameras and has limited performance. In this paper we present a method for outdoor vine yield estimation using a consumer grade RGB-D camera mounted on a mobile robotic platform. An algorithm for automatic grape cluster size estimation using depth information is evaluated both in controlled outdoor conditions and in commercial vineyard conditions. Ten video scans (3 camera viewpoints with 2 different backgrounds and 2 natural light conditions), acquired from a controlled outdoor experiment and a commercial vineyard setup, are used for analyses. The collected dataset (GRAPES3D) is released to the public. A total of 4542 regions of 49 grape clusters were manually labeled by a human annotator for comparison. Eight variations of the algorithm are assessed, both for manually labeled and auto-detected regions. The effect of viewpoint, presence of an artificial background, and the human annotator are analyzed using statistical tools. Results show 2.8-3.5 cm average error for all acquired data and reveal the potential of using lowcost commercial RGB-D cameras for improved robotic yield estimation.

  • 9.
    Ringdahl, Ola
    et al.
    Department of Computing Science, Umeå University, Umeå, Sweden .
    Kurtser, Polina
    Örebro University, School of Science and Technology. Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Barth, Ruud
    Greenhouse Horticulture, Wageningen University and Research Centre, Wageningen, the Netherlands.
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Operational flow of an autonomous sweetpepper harvesting robot2016Conference paper (Refereed)
    Abstract [en]

    Advanced automation is required for greenhouse production systems due to the lack of skilled workforce and increasing labour costs [1]. As part of the EU project SWEEPER, we are working on developing an autonomous robot able to harvest sweet pepper fruits in greenhouses. This paper focuses on the operational flow of the robot for the high level task planning.

    In the SWEEPER project, an RGB camera is mounted on the end effector to detect fruits. Due to the dense plant rows, the camera is located at a maximum of 40 cm from the plants and hence cannot provide an overview of all fruit locations. Only a few ripe fruits at each acquisition can be seen. This implies that the robot must incorporate a search pattern to look for fruits. When at least one fruit has been detected in the image, the search is aborted and a harvesting phase is initiated. The phase starts with directing the manipulator to a point close to the fruit and then activating a visual servo control loop. This motion approach ensures that the fruit is grasped despite the occlusions caused by the stems and leaves. When the manipulator has reached the fruit, it is harvested and automatically released into a container. If there are more fruits that have already been detected, the system continues to pick them. When all detected fruits have been harvested, the system resumes the search pattern again. When the search pattern is finished and no more fruits are detected, the robot base is advanced along the row to the next plant and repeats the operations above.

    To support implementation of the workflow into a program controlling the actual robot, a generic software framework for development of agricultural and forestry robots was used [2]. The framework is constructed with a hybrid robot architecture, using a state machine implementing the following flowchart.

    Download full text (pdf)
    Operational flow of an autonomous sweet pepper harvesting robot
  • 10.
    Ringdahl, Ola
    et al.
    Umeå University, Umeå, Sweden.
    Kurtser, Polina
    Örebro University, School of Science and Technology.
    Edan, Yael
    Ben-Gurion University of the Negev, Beer Sheva, Israel.
    Performance of RGB-D camera for different object types in greenhouse conditions2019In: 2019 European Conference on Mobile Robots (ECMR) / [ed] Libor Přeučil, Sven Behnke, Miroslav Kulich, IEEE, 2019, p. 1-6Conference paper (Refereed)
    Abstract [en]

    RGB-D cameras play an increasingly important role in localization and autonomous navigation of mobile robots. Reasonably priced commercial RGB-D cameras have recently been developed for operation in greenhouse and outdoor conditions. They can be employed for different agricultural and horticultural operations such as harvesting, weeding, pruning and phenotyping. However, the depth information extracted from the cameras varies significantly between objects and sensing conditions. This paper presents an evaluation protocol applied to a commercially available Fotonic F80 time-of-flight RGB-D camera for eight different object types. A case study of autonomous sweet pepper harvesting was used as an exemplary agricultural task. Each of the objects chosen is a possible item that an autonomous agricultural robot must detect and localize to perform well. A total of 340 rectangular regions of interests (ROI) were marked for the extraction of performance measures of point cloud density, and variability around center of mass, 30-100 ROIs per object type. An additional 570 ROIs were generated (57 manually and 513 replicated) to evaluate the repeatability and accuracy of the point cloud. A statistical analysis was performed to evaluate the significance of differences between object types. The results show that different objects have significantly different point density. Specifically metallic materials and black colored objects had significantly less point density compared to organic and other artificial materials introduced to the scene as expected. The point cloud variability measures showed no significant differences between object types, except for the metallic knife that presented significant outliers in collected measures. The accuracy and repeatability analysis showed that 1-3 cm errors are due to the the difficulty for a human to annotate the exact same area and up to ±4 cm error is due to the sensor not generating the exact same point cloud when sensing a fixed object.

  • 11.
    Ringdahl, Ola
    et al.
    Department of Computing Science, Umeå University, Umeå, Sweden.
    Kurtser, Polina
    Department of Industrial Engineering and Management Ben-Gurion, University of the Negev, Beer Sheva, Israel.
    Edan, Yael
    Department of Industrial Engineering and Management Ben-Gurion, University of the Negev, Beer Sheva, Israel.
    Strategies for selecting best approach direction for a sweet-pepper harvesting robot2017In: Towards Autonomous Robotic Systems (Taros 2017) / [ed] Yang Gao, Saber Fallah, Yaochu Jin, Constantina Lekakou, Cham: Springer, 2017, p. 516-525Conference paper (Refereed)
    Abstract [en]

    An autonomous sweet pepper harvesting robot must perform several tasks to successfully harvest a fruit. Due to the highly unstructured environment in which the robot operates and the presence of occlusions, the current challenges are to improve the detection rate and lower the risk of losing sight of the fruit while approaching the fruit for harvest. Therefore, it is crucial to choose the best approach direction with least occlusion from obstacles.

    The value of ideal information regarding the best approach direction was evaluated by comparing it to a method attempting several directions until successful harvesting is performed. A laboratory experiment was conducted on artificial sweet pepper plants using a system based on eye-in-hand configuration comprising a 6DOF robotic manipulator equipped with an RGB camera. The performance is evaluated in laboratorial conditions using both descriptive statistics of the average harvesting times and harvesting success as well as regression models. The results show roughly 40–45% increase in average harvest time when no a-priori information of the correct harvesting direction is available with a nearly linear increase in overall harvesting time for each failed harvesting attempt. The variability of the harvesting times grows with the number of approaches required, causing lower ability to predict them.

    Tests show that occlusion of the front of the peppers significantly impacts the harvesting times. The major reason for this is the limited workspace of the robot often making the paths to positions to the side of the peppers significantly longer than to positions in front of the fruit which is more open.

    Download full text (pdf)
    Strategies for selecting best approach direction for a sweet-pepper harvesting robot
  • 12.
    Zemmour, Elie
    et al.
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Kurtser, Polina
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Edan, Yael
    Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
    Dynamic thresholding algorithm for robotic apple detection2017In: 2017 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), IEEE, 2017, p. 240-246Conference paper (Refereed)
    Abstract [en]

    This paper presents a dynamic thresholding algorithm for robotic apple detection. The algorithm enables robust detection in highly variable lighting conditions. The image is dynamically split into variable sized regions, where each region has approximately homogeneous lighting conditions. Nine thresholds were selected so as to accommodate three different illumination levels for three different dimensions in the natural difference index (NDI) space by quantifying the required relation between true positive rate and false positive rate. This rate can change along the robotic harvesting process, aiming to decrease FPR from far views (to minimize cycle times) and to increase TPR from close views (to increase grasping accuracy). Analyses were conducted on apple images acquired in outdoor conditions. The algorithm improved previously reported results and achieved 91.14% true positive rate (TPR) with 3.05% false positive rate (FPR) using the NDI first dimension and a noise removal process.

1 - 12 of 12
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf