oru.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D camera
Örebro University, School of Science and Technology. (Centre for Applied Autonomous Sensor Systems)ORCID iD: 0000-0003-4685-379x
Department of Computing Science, Umeå University, Umeå, Sweden.ORCID iD: 0000-0002-4600-8652
Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.ORCID iD: 0000-0001-6265-4497
Institute of Agricultural Engineering, Agricultural Research Organization, The Volcani Center, Rishon Lezion, Israel.
Show others and affiliations
2020 (English)In: IEEE Robotics and Automation Letters, ISSN 2377-3766, E-ISSN 1949-3045, Vol. 5, no 2, p. 2031-2038Article in journal (Refereed) Published
Abstract [en]

Current practice for vine yield estimation is based on RGB cameras and has limited performance. In this paper we present a method for outdoor vine yield estimation using a consumer grade RGB-D camera mounted on a mobile robotic platform. An algorithm for automatic grape cluster size estimation using depth information is evaluated both in controlled outdoor conditions and in commercial vineyard conditions. Ten video scans (3 camera viewpoints with 2 different backgrounds and 2 natural light conditions), acquired from a controlled outdoor experiment and a commercial vineyard setup, are used for analyses. The collected dataset (GRAPES3D) is released to the public. A total of 4542 regions of 49 grape clusters were manually labeled by a human annotator for comparison. Eight variations of the algorithm are assessed, both for manually labeled and auto-detected regions. The effect of viewpoint, presence of an artificial background, and the human annotator are analyzed using statistical tools. Results show 2.8-3.5 cm average error for all acquired data and reveal the potential of using lowcost commercial RGB-D cameras for improved robotic yield estimation.

Place, publisher, year, edition, pages
IEEE , 2020. Vol. 5, no 2, p. 2031-2038
Keywords [en]
Field Robots, RGB-D Perception, Agricultural Automation, Robotics in Agriculture and Forestry
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
computer and systems sciences
Identifiers
URN: urn:nbn:se:oru:diva-80966DOI: 10.1109/LRA.2020.2970654ISI: 000526520700001Scopus ID: 2-s2.0-85079829054OAI: oai:DiVA.org:oru-80966DiVA, id: diva2:1421196
Funder
Knowledge Foundation
Note

Funding Agencies:

Israeli Ministry of Science 20187

Ben Gurion University of the Negev through the Helmsley Charitable Trust

Agricultural, Biological and Cognitive Robotics Initiative

Marcus Endowment Fund

Rabbi W. Gunther Plaut Chair in Manufacturing Engineering

Available from: 2020-04-02 Created: 2020-04-02 Last updated: 2020-04-30Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Kurtser, PolinaRingdahl, Ola

Search in DiVA

By author/editor
Kurtser, PolinaRingdahl, OlaRotstein, NatiEdan, Yael
By organisation
School of Science and Technology
In the same journal
IEEE Robotics and Automation Letters
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 17 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf