VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics

Gert Kootstra, Mila Popovic, Jimmy Alison Jørgensen, Danica Kragic, Henrik Gordon Petersen, Norbert Krüger

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.
Original languageEnglish
JournalPaladyn: Journal of Behavioral Robotics
Volume3
Issue number2
Pages (from-to)54-62
Number of pages9
ISSN2080-9778
DOIs
Publication statusPublished - 2012

Fingerprint

Benchmarking
Robotics
Simulators
Robots
Hardware
Databases
Uncertainty
Noise

Cite this

Kootstra, Gert ; Popovic, Mila ; Jørgensen, Jimmy Alison ; Kragic, Danica ; Petersen, Henrik Gordon ; Krüger, Norbert. / VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics. In: Paladyn: Journal of Behavioral Robotics. 2012 ; Vol. 3, No. 2. pp. 54-62.
@article{c99b19e77ca646afa7eca08f0f6f7fcf,
title = "VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics",
abstract = "We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.",
author = "Gert Kootstra and Mila Popovic and J{\o}rgensen, {Jimmy Alison} and Danica Kragic and Petersen, {Henrik Gordon} and Norbert Kr{\"u}ger",
year = "2012",
doi = "10.2478/s13230-012-0020-5",
language = "English",
volume = "3",
pages = "54--62",
journal = "Paladyn: Journal of Behavioral Robotics",
issn = "2080-9778",
publisher = "SP Versita",
number = "2",

}

VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics. / Kootstra, Gert; Popovic, Mila; Jørgensen, Jimmy Alison; Kragic, Danica; Petersen, Henrik Gordon; Krüger, Norbert.

In: Paladyn: Journal of Behavioral Robotics, Vol. 3, No. 2, 2012, p. 54-62.

Research output: Contribution to journalJournal articleResearchpeer-review

TY - JOUR

T1 - VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics

AU - Kootstra, Gert

AU - Popovic, Mila

AU - Jørgensen, Jimmy Alison

AU - Kragic, Danica

AU - Petersen, Henrik Gordon

AU - Krüger, Norbert

PY - 2012

Y1 - 2012

N2 - We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.

AB - We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different configurations are included in the database. The user needs to provide a method for grasp generation based on the real visual input. The grasps are then planned, executed, and evaluated by the provided grasp simulator where several grasp-quality measures are used for evaluation. This setup has the advantage that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision methods instead. As a baseline, benchmark results of our grasp strategy are included.

U2 - 10.2478/s13230-012-0020-5

DO - 10.2478/s13230-012-0020-5

M3 - Journal article

VL - 3

SP - 54

EP - 62

JO - Paladyn: Journal of Behavioral Robotics

JF - Paladyn: Journal of Behavioral Robotics

SN - 2080-9778

IS - 2

ER -