Leveraging Big Data for Grasping
We propose a new large-scale database containing grasps that are applied to a large set of objects from numerous categories.
These grasps are generated in simulation and are annotated with the standard epsilon-metric and a new physics-metric.
We use a descriptive and efficient representation of the local object shape at which the grasp is applied.
Each grasp is annotated with the proposed metrics and representation.
We use crowdsourcing to analyze the correlation of the two metrics with grasp success as predicted by humans.
The results confirm that the proposed physics-metric is a more consistent
predictor for grasp success than the epsilon-metric.
Furthermore it supports the hypothesis that human labels are not required for good ground truth grasp data.
Instead the physics-metric can be used for simulation data.
Registration for database information
We hope that this database will increase over time and
the community will contribute back.
Therefore, we encourage people to subscribe to the
database information service.
We will send notification mails as soon as new data,
code changes, or other changes related to the database
Grasp Database Data & Software
Getting the Data & Software:
We provide the raw data and a basic python software package
to visualize and interact with the data.
To obtain the data and get more detailed information
about the database please visit our dedicated
If you are interested in contributing to the database
or the software packages please do not hesitate to contact us.
Every database is stored in
the HDF5 file
format. There are existing tools to manually inspect the
database, e.g. vitables
shown on the right. In the software package we provide
scripts to obtain the raw HDF5 files of different versions
of the database.
To have a good user experience we decided to provide a
docker image which has been tested succesfully on Ubuntu
12.04/14.04 and Mac. The image ships with a pre-compiled
software package to visualize grasps stored in the
database and also store the grasp templates.
All dependencies to compile our code are described in the
docker build description, making it transparent to the
user to build our code outside of this environment.
The source code is also provided and explains how to
acquire data from the hdf5 file using our python library.
Grasp Database Examples
The grasp database consists of 87 categories. In the
following some example objects for the 3 groups, small,
medium, and large objects are shown. We want to stress
that there are more models available in our database. The
subset shown in the following is chosen to illustrate the
variaty of models we use.
Real Data Analysis
Coming soon, not yet available.