metadata
license: bsd-3-clause
task_categories:
- robotics
tags:
- fingernet
- asfinger
modalities:
- tabular
configs:
- config_name: finger
data_files: data/finger/data_*.parquet
- config_name: finger_surf
data_files: data/finger_surf/data_*.parquet
dataset_info:
- config_name: finger
features:
- name: motion
list: float64
- name: force
list: float64
- name: nodes
list:
list: float64
- config_name: finger_surf
features:
- name: motion
list: float64
- name: force
list: float64
- name: nodes
list:
list: float64
size_categories:
- 100K<n<1M
FingerNet-100K
This dataset contains 100K samples of data for FingerNet, generated by finite element simulations.
Dataset Schema
There are two subsets in this dataset:
- finger: Contains 100,000 samples of typical asFinger.
- finger_surf: Contains 100,000 samples of asFinger with a contact surface.
Each sample in this dataset contains three components:
| Field Name | Type | Shape | Description |
|---|---|---|---|
motion |
List[float64] |
[6] |
The 6D motion of the finger, including translation (dx, dy, dz) and rotation (rx, ry, rz) in mm and rad. |
force |
List[float64] |
[6] |
The 6D force and torque on the bottom surface of the finger, corresponding to (fx, fy, fz, tx, ty, tz) in N and Nmm. |
nodes |
List[List[float64]] |
[N,3] |
The 3D displacement of N surface nodes of the finger, where each node is represented as [dx, dy, dz] in mm. |
Usage
from datasets import load_dataset
dataset = load_dataset("asRobotics/fingernet-100k")
# Access the 'finger' subset
for sample in dataset['finger']:
motion = sample['motion']
force = sample['force']
nodes = sample['nodes']
# Access the 'finger_surf' subset
for sample in dataset['finger_surf']:
motion = sample['motion']
force = sample['force']
nodes = sample['nodes']
Citation
If you use this model in your research, please cite the following papers:
@article{liu2024proprioceptive,
title={Proprioceptive learning with soft polyhedral networks},
author={Liu, Xiaobo and Han, Xudong and Hong, Wei and Wan, Fang and Song, Chaoyang},
journal={The International Journal of Robotics Research},
volume = {43},
number = {12},
pages = {1916-1935},
year = {2024},
publisher={SAGE Publications Sage UK: London, England},
doi = {10.1177/02783649241238765}
}
@article{wu2025magiclaw,
title={MagiClaw: A Dual-Use, Vision-Based Soft Gripper for Bridging the Human Demonstration to Robotic Deployment Gap},
author={Wu, Tianyu and Han, Xudong and Sun, Haoran and Zhang, Zishang and Huang, Bangchao and Song, Chaoyang and Wan, Fang},
journal={arXiv preprint arXiv:2509.19169},
year={2025}
}