Dataset Viewer
Duplicate
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code:   StreamingRowsError
Exception:    ArrowInvalid
Message:      Mismatching child array lengths
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
                  return get_rows(
                         ^^^^^^^^^
                File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
                  return func(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2567, in __iter__
                  for key, example in ex_iterable:
                                      ^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2102, in __iter__
                  for key, pa_table in self._iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2125, in _iter_arrow
                  for key, pa_table in self.ex_iterable._iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 479, in _iter_arrow
                  for key, pa_table in iterator:
                                       ^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 380, in _iter_arrow
                  for key, pa_table in self.generate_tables_fn(**gen_kwags):
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 87, in _generate_tables
                  pa_table = _recursive_load_arrays(h5, self.info.features, start, end)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 273, in _recursive_load_arrays
                  arr = _recursive_load_arrays(dset, features[path], start, end)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 273, in _recursive_load_arrays
                  arr = _recursive_load_arrays(dset, features[path], start, end)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 273, in _recursive_load_arrays
                  arr = _recursive_load_arrays(dset, features[path], start, end)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                [Previous line repeated 2 more times]
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 294, in _recursive_load_arrays
                  sarr = pa.StructArray.from_arrays(values, names=keys)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "pyarrow/array.pxi", line 4294, in pyarrow.lib.StructArray.from_arrays
                File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
              pyarrow.lib.ArrowInvalid: Mismatching child array lengths

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Overview

The embodied intelligence industry is currently facing significant development challenges. The most critical issue is the lack of high-quality data, particularly omnimodal data that integrates force and tactile sensing. The PaXini introduces the PX OmniSharing Dataset, built on the PaXini Super EID Factory, enabling large-scale, high-fidelity human data collection across diverse tasks and scenarios. The dataset includes multi-dimensional tactile data, multi-view visual data, voice, text, proprioception, and spatial trajectories, comprehensively addressing the challenge of rapid generalization for embodied agents across diverse scenarios. Together with the PX OmniSharing Toolkit, it provides an end-to-end pipeline for efficient data processing and model development.


Get Started

Download the Dataset

To download the full dataset, you can use the following code. If you encounter any issues, please refer to the official Hugging Face documentation.

# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install

# When prompted for a password, use an access token with write permissions.# Generate one from your settings: https://huggingface.co/settings/tokens
git clone https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData

Subfolder only (e.g., part_07):

# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install

# Initialize an empty Git repository
git init Omnisharing_DB_SampleData
cd Omnisharing_DB_SampleData

# Set the remote repository
git remote add origin https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData

# Enable sparse-checkout
git sparse-checkout init

# Specify the folders and files
git sparse-checkout set data/part_07

# Pull the data
git pull origin main

The whole dataset covers 10 different tasks. You can find relevant information in the meta group inside each HDF5.


Dataset Structure

The PX OmniSharing Toolkit processing workflow involves four primary data categories: DF-1, DF-2, DF-2R, and DF-3. The corresponding data structure formats are illustrated below:

Data Format Naming Description Custom Format Suffix Data File Name Example
DF-1 The overall input: raw data after preprocessing and quality inspection Yes (HDF5) No Suffix episode_11_111219_112_120024.hdf5
DF-2 1st output: DF-1 with encoder and tactile data parsed; adds bimanual and object poses; includes both action and observation Yes (HDF5) "_glove" episode_11_111219_112_120024_glove.hdf5
DF-2R 2nd output: DF-2 retargeted to a dexterous hand model Yes (HDF5) "_{MODEL}" episode_11_111219_112_120024_dh13.hdf5 (retargeting to DexH13)
episode_11_111219_112_120024_mano.hdf5 (retargeting to MANO)
...
DF-3 3rd output: converts DF-2R to the LeRobot dataset format; can be used for VLA model training No No Suffix -

The sample data has undergone pose estimation & parsing, and is provided in the standardized DF-2 format. For access to raw data, don't hesitate to get in touch with us omnisharingdb@paxini.com, or visit our dataset marketplace: https://dataset-mall.paxini.com/. For advanced formats such as DF-2R or DF-3, users can perform conversion and further processing through the PX OmniSharing Toolkit, available on GitHub: https://github.com/px-DataCollection/px_omnisharing_dataprocess_kit. DF-2(Data Format-2):

/dataset
β”œβ”€β”€ attributes                    # e.g., generated_time, data_id (compressed error info)
β”œβ”€β”€ action                        # Action signals (no tactile)
β”‚   β”œβ”€β”€ lefthand
β”‚   β”‚   β”œβ”€β”€ attributes            # description, etc.
β”‚   β”‚   β”œβ”€β”€ joints
β”‚   β”‚   β”‚   β”œβ”€β”€ data              # (n, 29) joint angles in URDF joint order
β”‚   β”‚   β”‚   └── attributes        # joint_names = [...]
β”‚   β”‚   └── handpose
β”‚   β”‚       β”œβ”€β”€ data              # (n, 7)
β”‚   β”‚       └── attributes        # order = [x, y, z, qw, qx, qy, qz]
β”‚   └── righthand
β”‚       β”œβ”€β”€ attributes            # description, hand_name, urdf, etc.
β”‚       β”œβ”€β”€ joints
β”‚       β”‚   β”œβ”€β”€ data              # (n, 29)
β”‚       β”‚   └── attributes        # joint_names = [...]
β”‚       └── handpose
β”‚           β”œβ”€β”€ data              # (n, 7)
β”‚           └── attributes        # order = [x, y, z, qw, qx, qy, qz]
└── observation                   # Episode state
    β”œβ”€β”€ audio                     # Compressed audio stream (includes text)
    β”œβ”€β”€ image
    β”‚   β”œβ”€β”€ RGB_CameraXXX
    β”‚   β”‚   β”œβ”€β”€ data              # 1D compressed payload
    β”‚   β”‚   β”œβ”€β”€ extrinsics
    β”‚   β”‚   └── intrinsics        # attrs include width/height
    β”‚   β”œβ”€β”€ RGBD_XXX
    β”‚   β”‚   β”œβ”€β”€ data              # 1D compressed payload
    β”‚   β”‚   β”œβ”€β”€ extrinsics
    β”‚   β”‚   β”œβ”€β”€ intrinsics
    β”‚   β”‚   └── attributes        # width/height
    β”‚   └── [...]
    β”œβ”€β”€ lefthand
    β”‚   β”œβ”€β”€ attributes            # description, etc.
    β”‚   β”œβ”€β”€ joints
    β”‚   β”‚   β”œβ”€β”€ data              # (n, 29)
    β”‚   β”‚   └── attributes        # joint_names = [...]
    β”‚   β”œβ”€β”€ handpose
    β”‚   β”‚   β”œβ”€β”€ data              # (n, 7)
    β”‚   β”‚   └── attributes        # order = [x, y, z, qw, qx, qy, qz]
    β”‚   └── tactile
    β”‚       β”œβ”€β”€ data              # (n, 3465)
    β”‚       └── attributes        # sensor_names, sensor_lengths, etc.
    β”œβ”€β”€ righthand
    β”‚   β”œβ”€β”€ attributes
    β”‚   β”œβ”€β”€ joints
    β”‚   β”‚   β”œβ”€β”€ data              # (n, 29)
    β”‚   β”‚   └── attributes
    β”‚   β”œβ”€β”€ handpose
    β”‚   β”‚   β”œβ”€β”€ data              # (n, 7)
    β”‚   β”‚   └── attributes
    β”‚   └── tactile
    β”‚       β”œβ”€β”€ data              # (n, 3465)
    β”‚       └── attributes
    β”œβ”€β”€ obj1
    β”‚   β”œβ”€β”€ data                  # (n, 17)
    β”‚   └── attributes            # obj_name, obj_id, order/detail
    β”œβ”€β”€ obj2
    └── [...]

License and Citation

All the data within this repo is licensed under CC BY-NC-SA 4.0. Please consider citing our project if it contributes to your research.

@misc{PX OmniSharing DB,
    title        = {PX OmniSharing DB},
    author       = {PX OmniSharing DB},
    howpublished = {\url{https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData}},
    year         = {2026}
}
Downloads last month
719