File size: 7,846 Bytes
c4160bf 820f747 2bd48b7 820f747 e15a2a5 c4160bf 2bd48b7 c4160bf 2bd48b7 06f7191 2bd48b7 c4160bf 142775f 2bd48b7 c4160bf 142775f 2bd48b7 c4160bf 2bd48b7 e022077 2bd48b7 e022077 c4160bf 2bd48b7 602fb07 2bd48b7 c4160bf 2bd48b7 c4160bf 2bd48b7 c4160bf 2bd48b7 c4160bf 2bd48b7 c4160bf 2bd48b7 e022077 2bd48b7 e022077 2bd48b7 e022077 c4160bf 142775f c4160bf 142775f c4160bf 2bd48b7 c4160bf 2bd48b7 c4160bf 3026776 2bd48b7 c4160bf 3026776 c4160bf 2bd48b7 c4160bf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 |
---
license: mit
language:
- en
task_categories:
- zero-shot-image-classification
- zero-shot-classification
- feature-extraction
- image-feature-extraction
- tabular-classification
- tabular-regression
- depth-estimation
tags:
- tactile
- robotics
pretty_name: Sensor-Invariant Tactile Representation
size_categories:
- 1M<n<10M
viewer: false
---
# SITR Dataset & Weights
This repository hosts both the dataset and pre-trained model weights for the Sensor-Invariant Tactile Representation (SITR) paper. The dataset supports training and evaluating models for sensor-invariant tactile representations across simulated and real-world settings, while the pre-trained weights enable immediate deployment and fine-tuning for various tactile perception tasks.
The codebase implementing SITR is available on GitHub: [SITR Codebase](https://github.com/hgupt3/gsrl)
For more details on the underlying methods and experiments, please visit our [project website](https://hgupt3.github.io/sitr/) and read the [arXiv paper](https://arxiv.org/abs/2502.19638).
---
## Pre-trained Model Weights
The pre-trained model weights are available for immediate use in inference or fine-tuning. These weights were trained on our large-scale simulated dataset and have been validated across multiple real-world sensors.
### Downloading the Weights
```bash
wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/checkpoints.zip
unzip checkpoints.zip -d your_desired_directory
```
### Weights Directory Structure
The weights directory contains the following structure:
```
checkpoints/
βββ SITR_B18.pth # Base pre-trained model weights (371MB)
βββ classification/ # Classification task weights
β βββ SITR_base/ # Base model with fine-tuned head for classification on 1 sensor
β βββ sensor_0000.pth # Weights for sensor 0
β βββ sensor_0001.pth # Weights for sensor 1
β βββ ...
βββ pose_estimation/ # Pose estimation task weights
βββ SITR_base/ # Base model with fine-tuned head for classification on 1 sensor
βββ sensor_0000.pth # Weights for sensor 0
βββ sensor_0001.pth # Weights for sensor 1
βββ ...
```
You can use the SITR_B18.pth weight for:
1. **Zero-shot inference** on new tactile data
2. **Fine-tuning** for specific tasks
3. **Feature extraction** for downstream applications
For detailed usage instructions and examples, please refer to the [SITR Codebase](https://github.com/hgupt3/gsrl).
---
## Dataset Overview
The SITR dataset consists of three main parts:
1. **Simulated Tactile Dataset**
A large-scale synthetic dataset generated using physics-based rendering (PBR) in Blender. This dataset spans 100 unique simulated sensor configurations with tactile signals, calibration images, and corresponding surface normal maps. It includes 10K unique contact configurations generated using 50 high-resolution 3D meshes of common household objects, resulting in a pre-training dataset of 1M samples.
2. **Classification Tactile Dataset**
Data collected from 7 real sensors (including variations of GelSight Mini, GelSight Hex, GelSight Wedge, and DIGIT). For the classification task, 20 objects are pressed against each sensor at various poses and depths, accumulating 1K tactile images per object (140K images in total, with 20K per sensor). We used 16 objects for our classification experiments, as some items were deemed unsuitable (this was decided before experimentation). The dataset is provided as separate train (80%) and test sets (20%).
3. **Pose Estimation Tactile Dataset**
For pose estimation, tactile signals are recorded using a modified Ender-3 Pro 3D printer equipped with 3D-printed indenters. This setup provides accurate ground truth (x, y, z coordinates) for contact points, where all coordinates are specified in millimeters. Data were collected for 6 indenters across 4 sensors, resulting in 1K samples per indenter (24K images in total, 6K per sensor). This dataset is also organized into train (80%) and test sets (20%).
---
## Download and Setup
### Simulated Tactile Dataset
The simulated dataset is split into two parts due to its size:
- `renders_part_aa.zip`
- `renders_part_ab.zip`
Download both files using:
```bash
wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/renders_part_aa
wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/renders_part_ab
```
**To merge and unzip:**
1. **Merge the parts into a single zip file:**
```bash
cat renders_part_aa renders_part_ab > renders.zip
rm renders_part_aa renders_part_ab # Remove the split files
```
2. **Unzip the merged file:**
```bash
unzip renders.zip -d your_desired_directory
rm renders.zip
```
### Real-World Datasets (Classification & Pose Estimation)
Download the classification dataset:
```bash
wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/classification_dataset.zip
unzip classification_dataset.zip -d your_desired_directory
rm classification_dataset.zip
```
Download the pose estimation dataset:
```bash
wget https://huggingface.co/datasets/hgupt3/sitr_dataset/resolve/main/pose_dataset.zip
unzip pose_dataset.zip -d your_desired_directory
rm pose_dataset.zip
```
Each dataset contains:
- `train_set/` (80% of the data)
- `test_set/` (20% of the data)
---
## File Structure
### 1. Simulated Tactile Dataset
```
data_root/
βββ sensor_0000/
β βββ calibration/ # Calibration images
β β βββ 0000.png # Background image
β β βββ 0001.png
β β βββ ...
β βββ samples/ # Tactile sample images
β β βββ 0000.png
β β βββ 0001.png
β β βββ ...
β βββ dmaps/ # (Optional) Depth maps
β β βββ 0000.npy
β β βββ ...
β βββ norms/ # (Optional) Surface normals
β βββ 0000.npy
β βββ ...
βββ sensor_0001/
βββ ...
```
### 2. Classification Dataset
Each of the `train_set/` and `test_set/` directories follows this structure:
```
train_set/ (or test_set/)
βββ sensor_0000/
β βββ calibration/ # Calibration images
β βββ samples/ # Organized by class
β β βββ class_0000/
β β β βββ 0000.png
β β β βββ ...
β β βββ class_0001/
β β β βββ 0000.png
β β β βββ ...
β β βββ ...
βββ sensor_0001/
βββ ...
```
### 3. Pose Estimation Dataset
Each of the `train_set/` and `test_set/` directories is structured as follows:
```
train_set/ (or test_set/)
βββ sensor_0000/
β βββ calibration/ # Calibration images
β βββ samples/ # Tactile sample images
β β βββ 0000.png
β β βββ 0001.png
β β βββ ...
β βββ locations/ # Pose/Location data
β βββ 0000.npy
β βββ 0001.npy
β βββ ...
βββ sensor_0001/
βββ ...
```
---
## Citation
If you use this dataset or model weights in your research, please cite:
```bibtex
@inproceedings{
gupta2025sensorinvariant,
title={Sensor-Invariant Tactile Representation},
author={Harsh Gupta and Yuchen Mo and Shengmiao Jin and Wenzhen Yuan},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
}
```
---
## License
This dataset and model weights are licensed under the MIT License. See the LICENSE file for details.
If you have any questions or need further clarification, please feel free to reach out. |