TokenUnify Models

This repository contains TokenUnify models of different sizes trained on wafer electron microscopy data, along with a superhuman baseline model.

Available Models

  • TokenUnify-1B.pth: 1B parameter TokenUnify model
  • TokenUnify-500M.pth: 500M parameter TokenUnify model
  • TokenUnify-200M.pth: 200M parameter TokenUnify model
  • TokenUnify-100M.pth: 100M parameter TokenUnify model
  • superhuman.pth: Superhuman baseline model

Model Details

  • Architecture: TokenUnify (based on Mamba)
  • Training Data: Wafer electron microscopy images
  • Task: Image Segmentation
  • Framework: PyTorch

Usage

import torch

# Load a specific model
model_path = "TokenUnify-1B.pth"  # or any other model file
checkpoint = torch.load(model_path, map_location='cpu')

# Your model loading code here

Model Sizes

Model Parameters File Name
TokenUnify Large 1B TokenUnify-1B.pth
TokenUnify Medium 500M TokenUnify-500M.pth
TokenUnify Small 200M TokenUnify-200M.pth
TokenUnify Tiny 100M TokenUnify-100M.pth
Superhuman Baseline - superhuman.pth

Citation

If you use these models, please cite the relevant paper. EOF

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support