Dataset Viewer
The dataset viewer is not available for this subset.
Job manager crashed while running this job (missing heartbeats).

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

GH200 SEMamba Docker Image

This Docker image provides a pre-configured development environment for running SEMamba, a mamba-based speech enhancement model, on NVIDIA GH200 hardware. It was created to save time and ensure reproducibility for ARM64/aarch64 CUDA 12.8 setups.

The image includes all required dependencies and configurations for Mamba-based speech enhancement research and other sequence modeling tasks.

⚠️ Warning on Package Source

This Docker image installs PyTorch, Mamba-SSM, triton, decord, vllm, and FlashAttention from the Jetson SBSA CUDA 12.8 index, which is a custom external package repository.

While it enables compatibility with ARM64 + CUDA 12.8 setups, users should be aware that:

  • It is not an official PyPI source
  • Packages may have undocumented patches or modifications
  • There may be potential security or reproducibility risks

Use with discretion, especially in sensitive or production-level environments.


Contents


Usage

Download Docker Image

wget https://huggingface.co/datasets/rc19477/gh200-semamba-docker/resolve/main/gh200_semamba_py312_pt27_cuda128.tar

Load Docker Image

docker load < gh200_semamba_py312_pt27_cuda128.tar

Run Container

docker run --gpus all -it -v $(pwd):/workspace gh200_semamba_py312_pt27_cuda128

This will mount your current directory into /workspace inside the container.


Notes

  • This image is intended only for GH200 (ARM64) systems with CUDA 12.8.
  • Python packages are installed via a custom PyPI index for ARM64 provided by Jetson AI Lab.
  • Do not use this image on x86 systems; it will not work.
  • This environment was built to support projects like SEMamba that use selective state space models.

License & Attribution

  • This Docker image is shared for non-commercial research purposes.
  • All third-party packages, including PyTorch, FlashAttention, and Mamba-SSM, retain their original licenses.
  • PyTorch was installed from a community-provided index: https://pypi.jetson-ai-lab.dev
  • Users are responsible for complying with the licenses of any included or downloaded components.

Acknowledgments

Thanks to the developers of:

  • Jetson AI Lab for maintaining ARM64-compatible PyTorch wheels.

Maintainer

For any issues, feel free to open a discussion or contact me.

Downloads last month
26