metadata
tags:
- docker
- x86
- a100
- rtx4090
- semamba
- cuda
- pytorch
- mamba
license: mit
library_name: docker
datasets: []
x86 SEMamba Docker Image
This Docker image provides a pre-configured development environment for running SEMamba models on x86 systems such as NVIDIA A100, RTX 4090, and other CUDA-compatible GPUs. It contains Python 3.12 and PyTorch 2.2.2, built on top of Ubuntu 22.04 with CUDA 12.4.
Contents
- OS: Ubuntu 22.04 (x86_64)
- Python: 3.12 (via Miniconda)
- CUDA: 12.4 (base image)
- PyTorch: 2.2.2
- TorchVision: 0.17.2
- TorchAudio: 2.2.2
- Mamba-SSM: 1.2.0
- Essential packages: git, vim, screen, htop, tmux, openssh, etc.
Usage
Download Docker Image
wget https://huggingface.co/datasets/rc19477/x86-semamba-docker/resolve/main/x86_semamba_py312_pt222_cuda124.tar
Load Docker Image
docker load < x86_semamba_py312_pt222_cuda124.tar
Run Container
docker run --gpus all -it -v $(pwd):/workspace x86_semamba_py312_pt222_cuda124
This will mount your current directory into /workspace
inside the container.
Purpose
- Simplifies setup for SEMamba on x86 GPU systems
- Provides reproducible environment with version-pinned core libraries
License & Attribution
- This Docker image is shared for non-commercial research purposes.
- All included libraries retain their original licenses.
- Based on PyTorch, Miniconda, and Mamba.
Maintainer
For questions or issues, feel free to open a discussion or connect via GitHub.