File size: 1,760 Bytes
3bc2cbf 86af8d4 3bc2cbf cd6ae0d 3bc2cbf 2c6fae7 3bc2cbf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
---
tags:
- docker
- x86
- a100
- rtx4090
- semamba
- cuda
- pytorch
- mamba
license: mit
library_name: docker
datasets: []
---
# x86 SEMamba Docker Image
This Docker image provides a pre-configured development environment for running [SEMamba](https://github.com/RoyChao19477/SEMamba) models on x86 systems such as NVIDIA A100, RTX 4090, and other CUDA-compatible GPUs. It contains Python 3.12 and PyTorch 2.2.2, built on top of Ubuntu 22.04 with CUDA 12.4.
---
## Contents
- **OS**: Ubuntu 22.04 (x86_64)
- **Python**: 3.12 (via Miniconda)
- **CUDA**: 12.4 (base image)
- **PyTorch**: 2.2.2
- **TorchVision**: 0.17.2
- **TorchAudio**: 2.2.2
- **Mamba-SSM**: 1.2.0
- **Essential packages**: git, vim, screen, htop, tmux, openssh, etc.
---
## Usage
### Download Docker Image
```bash
wget https://huggingface.co/datasets/rc19477/x86-semamba-docker/resolve/main/x86_semamba_py312_pt222_cuda124.tar
```
### Load Docker Image
```bash
docker load < x86_semamba_py312_pt222_cuda124.tar
```
### Run Container
```bash
docker run --gpus all -it -v $(pwd):/workspace x86_semamba_py312_pt222_cuda124
```
This will mount your current directory into `/workspace` inside the container.
---
## Purpose
- Simplifies setup for SEMamba on x86 GPU systems
- Provides reproducible environment with version-pinned core libraries
---
## License & Attribution
- This Docker image is shared for **non-commercial research purposes**.
- All included libraries retain their original licenses.
- Based on [PyTorch](https://pytorch.org/), [Miniconda](https://docs.conda.io/en/latest/miniconda.html), and [Mamba](https://github.com/state-spaces/mamba).
---
## Maintainer
For questions or issues, feel free to open a discussion or connect via GitHub.
|