Spaces:
Running
Running
File size: 3,124 Bytes
7351a22 ca345b7 7351a22 ca345b7 7351a22 f762ee5 ed37070 f762ee5 ed37070 ca345b7 ed37070 ca345b7 0b598b9 ed37070 d411f8a ed37070 ca345b7 ed37070 0b598b9 f762ee5 ca345b7 0b598b9 ed37070 ca345b7 ed37070 f762ee5 ed37070 f762ee5 ed37070 9eea4a2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 |
---
title: Omniseal Leaderboard
emoji: 🦀
colorFrom: red
colorTo: green
sdk: docker
pinned: false
short_description: Leaderboard for watermarking models
---
## Docker Build Instructions
### Prerequisites
- Docker installed on your system
- Git repository cloned locally
### Build Steps (conda)
1. Initialize conda environment
```bash
cd backend
conda env create -f environment.yml -y
conda activate omniseal-benchmark-backend
```
2. Build frontend (outputs html, js, css into frontend/dist). Note you only need this if you are updating the frontend, the repository would already have a build checked in at frontend/dist
```bash
cd frontend
npm install
npm run build -- --mode prod
```
3. Run backend server from project root. This would serve frontend files from port http://localhost:7860
```bash
gunicorn --chdir backend -b 0.0.0.0:7860 app:app --reload
```
4. Server will be running on `http://localhost:7860`
### Build Steps (Docker, huggingface)
2. Build the Docker image from project root:
```bash
docker build -t omniseal-benchmark .
```
OR
```bash
docker buildx build -t omniseal-benchmark .
```
3. Run the container (this runs in auto-reload mode when you update python files in the backend directory). Note the -v argument make it so the backend could hot reload:
```bash
docker run -p 7860:7860 -v $(pwd)/backend:/app/backend omniseal-benchmark
```
4. Access the application at `http://localhost:7860`
### Local Development
When updating the backend, you can run it in whichever build steps above to take advantage of hot-reload so you don't have to restart the server.
For the frontend:
1. Create a `.env.local` file in the frontend directory. Set `VITE_API_SERVER_URL` to where your backend server is running. When running locally it will be `VITE_API_SERVER_URL=http://localhost:7860`. This overrides the configuration in `.env` so the frontend will connect with your backend URL of choice.
2. Run the development server with hot-reload:
```bash
cd frontend
npm install
npm run dev
```
### Local datasets
By default, datasets are loaded over the network based on `backend/config.py`. Please see the file there and modify if loading different datasets.
`ABS_DATASET_DOMAIN`, `ABS_DATASET_PATH` controls where datasets are loaded from and are used in `DATASET_CONFIGS` and `EXAMPLE_CONFIGS`. Any datasets and examples to be added would need to update the above constants to be visualized in the UI.
There is commented out code that sets the `ABS_DATASET_DOMAIN` to the `backend/data` directory. You can see the data formats of the csv / json files required to render the leaderboard as well as examples there.
In the `data` directory, by default this matches the path structure for loading over the network. Each dataset should be placed under `data/omnisealbench` as a directory, e.g. `data/omnisealbench/sav_val_full_v2` and in the directory have files:
- `{type}_benchmark.csv` for leaderboard tables
- `{type}_attacks_variations.csv` for leaderboard chart
- `examples_eval_results.json` for examples
Please see reference csv and json files for what these need to look like.
|