omnisealbench / README.md
Mark Duppenthaler
Parity in functionality
ed37070
|
raw
history blame
1.95 kB
metadata
title: Omniseal Dev
emoji: 🦀
colorFrom: red
colorTo: green
sdk: docker
pinned: false
short_description: POC development

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference

Docker Build Instructions

Prerequisites

  • Docker installed on your system
  • Git repository cloned locally

Build Steps (conda)

  1. Initialize conda environment
cd backend
conda env create -f environment.yml -y
  1. Build frontend (outputs html, js, css into frontend/dist)
cd frontend
npm install
npm run build
  1. Run backend server which serves built frontend files
gunicorn --chdir backend -b 0.0.0.0:7860 app:app --reload
  1. Server will be running on http://localhost:7860

Build Steps (Docker, huggingface)

  1. Navigate to the project directory:
cd /path/to/omniseal_dev
  1. Build the Docker image:
docker build -t omniseal-benchmark .

OR

docker buildx build -t omniseal-benchmark .
  1. Run the container (this runs in auto-reload mode when you update python files in the backend directory). Note the -v argument make it so the backend could hot reload:
docker run -p 7860:7860 -v $(pwd)/backend:/app/backend omniseal-benchmark
  1. Access the application at http://localhost:7860

Local Development

When updating the backend, you can run it in whichever build steps above to take advantage of hot-reload so you don't have to restart the server.

For the frontend to take advantage of hot reload:

  1. Create a .env.local file in the frontend directory. Set VITE_API_SERVER_URL to where your backend server is running. When running locally it will be VITE_API_SERVER_URL=http://localhost:7860. This overrides the configuration in .env so the frontend will connect with your backend URL of choice.

  2. Run the development server with hot-reload:

cd frontend
npm install
npm run dev