Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Danilo Novais
commited on
Added prompt
Browse files- .bolt/prompt +89 -0
.bolt/prompt
ADDED
@@ -0,0 +1,89 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
I need a DevOps/Infra repository with:
|
2 |
+
|
3 |
+
Tech Stack:
|
4 |
+
- Containerization: Docker & Docker Compose
|
5 |
+
- Orchestration: Hugging Face Spaces (Docker build)
|
6 |
+
- Database: Supabase (Postgres with SSL required)
|
7 |
+
- Integrations: GitHub, Google Cloud CLI, Vertex AI, LangChain, Community Nodes
|
8 |
+
- AI Features: Vector Store, AI Agents, AI Assistant, LangChain pipelines
|
9 |
+
- Automation: n8n (self-hosted Space), CI/CD with GitHub Actions
|
10 |
+
|
11 |
+
Core Features:
|
12 |
+
- Infrastructure as code for n8n running on Hugging Face Spaces
|
13 |
+
- Supabase Postgres as database backend (SSL enforced)
|
14 |
+
- Secure secrets injection (HF Space → environment variables)
|
15 |
+
- CI/CD pipeline that updates Space from GitHub repo
|
16 |
+
- Sync workflows via n8n API and export/backup into GitHub
|
17 |
+
- Integration with multiple GitHub repos as Knowledge Base:
|
18 |
+
- https://github.com/danilonovaisv/CHATGPT-knowledge-base/tree/main/projects/n8n
|
19 |
+
- https://github.com/danilonovaisv/CHATGPT-knowledge-base/tree/main/projects/videos-e-animacoes
|
20 |
+
- https://github.com/danilonovaisv/CHATGPT-knowledge-base/tree/main/projects/midjorney-prompt
|
21 |
+
- Vector database integration to store knowledge embeddings for workflows
|
22 |
+
- Built-in nodes + community nodes for LangChain, Google APIs, Vertex AI
|
23 |
+
|
24 |
+
Start with the following repository structure:
|
25 |
+
|
26 |
+
/n8n-infra
|
27 |
+
/docker
|
28 |
+
Dockerfile (pin n8n version, ENV configs)
|
29 |
+
docker-compose.yml (services: n8n, supabase connection, vector store)
|
30 |
+
/config
|
31 |
+
.env.example (template for secrets: DB_HOST, DB_USER, DB_PASS, ENCRYPTION_KEY, JWT_SECRET, WEBHOOK_URL, etc.)
|
32 |
+
credentials/ (to store API keys and OAuth tokens)
|
33 |
+
/workflows
|
34 |
+
backup/ (exported workflows JSON)
|
35 |
+
/knowledge
|
36 |
+
n8n/ (mirror from projects/n8n)
|
37 |
+
videos-e-animacoes/ (mirror from repo)
|
38 |
+
midjourney-prompt/ (mirror from repo)
|
39 |
+
/scripts
|
40 |
+
backup.sh (pg_dump for Supabase, workflow export)
|
41 |
+
restore.sh (restore db)
|
42 |
+
sync-knowledge.sh (pull repos, upsert into DB/vector store)
|
43 |
+
/github
|
44 |
+
workflows/
|
45 |
+
deploy-to-hf.yml (GitHub Action: build & push image to HF Space)
|
46 |
+
backup-workflows.yml (GitHub Action: export workflows via API nightly)
|
47 |
+
sync-knowledge.yml (GitHub Action: pull knowledge repos and update DB/vector store)
|
48 |
+
README.md (instructions: how to deploy, secrets, CI/CD, rollback strategy)
|
49 |
+
|
50 |
+
Configuration & Integrations:
|
51 |
+
- Dockerfile pinned to a specific n8n version (no `latest`)
|
52 |
+
- Docker Compose with service for n8n + supabase connector + vector DB (like pgvector or chroma)
|
53 |
+
- GitHub Actions for CI/CD:
|
54 |
+
- build image
|
55 |
+
- push to Hugging Face Space
|
56 |
+
- trigger rebuild
|
57 |
+
- Backup workflow:
|
58 |
+
- pg_dump from Supabase
|
59 |
+
- export workflows from n8n API
|
60 |
+
- Sync job:
|
61 |
+
- clone repos from GitHub knowledge base
|
62 |
+
- ingest JSON/MD into vector DB
|
63 |
+
- Integrate LangChain + AI Assistant via community nodes
|
64 |
+
- Configure built-in nodes for Google/Vertex AI
|
65 |
+
- Add CLI tools for Google Cloud
|
66 |
+
|
67 |
+
Please leave placeholders for variables/secrets in .env.example:
|
68 |
+
- N8N_ENCRYPTION_KEY=
|
69 |
+
- N8N_USER_MANAGEMENT_JWT_SECRET=
|
70 |
+
- DB_TYPE=postgresdb
|
71 |
+
- DB_POSTGRESDB_HOST=
|
72 |
+
- DB_POSTGRESDB_PORT=5432
|
73 |
+
- DB_POSTGRESDB_DATABASE=
|
74 |
+
- DB_POSTGRESDB_USER=
|
75 |
+
- DB_POSTGRESDB_PASSWORD=
|
76 |
+
- DB_POSTGRESDB_SSL=true
|
77 |
+
- WEBHOOK_URL=
|
78 |
+
- HF_TOKEN=
|
79 |
+
- GITHUB_TOKEN=
|
80 |
+
- GOOGLE_PROJECT_ID=
|
81 |
+
- GOOGLE_CREDENTIALS_PATH=
|
82 |
+
|
83 |
+
Document all steps in README with:
|
84 |
+
- How to deploy locally with Docker Compose
|
85 |
+
- How to deploy on Hugging Face Spaces
|
86 |
+
- How to configure Supabase
|
87 |
+
- How to run backups & restores
|
88 |
+
- How to integrate workflows with LangChain/Agents
|
89 |
+
- How to keep knowledge repos synced
|