|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
base_model: |
|
- deepseek-ai/deepseek-coder-6.7b-instruct |
|
tags: |
|
- code |
|
- Festi |
|
- php |
|
- developer-agent |
|
--- |
|
|
|
# Festi Coder Full 2025-06 |
|
|
|
This is a fully fine-tuned version of `deepseek-ai/deepseek-coder-6.7b-instruct`, built by [Festi](https://festi.io) to support advanced backend development on the Festi Framework. The model is trained on real-world Festi codebases and supports tasks like plugin generation, trait and service scaffolding, and backend automation. |
|
|
|
--- |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
- **Developed by:** Festi |
|
- **Model type:** Causal Language Model (full fine-tune) |
|
- **Base model:** [`deepseek-ai/deepseek-coder-6.7b-instruct`](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) |
|
- **Language(s):** English, PHP (Festi syntax) |
|
- **License:** Apache-2.0 |
|
- **Fine-tuned with:** Transformers (no LoRA) |
|
|
|
--- |
|
|
|
## Uses |
|
|
|
### Direct Use |
|
|
|
This model is intended for developers working in the Festi ecosystem who want to: |
|
|
|
- Generate Festi plugins, services, CLI commands, and traits |
|
- Edit and extend existing Festi modules |
|
- Explain and document PHP code following Festi patterns |
|
|
|
### Out-of-Scope Use |
|
|
|
- Natural language chat or general NLP tasks |
|
- Use with non-Festi PHP frameworks (e.g., Laravel, Symfony) |
|
- Autonomous execution without human validation |
|
|
|
--- |
|
|
|
## Bias, Risks, and Limitations |
|
|
|
This is a domain-specific model, not suitable for general-purpose programming. The code generated may contain syntactic or semantic issues and should be reviewed by experienced developers before use in production. |
|
|
|
### Recommendations |
|
|
|
- Validate model output before use |
|
- Use only in backend contexts aligned with Festi's architecture |
|
- Do not expose model output to end-users directly |
|
|
|
--- |
|
|
|
## How to Get Started with the Model |
|
|
|
```python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline |
|
|
|
model_id = "Festi/festi-coder-full-2025-06" |
|
|
|
tokenizer = AutoTokenizer.from_pretrained(model_id) |
|
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto") |
|
|
|
generator = pipeline("text-generation", model=model, tokenizer=tokenizer) |
|
|
|
prompt = "<|user|>\nCreate a plugin to subscribe users via email.\n<|assistant|>\n" |
|
output = generator(prompt, max_new_tokens=300) |
|
print(output[0]["generated_text"]) |