File size: 2,370 Bytes
26f9a57 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 4276214 e553154 26f9a57 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
license: mit
language:
- en
base_model:
- deepseek-ai/deepseek-coder-6.7b-instruct
tags:
- code
- Festi
---
# Festi Coder LoRA 2025-06
This is a LoRA fine-tuned version of `deepseek-coder-6.7b-instruct`, optimized for generating and understanding code built on the [Festi Framework](https://festi.io). The model is designed to assist with plugin generation, trait and service scaffolding, and other automation tasks relevant to the Festi ecosystem.
---
## Model Details
### Model Description
- **Developed by:** Festi
- **Model type:** Causal Language Model with LoRA fine-tuning
- **Base model:** [`deepseek-coder-6.7b-instruct`](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
- **Language(s):** English, PHP (Festi-specific syntax and DSL)
- **License:** [To be specified — likely mirrors base model]
- **Fine-tuned with:** PEFT + LoRA
---
## Uses
### Direct Use
This model is intended for developers using the Festi Framework who want to:
- Generate new plugins (e.g., SubscribePlugin)
- Scaffold services, traits, CLI commands
- Complete and explain Festi-specific PHP code
### Out-of-Scope Use
- General NLP tasks (e.g., chat, summarization)
- Non-Festi PHP applications
- High-stakes decision making
---
## Bias, Risks, and Limitations
This model is domain-specific and not suitable for general-purpose programming. Generated code may require manual review, especially in production settings. It inherits any limitations and biases from its base model (`deepseek-coder-6.7b-instruct`).
### Recommendations
- Always review generated code.
- Do not expose model outputs directly to end-users without validation.
---
## How to Get Started with the Model
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel, PeftConfig
peft_model_id = "Festi/festi-coder-lora-2025-06"
base_model = "deepseek-ai/deepseek-coder-6.7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(base_model)
model = AutoModelForCausalLM.from_pretrained(base_model)
model = PeftModel.from_pretrained(model, peft_model_id)
prompt = "<|user|>\nCreate a plugin to collect emails for a newsletter subscription.\n<|assistant|>\n"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True)) |