YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
amkyawdev/amkyaw-dev-v1
Model Overview
- Model Name: amkyaw-coder-1.5b-instruct
- Type: Code Generation / Instruction Following
- Size: 1.5B parameters
- Format: GGUF (quantized)
Quick Start
# Run the model
ollama run amkyawdev/amkyaw-dev-v1
# Or run with specific tag
ollama run amkyawdev/amkyaw-dev-v1:latest
Features
- Code generation
- Instruction following
- Burmese language support
- English language support
System Requirements
- Ollama installed
- At least 2GB RAM available
- No GPU required (runs on CPU)
Configuration
| Parameter | Value |
|---|---|
| Temperature | 0.8 |
| Top P | 0.9 |
| Top K | 40 |
| Context Length | 4096 |
Usage Examples
import ollama
response = ollama.generate(
model='amkyawdev/amkyaw-dev-v1',
prompt='Write a Python function to calculate factorial'
)
print(response['response'])
License
See Hugging Face for license information.
Troubleshooting
If you encounter issues:
- Make sure Ollama is running:
ollama serve - Check model is installed:
ollama list - Try restarting Ollama:
pkill ollama && ollama serve
- Downloads last month
- 108
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support