File size: 1,106 Bytes
1a14ea0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: mit
---

### bitnet_b1_58-3B-Coder

Code finetuned version of [bitnet_b1_58-3B](https://huggingface.co/1bitLLM/bitnet_b1_58-3B)

### Usage

```python
from tokenization_bitnet import BitnetTokenizer
from transformers import AutoModelForCausalLM
import torch
PROMPT = """### Instruction
{instruction}
### Response
"""
instruction = <Your code instruction here>
prompt = PROMPT.format(instruction=instruction)
tokenizer = BitnetTokenizer.from_pretrained(
    "TechxGenus/bitnet_b1_58-3B-Coder",
    trust_remote_code=True,
)
model = AutoModelForCausalLM.from_pretrained(
    "TechxGenus/bitnet_b1_58-3B-Coder",
    torch_dtype=torch.float16,
    device_map="auto",
)
inputs = tokenizer.encode(prompt, return_tensors="pt")
outputs = model.generate(input_ids=inputs.to(model.device), max_new_tokens=2048)
print(tokenizer.decode(outputs[0]))
```

### Note

Model may sometimes make errors, produce misleading contents, or struggle to manage tasks that are not related to coding. It has undergone very limited testing. Additional safety testing should be performed before any real-world deployments.