|
--- |
|
license: mit |
|
base_model: |
|
- Salesforce/codet5-base |
|
pipeline_tag: text2text-generation |
|
tags: |
|
- code |
|
- mathematics |
|
- theorem-proving |
|
--- |
|
|
|
Usage |
|
```python |
|
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer |
|
from transformers import pipeline |
|
model_name = "amitayusht/ProofWala-Multilingual" |
|
model = AutoModelForSeq2SeqLM.from_pretrained(model_name) |
|
tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
pipeline = pipeline("text2text-generation", model=model, tokenizer=tokenizer, device=-1) # device=0 for GPU, -1 for CPU |
|
|
|
# Example usage |
|
state = """ |
|
Goals to prove: |
|
[GOALS] |
|
[GOAL] 1 |
|
forall n : nat, n + 1 = 1 + n |
|
[END] |
|
""" |
|
result = pipeline(state, max_length=100, num_return_sequences=1) |
|
print(result[0]['generated_text']) |
|
# Output: |
|
# [RUN TACTIC] |
|
# induction n; trivial. |
|
# [END] |
|
``` |