AjayP13 commited on
Commit
3078d54
·
verified ·
1 Parent(s): e758bcb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -28,7 +28,7 @@ pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, pad_token_i
28
  # Run inference to templatize the query
29
  inputs = ["What volleyball exercises should I do I'm almost in high school and i do volleyball excellence five times a week (basically an advanced class in school with experienced volleyball coaches) , we have 2-3 skill training sessions a week which i feel like isn't enough for me as I would like to improve my skills almost every day.\n\n​\n\nWhat i wanted to know was what setting, digging, serving and spiking exercises could i do that would help me improve all of my skills (I have a large area to practice all these things so space isn't an issue)."]
30
  prompts = [tokenizer.apply_chat_template([{'role': 'user', 'content': i}], tokenize=False, add_generation_prompt=True) for i in inputs]
31
- generations = pipe(prompts, max_length=131072, truncation=True, temperature=0.0, top_p=0.0, do_sample=False)
32
  output = generations[0][0]['generated_text']
33
  print(output)
34
 
 
28
  # Run inference to templatize the query
29
  inputs = ["What volleyball exercises should I do I'm almost in high school and i do volleyball excellence five times a week (basically an advanced class in school with experienced volleyball coaches) , we have 2-3 skill training sessions a week which i feel like isn't enough for me as I would like to improve my skills almost every day.\n\n​\n\nWhat i wanted to know was what setting, digging, serving and spiking exercises could i do that would help me improve all of my skills (I have a large area to practice all these things so space isn't an issue)."]
30
  prompts = [tokenizer.apply_chat_template([{'role': 'user', 'content': i}], tokenize=False, add_generation_prompt=True) for i in inputs]
31
+ generations = pipe(prompts, max_length=131072, truncation=True, temperature=None, top_p=None, do_sample=False)
32
  output = generations[0][0]['generated_text']
33
  print(output)
34