Text Generation
Transformers
GGUF
English
code
munish0838 commited on
Commit
e85a34a
·
verified ·
1 Parent(s): 05931c5

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +96 -0
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ license: mit
5
+ license_link: https://huggingface.co/microsoft/wavecoder-ds-6.7b/blob/main/LICENSE
6
+ language:
7
+ - en
8
+ library_name: transformers
9
+ datasets:
10
+ - humaneval
11
+ pipeline_tag: text-generation
12
+ tags:
13
+ - code
14
+ metrics:
15
+ - code_eval
16
+
17
+ ---
18
+
19
+ ![](https://lh7-us.googleusercontent.com/docsz/AD_4nXfrlKyH6elkxeyrKw4el9j8V3IOQLsqTVngg19Akt6se1Eq2xaocCEjOmc1w8mq5ENHeYfpzRWjYB8D4mtmMPsiH7QyX_Ii1kEM7bk8eMzO68y9JEuDcoJxJBgbNDzRbTdVXylN9_zjrEposDwsoN7csKiD?key=xt3VSDoCbmTY7o-cwwOFwQ)
20
+
21
+ # QuantFactory/wavecoder-ds-6.7b-GGUF
22
+ This is quantized version of [microsoft/wavecoder-ds-6.7b](https://huggingface.co/microsoft/wavecoder-ds-6.7b) created using llama.cpp
23
+
24
+ # Original Model Card
25
+
26
+
27
+ <h1 align="center">
28
+ 🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM
29
+ </h1>
30
+
31
+ <p align="center">
32
+ <a href="https://arxiv.org/abs/2312.14187"><b>[📜 Paper]</b></a> •
33
+ <!-- <a href=""><b>[🤗 HF Models]</b></a> • -->
34
+ <a href="https://github.com/microsoft/WaveCoder"><b>[🐱 GitHub]</b></a>
35
+ <br>
36
+ <a href="https://twitter.com/TeamCodeLLM_AI"><b>[🐦 Twitter]</b></a> •
37
+ <a href="https://www.reddit.com/r/LocalLLaMA/comments/19a1scy/wavecoderultra67b_claims_to_be_the_2nd_best_model/"><b>[💬 Reddit]</b></a> •
38
+ <a href="https://www.analyticsvidhya.com/blog/2024/01/microsofts-wavecoder-and-codeocean-revolutionize-instruction-tuning/">[🍀 Unofficial Blog]</a>
39
+ <!-- <a href="#-quick-start">Quick Start</a> • -->
40
+ <!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
41
+ </p>
42
+
43
+ <p align="center">
44
+ Repo for "<a href="https://arxiv.org/abs/2312.14187" target="_blank">WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation</a>"
45
+ </p>
46
+
47
+ ## 🔥 News
48
+
49
+ - [2024/04/10] 🔥🔥🔥 WaveCoder repo, models released at [🤗 HuggingFace](https://huggingface.co/microsoft/wavecoder-ultra-6.7b)!
50
+ - [2023/12/26] WaveCoder paper released.
51
+
52
+ ## 💡 Introduction
53
+
54
+ WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
55
+
56
+ | Model | HumanEval | MBPP(500) | HumanEval<br>Fix(Avg.) | HumanEval<br>Explain(Avg.) |
57
+ | -------------------------------------------------------------------------------- | --------- | --------- | ---------------------- | -------------------------- |
58
+ | GPT-4 | 85.4 | - | 47.8 | 52.1 |
59
+ | [🌊 WaveCoder-DS-6.7B](https://huggingface.co/microsoft/wavecoder-ds-6.7b) | 65.8 | 63.0 | 49.5 | 40.8 |
60
+ | [🌊 WaveCoder-Pro-6.7B](https://huggingface.co/microsoft/wavecoder-pro-6.7b) | 74.4 | 63.4 | 52.1 | 43.0 |
61
+ | [🌊 WaveCoder-Ultra-6.7B](https://huggingface.co/microsoft/wavecoder-ultra-6.7b) | 79.9 | 64.6 | 52.3 | 45.7 |
62
+
63
+ ## 🪁 Evaluation
64
+
65
+ Please refer to WaveCoder's [GitHub repo](https://github.com/microsoft/WaveCoder) for inference, evaluation, and training code.
66
+
67
+ ## How to get start with the model
68
+
69
+ ```python
70
+ # Load model directly
71
+ from transformers import AutoTokenizer, AutoModelForCausalLM
72
+ tokenizer = AutoTokenizer.from_pretrained("microsoft/wavecoder-ds-6.7b")
73
+ model = AutoModelForCausalLM.from_pretrained("microsoft/wavecoder-ds-6.7b")
74
+ ```
75
+
76
+ ## 📖 License
77
+
78
+ This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its [License](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL).
79
+
80
+ ## ☕️ Citation
81
+
82
+ If you find this repository helpful, please consider citing our paper:
83
+
84
+ ```
85
+ @article{yu2023wavecoder,
86
+ title={Wavecoder: Widespread and versatile enhanced instruction tuning with refined data generation},
87
+ author={Yu, Zhaojian and Zhang, Xin and Shang, Ning and Huang, Yangyu and Xu, Can and Zhao, Yishujie and Hu, Wenxiang and Yin, Qiufeng},
88
+ journal={arXiv preprint arXiv:2312.14187},
89
+ year={2023}
90
+ }
91
+ ```
92
+
93
+ ## Note
94
+
95
+ WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets.
96
+