DanielSan7 commited on
Commit
e2c6c74
1 Parent(s): baf7215

update README.md more details

Browse files
Files changed (1) hide show
  1. README.md +50 -1
README.md CHANGED
@@ -109,10 +109,59 @@ special_tokens:
109
 
110
  # deepseek-coder-1.3b-typescript
111
 
112
- This model is a fine-tuned version of [deepseek-ai/deepseek-coder-1.3b-base](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base) on the the-stack dataset, using 0.5B of tokens of typescript only.
 
 
 
113
  It achieves the following results on the evaluation set:
114
  - Loss: 0.7681
115
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
116
  ## Training procedure
117
 
118
  ### Training hyperparameters
 
109
 
110
  # deepseek-coder-1.3b-typescript
111
 
112
+ CodeGPTPlus/deepseek-coder-1.3b-typescript, emerges as a fine-tuned iteration of [deepseek-ai/deepseek-coder-1.3b-base](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base), meticulously crafted by the CodeGPT team to excel in generating expert code in TypeScript. With specific fine-tuning for TypeScript and a dataset of 0.5B tokens, this model excels in producing precise and efficient solutions in this programming language.
113
+ The 16K window size and an additional fill-in-the-middle task are employed to deliver project-level code completion.
114
+ This new model stands as the ideal choice for those seeking a specialized code generator for TypeScript, backed by the expertise of the CodeGPT team.
115
+
116
  It achieves the following results on the evaluation set:
117
  - Loss: 0.7681
118
 
119
+ **Model Developers** CodeGPT Team
120
+ **Variations** 1.3B
121
+ **Input** Models input text only.
122
+ **Output** Models generate text only.
123
+
124
+ ## How to Use
125
+ This model is for completion purposes only. Here give some examples of how to use the model.
126
+
127
+ #### Running the model on a GPU
128
+ ```python
129
+ from transformers import AutoTokenizer, AutoModelForCausalLM
130
+ tokenizer = AutoTokenizer.from_pretrained("CodeGPTPlus/deepseek-coder-1.3b-typescript",
131
+ trust_remote_code=True)
132
+ model = AutoModelForCausalLM.from_pretrained("CodeGPTPlus/deepseek-coder-1.3b-typescript",
133
+ trust_remote_code=True).cuda()
134
+
135
+ input_text = """<|fim▁begin|>function quickSort(arr: number[]): number[] {
136
+ if (arr.length <= 1) {
137
+ return arr;
138
+ }
139
+ const pivot = arr[0];
140
+ const left = [];
141
+ const right = [];
142
+ <|fim▁hole|>
143
+ return [...quickSort(left), pivot, ...quickSort(right)];
144
+ }<|fim▁end|>"""
145
+
146
+ inputs = tokenizer(input_text, return_tensors="pt").to(model.device)
147
+ outputs = model.generate(**inputs, max_length=256)
148
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
149
+ ```
150
+
151
+ ### Fill In the Middle (FIM)
152
+ ```python
153
+ <|fim▁begin|>function quickSort(arr: number[]): number[] {
154
+ if (arr.length <= 1) {
155
+ return arr;
156
+ }
157
+ const pivot = arr[0];
158
+ const left = [];
159
+ const right = [];
160
+ <|fim▁hole|>
161
+ return [...quickSort(left), pivot, ...quickSort(right)];
162
+ }<|fim▁end|>
163
+ ```
164
+
165
  ## Training procedure
166
 
167
  ### Training hyperparameters