01GangaPutraBheeshma
commited on
Commit
•
4083802
1
Parent(s):
5ba2560
Update README.md
Browse files
README.md
CHANGED
@@ -27,4 +27,10 @@ Here's a brief description of my project.
|
|
27 |
|
28 |
## Introduction
|
29 |
|
30 |
-
colab_code_generator_FT_code_gen_UT, an instruction-following large language model trained on the Google Colab Pro with T4 GPU and fine-tuned on 'Salesforce/codegen-350M-mono' that is licensed for commercial use. Code Generator_UT is trained on ~19k instructions/response fine-tuning records from 'iamtarun/python_code_instructions_18k_alpaca'.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
|
28 |
## Introduction
|
29 |
|
30 |
+
colab_code_generator_FT_code_gen_UT, an instruction-following large language model trained on the Google Colab Pro with T4 GPU and fine-tuned on 'Salesforce/codegen-350M-mono' that is licensed for commercial use. Code Generator_UT is trained on ~19k instructions/response fine-tuning records from 'iamtarun/python_code_instructions_18k_alpaca'.
|
31 |
+
|
32 |
+
### Loading the fine-tuned Code Generator
|
33 |
+
<from peft import AutoPeftModelForCausalLM
|
34 |
+
|
35 |
+
test_model_UT = AutoPeftModelForCausalLM.from_pretrained("01GangaPutraBheeshma/colab_code_generator_FT_code_gen_UT")
|
36 |
+
test_tokenizer_UT = AutoTokenizer.from_pretrained("01GangaPutraBheeshma/colab_code_generator_FT_code_gen_UT")>
|