Gerson Fabian Buenahora Ormaza commited on
Commit
79de40a
1 Parent(s): b324b19

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -1
README.md CHANGED
@@ -10,4 +10,54 @@ pipeline_tag: text-generation
10
  library_name: transformers
11
  tags:
12
  - code
13
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  library_name: transformers
11
  tags:
12
  - code
13
+ ---
14
+ # Model Card
15
+
16
+ GPT2Coder is a language model that uses openAI's GPT2 model architecture,
17
+ the model was pre-trained on multiple code data focused on python and languages
18
+ ​​​​such as Spanish and English. The pretrained model was finely tuned to handle
19
+ the task of receiving textual input in the form of a code request and generating
20
+ a code output.
21
+
22
+ ## Model Details
23
+
24
+
25
+ - **Developed by:** BueormAI
26
+ - **Shared by:** BueormLLC
27
+ - **Model type:** Transformer
28
+ - **Language(s) (NLP):** English (en), Spanish (es)
29
+ - **License:** MiT
30
+ - **Finetuned from model:** GPT2 Architecture
31
+
32
+ ## Bias, Risks, and Limitations
33
+
34
+ The model can generate unexpected code and output, in addition to offensive texts and non-functional code.
35
+
36
+
37
+ ### Recommendations
38
+
39
+ We recommend using the model with caution and handling its outputs with discretion as they may turn out to be non-functional outputs and harmful and dangerous code.
40
+
41
+ ## Training Details
42
+
43
+ ### Training Hyperparameters
44
+
45
+ - **Training regime:** fp16 mixed precision
46
+ - **Max_lenght:** 1024 tokens
47
+ - **pretrain epochs:** 1 epochs
48
+ - **finetuning epochs:** 2 epochs
49
+
50
+ ## Environmental Impact
51
+
52
+ - **Hardware Type:** GPU P100
53
+ - **Hours used:** 18 hours
54
+ - **Cloud Provider:** Kaggle
55
+
56
+
57
+ # By Bueorm
58
+ Thanks to all the people who download and support our projects
59
+ and manage a vision towards the future with AI, we hope you will support
60
+ us to continue advancing and launching more followed models.
61
+
62
+ - [Paypal Donations](https://paypal.me/bueorm)
63
+ - [Patreon Subscription](https//patreon.com/bueorm)