Text2Text Generation
Transformers
PyTorch
t5
text-generation-inference
Inference Endpoints

fix: fix bibtex format for CodeRL

#5
by zhuwenq - opened
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -26,6 +26,7 @@ We validate the effectiveness of this checkpoint pretrained with simplified stra
26
 
27
  This model can be easily loaded using the `T5ForConditionalGeneration` functionality:
28
 
 
29
  ```python
30
  from transformers import AutoTokenizer, T5ForConditionalGeneration
31
  tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5-large")
@@ -50,7 +51,7 @@ print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
50
  year = {2021}
51
  }
52
 
53
- @article{CodeRL2022
54
  author = {Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi},
55
  title = {CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning},
56
  journal = {arXiv preprint},
 
26
 
27
  This model can be easily loaded using the `T5ForConditionalGeneration` functionality:
28
 
29
+
30
  ```python
31
  from transformers import AutoTokenizer, T5ForConditionalGeneration
32
  tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5-large")
 
51
  year = {2021}
52
  }
53
 
54
+ @article{CodeRL2022,
55
  author = {Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi},
56
  title = {CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning},
57
  journal = {arXiv preprint},