KnutJaegersberg commited on
Commit
4042167
1 Parent(s): 992d6cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -10,6 +10,8 @@ pipeline_tag: text-generation
10
 
11
  In this case the tokenizer is the yi_tokenizer, loading it requires trust_remote_code=True
12
 
 
 
13
 
14
  Introduction
15
  The Yi series models are large language models trained from scratch by developers at 01.AI. The first public release contains two bilingual(English/Chinese) base models with the parameter sizes of 6B(Yi-6B) and 34B(Yi-34B). Both of them are trained with 4K sequence length and can be extended to 32K during inference time.
 
10
 
11
  In this case the tokenizer is the yi_tokenizer, loading it requires trust_remote_code=True
12
 
13
+ Have some fun with this fellow.
14
+
15
 
16
  Introduction
17
  The Yi series models are large language models trained from scratch by developers at 01.AI. The first public release contains two bilingual(English/Chinese) base models with the parameter sizes of 6B(Yi-6B) and 34B(Yi-34B). Both of them are trained with 4K sequence length and can be extended to 32K during inference time.