FPHam commited on
Commit
7978287
1 Parent(s): 3e19015

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -39,6 +39,8 @@ Instead of confidently proclaiming something (or confidently hallucinating other
39
 
40
  The correct jinja chat_template is in tokenizer_config.json
41
 
 
 
42
  **Parameters**
43
 
44
  It's up to you to discover the best parameters that works.
@@ -46,3 +48,4 @@ It's up to you to discover the best parameters that works.
46
  I tested it in oobabooga WebUi using very off-the-shelf min_p preset: Temperature: 1, Top_p: 1, Top_k: 0, Typical_p: 1, min_p: 0.05, repetition_penalty: 1
47
 
48
  Different parameters, like temperature will affect the models talkativnes and self-reflecting properties. If you find something really good, let me know and I'll post it here.
 
 
39
 
40
  The correct jinja chat_template is in tokenizer_config.json
41
 
42
+ It was not trained with a system message, you can further use various system messages to steer the model.
43
+
44
  **Parameters**
45
 
46
  It's up to you to discover the best parameters that works.
 
48
  I tested it in oobabooga WebUi using very off-the-shelf min_p preset: Temperature: 1, Top_p: 1, Top_k: 0, Typical_p: 1, min_p: 0.05, repetition_penalty: 1
49
 
50
  Different parameters, like temperature will affect the models talkativnes and self-reflecting properties. If you find something really good, let me know and I'll post it here.
51
+