Maani commited on
Commit
eea6f54
1 Parent(s): 1d2fb2b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -8
README.md CHANGED
@@ -4,22 +4,26 @@ language:
4
  - en
5
  - es
6
  pipeline_tag: text-generation
 
7
  ---
8
 
9
  # Model Card for Model ID
10
 
11
- Introducing Pixie Zehir Nano
12
-
13
- Excelling in writing
 
 
 
14
 
15
- Fine tuned on HQ DATA™ from Pixie Zehir.
16
 
17
  ## Model Details
18
 
19
  - **Developed by:** [Maani x BLNKBLK]
20
  - **Language(s) (NLP):** [English, Spanish]
21
- - **License:** [More Information Needed]
22
- - **Finetuned from model [optional]:** [h2oai/h2o-danube-1.8b-chat]
23
 
24
 
25
  ## Agreements
@@ -44,7 +48,7 @@ pipe = pipeline(
44
  # We use the HF Tokenizer chat template to format each message
45
  # https://huggingface.co/docs/transformers/main/en/chat_templating
46
  messages = [
47
- {"role": "user", "content": "Why is drinking water so healthy?"},
48
  ]
49
  prompt = pipe.tokenizer.apply_chat_template(
50
  messages,
@@ -56,4 +60,4 @@ res = pipe(
56
  max_new_tokens=256,
57
  )
58
  print(res[0]["generated_text"])
59
- # <|prompt|>Write a haiku.</s><|answer|> In the windowless room, Digital dreams consume, Unseen sun sets on a white rabbit's ears: [...]
 
4
  - en
5
  - es
6
  pipeline_tag: text-generation
7
+ license: apache-2.0
8
  ---
9
 
10
  # Model Card for Model ID
11
 
12
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6320e992beec1969845be447/25pTrbjySoblu8cuiHASu.png)
13
+ )
14
+
15
+ Introducing Pixie Zehir Nano.
16
+
17
+ Excelling in writing.
18
 
19
+ A fine tune of H2O Danube 1.8b on HQ DATA™ from Pixie Zehir.
20
 
21
  ## Model Details
22
 
23
  - **Developed by:** [Maani x BLNKBLK]
24
  - **Language(s) (NLP):** [English, Spanish]
25
+ - **License:** [Apache 2.0]
26
+ - **Finetuned from model :** [h2oai/h2o-danube-1.8b-chat]
27
 
28
 
29
  ## Agreements
 
48
  # We use the HF Tokenizer chat template to format each message
49
  # https://huggingface.co/docs/transformers/main/en/chat_templating
50
  messages = [
51
+ {"role": "user", "content": "Write a haiku."},
52
  ]
53
  prompt = pipe.tokenizer.apply_chat_template(
54
  messages,
 
60
  max_new_tokens=256,
61
  )
62
  print(res[0]["generated_text"])
63
+ # <|prompt|>Write a haiku.</s><|answer|> In the windowless room, Digital dreams consume, Unseen sun sets on a white rabbit's ears: [...]