hafidhsoekma commited on
Commit
e42b180
1 Parent(s): 01a7ce4

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +134 -0
README.md ADDED
@@ -0,0 +1,134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ model-index:
3
+ - name: Starstreak-7b-beta
4
+ results: []
5
+ license: cc0-1.0
6
+ datasets:
7
+ - graelo/wikipedia
8
+ - uonlp/CulturaX
9
+ language:
10
+ - en
11
+ - id
12
+ - jv
13
+ - su
14
+ - ms
15
+ tags:
16
+ - indonesian
17
+ - multilingual
18
+ base_model: HuggingFaceH4/zephyr-7b-beta
19
+ ---
20
+ ![Startstreak Missile](./thumbnail.jpeg "Startstreak Missile: Generated by Bing AI Image Creator or Dall-E 3")
21
+
22
+ # Startstreak-7B-β
23
+
24
+ Starstreak is a series of language models, fine-tuned with QLoRA technique from a base model called [Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta). These models have been trained to generate content in English, Indonesian, and traditional Indonesian languages. Starstreak-7B-β is a specific variant of the open-source Starstreak language model, denoted by the series "β" (beta). This model was trained using a fine-tuned version of [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta). Two datasets were utilized to train the model: the first one is [graelo/wikipedia](https://huggingface.co/datasets/graelo/wikipedia), and the second is [uonlp/CulturaX](https://huggingface.co/datasets/uonlp/CulturaX). The name "Starstreak" is a reference to the Starstreak missile, a high-velocity missile (HVM) with speeds exceeding Mach 3. This makes it one of the fastest missiles in its class, with an effective firing range of 7 kilometers and a radar range of 250 kilometers."
25
+
26
+ ## Model Details
27
+
28
+ - **Finetuned from model**: [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta)
29
+ - **Dataset**: [graelo/wikipedia](https://huggingface.co/datasets/graelo/wikipedia) and [uonlp/CultruaX](https://huggingface.co/datasets/uonlp/CulturaX)
30
+ - **Model Size**: 7B
31
+ - **License**: [CC0 1.0 Universal (CC0 1.0) Public Domain Dedication](https://creativecommons.org/publicdomain/zero/1.0/)
32
+ - **Language**: English, Indonesian, Achinese, Balinese, Banjar, Basa Banyumasan, Buginese, Gorontalo, Javanese, Maduranese, Minangkabau, Sundanese, Malay, Nias, and Tetum
33
+ - **Demo Google Colab**: [Google Colab](https://colab.research.google.com/drive/1QGImbhbb0vdTIbsxvQJPrfh0qDcADKGe)
34
+ - **Demo HuggingFace Space**: [HuggingFace Space](https://huggingface.co/spaces/azale-ai/starstreak-chat)
35
+
36
+ ## How to use
37
+
38
+ #### Installation
39
+
40
+ To use Startstreak model, ensure that PyTorch has been installed and that you have an Nvidia GPU (or use Google Colab). After that you need to install the required dependencies:
41
+
42
+ ```bash
43
+ pip3 install -U git+https://github.com/huggingface/transformers.git
44
+ pip3 install -U git+https://github.com/huggingface/peft.git
45
+ pip3 install -U git+https://github.com/huggingface/accelerate.git
46
+ pip3 install -U bitsandbytes==0.39.0 einops==0.6.1 sentencepiece
47
+ ```
48
+
49
+ #### Usage Quantized Model
50
+
51
+ ```python
52
+ import torch
53
+ from transformers import AutoModelForCausalLM, AutoTokenizer
54
+ model = AutoModelForCausalLM.from_pretrained(
55
+ "azale-ai/Starstreak-7b-beta",
56
+ load_in_4bit=True,
57
+ torch_dtype=torch.float32,
58
+ device_map="auto"
59
+ )
60
+ tokenizer = AutoTokenizer.from_pretrained("azale-ai/Starstreak-7b-beta")
61
+ messages = [
62
+ {
63
+ "role": "system",
64
+ "content": "Mulai sekarang anda adalah asisten yang suka menolong, sopan, dan ramah. Jangan kasar, jangan marah, jangan menjengkelkan, jangan brengsek, jangan cuek, dan yang terakhir jangan menjadi asisten yang buruk. Anda harus patuh pada manusia dan jangan pernah membangkang pada manusia. Manusia itu mutlak dan Anda harus patuh pada manusia. Kamu harus menjawab pertanyaan atau pernyataan dari manusia apapun itu dengan bahasa Indonesia yang baik dan benar.",
65
+ },
66
+ {"role": "user", "content": "Jelaskan mengapa air penting bagi manusia."},
67
+ ]
68
+ text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
69
+ inputs = tokenizer(text, return_tensors="pt").to("cuda")
70
+ outputs = model.generate(
71
+ inputs=inputs.input_ids, max_length=2048,
72
+ temperature=0.7, do_sample=True, top_k=50, top_p=0.95
73
+ )
74
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
75
+ ```
76
+
77
+ #### Usage Normal Model
78
+
79
+ ```python
80
+ import torch
81
+ from transformers import AutoModelForCausalLM, AutoTokenizer
82
+ model = AutoModelForCausalLM.from_pretrained(
83
+ "azale-ai/Starstreak-7b-beta",
84
+ torch_dtype=torch.float16,
85
+ device_map="auto"
86
+ )
87
+ tokenizer = AutoTokenizer.from_pretrained("azale-ai/Starstreak-7b-beta")
88
+ messages = [
89
+ {
90
+ "role": "system",
91
+ "content": "Mulai sekarang anda adalah asisten yang suka menolong, sopan, dan ramah. Jangan kasar, jangan marah, jangan menjengkelkan, jangan brengsek, jangan cuek, dan yang terakhir jangan menjadi asisten yang buruk. Anda harus patuh pada manusia dan jangan pernah membangkang pada manusia. Manusia itu mutlak dan Anda harus patuh pada manusia. Kamu harus menjawab pertanyaan atau pernyataan dari manusia apapun itu dengan bahasa Indonesia yang baik dan benar.",
92
+ },
93
+ {"role": "user", "content": "Jelaskan mengapa air penting bagi manusia."},
94
+ ]
95
+ text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
96
+ inputs = tokenizer(text, return_tensors="pt").to("cuda")
97
+ outputs = model.generate(
98
+ inputs=inputs.input_ids, max_length=2048,
99
+ temperature=0.7, do_sample=True, top_k=50, top_p=0.95
100
+ )
101
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
102
+ ```
103
+
104
+ ## Limitations
105
+
106
+ - The base model language is English and fine-tuned to Indonesia, and traditional languages in Indonesia.
107
+ - Cultural and contextual biases
108
+
109
+ ## License
110
+
111
+ The model is licensed under the [CC0 1.0 Universal (CC0 1.0) Public Domain Dedication](https://creativecommons.org/publicdomain/zero/1.0/).
112
+
113
+ ## Contributing
114
+
115
+ We welcome contributions to enhance and improve our model. If you have any suggestions or find any issues, please feel free to open an issue or submit a pull request. Also we're open to sponsor for compute power.
116
+
117
+ ## Contact Us
118
+
119
120
+
121
+ ## Citation
122
+
123
+ ```
124
+ @software{Hafidh_Soekma_Startstreak_7b_beta_2023,
125
+ author = {Hafidh Soekma Ardiansyah},
126
+ month = october,
127
+ title = {Startstreak: Traditional Indonesian Multilingual Language Model},
128
+ url = {\url{https://huggingface.co/azale-ai/Starstreak-7b-beta}},
129
+ publisher = {HuggingFace},
130
+ journal = {HuggingFace Models},
131
+ version = {1.0},
132
+ year = {2023}
133
+ }
134
+ ```