dahara1 commited on
Commit
75a727e
1 Parent(s): 9f519b7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +94 -3
README.md CHANGED
@@ -6,14 +6,105 @@ tags:
6
  - RyzenAI
7
  ---
8
 
9
- LLama 3.1 model specialized for translation tasks.
10
 
11
- Currently supports translation between Japanese, English, French, and Chinese.
 
 
12
 
13
  For RyzenAI Driver Version 1.2 users. I have prepared the 4_0 version.
14
 
15
 
16
- sample prompt
 
17
  ```
18
  .llma-cli -m ./llama-translate-Q4_0.gguf -e -n 400 -p "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\nYou are a highly skilled professional translator.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\n### Instruction:\nTranslate Japanese to Mandarin.\n\n### Input:\\n生成AIは近年、その活用用途に広がりを見せている。企業でも生成AIを取り入れようとする動きが高まっており、人手不足の現状を打破するための生産性向上への>活用や、ビジネスチャンス創出が期待されているが、実際に活用している企業はどれほどなのだろうか。帝国データバンクが行った「現在の生成AIの活用状況について>調査」の結果を見ると、どうやらまだまだ生成AIは普及していないようだ。\n\n### Response\n<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n"
19
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  - RyzenAI
7
  ---
8
 
9
+ ## llama-translate
10
 
11
+ llama-translate is LLama 3.1 based model specialized for translation tasks.
12
+
13
+ Currently supports translation between Japanese, English, French, and Chinese(Mandarin).
14
 
15
  For RyzenAI Driver Version 1.2 users. I have prepared the 4_0 version.
16
 
17
 
18
+ ## sample prompt
19
+
20
  ```
21
  .llma-cli -m ./llama-translate-Q4_0.gguf -e -n 400 -p "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\nYou are a highly skilled professional translator.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\n### Instruction:\nTranslate Japanese to Mandarin.\n\n### Input:\\n生成AIは近年、その活用用途に広がりを見せている。企業でも生成AIを取り入れようとする動きが高まっており、人手不足の現状を打破するための生産性向上への>活用や、ビジネスチャンス創出が期待されているが、実際に活用している企業はどれほどなのだろうか。帝国データバンクが行った「現在の生成AIの活用状況について>調査」の結果を見ると、どうやらまだまだ生成AIは普及していないようだ。\n\n### Response\n<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n"
22
  ```
23
+ In Windows, please use the [llama-server command](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) because it may not be possible to enter Japanese or Chinese characters in Windows CMD.
24
+
25
+ ## For example.
26
+ server start command
27
+ ```
28
+ .\llama.cpp\build\bin\Release\llama-server -m .\llama-translate.f16.Q4_K_M.gguf -c 2048
29
+ ```
30
+
31
+ client script example
32
+ ```
33
+ import torch
34
+ import psutil
35
+ import requests
36
+ import json
37
+ import subprocess
38
+
39
+ def translation(instruction, input_text):
40
+ system = """<|start_header_id|>system<|end_header_id|>\nYou are a highly skilled professional translator. You are a native speaker of English, Japanese, French and Mandarin. Translate the given text accurately, taking into account the context and specific instructions provided. Steps may include hints enclosed in square brackets [] with the key and value separated by a colon:. If no additional instructions or context are provided, use your expertise to consider what the most appropriate context is and provide a natural translation that aligns with that context. When translating, strive to faithfully reflect the meaning and tone of the original text, pay attention to cultural nuances and differences in language usage, and ensure that the translation is grammatically correct and easy to read. For technical terms and proper nouns, either leave them in the original language or use appropriate translations as necessary. Take a deep breath, calm down, and start translating.<|eot_id|><|start_header_id|>user<|end_header_id|>"""
41
+
42
+ prompt = f"""{system}
43
+ ### Instruction:
44
+ {instruction}
45
+
46
+ ### Input:
47
+ {input_text}
48
+
49
+ ### Response:
50
+ <|eot_id|><|start_header_id|>assistant<|end_header_id|>
51
+ """
52
+
53
+ # Prepare the payload for the POST request
54
+ payload = {
55
+ "prompt": prompt,
56
+ "n_predict": 128 # Adjust this parameter as needed
57
+ }
58
+
59
+ # Define the URL and headers for the POST request
60
+ url = "http://localhost:8080/completion"
61
+ headers = {
62
+ "Content-Type": "application/json"
63
+ }
64
+
65
+ # Send the POST request and capture the response
66
+ response = requests.post(url, headers=headers, data=json.dumps(payload))
67
+ # print(response)
68
+ # print( response.json() )
69
+
70
+
71
+ # Check if the request was successful
72
+ if response.status_code != 200:
73
+ print(f"Error: {response.text}")
74
+ return None
75
+
76
+ # Parse the response JSON
77
+ response_data = response.json()
78
+
79
+ # Extract the 'content' field from the response
80
+ response_content = response_data.get('content', '').strip()
81
+
82
+ return response_content
83
+
84
+
85
+ if __name__ == "__main__":
86
+ translated_line = translation(f"Translate Japanese to English.", "アメリカ代表が怒涛の逆転劇で五輪5連覇に王手…セルビア下し開催国フランス代表との決勝へ")
87
+ print(translated_line)
88
+
89
+ translated_line = translation(f"Translate Japanese to Mandarin.", "石川佳純さんの『中国語インタビュー』に視聴者驚き…卓球女子の中国選手から笑顔引き出し、最後はハイタッチ「めちゃ仲良し」【パリオリンピック】")
90
+ print(translated_line)
91
+
92
+ translated_line = translation(f"Translate Japanese to French.", "開催国フランス すでに史上最多のメダル数に パリオリンピック")
93
+ print(translated_line)
94
+
95
+ translated_line = translation(f"Translate English to Japanese.", "U.S. Women's Volleyball Will Try For Back-to-Back Golds After Defeating Rival Brazil in Five-Set Thriller")
96
+ print(translated_line)
97
+
98
+ translated_line = translation(f"Translate Mandarin to Japanese.", "2024巴黎奥运中国队一日三金!举重双卫冕,花���历史首金,女曲再创辉煌")
99
+ print(translated_line)
100
+
101
+ translated_line = translation(f"Translate Mandarin to Japanese.", "2024巴黎奥运中国队一日三金!举重双卫冕,花游历史首金,女曲再创辉煌")
102
+ print(translated_line)
103
+
104
+ translated_line = translation(f"Translate French to Japanese.", "Handball aux JO 2024 : Laura Glauser et Hatadou Sako, l’assurance tous risques de l’équipe de France")
105
+ print(translated_line)
106
+
107
+ ```
108
+
109
+
110
+