louisbrulenaudet commited on
Commit
11ab34e
1 Parent(s): 5d87d74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -7
README.md CHANGED
@@ -4,21 +4,54 @@ language:
4
  - en
5
  library_name: transformers
6
  pipeline_tag: text-generation
7
- ---
8
- 9o
9
- ---
10
- license:
11
- base_model:
12
  tags:
13
  - merge
14
  - mergekit
15
- - lazymergekit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  ---
17
 
 
 
18
  # Maxine-34B-stock
19
 
20
  Maxine-34B-stock is a merge of the following models:
21
 
 
 
22
  ## Configuration
23
 
24
  ```yaml
@@ -53,4 +86,21 @@ pipeline = transformers.pipeline(
53
 
54
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
55
  print(outputs[0]["generated_text"])
56
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  - en
5
  library_name: transformers
6
  pipeline_tag: text-generation
 
 
 
 
 
7
  tags:
8
  - merge
9
  - mergekit
10
+ - ConvexAI/Luminex-34B-v0.2
11
+ - fblgit/UNA-34BeagleSimpleMath-32K-v1
12
+ - chemistry
13
+ - biology
14
+ - math
15
+ base_model:
16
+ - ConvexAI/Luminex-34B-v0.2
17
+ - fblgit/UNA-34BeagleSimpleMath-32K-v1
18
+ model-index:
19
+ - name: Maxine-34B-stock
20
+ results:
21
+ - task:
22
+ type: text-generation
23
+ metrics:
24
+ - name: Average
25
+ type: Average
26
+ value: 77.28
27
+ - name: ARC
28
+ type: ARC
29
+ value: 74.06
30
+ - name: GSM8K
31
+ type: GSM8K
32
+ value: 72.18
33
+ - name: Winogrande
34
+ type: Winogrande
35
+ value: 83.9
36
+ - name: TruthfulQA
37
+ type: TruthfulQA
38
+ value: 70.18
39
+ - name: HellaSwag
40
+ type: HellaSwag
41
+ value: 86.74
42
+ source:
43
+ name: Open LLM Leaderboard
44
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
45
  ---
46
 
47
+ <center><img src='https://i.imgur.com/dU9dUh0.png' width='500px'></center>
48
+
49
  # Maxine-34B-stock
50
 
51
  Maxine-34B-stock is a merge of the following models:
52
 
53
+ **04-07-2024 - To date, louisbrulenaudet/Maxine-34B-stock is the "Best 🤝 base merges and moerges model of around 30B" on the Open LLM Leaderboard.**
54
+
55
  ## Configuration
56
 
57
  ```yaml
 
86
 
87
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
88
  print(outputs[0]["generated_text"])
89
+ ```
90
+
91
+ ## Citing & Authors
92
+
93
+ If you use this code in your research, please use the following BibTeX entry.
94
+
95
+ ```BibTeX
96
+ @misc{louisbrulenaudet2024,
97
+ author = {Louis Brulé Naudet},
98
+ title = {Maxine-34B-stock, an xtraordinary 34B model},
99
+ year = {2024}
100
+ howpublished = {\url{https://huggingface.co/louisbrulenaudet/Maxine-34B-stock}},
101
+ }
102
+ ```
103
+
104
+ ## Feedback
105
+
106
+ If you have any feedback, please reach out at [[email protected]](mailto:[email protected]).