sophosympatheia commited on
Commit
fed9415
1 Parent(s): e4b31b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -6
README.md CHANGED
@@ -6,9 +6,25 @@ tags:
6
  - merge
7
 
8
  ---
9
- # midnight-miqu-103b-v1.1
 
 
10
 
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
 
13
  ## Merge Details
14
  ### Merge Method
@@ -18,7 +34,7 @@ This model was merged using the passthrough merge method.
18
  ### Models Merged
19
 
20
  The following models were included in the merge:
21
- * /home/llm/mergequant/models/midnight-miqu-70b-slerp-v0.2
22
 
23
  ### Configuration
24
 
@@ -27,13 +43,13 @@ The following YAML configuration was used to produce this model:
27
  ```yaml
28
  slices:
29
  - sources:
30
- - model: /home/llm/mergequant/models/midnight-miqu-70b-slerp-v0.2
31
  layer_range: [0, 40] # 40
32
  - sources:
33
- - model: /home/llm/mergequant/models/midnight-miqu-70b-slerp-v0.2
34
  layer_range: [20, 60] # 40
35
  - sources:
36
- - model: /home/llm/mergequant/models/midnight-miqu-70b-slerp-v0.2
37
  layer_range: [40, 80] # 40
38
  merge_method: passthrough
39
  dtype: float16
 
6
  - merge
7
 
8
  ---
9
+ <div style="width: auto; margin-left: auto; margin-right: auto">
10
+ <img src="https://i.imgur.com/Tn9MBg6.png" alt="MidnightMiqu" style="width: 100%; min-width: 400px; display: block; margin: auto;">
11
+ </div>
12
 
13
+ ### Overview
14
+
15
+ This is a 103B frankenmerge of [sophosympatheia/Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0) with itself. Please see that model card for details and usage instructions.
16
+ This model is based on Miqu so it's capable of 32K context.
17
+
18
+ ### Quantizations
19
+ * Pending
20
+ * If you don't see something you're looking for, [try searching Hugging Face](https://huggingface.co/models?search=midnight-miqu-103b). There may be newer quants available than what I've documented here.
21
+
22
+ ### Licence and usage restrictions
23
+
24
+ <font color="red">152334H/miqu-1-70b-sf was based on a leaked version of one of Mistral's models.</font>
25
+ All miqu-derived models, including this merge, are **only suitable for personal use.** Mistral has been cool about it so far, but you should be aware that by downloading this merge you are assuming whatever legal risk is iherent in acquiring and using a model based on leaked weights.
26
+ This merge comes with no warranties or guarantees of any kind, but you probably already knew that.
27
+ I am not a lawyer and I do not profess to know what we have gotten ourselves into here. You should consult with a lawyer before using any Hugging Face model beyond private use... but definitely don't use this one for that!
28
 
29
  ## Merge Details
30
  ### Merge Method
 
34
  ### Models Merged
35
 
36
  The following models were included in the merge:
37
+ * [sophosympatheia/Midnight-Miqu-70B-v1.0](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0)
38
 
39
  ### Configuration
40
 
 
43
  ```yaml
44
  slices:
45
  - sources:
46
+ - model: /home/llm/mergequant/models/midnight-miqu-70b
47
  layer_range: [0, 40] # 40
48
  - sources:
49
+ - model: /home/llm/mergequant/models/midnight-miqu-70b
50
  layer_range: [20, 60] # 40
51
  - sources:
52
+ - model: /home/llm/mergequant/models/midnight-miqu-70b
53
  layer_range: [40, 80] # 40
54
  merge_method: passthrough
55
  dtype: float16