tclf90
commited on
Commit
•
e624633
1
Parent(s):
c37fb83
'模型调优、量化损失微调优化'
Browse files- LICENSE +84 -0
- README.md +79 -0
- config.json +1 -1
- model-00001-of-00002.safetensors +2 -2
- model-00002-of-00002.safetensors +2 -2
LICENSE
ADDED
@@ -0,0 +1,84 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
The glm-4-9b License
|
2 |
+
|
3 |
+
1. 定义
|
4 |
+
|
5 |
+
“许可方”是指分发其软件的 glm-4-9b 模型团队。
|
6 |
+
“软件”是指根据本许可提供的 glm-4-9b 模型参数。
|
7 |
+
|
8 |
+
2. 许可授予
|
9 |
+
|
10 |
+
根据本许可的条款和条件,许可方特此授予您非排他性、全球性、不可转让、不可再许可、可撤销、免版税的版权许可。
|
11 |
+
本许可允许您免费使用本仓库中的所有开源模型进行学术研究,对于希望将模型用于商业目的的用户,需在[这里](https://open.bigmodel.cn/mla/form)完成登记。经过登记的用户可以免费使用本模型进行商业活动,但必须遵守本许可的所有条款和条件。
|
12 |
+
上述版权声明和本许可声明应包含在本软件的所有副本或重要部分中。
|
13 |
+
如果您分发或提供 THUDM / 智谱AI 关于 glm-4 开源模型的材料(或其任何衍生作品),或使用其中任何材料(包括 glm-4 系列的所有开源模型)的产品或服务,您应:
|
14 |
+
|
15 |
+
(A) 随任何此类 THUDM / 智谱AI 材料提供本协议的副本;
|
16 |
+
(B) 在相关网站、用户界面、博客文章、关于页面或产品文档上突出显示 “Built with glm-4”。
|
17 |
+
如果您使用 THUDM / 智谱AI的 glm-4 开源模型的材料来创建、训练、微调或以其他方式改进已分发或可用的 AI 模型,您还应在任何此类 AI 模型名称的开头添加 “glm-4”。
|
18 |
+
|
19 |
+
3. 限制
|
20 |
+
|
21 |
+
您不得出于任何军事或非法目的使用、复制、修改、合并、发布、分发、复制或创建本软件的全部或部分衍生作品。
|
22 |
+
您不得利用本软件从事任何危害国家安全和国家统一,危害社会公共利益及公序良俗,侵犯他人商业秘密、知识产权、名誉权、肖像权、财产权等权益的行为。
|
23 |
+
您在使用中应遵循使用地所适用的法律法规政策、道德规范等要求。
|
24 |
+
|
25 |
+
4. 免责声明
|
26 |
+
|
27 |
+
本软件“按原样”提供,不提供任何明示或暗示的保证,包括但不限于对适销性、特定用途的适用性和非侵权性的保证。
|
28 |
+
在任何情况下,作者或版权持有人均不对任何索赔、损害或其他责任负责,无论是在合同诉讼、侵权行为还是其他方面,由软件或软件的使用或其他交易引起、由软件引起或与之相关
|
29 |
+
软件。
|
30 |
+
|
31 |
+
5. 责任限制
|
32 |
+
|
33 |
+
除适用法律禁止的范围外,在任何情况下且根据任何法律理论,无论是基于侵权行为、疏忽、合同、责任或其他原因,任何许可方均不对您承担任何直接、间接、特殊、偶然、示范性、
|
34 |
+
或间接损害,或任何其他商业损失,即使许可人已被告知此类损害的可能性。
|
35 |
+
|
36 |
+
6. 争议解决
|
37 |
+
|
38 |
+
本许可受中华人民共和国法律管辖并按其解释。 因本许可引起的或与本许可有关的任何争议应提交北京市海淀区人民法院。
|
39 |
+
请注意,许可证可能会更新到更全面的版本。 有关许可和版权的任何问题,请通过 [email protected] 与我们联系。
|
40 |
+
|
41 |
+
1. Definitions
|
42 |
+
|
43 |
+
“Licensor” means the glm-4-9b Model Team that distributes its Software.
|
44 |
+
“Software” means the glm-4-9b model parameters made available under this license.
|
45 |
+
|
46 |
+
2. License
|
47 |
+
|
48 |
+
Subject to the terms and conditions of this License, Licensor hereby grants you a non-exclusive, worldwide, irrevocable, non-sublicensable, revocable, photo-free copyright license.
|
49 |
+
This license allows you to use all open source models in this repository for free for academic research. For users who wish to use the models for commercial purposes, please do so [here](https://open.bigmodel.cn/mla/form)
|
50 |
+
Complete registration. Registered users are free to use this model for commercial activities, but must comply with all terms and conditions of this license.
|
51 |
+
The copyright notice and this license notice shall be included in all copies or substantial portions of the Software.
|
52 |
+
If you distribute or provide THUDM / Zhipu AI materials on the glm-4 open source model (or any derivative works thereof), or products or services that use any materials therein (including all open source models of the glm-4 series), you should:
|
53 |
+
|
54 |
+
(A) Provide a copy of this Agreement with any such THUDM/Zhipu AI Materials;
|
55 |
+
(B) Prominently display "Built with glm-4" on the relevant website, user interface, blog post, related page or product documentation.
|
56 |
+
If you use materials from THUDM/Zhipu AI's glm-4 model to create, train, operate, or otherwise improve assigned or available AI models, you should also add "glm-4" to the beginning of any such AI model name.
|
57 |
+
|
58 |
+
3. Restrictions
|
59 |
+
|
60 |
+
You are not allowed to use, copy, modify, merge, publish, distribute, copy or create all or part of the derivative works of this software for any military or illegal purposes.
|
61 |
+
You are not allowed to use this software to engage in any behavior that endangers national security and unity, endangers social public interests and public order, infringes on the rights and interests of others such as trade secrets, intellectual property rights, reputation rights, portrait rights, and property rights.
|
62 |
+
You should comply with the applicable laws, regulations, policies, ethical standards, and other requirements in the place of use during use.
|
63 |
+
|
64 |
+
4. Disclaimer
|
65 |
+
|
66 |
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
|
67 |
+
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
68 |
+
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
|
69 |
+
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
70 |
+
|
71 |
+
5. Limitation of Liability
|
72 |
+
|
73 |
+
EXCEPT TO THE EXTENT PROHIBITED BY APPLICABLE LAW, IN NO EVENT AND UNDER NO LEGAL THEORY, WHETHER BASED IN TORT,
|
74 |
+
NEGLIGENCE, CONTRACT, LIABILITY, OR OTHERWISE WILL ANY LICENSOR BE LIABLE TO YOU FOR ANY DIRECT, INDIRECT, SPECIAL,
|
75 |
+
INCIDENTAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES, OR ANY OTHER COMMERCIAL LOSSES, EVEN IF THE LICENSOR HAS BEEN ADVISED
|
76 |
+
OF THE POSSIBILITY OF SUCH DAMAGES.
|
77 |
+
|
78 |
+
6. Dispute Resolution
|
79 |
+
|
80 |
+
This license shall be governed and construed in accordance with the laws of People’s Republic of China. Any dispute
|
81 |
+
arising from or in connection with this License shall be submitted to Haidian District People's Court in Beijing.
|
82 |
+
|
83 |
+
Note that the license is subject to update to a more comprehensive version. For any questions related to the license and
|
84 |
+
copyright, please contact us at [email protected].
|
README.md
ADDED
@@ -0,0 +1,79 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: GLM-4
|
3 |
+
license_name: glm-4-9b
|
4 |
+
license_link: LICENSE
|
5 |
+
pipeline_tag: text-generation
|
6 |
+
tags:
|
7 |
+
- chatglm
|
8 |
+
- gptq
|
9 |
+
- int4
|
10 |
+
- 量化修复
|
11 |
+
- vLLM
|
12 |
+
---
|
13 |
+
|
14 |
+
# GLM-4-9B-Chat-GPTQ-Int4-量化修复
|
15 |
+
原模型 [ZhipuAI/glm-4-9b-chat](https://www.modelscope.cn/models/ZhipuAI/glm-4-9b-chat)
|
16 |
+
|
17 |
+
|
18 |
+
### 【模型更新日期】
|
19 |
+
``` 2024-06-06 00:20 ```
|
20 |
+
|
21 |
+
### 【模型大小】
|
22 |
+
`6.9GB`
|
23 |
+
|
24 |
+
### 【06-06 情况告知】
|
25 |
+
|
26 |
+
1. 目前需要用vllm entrypoint的方式来启动模型。
|
27 |
+
2. 这个模型我也来回折腾了好几轮,不好量化。
|
28 |
+
原因在于原作者用了一个比较大胆的 `layernorm_epsilon: 1.5625e-07`。
|
29 |
+
这个数值下训练出来的模型,用`fp16/half`及难保质。
|
30 |
+
3. 模型现在已经校准的差不多了,目前以`2024-06-06 00:20`的版本为准。
|
31 |
+
4. 打比赛的同学,我更推荐使用int8模型 [GLM-4-9B-Chat-GPTQ-Int8-量化修复](https://www.modelscope.cn/models/tclf90/glm-4-9b-chat-GPTQ-Int8),这个模型更鲁棒。
|
32 |
+
|
33 |
+
### 【更新日志】
|
34 |
+
|
35 |
+
```
|
36 |
+
2004-06-06 00:20
|
37 |
+
1. 模型重新校准
|
38 |
+
2. 修复layernorm_epsilon数值不对的问题
|
39 |
+
3. 修复一些设备不能双卡运行的问题(可能不能完全解决)
|
40 |
+
|
41 |
+
2004-06-05 21:00
|
42 |
+
1. 尝试修复!!!感叹号吐字问题
|
43 |
+
2. group_size 调整为64,减少量化精度损失
|
44 |
+
```
|
45 |
+
|
46 |
+
### 【介绍】
|
47 |
+
|
48 |
+
GLM-4-9B 是智谱 AI 推出的最新一代预训练模型 GLM-4 系列中的开源版本。 在语义、数学、推理、代码和知识等多方面的数据集测评中,GLM-4-9B 及其人类偏好对齐的版本 GLM-4-9B-Chat 均表现出较高的性能。 除了能进行多轮对话,GLM-4-9B-Chat 还具备网页浏览、代码执行、自定义工具调用(Function Call)和长文本推理(支持最大 128K 上下文)等高级功能。 本代模型增加了多语言支持,支持包括日语,韩语,德语在内的 26 种语言。我们还推出了支持 1M 上下文长度(约 200 万中文字符)的模型。
|
49 |
+
|
50 |
+
[更多详情...](https://www.modelscope.cn/models/ZhipuAI/glm-4-9b-chat/summary)
|
51 |
+
|
52 |
+
### 【量化修复】
|
53 |
+
|
54 |
+
调优了现有 `AWQ` 与 `GPTQ` 量化算法的量化策略。带有`量化修复`标签的`Int3`模型,可以比肩默认`AWQ`与`GPTQ`算法的`Int8`模型的能力。
|
55 |
+
|
56 |
+
1. 量化修复可以极大减少模型的`1.乱吐字`、`2.无限循环`、`3.长文能力丢失`等量化损失造成的模型不可用的情况。
|
57 |
+
|
58 |
+
2. 调优后的量化模型,`AWQ`与`GPTQ`模型在能力上没有表现出明显区别。同时考虑到`GPTQ`的`vLLM`引擎的并发推理效率最好,所以不再制作`AWQ`模型。
|
59 |
+
|
60 |
+
3. 待工作完成后补充...
|
61 |
+
|
62 |
+
### 【同期量化修复模型】
|
63 |
+
待工作完成后补充...
|
64 |
+
|
65 |
+
### 【模型下载】
|
66 |
+
```python
|
67 |
+
from modelscope import snapshot_download
|
68 |
+
model_dir = snapshot_download('tclf90/模型名', cache_dir="本地路径")
|
69 |
+
```
|
70 |
+
|
71 |
+
### 【[vLLM](https://github.com/vllm-project/vllm)推理(目前仅限Linux)】
|
72 |
+
#### 1. Python 简易调试
|
73 |
+
|
74 |
+
待工作完成后补充...
|
75 |
+
|
76 |
+
#### 2. 类ChatGPT RESTFul API Server
|
77 |
+
```
|
78 |
+
>>> python -m vllm.entrypoints.openai.api_server --model 本地路径/tclf90/模型名称
|
79 |
+
```
|
config.json
CHANGED
@@ -28,7 +28,7 @@
|
|
28 |
"hidden_dropout": 0.0,
|
29 |
"hidden_size": 4096,
|
30 |
"kv_channels": 128,
|
31 |
-
"layernorm_epsilon":
|
32 |
"model_type": "chatglm",
|
33 |
"multi_query_attention": true,
|
34 |
"multi_query_group_num": 2,
|
|
|
28 |
"hidden_dropout": 0.0,
|
29 |
"hidden_size": 4096,
|
30 |
"kv_channels": 128,
|
31 |
+
"layernorm_epsilon": 2e-06,
|
32 |
"model_type": "chatglm",
|
33 |
"multi_query_attention": true,
|
34 |
"multi_query_group_num": 2,
|
model-00001-of-00002.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:608cf244087fcfd5906dc2fd7d1aadf50c1af0b0200367233041b5dfe79e55bb
|
3 |
+
size 4995499776
|
model-00002-of-00002.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:11781ced3a34947f3d4872aad2ca650488cbff962d825d5b1f192e9a3be0628b
|
3 |
+
size 1893310824
|