diff --git a/ptuning/README.md b/ptuning/README.md new file mode 100644 index 0000000000000000000000000000000000000000..e3339ce65fc91dd46c027f36d4038e97355cb535 --- /dev/null +++ b/ptuning/README.md @@ -0,0 +1,248 @@ +# ChatGLM-6B-PT +本仓库实现了对于 ChatGLM-6B 模型基于 [P-Tuning v2](https://github.com/THUDM/P-tuning-v2) 的微调。P-Tuning v2 将需要微调的参数量减少到原来的 0.1%,再通过模型量化、Gradient Checkpoint 等方法,最低只需要 7GB 显存即可运行。 + +下面以 [ADGEN](https://aclanthology.org/D19-1321.pdf) (广告生成) 数据集为例介绍代码的使用方法。 + +*Read this in [English](README_en.md).* + +## 软件依赖 +运行微调需要4.27.1版本的`transformers`。除 ChatGLM-6B 的依赖之外,还需要安装以下依赖 +``` +pip install rouge_chinese nltk jieba datasets +``` +## 使用方法 + +### 下载数据集 +ADGEN 数据集任务为根据输入(content)生成一段广告词(summary)。 + +```json +{ + "content": "类型#上衣*版型#宽松*版型#显瘦*图案#线条*衣样式#衬衫*衣袖型#泡泡袖*衣款式#抽绳", + "summary": "这件衬衫的款式非常的宽松,利落的线条可以很好的隐藏身材上的小缺点,穿在身上有着很好的显瘦效果。领口装饰了一个可爱的抽绳,漂亮的绳结展现出了十足的个性,配合时尚的泡泡袖型,尽显女性甜美可爱的气息。" +} +``` + +从 [Google Drive](https://drive.google.com/file/d/13_vf0xRTQsyneRKdD1bZIr93vBGOczrk/view?usp=sharing) 或者 [Tsinghua Cloud](https://cloud.tsinghua.edu.cn/f/b3f119a008264b1cabd1/?dl=1) 下载处理好的 ADGEN 数据集,将解压后的 `AdvertiseGen` 目录放到本目录下。 + +### 训练 + +#### P-tuning v2 + +运行以下指令进行训练: +```shell +bash train.sh +``` +`train.sh` 中的 `PRE_SEQ_LEN` 和 `LR` 分别是 soft prompt 长度和训练的学习率,可以进行调节以取得最佳的效果。P-Tuning-v2 方法会冻结全部的模型参数,可通过调整 `quantization_bit` 来被原始模型的量化等级,不加此选项则为 FP16 精度加载。 + +在默认配置 `quantization_bit=4`、`per_device_train_batch_size=1`、`gradient_accumulation_steps=16` 下,INT4 的模型参数被冻结,一次训练迭代会以 1 的批处理大小进行 16 次累加的前后向传播,等效为 16 的总批处理大小,此时最低只需 6.7G 显存。若想在同等批处理大小下提升训练效率,可在二者乘积不变的情况下,加大 `per_device_train_batch_size` 的值,但也会带来更多的显存消耗,请根据实际情况酌情调整。 + +如果你想要[从本地加载模型](https://github.com/THUDM/ChatGLM-6B#%E4%BB%8E%E6%9C%AC%E5%9C%B0%E5%8A%A0%E8%BD%BD%E6%A8%A1%E5%9E%8B),可以将 `train.sh` 中的 `THUDM/chatglm-6b` 改为你本地的模型路径。 + +#### Finetune + +如果需要进行全参数的 Finetune,需要安装 [Deepspeed](https://github.com/microsoft/DeepSpeed),然后运行以下指令: + +```shell +bash ds_train_finetune.sh +``` + +### 推理 + +将 `evaluate.sh` 中的 `CHECKPOINT` 更改为训练时保存的 checkpoint 名称,运行以下指令进行模型推理和评测: +```shell +bash evaluate.sh +``` +**[2023/04/10更新]** 在 P-tuning v2 训练时模型只保存 PrefixEncoder 部分的参数,所以在推理时需要同时加载原 ChatGLM-6B 模型以及 PrefixEncoder 的权重,因此需要指定参数(已更新 `evaluate.sh`) : + +```shell +--model_name_or_path THUDM/chatglm-6b +--ptuning_checkpoint $CHECKPOINT_PATH +``` + +仍然兼容旧版全参保存的 Checkpoint,只需要跟之前一样设定 `model_name_or_path`: + +```shell +--model_name_or_path $CHECKPOINT_PATH +``` + +评测指标为中文 Rouge score 和 BLEU-4。生成的结果保存在 +`./output/adgen-chatglm-6b-pt-8-1e-2/generated_predictions.txt`。 + +### 例子 +#### 示例1 +* Input: 类型#上衣\*材质#牛仔布\*颜色#白色\*风格#简约\*图案#刺绣\*衣样式#外套\*衣款式#破洞 +* Label: 简约而不简单的牛仔外套,白色的衣身十分百搭。衣身多处有做旧破洞设计,打破单调乏味,增加一丝造型看点。衣身后背处有趣味刺绣装饰,丰富层次感,彰显别样时尚。 +* Output[微调前]: 这件上衣的材质是牛仔布,颜色是白色,风格是简约,图案是刺绣,衣样式是外套,衣款式是破洞。 +* Output[微调后]: 这是一款简约的牛仔外套,破洞设计,将牛仔布破洞,带来一种随意与个性。破洞的牛仔外套,展现出时尚气息,带来一种休闲感。同时,刺绣图案,让整件外套更加立体。 + +#### 示例2 + +* Input: 类型#裙\*版型#显瘦\*风格#文艺\*风格#简约\*图案#印花\*图案#撞色\*裙下摆#压褶\*裙长#连衣裙\*裙领型#圆领 +* Label: 文艺个性的印花连衣裙,藏青色底蕴,低调又大气,撞色太阳花分布整个裙身,绚丽而美好,带来时尚减龄的气质。基础款的舒适圆领,简约不失大方,勾勒精致脸庞。领后是一粒包布扣固定,穿脱十分方便。前片立体的打褶设计,搭配后片压褶的做工,增添层次和空间感,显瘦又有型。 +* Output[微调前]: 类型#裙*版型#显瘦*风格#文艺*风格#简约*图案#印花*图案#撞色*裙下摆#压褶*裙长#连衣裙*裙领型#圆领 1\. 连衣裙:简约风格,裙长为膝盖以上,裙领型为圆领。2\. 裙下摆:压褶设计,使裙摆呈现出流畅的褶皱效果。3\. 裙领型:裙领型为圆领,使穿上连衣裙后更加有型。4\. 版型:采用显瘦设计,让连衣裙看起来更加苗条。5\. 风格:文艺风格,让连衣裙更加有内涵和品味。6\. 图案:印花设计,在连衣裙上印有独特的图案。7\. 撞色:采用撞色设计,让连衣裙在色彩上更加鲜明、富有层次感。 +* Output[微调后]: 这是一款文艺范的连衣裙,以印花为元素,采用简约的印花,既能够突出文艺气质,又能够展现简约风。在印花的同时又有领子和裙摆的压褶设计,更加凸显文艺气质。简约而不会过于单调,搭配出街,穿着十分舒适。 + +### 评估结果 + +| | Finetune | P-tuning v2 | LoRA | +| ------------- | ----------- | ----- | ------------- | +| BLEU-4 | 8.01 | 8.10 | 7.62 | +| Rouge-1 | 31.23 | 31.12 | 30.60 | +| Rouge-2 | 7.36 | 7.11 | 6.96 | +| Rouge-l | 25.08 | 24.97 | 24.80 | +| Training Loss | 3.00 | 3.74 | 3.32 | + + + +#### 实验设置 + + ``` +max_source_length=64 +max_target_length=64 +max_steps=3000 + ``` + +##### P-tuning v2 + +``` +pre_seq_len=128 +learning_rate=2e-2 +quantization_bit=4 +per_device_train_batch_size=16 +gradient_accumulation_steps=1 +``` + +##### Finetune + +``` +learning_rate=1e-4 +fp16 +num_gpus=4 +per_device_train_batch_size=4 +gradient_accumulation_steps=1 +``` + +##### LoRA + +实现采用的是 [simple_thu_chatglm6b](https://github.com/yuanzhoulvpi2017/zero_nlp/tree/main/simple_thu_chatglm6b) + +``` +learning_rate=5e-4 +per_device_train_batch_size=16 +gradient_accumulation_steps=1 +``` + + + +## 模型部署 +首先载入Tokenizer: + +```python +import os +import torch +from transformers import AutoConfig, AutoModel, AutoTokenizer + +# 载入Tokenizer +tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) +``` + +1. 如果需要加载的是新 Checkpoint(只包含 PrefixEncoder 参数): + +```python +config = AutoConfig.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True, pre_seq_len=128) +model = AutoModel.from_pretrained("THUDM/chatglm-6b", config=config, trust_remote_code=True) +prefix_state_dict = torch.load(os.path.join(CHECKPOINT_PATH, "pytorch_model.bin")) +new_prefix_state_dict = {} +for k, v in prefix_state_dict.items(): + if k.startswith("transformer.prefix_encoder."): + new_prefix_state_dict[k[len("transformer.prefix_encoder."):]] = v +model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict) +``` +注意你可能需要将 `pre_seq_len` 改成你训练时的实际值。如果你是[从本地加载模型](https://github.com/THUDM/ChatGLM-6B#%E4%BB%8E%E6%9C%AC%E5%9C%B0%E5%8A%A0%E8%BD%BD%E6%A8%A1%E5%9E%8B)的话,需要将 `THUDM/chatglm-6b` 改成本地的模型路径(注意不是checkpoint路径)。 + +2. 如果需要加载的是旧 Checkpoint(包含 ChatGLM-6B 以及 PrefixEncoder 参数),或者进行的是全参数微调,则直接加载整个 Checkpoint: + +```python +model = AutoModel.from_pretrained(CHECKPOINT_PATH, trust_remote_code=True) +``` + +之后根据需求可以进行量化,也可以直接使用: + +```python +# Comment out the following line if you don't use quantization +model = model.quantize(4) +model = model.half().cuda() +model.transformer.prefix_encoder.float() +model = model.eval() + +response, history = model.chat(tokenizer, "你好", history=[]) +``` + +## 使用自己的数据集 +修改 `train.sh` 和 `evaluate.sh` 中的 `train_file`、`validation_file`和`test_file`为你自己的 JSON 格式数据集路径,并将 `prompt_column` 和 `response_column` 改为 JSON 文件中输入文本和输出文本对应的 KEY。可能还需要增大 `max_source_length` 和 `max_target_length` 来匹配你自己的数据集中的最大输入输出长度。 + +## 对话数据集 + +如需要使用多轮对话数据对模型进行微调,可以提供聊天历史,例如 + +```json +{ + "prompt": "是的。上下水管都好的", + "response": "那就要检查线路了,一般风扇继电器是由电脑控制吸合的,如果电路存在断路,或者电脑坏了的话会出现继电器不吸合的情况!", + "history": [ + [ + "长城h3风扇不转。继电器好的。保险丝好的传感器新的风扇也新的这是为什么。就是继电器缺一个信号线", + "用电脑能读数据流吗?水温多少" + ], + [ + "95", + "上下水管温差怎么样啊?空气是不是都排干净了呢?" + ] + ] +} +``` + +训练时需要指定 `--history_column` 为数据中聊天历史的 key(在此例子中是 `history`),将自动把聊天历史拼接,例如: + +- Input + + ``` + [Round 0] + 问:长城h3风扇不转。继电器好的。保险丝好的传感器新的风扇也新的这是为什么。就是继电器缺一个信号线 + 答:用电脑能读数据流吗?水温多少 + [Round 1] + 问:95 + 答:上下水管温差怎么样啊?空气是不是都排干净了呢? + [Round 2] + 问:是的。上下水管都好的 + 答: + ``` + +- Label + + ``` + 那就要检查线路了,一般风扇继电器是由电脑控制吸合的,如果电路存在断路,或者电脑坏了的话会出现继电器不吸合的情况! + ``` + +要注意超过输入长度 `max_source_length` 的内容会被截。 + +可以参考以下指令: + +```shell +bash train_chat.sh +``` + +## 引用 + +``` +@inproceedings{liu2022p, + title={P-tuning: Prompt tuning can be comparable to fine-tuning across scales and tasks}, + author={Liu, Xiao and Ji, Kaixuan and Fu, Yicheng and Tam, Weng and Du, Zhengxiao and Yang, Zhilin and Tang, Jie}, + booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)}, + pages={61--68}, + year={2022} +} +``` + + + diff --git a/ptuning/README_en.md b/ptuning/README_en.md new file mode 100644 index 0000000000000000000000000000000000000000..9282da32c467eb17316c05b65a5522f99d149340 --- /dev/null +++ b/ptuning/README_en.md @@ -0,0 +1,115 @@ +# ChatGLM-6B-PT +This repository implements tuning of the ChatGLM-6B model based on [P-Tuning v2](https://github.com/THUDM/P-tuning-v2). P-Tuning v2 reduces the amount of parameters that need to be optimized to 0.1% of the full fine-tuning, and then through model quantization, Gradient Checkpoint and other methods, it only needs a minimum of 7GB of video memory to run. + +The following uses the [ADGEN](https://aclanthology.org/D19-1321.pdf) (advertising generation) dataset as an example to introduce how to use the code. + +## Software dependencies +Running p-tuning requires version 4.27.1 of `transformers`. In addition to the dependencies of ChatGLM-6B, the following dependencies are required +``` +pip install rouge_chinese nltk jieba datasets +``` +## Instructions + +### Download the dataset +The task of the ADGEN dataset is to generate an advertisement word (summary) based on the input (content). + +```json +{ + "content": "类型#上衣*版型#宽松*版型#显瘦*图案#线条*衣样式#衬衫*衣袖型#泡泡袖*衣款式#抽绳", + "summary": "这件衬衫的款式非常的宽松,利落的线条可以很好的隐藏身材上的小缺点,穿在身上有着很好的显瘦效果。领口装饰了一个可爱的抽绳,漂亮的绳结展现出了十足的个性,配合时尚的泡泡袖型,尽显女性甜美可爱的气息。" +} +``` + +From [Google Drive](https://drive.google.com/file/d/13_vf0xRTQsyneRKdD1bZIr93vBGOczrk/view?usp=sharing) or [Tsinghua Cloud](https://cloud.tsinghua.edu.cn/f/b3f119a008264b1cabd1/?dl=1) Download the processed ADGEN dataset, and put the decompressed `AdvertiseGen` directory into this directory. + +### Training +Run the following commands for training: +```shell +bash train.sh +``` +`PRE_SEQ_LEN` and `LR` in `train.sh` are soft prompt length and training learning rate respectively, which can be adjusted to achieve the best results. The P-Tuning-v2 method will freeze all model parameters, and the quantization level of the original model can be adjusted by adjusting `quantization_bit`. If this option is not added, it will be loaded with FP16 precision. + +Under the default configuration of `per_device_train_batch_size=1`, `gradient_accumulation_steps=16`, the model parameters of INT4 are frozen, and a training iteration will perform 16 cumulative forward and backward propagations with a batch size of 1, which is equivalent to the total batch size of 16, and only 6.7G GPU memory is required at this time with `quantization_bit=4`. If you want to improve the training efficiency under the same batch size, you can increase the value of `per_device_train_batch_size` while keeping the product of the two unchanged, but it will also bring more GPU memory consumption, please adjust it according to the actual situation. + +### Inference + +Change `CHECKPOINT` in `evaluate.sh` to the checkpoint name saved during training, and run the following commands for model inference and evaluation: +```shell +bash evaluate.sh +``` + +The evaluation indicators are Chinese Rouge score and BLEU-4. The generated results are saved in +`./output/adgen-chatglm-6b-pt-8-1e-2/generated_predictions.txt`. + +### Example +#### Example 1 +* Input: 类型#上衣\*材质#牛仔布\*颜色#白色\*风格#简约\*图案#刺绣\*衣样式#外套\*衣款式#破洞 +* Label: 简约而不简单的牛仔外套,白色的衣身十分百搭。衣身多处有做旧破洞设计,打破单调乏味,增加一丝造型看点。衣身后背处有趣味刺绣装饰,丰富层次感,彰显别样时尚。 +* Output[微调前]: 这件上衣的材质是牛仔布,颜色是白色,风格是简约,图案是刺绣,衣样式是外套,衣款式是破洞。 +* Output[微调后]: 这是一款简约的牛仔外套,破洞设计,将牛仔布破洞,带来一种随意与个性。破洞的牛仔外套,展现出时尚气息,带来一种休闲感。同时,刺绣图案,让整件外套更加立体。 + +#### Example 2 + +* Input: 类型#裙\*版型#显瘦\*风格#文艺\*风格#简约\*图案#印花\*图案#撞色\*裙下摆#压褶\*裙长#连衣裙\*裙领型#圆领 +* Label: 文艺个性的印花连衣裙,藏青色底蕴,低调又大气,撞色太阳花分布整个裙身,绚丽而美好,带来时尚减龄的气质。基础款的舒适圆领,简约不失大方,勾勒精致脸庞。领后是一粒包布扣固定,穿脱十分方便。前片立体的打褶设计,搭配后片压褶的做工,增添层次和空间感,显瘦又有型。 +* Output[微调前]: 类型#裙*版型#显瘦*风格#文艺*风格#简约*图案#印花*图案#撞色*裙下摆#压褶*裙长#连衣裙*裙领型#圆领 1\. 连衣裙:简约风格,裙长为膝盖以上,裙领型为圆领。2\. 裙下摆:压褶设计,使裙摆呈现出流畅的褶皱效果。3\. 裙领型:裙领型为圆领,使穿上连衣裙后更加有型。4\. 版型:采用显瘦设计,让连衣裙看起来更加苗条。5\. 风格:文艺风格,让连衣裙更加有内涵和品味。6\. 图案:印花设计,在连衣裙上印有独特的图案。7\. 撞色:采用撞色设计,让连衣裙在色彩上更加鲜明、富有层次感。 +* Output[微调后]: 这是一款文艺范的连衣裙,以印花为元素,采用简约的印花,既能够突出文艺气质,又能够展现简约风。在印花的同时又有领子和裙摆的压褶设计,更加凸显文艺气质。简约而不会过于单调,搭配出街,穿着十分舒适。 + +### evaluation result + +| | P-tuning v2 | LoRA | +| ------- | ----------- | ----- | +| BLEU-4 | 7.71 | 6.13 | +| Rouge-1 | 31.35 | 28.36 | +| Rouge-2 | 7.19 | 4.38 | +| Rouge-l | 25.17 | 17.54 | + +#### Experiment Settings + + ``` +max_source_length=64 +max_target_length=64 +per_device_train_batch_size=1 +gradient_accumulation_steps=16 +max_steps=3000 + ``` + +##### P-tuning v2 + +``` +pre_seq_len=128 +learning_rate=2e-2 +quantization_bit=4 +``` + +##### LoRA + +``` +learning_rate=5e-4 +``` + +The implementation uses [simple_thu_chatglm6b](https://github.com/yuanzhoulvpi2017/zero_nlp/tree/main/simple_thu_chatglm6b) + + + +## Model Deployment +Replace `THUDM/chatglm-6b` in the corresponding demo or code with the path of the checkpoint after P-Tuning(in the example, `./output/adgen-chatglm-6b-pt-8-1e-2/ checkpoint-3000`). Note that the current fine-tuning does not support multiple rounds of data, so only the responses from the first round of the conversation are fine-tuned. + +## Use your own dataset +Modify `train_file`, `validation_file` and `test_file` in `train.sh` and `evaluate.sh` to your own JSON format dataset paths, and change `prompt_column` and `response_column` to the keys in the JSON file corresponding to input text and output text. + +## TODO +* [ ] Support for chat data +* [ ] Support for full finetuning + +## quoting + +``` +@inproceedings{liu2022p, + title={P-tuning: Prompt tuning can be comparable to fine-tuning across scales and tasks}, + author={Liu, Xiao and Ji, Kaixuan and Fu, Yicheng and Tam, Weng and Du, Zhengxiao and Yang, Zhilin and Tang, Jie}, + booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)}, + pages={61--68}, + year={2022} +} +``` \ No newline at end of file diff --git a/ptuning/arguments.py b/ptuning/arguments.py new file mode 100644 index 0000000000000000000000000000000000000000..fda1f3522261f50768984402d9ac691557ea63f3 --- /dev/null +++ b/ptuning/arguments.py @@ -0,0 +1,224 @@ +from dataclasses import dataclass, field +from typing import Optional + + +@dataclass +class ModelArguments: + """ + Arguments pertaining to which model/config/tokenizer we are going to fine-tune from. + """ + + model_name_or_path: str = field( + metadata={"help": "Path to pretrained model or model identifier from huggingface.co/models"} + ) + ptuning_checkpoint: str = field( + default=None, metadata={"help": "Path to p-tuning v2 checkpoints"} + ) + config_name: Optional[str] = field( + default=None, metadata={"help": "Pretrained config name or path if not the same as model_name"} + ) + tokenizer_name: Optional[str] = field( + default=None, metadata={"help": "Pretrained tokenizer name or path if not the same as model_name"} + ) + cache_dir: Optional[str] = field( + default=None, + metadata={"help": "Where to store the pretrained models downloaded from huggingface.co"}, + ) + use_fast_tokenizer: bool = field( + default=True, + metadata={"help": "Whether to use one of the fast tokenizer (backed by the tokenizers library) or not."}, + ) + model_revision: str = field( + default="main", + metadata={"help": "The specific model version to use (can be a branch name, tag name or commit id)."}, + ) + use_auth_token: bool = field( + default=False, + metadata={ + "help": ( + "Will use the token generated when running `huggingface-cli login` (necessary to use this script " + "with private models)." + ) + }, + ) + resize_position_embeddings: Optional[bool] = field( + default=None, + metadata={ + "help": ( + "Whether to automatically resize the position embeddings if `max_source_length` exceeds " + "the model's position embeddings." + ) + }, + ) + quantization_bit: Optional[int] = field( + default=None + ) + pre_seq_len: Optional[int] = field( + default=None + ) + prefix_projection: bool = field( + default=False + ) + + +@dataclass +class DataTrainingArguments: + """ + Arguments pertaining to what data we are going to input our model for training and eval. + """ + + lang: Optional[str] = field(default=None, metadata={"help": "Language id for summarization."}) + + dataset_name: Optional[str] = field( + default=None, metadata={"help": "The name of the dataset to use (via the datasets library)."} + ) + dataset_config_name: Optional[str] = field( + default=None, metadata={"help": "The configuration name of the dataset to use (via the datasets library)."} + ) + prompt_column: Optional[str] = field( + default=None, + metadata={"help": "The name of the column in the datasets containing the full texts (for summarization)."}, + ) + response_column: Optional[str] = field( + default=None, + metadata={"help": "The name of the column in the datasets containing the summaries (for summarization)."}, + ) + history_column: Optional[str] = field( + default=None, + metadata={"help": "The name of the column in the datasets containing the history of chat."}, + ) + train_file: Optional[str] = field( + default=None, metadata={"help": "The input training data file (a jsonlines or csv file)."} + ) + validation_file: Optional[str] = field( + default=None, + metadata={ + "help": ( + "An optional input evaluation data file to evaluate the metrics (rouge) on (a jsonlines or csv file)." + ) + }, + ) + test_file: Optional[str] = field( + default=None, + metadata={ + "help": "An optional input test data file to evaluate the metrics (rouge) on (a jsonlines or csv file)." + }, + ) + overwrite_cache: bool = field( + default=False, metadata={"help": "Overwrite the cached training and evaluation sets"} + ) + preprocessing_num_workers: Optional[int] = field( + default=None, + metadata={"help": "The number of processes to use for the preprocessing."}, + ) + max_source_length: Optional[int] = field( + default=1024, + metadata={ + "help": ( + "The maximum total input sequence length after tokenization. Sequences longer " + "than this will be truncated, sequences shorter will be padded." + ) + }, + ) + max_target_length: Optional[int] = field( + default=128, + metadata={ + "help": ( + "The maximum total sequence length for target text after tokenization. Sequences longer " + "than this will be truncated, sequences shorter will be padded." + ) + }, + ) + val_max_target_length: Optional[int] = field( + default=None, + metadata={ + "help": ( + "The maximum total sequence length for validation target text after tokenization. Sequences longer " + "than this will be truncated, sequences shorter will be padded. Will default to `max_target_length`." + "This argument is also used to override the ``max_length`` param of ``model.generate``, which is used " + "during ``evaluate`` and ``predict``." + ) + }, + ) + pad_to_max_length: bool = field( + default=False, + metadata={ + "help": ( + "Whether to pad all samples to model maximum sentence length. " + "If False, will pad the samples dynamically when batching to the maximum length in the batch. More " + "efficient on GPU but very bad for TPU." + ) + }, + ) + max_train_samples: Optional[int] = field( + default=None, + metadata={ + "help": ( + "For debugging purposes or quicker training, truncate the number of training examples to this " + "value if set." + ) + }, + ) + max_eval_samples: Optional[int] = field( + default=None, + metadata={ + "help": ( + "For debugging purposes or quicker training, truncate the number of evaluation examples to this " + "value if set." + ) + }, + ) + max_predict_samples: Optional[int] = field( + default=None, + metadata={ + "help": ( + "For debugging purposes or quicker training, truncate the number of prediction examples to this " + "value if set." + ) + }, + ) + num_beams: Optional[int] = field( + default=None, + metadata={ + "help": ( + "Number of beams to use for evaluation. This argument will be passed to ``model.generate``, " + "which is used during ``evaluate`` and ``predict``." + ) + }, + ) + ignore_pad_token_for_loss: bool = field( + default=True, + metadata={ + "help": "Whether to ignore the tokens corresponding to padded labels in the loss computation or not." + }, + ) + source_prefix: Optional[str] = field( + default="", metadata={"help": "A prefix to add before every source text (useful for T5 models)."} + ) + + forced_bos_token: Optional[str] = field( + default=None, + metadata={ + "help": ( + "The token to force as the first generated token after the decoder_start_token_id." + "Useful for multilingual models like mBART where the first generated token" + "needs to be the target language token (Usually it is the target language token)" + ) + }, + ) + + + + def __post_init__(self): + if self.dataset_name is None and self.train_file is None and self.validation_file is None and self.test_file is None: + raise ValueError("Need either a dataset name or a training/validation/test file.") + else: + if self.train_file is not None: + extension = self.train_file.split(".")[-1] + assert extension in ["csv", "json"], "`train_file` should be a csv or a json file." + if self.validation_file is not None: + extension = self.validation_file.split(".")[-1] + assert extension in ["csv", "json"], "`validation_file` should be a csv or a json file." + if self.val_max_target_length is None: + self.val_max_target_length = self.max_target_length + diff --git a/ptuning/datasets/AdvertiseGen/dev.json b/ptuning/datasets/AdvertiseGen/dev.json new file mode 100644 index 0000000000000000000000000000000000000000..d035bb02c899cb81c23cdf7f9f329c4ae3ddc75a --- /dev/null +++ b/ptuning/datasets/AdvertiseGen/dev.json @@ -0,0 +1,1070 @@ +{"content": "类型#上衣*材质#牛仔布*颜色#白色*风格#简约*图案#刺绣*衣样式#外套*衣款式#破洞", "summary": "简约而不简单的牛仔外套,白色的衣身十分百搭。衣身多处有做旧破洞设计,打破单调乏味,增加一丝造型看点。衣身后背处有趣味刺绣装饰,丰富层次感,彰显别样时尚。"} +{"content": "类型#裙*材质#针织*颜色#纯色*风格#复古*风格#文艺*风格#简约*图案#格子*图案#纯色*图案#复古*裙型#背带裙*裙长#连衣裙*裙领型#半高领", "summary": "这款BRAND针织两件套连衣裙,简约的纯色半高领针织上衣,修饰着颈部线,尽显优雅气质。同时搭配叠穿起一条背带式的复古格纹裙,整体散发着一股怀旧的时髦魅力,很是文艺范。"} +{"content": "类型#上衣*风格#嘻哈*图案#卡通*图案#印花*图案#撞色*衣样式#卫衣*衣款式#连帽", "summary": "嘻哈玩转童年,随时,没错,出街还是要靠卫衣来装酷哦!时尚个性的连帽设计,率性有范还防风保暖。还有胸前撞色的卡通印花设计,靓丽抢眼更富有趣味性,加上前幅大容量又时尚美观的袋鼠兜,简直就是孩子耍帅装酷必备的利器。"} +{"content": "类型#裤*风格#英伦*风格#简约", "summary": "裤子是简约大方的版型设计,带来一种极简主义风格而且不乏舒适优雅感,是衣橱必不可少的一件百搭单品。标志性的logo可以体现出一股子浓郁的英伦风情,轻而易举带来独一无二的体验。"} +{"content": "类型#裙*裙下摆#弧形*裙腰型#高腰*裙长#半身裙*裙款式#不规则*裙款式#收腰", "summary": "这款来自梵凯的半身裙富有十足的设计感,采用了别致的不规则设计,凸显出时尚前卫的格调,再搭配俏皮的高腰设计,收腰提臀的同时还勾勒出优美迷人的身材曲线,而且还帮你拉长腿部比例,释放出优雅娇俏的小女人味。并且独特的弧形下摆还富有流畅的线条美,一颦一动间展现出灵动柔美的气质。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*图案#线条*衣样式#衬衫*衣袖型#泡泡袖*衣款式#抽绳", "summary": "这件衬衫的款式非常的宽松,利落的线条可以很好的隐藏身材上的小缺点,穿在身上有着很好的显瘦效果。领口装饰了一个可爱的抽绳,漂亮的绳结展现出了十足的个性,配合时尚的泡泡袖型,尽显女性甜美可爱的气息。"} +{"content": "类型#裙*材质#蕾丝*风格#宫廷*图案#刺绣*图案#蕾丝*裙型#大裙摆*裙下摆#花边*裙袖型#泡泡袖", "summary": "宫廷风的甜美蕾丝设计,清醒的蕾丝拼缝处,刺绣定制的贝壳花边,增添了裙子的精致感觉。超大的裙摆,加上精细的小花边设计,上身后既带着仙气撩人又很有女人味。泡泡袖上的提花面料,在细节处增加了浪漫感,春日的仙女姐姐。浪漫蕾丝布满整个裙身,美丽明艳,气质超仙。"} +{"content": "类型#裤*版型#显瘦*颜色#黑色*风格#简约*裤长#九分裤", "summary": "个性化的九分裤型,穿着在身上,能够从视觉上拉长你的身体比例,让你看起来更加的有范。简约的黑色系列,极具时尚的韵味,充分凸显你专属的成熟韵味。修身的立体廓形,为你塑造修长的曲线。"} +{"content": "类型#裙*版型#显瘦*风格#文艺*风格#简约*图案#印花*图案#撞色*裙下摆#压褶*裙长#连衣裙*裙领型#圆领", "summary": "文艺个性的印花连衣裙,藏青色底蕴,低调又大气,撞色太阳花分布整个裙身,绚丽而美好,带来时尚减龄的气质。基础款的舒适圆领,简约不失大方,勾勒精致脸庞。领后是一粒包布扣固定,穿脱十分方便。前片立体的打褶设计,搭配后片压褶的做工,增添层次和空间感,显瘦又有型。"} +{"content": "类型#裙*颜色#蓝色*风格#清新*图案#蝴蝶结", "summary": "裙身处采用立体蝴蝶结装饰辅以蓝色条带点缀,令衣身造型饱满富有层次的同时为其注入一丝甜美气息。将女孩清新娇俏的一面衬托而出。"} +{"content": "类型#裙*颜色#白色*风格#清新*图案#碎花*裙腰型#松紧腰*裙长#长裙*裙衣门襟#拉链*裙款式#拉链", "summary": "这条颜色素雅的长裙,以纯净的白色作为底色,辅以印在裙上的点点小碎花,勾勒出一幅生动优美的“风景图”,给人一种大自然的清新之感,好似吸收新鲜空气的那种舒畅感。腰间贴心地设计成松紧腰,将腰线很好地展现出来,十分纤巧,在裙子的侧边,有着一个隐形的拉链,能够让你穿脱自如。"} +{"content": "类型#裤*材质#羊毛*裤长#九分裤*裤口#微喇裤", "summary": "不同于一般的西服裤。这款小喇叭羊毛裤在样式上显得更加时髦优雅,特地采用微微的九分喇叭裤腿设计,视觉上将脚踝处显得更加纤细。并且特地甄选柔软的羊毛材质,就算直接贴肤穿着,也不会觉得寒冷,比较适合初秋穿噢。"} +{"content": "类型#上衣*风格#简约*衣门襟#拉链*衣款式#口袋*衣款式#拉链", "summary": "上衣与裤子的连体式设计从整体看起来十分的具有大牌的风范。简约,没有任何的其他装饰,把自己的独特尽情展现。上衣胸口两边设有两个加大口袋,更增添了层次感。衣襟了拉链,让穿脱更加的方便,轻轻一点,显得更加时尚。"} +{"content": "类型#上衣*版型#宽松*风格#英伦*风格#复古*图案#格子*图案#复古*图案#线条*衣样式#外套*衣样式#西装*衣领型#翻领", "summary": "这件西装外套选用了经久不衰的格纹元素,通过色彩的明暗对比,展现出丰富的视觉层次,又缔造了英伦风的复古气息。法式的大翻领,延长颈部线条,彰显出女性帅气干练的特殊魅力。宽松舒适的版型完美掩藏了身材的小秘密,给身体自由活动空间。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*颜色#纯色*风格#知性*风格#高贵*风格#性感*图案#纯色*图案#蕾丝*裙型#背带裙*裙型#包臀裙*裙型#鱼尾裙*裙长#连衣裙*裙袖型#喇叭袖", "summary": "蕾丝喇叭袖上衣,搭配鱼尾包臀背带裙,整体造型给人甜美可人的感觉。偏爱蕾丝的浪漫柔情,流露别致女人味。喇叭袖的设计凸显别样浪漫,透露隐约小性感。两件套连衣裙,平添视觉层次感。鱼尾的设计修身显瘦,喇叭袖时尚减龄,纯色设计更加凸显女性知性高贵的气质。"} +{"content": "类型#裙*风格#淑女*风格#清新*风格#性感*图案#碎花*图案#线条*裙型#大裙摆*裙下摆#荷叶边", "summary": "性感的挂脖领设计,展现出迷人的肩部线条,尽显女人的妩媚气息。清新的碎花点缀裙身,凸显出秀雅温柔的韵味,衬的人很是气质不俗。灵动的荷叶边装饰,让整件上衣多了一些柔美和俏皮。散开的大摆裙剪裁,修饰出身材的小缺陷,行走间尽显温婉的淑女气质。"} +{"content": "类型#裙*材质#网纱*图案#蝴蝶结*裙下摆#层叠*裙长#半身裙*裙衣门襟#系带", "summary": "层叠网纱,仙气飘飘,却不会过于膨胀。腰间的蝴蝶结系带,恰到好处的增添了柔美感。膝盖以下,长度刚刚好的半身裙,比起“一览无遗魅力尽显”,专注于“完美隐藏”"} +{"content": "类型#裙*版型#宽松*颜色#焦糖色*风格#简约*风格#ol*风格#职场*裙型#百褶*裙长#连衣裙*裙领型#翻领*裙款式#腰带*裙款式#衬衫式", "summary": "来自自制的连衣裙采用今年大热的焦糖色,就像巧克力一样,甜蜜又不腻人。腰带的贴心设计,让宽松的版型也能拥有s曲线。上身简约的衬衫式翻领,衬托小v脸,带来一股职场ol风,加以百褶下摆的点缀,一起述说无尽温柔。"} +{"content": "类型#裙*裙型#鱼尾裙*裙款式#收腰", "summary": "率性大方的裙身,加上经典百搭的收腰版型,轻松打造出了时尚大方感呢。更有着俏皮可爱的鱼尾裙摆以及靓丽打眼的鹅黄色裙身,尽显元气少女风范,减龄效果特别好还特别的有青春活力气息,适合各个年龄阶段的女生们穿着。"} +{"content": "类型#上衣*颜色#红色*风格#青春*衣样式#外套*衣长#短款*衣款式#口袋", "summary": "这款外套对于个子矮小的妹纸来说就是福音了,短款穿在身上搭配起起来,立马就能变成大长腿,把整体身长比例拉长,呈现出黄金比例效果。鲜艳活泼的红色,穿在身上,视觉上给人呈现出青春的活力,元气满满的少女,还能衬托出肌肤的白皙,拥有一整天的好气色。大大的口袋,既可以作为装饰,出门携带东西也是非常的方便,还能增加整体的层次感。"} +{"content": "类型#上衣*材质#牛仔布*颜色#浅蓝色*颜色#深蓝色*风格#休闲*风格#潮*衣样式#外套*衣款式#拼接*衣款式#口袋*衣款式#纽扣", "summary": "BRAND牛仔外套,浅蓝色的衣身和深蓝色形成拼接的设计,充满了潮流的时尚感,翻折的领口造型,衬托在颈部肌肤,能修饰脸型。领口下有单排金属的纽扣门襟,开合很方便,很实用可以保暖。两侧有翻盖的口袋和斜插的口袋,在视觉上很有层次感。看起来很休闲。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*图案#蝴蝶结*图案#蕾丝*裙下摆#花边*裙下摆#压褶*裙长#半身裙*裙袖长#长袖*裙领型#立领*裙款式#拼接*裙款式#钉珠", "summary": "成熟韵味和优雅气质并存的时尚两件套。上衣立领系蝴蝶结造型,俏皮优雅。喇叭长袖拼接压褶蕾丝花边,气质减龄。高腰包臀半身裙,修身效果特别好,收腹展示曼妙的身材曲线。两侧手工钉珠装饰,时髦立体,视觉拉长腿型,整体上身彰显成熟女人魅力。显瘦百搭。"} +{"content": "类型#裙*材质#蕾丝*图案#刺绣*图案#蕾丝*裙衣门襟#拉链*裙款式#拉链*裙款式#吊带*裙款式#收腰", "summary": "蕾丝吊带显露出精致的锁骨,让颈部显得更加修长。腰部采用款腰围收腰的方式,小蛮腰更诱人。裙摆上大朵刺绣花朵,非常逼真,仿佛真正的花朵撒在裙子上摇曳生姿。背后贴心的珍珠扣,美观的同时又避免了生活中忘记拉拉链的尴尬情况,精致不失优雅。"} +{"content": "类型#裙*版型#显瘦*图案#线条*裙下摆#花边*裙腰型#松紧腰*裙长#连衣裙", "summary": "连衣裙采用了松紧腰的设计,凸显出腰部纤细的线条,再加上过膝的长度,可以遮掩掉大腿上的小肉肉,更加显瘦,走路飘逸十足。采用了圆形领口的设计,修饰颈部线条,衣身上加了层次感分明的荷叶花边作为装饰,颇显甜美气质。"} +{"content": "类型#上衣*图案#字母*图案#文字*图案#印花*衣样式#外套*衣领型#圆领*衣长#中长款*衣袖长#长袖", "summary": "圆领款式设计的这一件长袖中长款的外套最大的设计亮点在于衣身上面的印花字母的设计哦,印花字母这样的款式的设计使得整一件外套看起来的感觉是很不错的呐,既显得个性又是很时髦的哟。"} +{"content": "类型#上衣*版型#宽松*风格#街头*风格#休闲*风格#青春*图案#印花*衣样式#卫衣*衣款式#连帽", "summary": "赋予活力标签的连帽连帽卫衣,是穿的出的舒适感,看得见的休闲风。这款卫衣在版式延续了经典的宽松廓形,让身体无拘无束的同时,更显放肆的青春减龄感。前中的人头印花点缀,个性而鲜明,轻松打造活跃于街头的潮酷风采,倍显时尚洒脱范儿。"} +{"content": "类型#裙*颜色#黑白*图案#条纹*图案#线条*裙下摆#荷叶边*裙下摆#压褶*裙长#连衣裙*裙领型#一字领*裙衣门襟#拉链*裙款式#口袋*裙款式#拉链*裙款式#吊带*裙款式#抽褶", "summary": "集甜美的少女感和简洁风格为一体的连衣裙,胸前延伸一圈的压褶荷叶边设计,增加了立体层次感,让黑白条纹呈现出水波般荡漾。明线外缝,凸出褶皱的线条,形成对比收边。两侧斜插口袋方便,背后拉链拉和顺滑,吊带一字肩型设计,贴合肩部的织带,可根据身形伸缩长短,非常具有实穿性。"} +{"content": "类型#裤*颜色#黑色*风格#简约*图案#条纹", "summary": "传承动感简约气质的条纹衣身,结合包边圆领和半开襟设计,造型显得活力有范,又不失男孩子的时尚帅气。胸前单侧小口袋点缀,让男宝宝帅气加倍。搭配纯黑色的底裤,整体显得层次十足,视觉也十分有美感,男宝宝穿起来独特魅力尽显。"} +{"content": "类型#裙*材质#雪纺*风格#休闲*裙下摆#花边*裙长#半身裙*裙领型#v领*裙款式#拼接*裙款式#吊带", "summary": "优美而动感的上衣。采用半透的雪纺材质工艺,深黑色系给您以非常魅惑的穿着体验,内里需要搭配深黑色的吊带。花边v字领口连襟拼接,举手投足更加优雅迷人,适合搭配各种半身裙和休闲长裤。"} +{"content": "类型#上衣*版型#宽松*材质#针织*材质#混纺*材质#纤维*风格#运动*风格#休闲*衣样式#卫衣*衣袖长#长袖*衣袖型#落肩袖*衣款式#拼接*衣款式#抽绳*衣款式#抽褶*衣款式#连帽*衣款式#罗纹", "summary": "柔和的纤维混纺针织面料,手感舒适且回弹性强不易褶皱,肌理感强,布面干爽抗起球性能好。休闲运动感连帽卫衣设计,加以流苏感织带抽绳点缀,品质感尽显。宽松舒适落肩拼接长袖,袖口罗纹窄口处理,打造立闲适运动廓形。"} +{"content": "类型#裙*颜色#宝蓝色*风格#复古*图案#复古*裙下摆#开叉*裙腰型#高腰*裙衣长#短款", "summary": "闭眼入的一款裙子,选用了光色感饱满的面料,营造出轻盈欢快的愉悦感。宝蓝与复古绿,轻熟风不二之色。侧开叉的设计,行走起来步履之间尽显女性柔美。高腰的版型设计,拉长身形比例,搭配任何短款上衣都会让你高挑吸睛~"} +{"content": "类型#上衣*版型#宽松*材质#棉*风格#简约*风格#潮*衣样式#衬衫*衣领型#v领*衣款式#拼接*衣款式#荷叶边", "summary": "这款来自massimodutti的衬衫,精选高品质棉质混纤,轻薄质地,吸湿透气,结实耐穿。整体的版型简约大方,在宽松的廓形下感受随性的时尚格调。v领领口的设计,简约之中展现干练硬朗的气场,潮味十足。袖口处双层荷叶边的拼接,低调吸睛,富有层次感。"} +{"content": "类型#上衣*材质#针织*衣样式#毛衣*衣袖长#短袖*衣袖长#长袖*衣袖型#落肩袖", "summary": "长袖的基础设计,活动舒适自在。微微落肩袖的设计,上身更修饰身形。这款毛衣有两个款式,一件是套头毛衣的款,斜下摆的设计。又让整体更具特色了一些;另一件是短袖针织连衣裙的款式。"} +{"content": "类型#上衣*版型#宽松*颜色#卡其色*风格#复古*图案#复古*衣样式#风衣*衣款式#腰带", "summary": "的风衣,没有之一,灵感来源于复古的欧洲军装,肩章排扣和腰带这些细节设计就能展现,然后搭配长款版型,上身自带气场。而且整体采用宽松直筒版型,穿着舒适不显臃肿,还能起到修饰身形的作用。而配色采用经典又时髦的卡其色,可搭性颇高,轻松穿出独特气场。"} +{"content": "类型#裙*颜色#白色*风格#清新*图案#刺绣*裙下摆#花边*裙长#连衣裙*裙领型#v领*裙款式#抽褶", "summary": "简单大气纯白色连衣裙,是开春季节最美好的穿搭单品。简单的小v领点缀领部,加以独特的花边绣花点缀,满满的清新活力悠然散发。加以纯粹的白色选料,上身亲肤透气,自带自然的褶皱肌理。同时,中长款式,修饰好身材,十分美腻。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#青春*风格#清新*图案#条纹*图案#线条*图案#撞色*衣样式#衬衫*衣领型#翻领*衣款式#腰带", "summary": "非常清新百搭的一款衬衫,蓝白撞色的条纹设计青春又减龄。经典的翻领造型暗藏小心机,领型是故意往后穿的设计,上身自然随性,不经意间展露迷人肩颈线条。下摆的开叉细节也是一大亮点,搭配腰间配置的腰带装饰,形成自然收紧的伞状效果,修饰腰身更加显瘦。腰带是可拆卸的设计,采用本布包扣装饰更显精致细节。将腰带去除时就是比较宽松的bf风款式,自带慵懒随意的感觉。"} +{"content": "类型#裤*版型#宽松*颜色#红色*风格#复古*风格#文艺*图案#复古*裤款式#木耳边", "summary": "轻盈有垂感的面料,上身舒适亲肤,甩腿的裤摆设计,行走画风干净利落。复古文艺的木耳边元素,细碎褶束腰,凸显腰身。显瘦高腰的宽松裤腿能够整体拉长腰身,使得摩登BRAND少女气质凸显。红色米色温柔色系,百搭时尚。"} +{"content": "类型#裙*材质#蚕丝*风格#性感*图案#印花*裙下摆#垂坠*裙领型#v领*裙款式#拼接*裙款式#飘带", "summary": "精选优质飘逸桑蚕丝面料,质感垂顺柔软,手感舒适细腻;优雅时尚的v领拼接飘带领口设计,展现完美颈部曲线,性感迷人;透视印花罩衫款式设计,给你更多的搭配选择。"} +{"content": "类型#裙*版型#显瘦*颜色#白色*颜色#黑色*图案#线条*裙型#a字*裙腰型#高腰*裙款式#不规则", "summary": "这款裙子采用黑色的颜色打底,裙身上装饰着白色的线条以及扣子装饰,丰富视觉上的变化。另外整体上a字裙裙型搭配高腰的设计,修身效果出众,还有着不规则的裙摆,展现出十足的设计感。"} +{"content": "类型#裙*材质#针织*颜色#黑色*风格#休闲*风格#性感*裙下摆#荷叶边*裙长#连衣裙*裙款式#拼接", "summary": "这款经典的黑色连衣裙,整体采用针织和冰丝两种材料拼接而成,使裙子在休闲中又透着些许法式优雅感。领口采用v形设计,修饰脸型,同时凸出性感又不过分的气质。肩部的荷叶边拼接,显得飘逸灵动,衬托出了女性活泼浪漫的魅力。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#蚕丝*图案#线条*衣样式#衬衫*衣领型#翻领*衣袖型#喇叭袖", "summary": "气质翻领设计,衬托出脖子优雅的线条,桑蚕丝面料,适合敏感肌穿着,穿着清凉透气。很显气质的一件衬衫,完美符合亚洲女性身材,扬长避短,版型很正。修身塑形效果尤为出众,小喇叭袖更是十分具有艺术感,小翻领挺括更显气质,怎么搭配都好看,系上丝巾更有女人味。宽松的版型更能衬托女性的娇小,慵懒和帅气的完美结合。"} +{"content": "类型#裤*材质#牛仔布*风格#街头*风格#复古*风格#文艺*风格#简约*风格#休闲*风格#潮*图案#复古*裤型#直筒裤*裤款式#拉链", "summary": "牛仔裤本身多添了复古文艺,简约直筒休闲款式,修饰腿型尽显挺拔俊朗,出街搭配更有范。时尚猫须磨白做旧处理,复古文艺更具街头潮流风尚。金属拉链暗门襟设计,推拉顺滑更富有质感。"} +{"content": "类型#上衣*版型#h*材质#棉麻*风格#简约*风格#清新*图案#字母*图案#文字*图案#刺绣*衣样式#西装*衣领型#翻领*衣款式#口袋", "summary": "BRAND棉麻西装采用别致的翻领设计,立体修饰脖型,挺括有精神,彰显优雅与品味。两侧翻盖口袋设计,可方便插手的同时,还可放置随身携带的物品,美观又实用。后背清新的字母刺绣,穿着利落精干,彰显女性优雅英气。简约的格调,舒适的h版型,打破对身体的束缚,让你穿得更轻松自在。"} +{"content": "类型#上衣*颜色#纯色*风格#简约*图案#条纹*图案#纯色*衣样式#外套*衣款式#口袋*衣款式#对称", "summary": "来自巴拉巴拉的女童长款外套,设计师采用直筒式衣袖裁剪,并在袖口加饰有纯色条纹,在打破了整体的单一性的同时,还增添了一丝简约时尚气息。再加上对称的斜插口袋,既能给予娇嫩双手温暖,同时还可放置孩子的随身物品,暖心又很实用呢。"} +{"content": "类型#裙*版型#显瘦*风格#淑女*风格#清新*图案#格子*裙下摆#荷叶边*裙长#连衣裙*裙款式#腰带", "summary": "这款连衣裙采用清新的格子元素,干净大方,视觉上给人十分舒服的体验。甜美的荷叶边从上延伸到下面,给人丰富的层次感,十分的淑女,一抹浪漫的气息油然而生,女神气质爆表。腰带的设计既起到装饰作用又显瘦。"} +{"content": "类型#裙*版型#宽松*风格#复古*图案#复古*裙袖型#灯笼袖", "summary": "袖子有灯笼袖的既视感,中世纪的复古韵味轻松展现,版型宽松舒适,上身贴合身材,不会显胖。超级百搭,秋季单穿,搭配裙子裤子都ok!冬天也能做打底,外搭毛呢大衣,气质满满。"} +{"content": "类型#裤*材质#雪纺*风格#性感*图案#线条*裤长#连体裤", "summary": "非常少女感的连体裤,做工非常的细致,面料是雪纺的面料,手感非常好,上身舒适,非常的亲肤。一字领加上吊带的设计,展现了迷人的锁骨,突出了颈部线条,非常的性感。腰间的系带飘逸灵动。荷叶边的袖口浪漫梦幻,非常的有女人味,优雅十足。"} +{"content": "类型#上衣*版型#显瘦*颜色#白色*颜色#纯色*图案#纯色*图案#碎花*衣样式#衬衫*衣袖长#长袖*衣款式#收腰", "summary": "推荐这款来自时尚服装品牌tedbaker的长袖衬衫。这款衬衫采用纯色色调搭载小碎花的点缀设计,同时修身版型,穿起来又显得大气时尚,给人十分成熟的感觉,胸前白色扣子设计更显时尚的气息,束紧的袖口设计是的活动更加自然。排扣设计更有容易穿脱。修身版型设计,收腰,勾勒出纤细的腰线。适合25岁到30岁的年轻男性穿着,彰显气质。"} +{"content": "类型#裤*图案#线条*图案#撞色*裤长#连体裤*裤款式#拉链", "summary": "率性的一定是喜欢拥有利落线条的服装,不重复不拖沓,这件连体裤便具有这样干练的气质。宽大松垮的廓型给身体自由舒展的空间,慵懒又个性。拉链从领口直通到裤脚,双头的拉链可以从裤脚拉开,解决了去的问题。酷酷的高领设计,利落的窄袖,外穿内搭都合适。撞色的处理,更加精致、牢固。"} +{"content": "类型#裙*风格#文艺*风格#知性*风格#潮*裙长#连衣裙*裙衣门襟#系带", "summary": "唯美时髦的连衣裙,充满了文艺的气息,呈现出知性洒脱的风采,随时随地都能显得曼妙别致。配以绚丽精美的系带设计,呈现出了新潮时尚的风采,优雅大气,给人以眼前一亮的视觉惊喜。一抹个性的系带,在领口处精心设计。展现出了潇洒知性的魔力,时刻都能够带来新潮的视觉美感,靓丽十足。"} +{"content": "类型#上衣*材质#棉*材质#斜纹*风格#文艺*衣样式#风衣", "summary": "喜欢这样的风衣,简单又带点文艺气的,在春天的时候一穿,很难不让人为她着迷。用了有挺括度和质感很好的面料,有点儿斜纹的样子,手感摸起来有棉的样子,灰常的实在。"} +{"content": "类型#裙*版型#显瘦*材质#蚕丝*风格#欧美*风格#复古*风格#潮*图案#条纹*图案#格子*图案#复古*裙领型#polo领", "summary": "气质的格子设计,复古时尚,polo领的设计,简洁大方时尚,越简单的版型,亮点就越让人惊艳,上身效果很好,款式很时尚有范,面料质感很高档。真丝材质摸起来手感顺滑,很有高级感。欧美范大气经典,修身遮挡腹部赘肉,让你穿上充满自信的魅力。条纹是经久不过时的潮流元素,尽显女性时尚气质。"} +{"content": "类型#上衣*风格#韩版*衣样式#衬衫", "summary": "这一款衬衫手工磨边的设计,做工精湛特别考究,精致的韩版设计,符合女性的身材曲线,自然衬托纤美妙身姿。时尚的双贴袋装饰,立体时尚美观实用。精挑细选的天丝棉布料,丝滑垂坠亲肤细腻。"} +{"content": "类型#上衣*风格#复古*风格#知性*图案#格子*图案#复古*图案#线条*衣样式#衬衫*衣领型#v领*衣袖型#灯笼袖*衣款式#抽绳", "summary": "轻薄舒适的衬衫,质地飘逸,上身后有种轻纱般的朦胧美感,让人爱不释手。淡雅的格纹做点缀,彰显复古的时尚韵味,很好的衬托出白皙的肌肤。流畅的v领,修饰纤细的颈部线条,展现知性的都市风情。袖口处有抽绳收紧的处理,呈现出微蓬的灯笼袖质感,包容手臂的线条。下摆处开衩,体贴又温柔,减少束缚感。"} +{"content": "类型#裤*版型#显瘦*颜色#红色*风格#性感*图案#线条*裤长#连体裤*裤型#直筒裤*裤腰型#高腰", "summary": "连体裤采用了优质面料,精心剪裁而成,修身的版型,让身材凹凸有致,轻松塑造黄金比例。亮眼的红色系设计,凸显出你张扬的个性,同时让你具有出挑的魅力。精美的吊带领设计,修饰颈部线条,还可以防止滑落,美观又兼具实用性。v字领口,让裸露的锁骨更显性感,个性又时尚。高腰线直筒裤型,让身材更显高挑,无论是单穿还是作为内搭,都是不错的选择。"} +{"content": "类型#裙*版型#宽松*颜色#粉色*裙型#百褶*裙领型#圆领*裙袖型#泡泡袖", "summary": "的这款礼服裙,采用丝质面料,以营造亲肤柔软的穿着体验。泡泡袖娃娃圆领的版型,可爱俏皮。衣身以粉色色彩简洁,视觉丰富,展现乖巧甜美的气息,举手投足间,尽显温婉活泼气质。下摆的大百褶设计,宽松无束缚感的同时,还展现孩童的公主风范。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑色*风格#青春*图案#刺绣*衣样式#针织衫*衣样式#开衫*衣领型#圆领*衣款式#钉珠", "summary": "本款的针织衫图案真的太美啦!精美刺绣加手工钉珠,图案俏皮可爱。五角星、流行都是一个个珠子串上去的,纯人工,很费时的,阳光下blingbling,如同黑夜里的星空,多彩欢乐又别致。黑色基础款开衫,百搭的圆领,上身减龄显瘦,总而言之就是一款很美的开衫。"} +{"content": "类型#上衣*风格#潮*风格#性感*图案#刺绣*衣样式#雪纺衫*衣款式#露肩*衣款式#不规则", "summary": "这件非常精美时尚的雪纺衫,最最别样灵动的设计亮点和设计的曼妙之处就在于。在整体领口的设计上搭配了别样灵动的一字的露肩的搭配,足以轻轻松松的打造了整体的性感优雅的气质,独特灵动的绣花搭配不规则的下摆,完美的彰显了时尚潮流感。"} +{"content": "类型#裤*版型#宽松*材质#棉*材质#牛仔布*材质#水洗*材质#混纺*图案#线条*裤长#长裤*裤长#连体裤*裤型#阔腿裤*裤款式#抽褶", "summary": "这款BRAND的阔腿连体长裤,精选质量上乘的含棉混纺牛仔布料裁剪缝制而成,经过特殊水洗工艺处理更加柔软亲肤,混纺布料相比纯棉布料更具有弹性不易褶皱变形。阔腿裤的版型宽松自在不挑身材容易驾驭。背带连体裤款式百搭无疑是减龄单品。背后对称贴袋设计修饰臀部线条丰富整体层次感。"} +{"content": "类型#上衣*颜色#粉色*图案#刺绣*衣样式#卫衣*衣款式#露肩*衣款式#荷叶边", "summary": "一款刺绣荷叶边露肩粉色卫衣,粉色的设计,甜美而又俏皮,满足了每个女孩心中的少女心。荷叶边的设计,更添几分灵动感。花草刺绣的设计十分的别致,笼罩着浪漫的气息,更显精致的生活品味。"} +{"content": "类型#裤*图案#字母*图案#文字*裤腰型#松紧腰*裤口#小脚", "summary": "舒适的松紧裤腰,穿脱十分方便。而且一点也不会勒着孩子的肚子,很受宝妈们青睐。优雅的松紧裤脚收口设计,既可以防止裤腿灌风,又可以展现出酷帅的气质。炫酷的字母印图装饰,挡不住的时尚感,塑造出活力四射的时尚男孩形象。"} +{"content": "类型#裙*风格#淑女*风格#简约*风格#知性*风格#高贵*裙领型#v领*裙衣门襟#双排扣*裙款式#绑带", "summary": "这款适合在明媚动人的温暖季节,张扬你淑女的迷人风情。简单优雅的合身版型,精致细腻的v字领口,凸显柔美颈部,领口的绑带搭配金属装饰的设计,简约中增添细节感,诠释高雅大气。包裙的设计,彰显女性高贵大方,配上经典的双排扣,自然流露知性典雅的气质。"} +{"content": "类型#裤*风格#复古*风格#简约*风格#休闲*图案#字母*图案#文字*图案#复古*图案#线条*图案#撞色*裤型#直筒裤*裤款式#抽褶", "summary": "裤子整体风格样式简约休闲,直筒版型穿起来更添笔直挺拔。裤面褶皱工艺理念渗透,营造复古做旧的质感,穿起来典雅绅士。裤脚自然微微收束,修饰腿部线条更添高挑帅气。腰头撞色字母点缀,协调色调更有范。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*图案#线条*裤长#九分裤*裤款式#破洞", "summary": "这么一款修身的牛仔裤版型设计,上身遮肉显瘦的同时更显百搭耐看,腰部的裁剪大方迷人。收腰的版型做工巧妙的勾勒出纤细的腰身线条,裤身上的破洞做工精致巧妙,凸显时尚感。而利落干净的九分裤裁剪,视觉上尽显腿部修长。"} +{"content": "类型#裤*版型#宽松*材质#棉*风格#简约*风格#休闲*裤款式#口袋", "summary": "这款宽松休闲裤精选优质纯棉面料,以能够藏肉的宽松剪裁打板,成品在着用感舒适的同时修饰你的身形。此外,简约版型的基础上配以多口袋的细节设计,吸睛度和时髦度满满。更值得一提的是,它简约的配色,能让你成为更好的自己。是一款春夏季节里休闲场合的实用单品。"} +{"content": "类型#裤*材质#羊毛", "summary": "精纺羊毛面料加入了少量的尼龙,性能得到了改良和发挥既有羊毛的透气亲肤,也有尼龙的抗皱抗起球这种羊毛面料轻薄柔和,做成裤子太合适,穿着舒适又好打理"} +{"content": "类型#裤*材质#羊毛", "summary": "底衫是以美利奴羊毛精制而成。布料极度舒适,排汗力绝佳并且具有天然的透气效果。这款女性短袖底衫是依女性身型量身打造,精致的内嵌式v领设计,收边比男性款式更加精巧。其他特点包括内嵌式袖筒和绷缝线织法,确保衣料不会造成擦伤,同时减少因连身和背包肩带造成的磨擦。"} +{"content": "类型#上衣*版型#显瘦*衣样式#卫衣", "summary": "设计师把整个风格定义为,一件卫衣承包了整个秋季。经典又百搭的卫衣绝对是秋季首选。不仅舒适还让人觉得温馨又阳光。最重要的是,它不仅减龄还非常显瘦。"} +{"content": "类型#裙*风格#性感*图案#印花*裙长#连衣裙*裙领型#v领*裙衣门襟#系带", "summary": "超长的睡袍系带款式的连衣裙,柔软的材质彰显轻柔的优雅。性感的v领以及围裹式的版型,搭配热带的印花,营造出性感种带有慵懒的度假风情。"} +{"content": "类型#上衣*风格#文艺*风格#青春*风格#潮*图案#撞色*衣样式#衬衫*衣领型#翻领*衣款式#拼接*衣款式#口袋", "summary": "充满文艺风格的衬衫,在经典的同时,又拥有活泼的设计元素。在右上方点缀了撞色的贴标,不但拥有了立体感,显得青春率性,而且也迎合了当下时尚潮流。采用翻领的领口设计,还可以凸显出干练的男性气质,尽显挺拔身姿。拼接了多个立体的大口袋,还可以凸显出男孩的率性魅力。"} +{"content": "类型#裙*材质#网纱*风格#性感*裙型#网纱裙*裙腰型#高腰*裙款式#钉珠", "summary": "轻盈的网纱裙上绣上树叶和花朵,别致的造型更显甜美,没有大面积的运用,适当留白更有艺术气息,钉珠的加入又为它增加了奢华感,立体精致。大面积的网纱为你营造出浪漫气息,朦胧的的小性感呼之欲出,胸部和裙摆都配上内衬,贴心防走光。高腰线的运用,凸显挺拔,过膝长度更女神范,优雅大气。"} +{"content": "类型#裤*版型#显瘦*颜色#黑色*风格#复古*风格#简约*风格#职场*图案#条纹*图案#复古*图案#线条*裤腰型#高腰", "summary": "经典的纯黑色调,最为基础百搭不易出错,融合入线条干净利落的版型设计之中,衬托出干练气势的职场风范,更有视觉上的显瘦效果。复古的条纹元素的加入,为单调的正装之中增添丝丝时髦气息,配合高腰的设计,提高腰线巧妙的纵向拉长腿部比例。简约的窄腿裤,避免了软塌没精神,上身更加精致有气势。"} +{"content": "类型#上衣*颜色#姜黄色*风格#休闲*衣样式#风衣*衣样式#外套*衣样式#打底衫*衣长#短款*衣袖型#收口*衣款式#螺纹", "summary": "利落有型的短款风衣外套,颜色采用了衬托肤色的姜黄色,内搭简洁打底衫,休闲随意,彰显青春活力。衣身做了机器人图案装饰,童趣十足,美观大方,给略显单的衣身了几分情趣,穿出孩子独有的青春活力。衣身四周螺纹的收口,松紧度好,服帖舒适,防风保暖。"} +{"content": "类型#裙*材质#棉*材质#牛仔布*材质#水洗*风格#复古*风格#潮*图案#复古*裙下摆#开叉*裙款式#拼接*裙款式#不对称", "summary": "精选进口纯棉牛仔面料经过特殊的水洗做旧工艺,弥漫着别样复古的味道,解构式拼接腰头设计极具前卫感,加上不对称裙摆和前后开叉剪裁打破沉闷的造型,都更加有当下潮流个性。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*颜色#黑色*图案#条纹*裙型#鱼尾裙*裙下摆#荷叶边*裙款式#螺纹", "summary": "本品版型宽松,造型独特时尚。竖坑条设计,拉伸黑色条纹,视觉显瘦感好,凸显你的优雅好身材。浪漫荷叶边设计,裙摆蹁跹优雅,螺纹搭配鱼尾,美观又蓬松,给身体预留出足够的空间,穿着的同时,可以尽情展现自己。"} +{"content": "类型#上衣*颜色#纯色*颜色#黑白*风格#文艺*风格#高贵*图案#纯色*衣样式#棒球服*衣袖型#灯笼袖*衣款式#收腰", "summary": "这款连衣裙选用黑白两种色彩设计,简单纯色设计带有精致高贵的气质。两件套设计设计可以让下面的纱裙带有朦胧隐约的美感。七分灯笼袖设计与棒球服版型结合,凸显出混搭的别致风格演绎专属的美感。收腰设计可以展现出你的腰身,文艺范下摆张扬你的气质。"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*裙型#牛仔裙*裙款式#破洞", "summary": "束脚的牛仔裤在现在也是非常少见的,现在大部分的人都穿过破洞的乞丐裤。因此束脚的别样成为我们这款商品的。大腿的水洗效果是为了冲破本是牛仔的规格,腰间的卡扣是这款牛仔裤和其他牛仔裤的别样之处。小腿的束脚更是为略带微风的夏日带来保暖的我效果。"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*裙型#牛仔裙*裙款式#破洞", "summary": "这款牛仔做了特殊的水洗工艺,呈现出的效果有反穿感,但是不会突兀,可以说是相当有自己个性的裤子啦。跟日常见到中规中矩的牛仔裤不一样,裤脚部分的折叠呈现的内浅效果。让裤子的层次感更丰富,破洞效果随性自然。用本身自带的水洗色变化,打造出裤子的与众不同。牛仔裤什么的,只要你喜欢,属于闭眼入不会后悔系列。耐穿耐用,起来也非常方便,省心省力,一句,喜欢不要错过哦~"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*裙型#牛仔裙*裙款式#破洞", "summary": "这款时尚牛仔裤选用优质的牛仔面料制作而成,质地厚实,纹路清晰,经水洗磨白破洞工艺处理后,增强了设计效果,上身穿着尽现不羁潮感;而半月型前袋,加入牛仔风格小口袋,裁剪贴合手型,穿着舒适更美观。"} +{"content": "类型#裙*风格#清新*风格#性感*图案#碎花*图案#线条*裙长#连衣裙*裙款式#勾花镂空*裙款式#收腰", "summary": "这款连衣裙采用了丰富的碎花图案,一眼就给人清新明媚的视觉体验,而且还衬托出了女性优雅温柔的一面。收腰版型,更加凸显纤细小蛮腰。领口的v形镂空设计,不仅在视觉上拉长了颈部线条,还体现了几分不过分的性感气质。"} +{"content": "类型#裙*颜色#黑色*风格#潮*裙款式#腰带", "summary": "以个性张扬的黑色为主打,配以大胆前卫的装饰瞬间提高了裙子的整体形象。炫酷的腰带点缀与纤细柔软的之上,一股潮流酷感迸发而出。加之舒适合体的版型设计,诠释出女子婀娜多姿的身姿,亦带来几许高级时髦格调。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*材质#牛仔布*图案#线条*图案#刺绣*裤型#直筒裤*裤款式#破洞*裤款式#亮片*裤口#毛边", "summary": "这款短袖牛仔裤两件套,上衣凭着经典圆领设计,轻松显露脖颈线条。修身显瘦的宽松直筒版型,百搭上身显气质。衣面和裤面上点缀的亮片猫咪刺绣,耐看显气质,瞬间提升衣品。破洞和毛边细节,兼具个性与质感,大大丰富上身时尚度。"} +{"content": "类型#裙*颜色#粉红色*图案#条纹*图案#印花*裙长#连衣裙", "summary": "这款粉红色条纹连衣裙精美大方,充满青春活力气息,十分唯美大气,尽显女性俏丽活泼感。且配以可爱亮眼的印花设计,更显女性甜美气息。"} +{"content": "类型#裙*材质#蕾丝*图案#蝴蝶结*图案#撞色*图案#蕾丝*裙下摆#花边*裙长#连衣裙*裙领型#圆领*裙袖型#荷叶袖*裙款式#拼接", "summary": "一款比较甜美精致的连衣裙,裙身多处拼接了撞色蕾丝花边,丰富了视觉层次,带来了几分甜美浪漫气息,激起满满的少女心,上身甜美减龄。小圆领领型,领口还有蝴蝶结点缀,温婉含蓄,增加了细节看点。多层荷叶袖袖型,更显精致甜美,提升了裙子的设计感。"} +{"content": "类型#上衣*颜色#黄色*风格#简约*图案#印花*衣样式#卫衣*衣门襟#系带*衣款式#抽绳*衣款式#连帽", "summary": "如今卫衣已经成功逆袭身为自带时髦属性的利器,年轻又减龄。靓丽的黄色调在清冷的季节带来一丝暖意,胸前简约的英文字母印花点缀,丰富层次感,让伴随着我们。连帽抽绳系带的版型,为我们带来了无限的青春活力。"} +{"content": "类型#裙*颜色#黑色*颜色#墨绿色*风格#复古*风格#ol*风格#职场*图案#复古*裙长#半身裙*裙款式#拼接", "summary": "衣身以复古的墨绿色为主体,拼接内敛的黑色,既时髦又不乏庄重的气质感。非常适合职场ol一族,与半身裙组合在一起,或者作为大衣的内搭,都不失为吸睛的组合。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#蚕丝*图案#印花*衣样式#衬衫", "summary": "来自BRAND,美元印花真丝衬衫。采用真丝材质打造的衣身,穿着亲肤透气。宽松版型穿着舒适显瘦,毫无束缚感。衣身饰有美元印花图案装饰,视觉效果强烈。后背配有彩色的大号标签装饰,辨识。"} +{"content": "类型#裙*颜色#蓝色*风格#淑女*风格#性感*裙长#连衣裙*裙领型#一字领*裙袖型#喇叭袖", "summary": "不管是什么季节连衣裙都是那一抹优雅的存在,就像newlook家的这款连衣裙,深沉的蓝色调,上身不仅衬托出肌肤的白皙。搭配着性感的一字肩,又展现出性感不失优雅的淑女气息。而袖口出精致的喇叭袖,灵动的样子,举手投足间散发出浪漫格调。"} +{"content": "类型#裙*材质#网纱*材质#雪纺*风格#复古*图案#手绘*图案#复古*图案#线条", "summary": "轻盈透气的双层雪纺面料,手感柔软仙气满满。复古的的双层网纱领口,修饰颈脖线条,勾勒柔美脸型。斜开门襟结合开合,新颖独特穿脱便捷。裙身手兰花设计浪漫吸睛,优雅女人味十足。过膝的版型微露脚踝,端庄大方。"} +{"content": "类型#裙*风格#简约*裙袖长#七分袖*裙领型#圆领*裙款式#拼接", "summary": "典雅圆领设计展露净白脖颈,简约七分袖设计将女性优雅气质完美呈现,裙摆拼接细腻轻纱彰显浪漫唯美气质,表面细致纹理彰显自然纯粹的美感。"} +{"content": "类型#上衣*版型#宽松*版型#h*风格#休闲*图案#撞色*衣样式#外套", "summary": "丰盈厚实的毛呢面料,内里做了加棉加厚处理,提升外套的防风保暖性,让你的春秋季节温暖倍增。撞色的羊羔毛领,带来冲击感的视觉,并为颈部添加柔软与温暖。加长的衣身,h版型宽松不挑身材,慵懒休闲风打造。"} +{"content": "类型#裙*材质#网纱*材质#蕾丝*风格#高贵*风格#潮*风格#性感*图案#蝴蝶结*图案#刺绣*图案#蕾丝*裙型#百褶*裙型#抹胸裙*裙款式#勾花镂空*裙款式#腰带", "summary": "镂空蕾丝结合精美的绣花,奢侈华丽加一点点的小性感,结合里面抹胸的内衬,增加了优雅不失潮流。丝绸的腰带以蝴蝶结的形式,装饰在腰间,凸显腰部纤细。百褶纱网的裙子,呈现出朦胧神秘感,更是增加母女们高贵优雅的气质。"} +{"content": "类型#裙*图案#印花*裙长#长裙*裙长#连衣裙", "summary": "这条优雅的长裙,简直是春夏季节,每个爱美女生衣柜里不可或缺的单品,连衣裙设计,省却了整体搭配造型的苦恼,穿在身上尽显仙女气质,细节之处精美绝伦,印花设计配合饱满的色彩,让你立刻成为人群中的焦点。不管是旅游还是出街拍照又或者是春游踏青,穿上它都是妥妥的完美。"} +{"content": "类型#上衣*风格#潮*风格#中国风*图案#刺绣*衣样式#外套", "summary": "这款BRAND的刺绣外套,制作工艺非常的精致立体,并采用了如丝滑般的面料,凸显满满的高级质感,穿着也倍增清爽舒适度;侧面衣袖并设计了logo标志,增添了酷炫个性,可谓在中国风中注入潮流,彰显独特韵味,超有态度款式,出街尽显国潮男子。"} +{"content": "类型#裤*风格#青春*风格#性感", "summary": "这一款弹力窄腿版形的速干休闲裤,设计亮点在于裤子的紧身版形设计。这样的表现手法使得整休裤子看起来非常性感,既充满时尚感又彰显青春动感的气息,动感个性。"} +{"content": "类型#上衣*材质#牛仔布*颜色#灰色*图案#刺绣*衣样式#衬衫*衣款式#口袋", "summary": "如果只是单纯色的灰色未免过于单调,因此,b在胸前加入牛仔衬衫口袋设计,不仅能让衣身更加立体还能提升时尚指数和色彩饱满度。并在口袋上增加小黑熊刺绣,让可爱的孩子穿着萌趣感立显。袖口、领口和下摆的平滑衍缝减少缝边对宝贝肌肤的摩擦感,保护娇嫩肌肤穿着舒适加倍。"} +{"content": "类型#裤*材质#牛仔布*风格#复古*风格#休闲*图案#复古*裤长#长裤*裤款式#拼接", "summary": "休闲范十足的牛仔长裤,贴布点缀背面矩形贴袋,与做旧裤身营造的复古感觉相契合,裤脚两端巧妙拼接,使裤身造型更为立体饱满,并展现不失率性的休闲少年感。裤脚的设计很有小心机,可以放下,也可以卷起来,怎么穿都很潮。"} +{"content": "类型#上衣*颜色#白色*风格#休闲*图案#刺绣*衣样式#衬衫*衣袖长#长袖*衣款式#纽扣", "summary": "一款以优雅白色为主打色调的简洁衬衫,以休闲长袖的版型赋予穿搭者舒适的体验。简洁的衣身设计,搭配大方的纽扣门襟,自带满满的青春活力,日常穿搭显得尤为便捷。同时,精致的绣花点缀衣身,烘托出小女生的精致与美腻,倍显少女感。"} +{"content": "类型#裤*材质#水洗*图案#印花*裤款式#破洞", "summary": "要人,不做!个性磨破破洞,时尚印花设计,街拍潮人必备。舒适耐磨不变形不掉色,走线工整,水洗工艺。年轻,要“裤”!"} +{"content": "类型#裙*材质#羊毛*材质#针织*风格#通勤*风格#淑女*裙型#包臀裙*裙长#半身裙", "summary": "此款羊毛针织半裙女采用半身裙设计,半身包臀勾勒出曲线与美腿,展现轻熟优雅的淑女气质。轻松搞定日常通勤打扮,很好的修饰了身材比例,魅力大方诱惑人心。精选优质羊毛面料,穿着舒适自然。"} +{"content": "类型#上衣*版型#显瘦*风格#清新*图案#印花*衣样式#衬衫*衣领型#翻领*衣款式#纽扣", "summary": "更容易引起关注率的印花元素,添加在衬衫上,穿着既不单调乏味又能很好凸显自我。经典的翻领,凸显气质优雅大方,别致的鹅黄色印花,洋气十足,浪漫更显小清新,还好藏肉显瘦,纽扣的位子添加粗黑条点缀,提升整体的吸睛力,穿着上身更利落更时髦。"} +{"content": "类型#上衣*材质#棉*颜色#白色*风格#简约*风格#休闲*风格#清新*图案#拼色*衣样式#卫衣", "summary": "来自BRAND的这款卫衣,采用柔软透气的纯棉面料,营造出舒适的穿着体验。简约的衣身,白色调,清新干净,融入了拼色的按扣装饰,既可以系上扣子休闲舒适的穿着,也可以将扣子解开,露出诱人的香肩,打造多变得穿着,让你不做平庸女。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑色*图案#字母*图案#文字*图案#印花*图案#撞色*衣样式#卫衣*衣领型#高领*衣袖型#罗纹袖口*衣袖型#收口*衣款式#拼接*衣款式#螺纹", "summary": "精选100%绒卫衣面料,舒适透气。捏省设计+罗纹收口塑造蝙蝠廓形,还起到点缀衣身的作用,底摆螺纹收口腰位勾勒出纤细线头,让廓形版型也显瘦。小高领设计凸显气质,弹力螺纹拼接穿脱方便。原创字母印花有一点点叛逆的,凸显自由追求。字母和黑色衣身大胆撞色,时尚感十足。"} +{"content": "类型#裙*风格#性感*裙下摆#荷叶边*裙领型#v领*裙款式#抽褶", "summary": "袖子的荷叶边与领口,很有层次感,尽显甜美优雅的气息。立体褶皱的荷叶边裙摆更显得浪漫与梦幻,v字领口露出锁骨线,尽显性感的情调。"} +{"content": "类型#上衣*版型#显瘦*材质#针织*风格#复古*图案#格子*图案#复古*衣样式#衬衫*衣款式#拼接", "summary": "显型男格调的一款修身衬衫,采用了前后幅拼接的设计款,在肩部通过精致的车缝线连接,呈现出不同与众的独特品位。前幅有致密格纹提花大气装饰,显出些许复古气息,个性的弧形下摆,以及针织贴袋的装饰点缀,带来俊朗帅气的穿衣风格。后幅为平纹肌理,平实而质朴,增添稳重气质。"} +{"content": "类型#上衣*材质#针织*颜色#浅蓝色*风格#清新*衣样式#开衫*衣领型#v领*衣款式#纽扣", "summary": "长款针织开衫为这个春夏而打造,半透明的薄款设计在即能满足搭配,又能略微的保暖,也能作为空调的穿着。清新淡雅的浅蓝色衣身有着湖水般的清澈,非常干净美好。v领的领口设计穿着更加舒适,前短后长的开叉下摆更显良好比例的身型。同色纽扣浑然一体。"} +{"content": "类型#上衣*版型#显瘦*版型#立体剪裁*颜色#白色*颜色#黑白*风格#复古*图案#复古*图案#波点*图案#线条*图案#印花*衣样式#衬衫*衣长#短款", "summary": "这款衬衫复古风的黑白波点印花款,非常的时髦百搭!小短款的设计,小个子女生也能轻松驾驭。修身的立体剪裁,对身材包容性更大,上身遮肉显瘦。复古别致的小尖角领面,采用白色的线条包边,系上更显甜美浪漫的法式风。衣身的隐藏式小口袋,和袖口,设计简洁大气,出街实力吸睛!"} +{"content": "类型#裙*版型#宽松*材质#牛仔布*颜色#黑色*风格#复古*风格#简约*风格#休闲*图案#复古*裙型#牛仔裙*裙款式#抽绳", "summary": "这款来自ehe的牛仔裤,采用黑色棉感牛仔制成,表层泛着淡淡的白点,复古静谧感十足。宽松的裤型,休闲自在,结合松紧式的抽绳腰头设计,上身毫无束缚感,舒适有型。简约的设计搭上利落的剪裁,适合日常多种造型。"} +{"content": "类型#裤*风格#简约*风格#ol*裤型#阔腿裤", "summary": "阔腿裤是人手必备一条的百搭单品,这款阔腿裤整体造型简约大方,简单百搭的同时时髦又自然,带了气质的ol感。垂感极佳的麻纱面料光滑亲肤,走起来飘逸显气质,气场自动释放。"} +{"content": "类型#上衣*版型#显瘦*风格#韩版*图案#字母*图案#文字*图案#印花*衣样式#卫衣*衣领型#圆领*衣门襟#套头", "summary": "这一款卫衣字母印花,青春活力自然减龄,精致裁剪加上韩版的设计,显瘦的同时拉长身型比例,将整个身材比例拉伸,凸显曲线。套头设计,方便穿脱气质利落。简洁圆领,修饰脖颈落落大方。"} +{"content": "类型#裤*版型#宽松*风格#简约*风格#休闲*图案#菱形*裤型#直筒裤*裤型#阔腿裤", "summary": "简约大方,时尚休闲,宽松直筒阔腿裤的版型设计简约却不简单,使得行走间自带清风,随性洒脱的魅力令人无法抵挡,同时,彰显出英姿飒爽的女王范儿。结合立体感的菱形提花面料,使得这条阔腿裤富有肌理感,低调而不失奢华地诠释着精致魅力。"} +{"content": "类型#上衣*材质#蕾丝*图案#刺绣*图案#蕾丝*衣样式#衬衫*衣款式#勾花镂空", "summary": "这款衬衣镂空的蕾丝花边领口,装点着梦幻般的唯美视觉感受;衣袖处相同面料的蕾丝装饰,充满了统一感;姿态各异的纯白花朵,以刺绣的工艺,仿佛朵朵鲜花绽放在衣摆上,优雅动人。"} +{"content": "类型#裙*颜色#紫色*颜色#纯色*颜色#粉色*风格#淑女*风格#简约*图案#纯色*裙型#鱼尾裙*裙款式#拼接*裙款式#抽褶", "summary": "纯色的过膝裙,有粉色、紫色和棕色三种款式可供选择,全面大幅度的纯色底色能很好地铺陈出一种别样的简约而质朴的视觉美感。而裙子的拼接式褶皱鱼尾裙下摆打破了传统的视觉比例,凸显出优雅的淑女之美。"} +{"content": "类型#裙*颜色#纯色*风格#简约*风格#性感*图案#纯色*裙型#直筒裙*裙款式#腰带", "summary": "这款BRAND的吊带衫。纯色的衣身,简约又不失大气,还很百搭。直筒的版型,巧妙的挡住赘肉,视觉上更显苗条。腰带的添加,勾勒出女性曼妙的身姿。多了一丝性感的调调。"} +{"content": "类型#上衣*版型#宽松*风格#英伦*风格#休闲*图案#线条*衣样式#风衣*衣款式#口袋", "summary": "风衣大多为英伦复古风,这款继承了基础版型的风衣,设计成宽松的廓型样式,版型上更偏休闲风格。大气驳领剪裁加上两侧的斜插口袋设计,帅气个性具有独特韵味搭配。两袖袖口的搭扣样式,使其具有修饰手臂线条的作用,并且防止冷风灌入。较长的款式剪裁,能够很好的遮住身型的不完美,打造修长曲线美感。"} +{"content": "类型#上衣*版型#宽松*风格#简约*风格#休闲*衣样式#卫衣*衣领型#一字领*衣袖型#喇叭袖*衣款式#吊带", "summary": "blank的这款一字领卫衣设计贴心简约。一字领展露玲珑锁骨,花边堆褶的衣领围绕在肩部,凸显女性气质。吊带设计轻巧精致,还免除了衣领滑落的可能,增强自信。双层喇叭袖设计甜美可爱,减龄又俏皮。宽松版型适合多种身材,穿着舒适休闲。"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*颜色#浅蓝色*风格#休闲*图案#线条*裙型#牛仔裙*裙腰型#高腰", "summary": "别小看基础款牛仔裤的魅力,这款水洗牛仔裤,遵循一贯极简的设计,却更符合大众的口味,可见其亲和力之高。猫抓痕的经典设计,增添几分不浮夸的潮范感。结合浅蓝色的水洗牛仔设计,更是凸显干净休闲的feel。中高腰的版型,展现完美腿部线条。"} +{"content": "类型#裤*颜色#纯色*图案#条纹*图案#纯色*裤长#九分裤*裤型#哈伦裤*裤款式#口袋*裤腰型#高腰", "summary": "高腰的款式设计时尚大气,选用纯色的版型,搭配哈伦裤的样式,简洁利落中不失腔调。九分裤的设计风格,微微露出脚裸不得不分,帅气感爆棚。双侧的口袋和细腻的条纹搭配,呈现出饱满的层次感,唯美大方。"} +{"content": "类型#上衣*版型#宽松*颜色#粉色*颜色#绿色*图案#印花*衣样式#卫衣*衣袖型#落肩袖*衣款式#抽绳*衣款式#连帽", "summary": "一眼看上去就能吸引住眼光的一款卫衣,荧光绿色在人群中显得特别的显眼,搭配着身前同样很鲜艳的贴布印花,上身满满的个性,粉色的抽绳也与整体卫衣的风格融合起来。整体是不挑人的宽松落肩袖版型,经典的连帽设计穿起来很有活力感。"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*颜色#白色*风格#街头*风格#运动*图案#条纹*裙型#牛仔裙*裙款式#拼接*裙款式#破洞*裙款式#抽绳", "summary": "水洗做旧工艺加上裤身上的磨破破洞元素,营造出饱满的街头气息,同时裤腰加宽裤袢的设计。辅以白色抽绳,以及裤侧的白色条纹拼接,代入轻快灵动的减龄运动款型,区别于传统的牛仔设计。"} +{"content": "类型#裙*颜色#白色*颜色#粉色*风格#性感*图案#线条*裙型#直筒裙*裙长#连衣裙*裙领型#v领", "summary": "对于一些粉色控的小仙女来说,这款粉色连衣裙绝对是不能错过的存在:大气的直筒版型不挑身材,而且也能让你的身形看着更加的高挑。白色的包边处理不会影响整体淡雅的气质韵味,而且也让衣身的线条更显清爽,交叉的v领剪裁能让你秀出精致的锁骨,无意中也添了几分性感的气息了。"} +{"content": "类型#裙*材质#蕾丝*风格#青春*图案#蕾丝*裙下摆#花边*裙款式#拼接*裙款式#吊带", "summary": "灵动的蕾丝元素给人很青春甜美的感觉,巧妙的将女性的柔美优雅尽情展现。而这款吊带大方的利用蕾丝花边拼接,增添一丝女人味,加上细腻的提花设计,萦绕裙身犹如精灵般动人,瞬间让整个人变得柔和起来,尽显不拘一格的时尚格调。"} +{"content": "类型#上衣*版型#宽松*风格#性感*衣样式#衬衫*衣袖型#灯笼袖*衣款式#荷叶边", "summary": "洋气的荷叶边设计的一款衬衫上衣,不管你是选择单穿,还是选择搭配穿着,都是非常的百搭时尚,还富有个性感。荷叶边的领口设计,修饰脸型,增添服饰的层次,显露甜美感的穿着。灯笼袖的袖口设计,轻松遮挡了腰部的赘肉,显露穿着的个性与独特感。宽松的衣型轮廓,不显身材,不挑人穿,好驾驭。"} +{"content": "类型#上衣*材质#棉麻*颜色#白色*颜色#黑色*风格#文艺*图案#卡通*图案#波点*图案#印花*衣样式#外套*衣款式#抽褶*衣款式#连帽", "summary": "自带吸睛技能的外套,醒目清爽的白色调,加上装饰的连帽,以及一身黑色的波点印花,轻松营造出趣味可爱的卡通波点狗形象,让宝宝上身简直不要萌活泼;双层棉麻面料的褶皱独特性,使整体看起来恬静文艺,而又有着透气亲肤的特性,赋予宝宝舒适自在的体验。"} +{"content": "类型#裤*版型#宽松*材质#牛仔布*材质#水洗*风格#复古*风格#休闲*图案#复古*裤长#长裤*裤型#直筒裤*裤款式#拼接*裤款式#破洞*裤款式#不规则*裤腰型#高腰", "summary": "BRAND带来的这款长裤,后幅采用解构式双腰头设计,加以小心机的高腰处理,能够有效提高腰线;前后拼接结合水洗磨白工艺,带来富有层次感的复古牛仔视效;宽松的直筒裤型,对身材有良好的包容性,打造休闲随性的穿着视效;加以裤身不规则的破洞设计,尽显叛逆不羁的个性。"} +{"content": "类型#上衣*风格#简约*风格#知性*图案#条纹*图案#线条*衣样式#衬衫*衣领型#一字领*衣款式#勾花镂空", "summary": "sitiselected这款条纹一字领衬衫,简约的一字领设计,尽显优雅知性。镂空排扣袖设计,修饰手臂的额线条。下摆两侧开衩裁剪,方便穿着,提升整衣的细节感。"} +{"content": "类型#上衣*颜色#黑色*风格#复古*图案#复古*图案#线条*衣样式#衬衫*衣款式#纽扣", "summary": "这是一款时尚感十足的小上衣,它采用了衬衫的设计款式,具有美观且百搭的穿着效果。袖口处采用纽扣设计,看上去十分大气优雅。它采用了精致的v型领口设计,能够凸显出女性独特的颈部柔美线条。然后再其配合黑色的贴布图案,十分个性且新颖。贴布上有着金属环的装饰设计,无形之中就能够增添许多复古的腔调,尽显你的时髦与大气。"} +{"content": "类型#裤*版型#显瘦*风格#简约*裤款式#口袋*裤口#卷边", "summary": "很简约时髦的一款纸袋裤,简约的版型设计,上身修身时髦,穿出优雅的气质。多口袋设计,增添时髦气息,更方便实用。裤脚卷边装饰,更独特时尚。"} +{"content": "类型#裙*风格#淑女*风格#复古*风格#清新*图案#复古*裙长#连衣裙*裙款式#抽褶*裙款式#收腰", "summary": "一款汉元素的连衣裙让我们的心瞬间沉静下来,它清新温婉的色系仿佛让时光都了,上身真的很显文静淑女。复古的一片式交领设计轻便利落,而收腰的版型又增加了层次感,更加好穿显高。蓬松的下摆自然褶皱,更显气质绰约。"} +{"content": "类型#裙*材质#雪纺*裙型#a字*裙长#连衣裙*裙领型#v领*裙衣门襟#系带", "summary": "暗红色的调调,可以完美的胜过所有妩媚的颜色;布满风琴褶的雪纺料,深v系带的a字连衣裙,不如为这件带有法式田园的风格的裙子,挑选几件富有现代感单品来搭配,比如踩上一双芭蕾舞平底鞋,背上天鹅绒,来平衡单品的法式田园风的气质。"} +{"content": "类型#上衣*版型#宽松*风格#简约*风格#性感*衣样式#开衫*衣样式#毛衣*衣领型#v领", "summary": "百搭糖果色开衫毛衣,时尚的大v领,彰显性感,简约百搭宽松,这样的设计也很显肤白,质地端庄大气,却可以衬托出你与众不同的气质来,适合花开的季节穿。"} +{"content": "类型#上衣*版型#宽松*颜色#纯色*风格#简约*风格#性感*图案#纯色*图案#创意*图案#线条*衣样式#开衫*衣领型#v领*衣款式#勾花镂空*衣款式#纽扣", "summary": "宽松的开衫版型,采用了单排纽扣,给人一种简约随性的气息。精致的v字领口,衬托颈部的线条,又增加了性感干练的魅力。纯色的衣身,采用了勾花镂空的设计,富余创意性的美感,灵动又欢脱。袖口微收,能够贴合腕部的线条。"} +{"content": "类型#上衣*版型#显瘦*颜色#粉色*颜色#深蓝色*风格#运动*风格#青春*图案#条纹*衣样式#外套*衣领型#立领*衣袖型#喇叭袖", "summary": "出自品牌fiveplus的这款棒球外套,充满运动风情。衣身制作采用光泽细腻,触感柔滑的纤材质,搭配流畅的剪裁手法,打造出立体修身的立领喇叭袖造型。既能巧妙凸显出白皙肌肤与精致五官,同时还能用飘逸柔美的袖管中和衣型的中性气息,烘托出娇俏少女韵味。深蓝色调与袖管两侧粉色的条纹装饰结合,演绎出了沉静而不失甜美的观感,能助你诠释出活力青春范儿。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#浅蓝色*风格#简约*风格#休闲*风格#清新*图案#印花*衣样式#polo", "summary": "时尚简约款polo衫,宽松休闲版型以清新浅蓝色为设计,舒缓炎热夏季带来的感,给视觉带来清凉舒爽。撞色系印花点缀简约衣身,给衣身增添亮点时尚。宽松衣身,简单修身,凸显男性时尚魅力。"} +{"content": "类型#上衣*版型#宽松*颜色#黑色*风格#简约*衣样式#风衣*衣样式#外套*衣领型#翻领*衣门襟#双排扣*衣款式#纽扣", "summary": "一款宽松版型设计的风衣外套,直筒宽松的衣型轮廓,不显身材的设计,不挑人穿的穿着,无论你是高个子,还是小个子,都可以轻松的驾驭。翻领的领口设计,修饰脸型的轮廓,显脸小的视觉效果。黑色的双排扣纽扣点缀,为简约版型的衣身,增加了层次感,让上身穿着没有单调感,显露满满的个性魅力。"} +{"content": "类型#上衣*材质#牛仔布*风格#清新*图案#创意*衣样式#衬衫*衣款式#破洞", "summary": "一直很喜欢这个风格,衬衫款又不会太过俏皮的像小女生,适合更多年龄层穿着。个性破洞的效果,打破单调,不像一般那样硬朗,更生趣。清新的牛仔蓝,整体的风格清爽干净。前短后长的下摆设计,造型时尚有趣,创意感强,并增添了整衣的层次效果,即便简单一搭,就能让自己很有品位的感觉。"} +{"content": "类型#裙*风格#运动*风格#性感*裙型#包臀裙*裙型#一步裙*裙长#短裙*裙衣门襟#拉链*裙款式#拉链", "summary": "来自品牌BRAND的这款一步短裙,设计师采用包臀的设计让这条原本运动学院风的单品多了一份性感,并且可展示出穿着者的好身材。其次,还在背面设计了一个小拉链,这样的装饰设计,不会让裙子显得单调,而且更加的有设计感。"} +{"content": "类型#上衣*版型#h*颜色#白色*风格#复古*图案#复古*衣样式#毛衣", "summary": "市面上很少见到肌理感如此顺畅的毛衣,从领口开始一直下摆,让整件毛衣的条理感清晰的好似一行诗,随性的故意做长袖口是为了让你可以把袖口告别呆板表情。h版型身材包容性高,适合外穿也适合内搭,三种颜色各有各的性格,白色安静温柔,深绿一如既往的内敛,则复古显白,总之总有颜色适合你。"} +{"content": "类型#裙*版型#显瘦*风格#休闲*风格#潮*裙衣门襟#拉链*裙款式#拉链*裙款式#收腰", "summary": "立体修身的版型,符合人体工学,藏肉显瘦,修饰美腿,拒绝臃肿,适合大多数体型,更好的修饰身形,根据人体美学打造属于你的黄金比例,隐形的品质拉链,经久耐用,收腰裙头尽显品质。适合潮流爱美女性在休闲的场合穿着。"} +{"content": "类型#裙*版型#显瘦*材质#绸缎*风格#复古*风格#知性*图案#格子*图案#复古*图案#线条*图案#撞色*裙腰型#高腰*裙款式#拼接*裙款式#木耳边", "summary": "优雅经典的撞色格纹,复古而知性的韵味搭配柔美木耳边,灵动中展现令人窒息的浪漫。修身显瘦的版型搭配绸缎腰部拼接,收腰身的同时大大提升了气场,魅力挡不住。高腰线拉升腿部线条,打造小蛮腰不费力,恰到好处的裙长衬显双腿纤长高挑。"} +{"content": "类型#上衣*材质#牛仔布*材质#水洗*风格#简约*风格#休闲*衣样式#衬衫*衣领型#翻领", "summary": "想用简单的单品或者搭配穿出时尚感,牛仔绝对是首要选择。而这个春季刮起的大热复古风,牛仔衬衫是一大热门单品,无论是想穿出帅气,还是休闲的酷感,牛仔衬衫都可以使你成为抢镜的焦点。江南布衣休闲衬衣经典的小翻领,搭载水洗的视觉效果,简约休闲又百搭。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙衣长#短款", "summary": "蕾丝面料哒,女人味达到造ji的位置连衣短款裙,优雅时尚范满满,名媛气质仙女时尚风格满满裙身细节部分也是real感人,不低俗,竟然还过分的仙袖子处的若隐若现,撩人到无懈可击polo领,有点禁欲,总体搭配出的效果翻倍!短款剪裁"} +{"content": "类型#裙*材质#棉*材质#羊毛*材质#混纺*颜色#黑色*风格#通勤*风格#休闲*风格#潮*裙型#包臀裙*裙腰型#自然腰*裙长#半身裙", "summary": "这是一款设计很别具一格的半裙,是都市女性日常通勤或者休闲时必备的单品。精选混纺的羊毛和棉料面料,有型更有范。沉稳大气的黑色配上包臀半裙的廓形,具有良好的包容性,迷人又俏丽。舒适的自然腰穿脱方便,新潮又时髦。"} +{"content": "类型#裙*风格#复古*图案#格子*图案#复古*图案#线条*裙型#鱼尾裙*裙腰型#高腰*裙长#半身裙", "summary": "看到这款别致的棕色系格纹裙,让人忍不住想穿上它,展现自己的复古气质。高腰的版型和立体的剪裁,让这款半身裙勾勒出你婀娜多姿的线条。优雅的鱼尾裙摆,穿上以后更是平添了几分灵动飘逸感,打造自信满满的你。"} +{"content": "类型#裙*版型#宽松*风格#文艺*风格#清新*图案#环保*裙长#连衣裙", "summary": "这一款连衣裙采用了天然环保的苎麻面料,拥有良好的亲肤性和透气性,穿着舒适,带来亲近自然的自由感觉。宽松的廓形设计有种度假的韵味,不挑身材不挑人,可以轻松驾驭。裙身上的花果图案装饰配色文艺清新,营造淡雅的气韵。"} +{"content": "类型#裙*风格#淑女*风格#潮*图案#撞色*裙下摆#开叉*裙长#连衣裙*裙款式#拼接*裙款式#口袋*裙款式#连帽", "summary": "连衣裙采用了拼接的撞色设计,在视觉上带来层次感。时髦的连帽款式上身彰显活力十足,引领时尚潮流趋势,胸前的口袋设计更显立体,再加上下摆处的开衩设计,露出拼接的网裙摆,带来新颖的设计感,上身彰显甜美的可爱淑女气质。"} +{"content": "类型#裙*版型#显瘦*图案#线条*裙型#背带裙*裙领型#v领", "summary": "v字的领型是在背带裙的设计当中经常会出现的领型,其中一个重要原因就是v字领型的修身效果跟背带裙个性相搭,用在背带裙的设计上十分应景。而且v字的领型线条比较利落,它有着视觉上面的拉长感,让女孩子轻松穿出显高显瘦的效果。"} +{"content": "类型#裤*裤长#连体裤*裤型#连衣裤", "summary": "这款连衣裤,柔软面料亲和宝贝每一寸肌肤,同时不起球,的开档开合设计,能够方便更换尿布;同时加以品质暗扣,可以带给宝贝零摩擦的舒适体验,穿着更加自在。"} +{"content": "类型#裤*颜色#迷彩色*风格#街头*风格#潮*风格#摇滚*图案#字母*图案#条纹*图案#文字*图案#迷彩*图案#音乐*图案#线条*图案#刺绣*裤款式#拼接*裤腰型#松紧腰", "summary": "裤身两侧的条纹拼接,为整体增添潮流元素,精致的字母刺绣搭配,起到很好的点缀作用,让整体看起来美观又大方。裤身迷彩拼接,十分有街头摇滚风的味道,可以尽情展现属于你的风格;松紧裤型,美观而实用修饰出笔直而修长的腿部线条。"} +{"content": "类型#上衣*颜色#黑色*颜色#粉色*图案#线条*衣样式#针织衫*衣领型#圆领*衣款式#露背", "summary": "一件前后风格截然不同而有和谐共处的针织衫。温柔明媚的少女粉色,让人忍不住想捧在手心里好好呵护;大圆领设计,修饰脖颈线条;露背设计,搭配黑色弹力织带,交叉之中隐约露肤,时髦独特,尽显优雅气质;"} +{"content": "类型#裙*材质#蕾丝*颜色#纯色*图案#纯色*图案#线条*图案#蕾丝*裙下摆#层叠*裙下摆#花边*裙衣门襟#拉链*裙款式#勾花镂空*裙款式#拉链*裙款式#吊带", "summary": "超美的蕾丝裙,优雅的花边领口,贴合肌肤,修饰柔美颈部线条,精致动人。袖口的蕾丝花边彰显几分俏皮感。背部隐形水滴拉链,精巧大气,拉合顺畅,方便穿脱。肩部镂空设计,不但精致,而且很清凉哦。流畅的叠层裙摆,丰富层次感。一体吊带内搭,避免走光。纯色的裙装,有种很圣洁的气质。"} +{"content": "类型#裙*版型#宽松*材质#牛仔布*颜色#浅蓝色*风格#街头*风格#休闲*裙型#牛仔裙*裙下摆#毛边*裙款式#不规则", "summary": "休闲的牛仔裙裤无论在什么时候都不会过时,永远会给人一种时尚前卫的感觉。不规则的毛边裤口,彰显年轻新潮流和不一样的时尚品味,还搭配出满满的街头风格。它宽松舒适的版型可以修饰你笔直的大长腿,牛仔布设计穿起来不仅舒适又有质感。浅蓝色色调,看起来非常彰显年轻活力。"} +{"content": "类型#上衣*版型#显瘦*图案#条纹*衣样式#衬衫*衣款式#露肩*衣款式#抽褶", "summary": "BRAND以蓝白细条纹打造的这款上衣,通过竖条纹的运用结合相对修身的剪裁,带来较为显瘦且好穿的单品。比较特别的是,设计师为这款衬衫做了露肩的处理效果,变化感鲜明。此外,衬衫通体褶皱的处理,显得非常别致且出彩。"} +{"content": "类型#裙*材质#网纱*裙型#蛋糕*裙型#抹胸裙*裙长#连衣裙*裙款式#亮片", "summary": "这款连衣裙第一眼就美得让人窒息,在温柔的网纱织面上,点缀了炫目晶莹的亮片元素,看起来层次丰富而梦幻,流露出的朦胧感特别美妙,颇具华丽隆重的贵族气息。甜美的抹胸式设计更加有女人味,可以尽情展现女生的曼妙身姿。三层蛋糕裙摆仙气满满,着每一位有着少女心的girl,简单一件就能让你秒变小公举。"} +{"content": "类型#裙*材质#棉*颜色#纯色*图案#格子*图案#纯色*图案#拼色*裙长#半身裙*裙款式#抽褶*裙款式#收腰", "summary": "松紧收腰设计,拼色格纹极具度假情调,纯棉面料具有极好的透气性与亲肤感,下摆褶皱处理少女感十足,纯色极简百搭,日常搭配小脚都很好看。"} +{"content": "类型#裤*版型#显瘦*版型#h*材质#蕾丝*风格#ol*风格#潮*图案#蝴蝶结*图案#蕾丝*裤款式#绑带*裤腰型#高腰*裤口#微喇裤", "summary": "优雅气质的花边领口设计,凸显服装的时尚新潮。时尚喇叭袖口搭配绑带蝴蝶结,蕾丝裙摆设计,穿着飘逸大方,彰显女神范。高腰设计,拉长腿部比例,a字裙摆,遮肉显瘦,有范优雅显气质,谁都能hold住的实穿款。"} +{"content": "类型#裙*风格#潮*风格#性感*裙型#a字*裙型#鱼尾裙*裙长#连衣裙*裙领型#立领*裙款式#钉珠*裙款式#木耳边", "summary": "a字型轮廓的一条连衣裙,不显身材的设计,还不挑人穿,无论你是高个子,还是小个子,都可以轻松的驾驭,让你轻松展现魅惑的女人味。木耳花边的设计,显露穿着甜美感,立领的领口,修饰脸型,显脸小的视觉效果。鱼尾的裙摆,是个性感的设计。钉珠的点缀,增添服装的层次,与潮流感。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*图案#线条*裙型#牛仔裙*裙下摆#弧形*裙腰型#高腰*裙长#短裙*裙款式#拼接", "summary": "显瘦牛仔短裙,穿着美观又时尚。优雅又大方的高腰设计,穿着非常的美观气质,勾勒出纤细的腰部曲线。多裁片拼接的造型,别致时尚又个性,增添线条美感。前短后长的版型,优雅的圆弧裙摆,美观气质又优雅。"} +{"content": "类型#上衣*颜色#迷彩色*风格#复古*风格#潮*图案#复古*图案#迷彩*图案#刺绣*衣样式#衬衫*衣样式#卫衣*衣款式#口袋", "summary": "采用经典美式复古版型,定制的刺绣臂章,彰显品质,方便实用迷彩大口袋设计,增大大提高了储物空间,采用标准版型裁剪工艺,能够容纳多种不同身材。随意搭配卫衣或者衬衫,非常适宜春季各类都市潮流装扮。"} +{"content": "类型#裤*版型#显瘦*版型#立体剪裁*颜色#深蓝色*风格#复古*图案#花色*图案#碎花*图案#复古*图案#线条*图案#撞色*裤型#阔腿裤*裤款式#拼接*裤腰型#高腰", "summary": "阔腿裤的设计垂感较好的a字版型,很显瘦呢!复古的深蓝色颜色十分得衬托肤色,深浅不一的小碎花点缀着吊带裤设计,让你看起来更显魅力。拼接撞色花色罗纹领口设计,也带有几分女人味。高腰的立体剪裁更是修饰腰部的线条感,凹凸有致的好身材表露无遗。"} +{"content": "类型#裙*版型#显瘦*材质#针织*风格#复古*风格#简约*风格#休闲*风格#清新*风格#职场*图案#条纹*图案#复古*裙下摆#花边*裙长#短裙*裙长#长裙*裙领型#半高领", "summary": "花边半高领唤醒了18世纪的浪漫主义风格,凸显气质,焕发复古优雅魅力。细带设计,沿袭宫廷式的至美元素。竖条纹弹力修身,显瘦纹理,贴合女性身材曲线,简约优雅。搭配优雅职场短裙,清新休闲长裙,打造不一样的气质。秋冬内搭不可或缺的打底针织。"} +{"content": "类型#裤*风格#复古*风格#性感*图案#复古*图案#印花*图案#撞色*裤长#九分裤*裤款式#木耳边*裤款式#抽褶", "summary": "领口和袖口的木耳褶皱元素穿上之后凸显女人的性感魅力,这样的设计元素你一定不会拒绝。领口的多层蝴蝶系带,有着少女的减龄可爱,不会显得太过老气。九分的喇叭袖设计迎合了几年来的复古风潮,而且可以修饰胳膊上的赘肉。微微廓形的裙摆不会闷汗束缚,无论是外穿还是内搭都很好看。热烈的撞色印花元素很适合度假穿着。"} +{"content": "类型#裙*材质#棉*颜色#红色*裙长#半身裙*裙袖长#长袖*裙款式#抽绳*裙款式#连帽", "summary": "红色斜襟抽绳,气场十足,回头率高,长袖连帽,适合早春时节。可以搭配半裙也可单穿,显得活力十足有气质,格调也随之上升,棉质材质,穿着很舒适,手感丝滑有韧性,还在等什么,回家吧。"} +{"content": "类型#裙*版型#宽松*颜色#墨绿色*风格#清新*图案#印花*裙型#直筒裙*裙长#连衣裙*裙款式#口袋*裙款式#纽扣", "summary": "十分秀气的一件气质连衣裙,宽松的直筒裙型一点也不挑身材,墨绿色的基调中给到了清新的小印花装饰,加上小巧的口袋和纽扣装饰,更是大方又端庄实穿。"} +{"content": "类型#裙*版型#宽松*材质#雪纺*风格#复古*风格#简约*图案#复古*图案#波点*裙款式#拼接", "summary": "复古俏皮的波点图案装点裙身,色彩亮丽吸睛,洋溢着浪漫甜美气息,拼接雪纺袖子设计,碰撞出别样个性优雅感。简约流畅宽松版型,结合a摆廓形,更好的包容和修饰身材曲线。"} +{"content": "类型#上衣*颜色#纯色*图案#纯色*衣样式#外套*衣领型#立领*衣门襟#拉链*衣款式#口袋*衣款式#拉链", "summary": "这款外套采用立体的版型,搭配立领的设计,让外套挺括有型,穿起来更显精神。而光面的设计,让外套表面的色泽会根据的变化而产生变化,彰显出低调的奢华感。同时也让外套穿起来温暖舒适,兼具了保暖性和美观性。拉链胸袋的装饰,让纯色的衣身不显单调,丰富了视觉效果,让衣身两侧的口袋点缀,让你的双手能够随时取暖。"} +{"content": "类型#上衣*材质#棉*材质#纤维*颜色#黑色*图案#线条*图案#撞色*衣样式#针织衫*衣领型#翻领*衣门襟#拉链*衣款式#拉链*衣款式#罗纹", "summary": "BRAND这款针织衫,采用手感细腻的棉和聚酯纤维面料精心制作而成,轻薄且透气性好,给人带来柔软亲肤的穿着体验。设计师以简明的线条勾勒撞色罗纹翻领版型,修饰脸部;并将黑色作为主底色,彰显型男的低调轻熟韵味;门襟拉链的设计,增加了层次,兼具美观性和实用性。"} +{"content": "类型#裤*裤长#连体裤*裤型#阔腿裤*裤款式#拉链", "summary": "连体裤的版型设计,在一定的条件上可以很好的拉长,女人整体的身材比例。应用的翻领展示出了干练的一面,采用的拉链设计,让你可以随心所欲的变换风格。阔腿的样式,双腿比例更加的完美,女神非你莫属。"} +{"content": "类型#裙*风格#高贵*图案#渐变*裙型#蛋糕*裙型#蓬蓬裙*裙型#公主裙", "summary": "甜美风格的小公主裙,在外形上融入了渐变的色彩,使得裙子的立体效果增强,突出宝贝的高贵之气。裙子在设计时融合蛋糕裙与洛丽塔的双重风格。在风格的混搭之间,营造出来更多的浪漫之感。蓬蓬的小裙摆,强烈的视觉效果,展示出来宝贝更多的活力,让小公主如同天鹅一般神秘迷人。"} +{"content": "类型#裙*材质#雪纺*风格#清新*风格#性感*图案#印花*裙长#半身裙*裙衣长#中长款", "summary": "一款中长款的雪纺半身裙,有种精致的花朵印花点缀,很清新的感觉,甜美减龄又充满浪漫的气息。腰部是松紧的设计,穿着舒适有度,更加轻松自在。双层的裙摆,微微透视的效果,层次清晰,裙摆侧边分叉,更加的性感诱惑。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*材质#雪纺*颜色#黑色*图案#蝴蝶结*裤型#直筒裤*裤型#背带裤*裤腰型#高腰", "summary": "穿惯挺括的牛仔裤腿裤,在这个春夏不妨试试这款垂坠柔滑的雪纺背带裤。小有厚度能抵挡早春的料峭。黑色直筒裤型,修饰腿型显瘦自在。撞色感的亮银色圆环,暗色调的裤子也变得明快起来,背带穿过扣眼,半扎起蝴蝶结就能随性的调节背带长短。高腰设计,优化身材比例,有秒变大长腿的技能。"} +{"content": "类型#上衣*材质#蕾丝*颜色#杏色*图案#蕾丝*衣样式#衬衫*衣领型#v领*衣领型#翻领*衣门襟#一粒扣*衣款式#拼接", "summary": "让人一见倾心的粉杏色衬衫,采用了挺括的梭织面料,穿着透气舒适,自在又随性。翻领与v领的完美结合,既能勾勒出迷人的天鹅颈,又能衬托出娇俏的小脸。一粒扣开合,方便穿脱的同时又能起到装饰的作用。拼接的七分蕾丝袖,在婉转中展现一分利落,后幅收褶设计,让美妙得以展现。"} +{"content": "类型#裤*材质#牛仔布*颜色#浅色*裤款式#口袋", "summary": "新款女裤的时尚设计,结合了大众的审美,并注入了别有风格的设计。浅色磨白的图案选择,展现出女生天生具有的恬静优雅的气质。男生看到女生第一眼看到的是腿,这条休闲裤在腿部的设计通过微喇的版型,掩盖了腿型缺陷。还视觉上拉长小腿比例,塑造大长腿。后口袋的盾形设计,将焦点汇聚到臀线中间,打造立体臀型。时尚的裤脚做了三条杠的设计,让这条牛仔裤不显单调。"} +{"content": "类型#上衣*材质#蚕丝*风格#休闲*衣样式#衬衫*衣样式#西装*衣领型#立领*衣领型#v领", "summary": "假如不能将矜贵的面料做现代风格的设计表达,那么有如此的传统桑蚕丝面料,光华就永远只在过去时。如般下落的小翻折领,的态度和,呈现出的v字领,正如远古神话的“”传统的的肩线,下落式样的肩袖处理,与改良式西装立领的,休闲懒,又好像穿了的老衬衫一般调皮整个廓形为的三角形,适宜搭配哈伦裤或者阔腿裤。"} +{"content": "类型#裙*风格#知性*风格#青春*图案#蝴蝶结*裙下摆#开叉*裙长#连衣裙*裙领型#v领*裙衣门襟#系带", "summary": "这款连衣裙走的是青春时尚的风格路线,尽显出女性的精致细腻的一面,衬托出女性的端庄知性的一面。采用了v领的领口设计,十分利索干净。搭配腰部的蝴蝶结系带,突显出新鲜有趣的一面。裙摆的两侧开叉剪裁,更是体现出你的华丽优雅的气质。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*衣样式#衬衫*衣门襟#套头", "summary": "这款衬衫的看似普通实则颇有几分个性,宽松的版型也一样显瘦遮肉,是属于套头式的版型设计,大大的v字形领口,修饰脸型,前短后长的下摆设计,背后更是设有开叉的元素,这样看起来不会很单调。"} +{"content": "类型#上衣*材质#网纱*衣样式#卫衣*衣样式#外套", "summary": "此款做了黑、白两色,使用质量偏好的网纱和网眼;网纱飘逸动感,一般会搭配一些卫衣罗马面料,会中和掉网纱的仙;此款将网纱和网眼结合,既不会显得过于轻浮,也不会过于硬朗,吸人眼球,给人视觉冲击。偏薄,透风,可以用作春秋的薄外套。"} +{"content": "类型#裙*版型#显瘦*图案#条纹*图案#印花*裙型#一步裙*裙腰型#高腰*裙长#半身裙*裙款式#不规则", "summary": "造型感十足的一款半身裙,经典的过膝长度,尽显矜持优雅。时髦的高腰设计,采用了松紧设计,显瘦又方便穿脱。抢眼的条纹印花,更是充满时尚活力;裙摆处做了不规则的设计,让你的每一步都摇曳生姿,展现与众不同的自己。"} +{"content": "类型#上衣*版型#宽松*风格#简约*风格#ol*风格#休闲*衣样式#衬衫", "summary": "简约到极致的一款长款的宽松结构衬衫,几乎可以做连衣裙穿着,简约中带着些甜美,洒脱里又包含率性。很干净的白,很轻柔的面料,有些薄透的视觉感。很好搭配的款式,休闲还有ol风格。"} +{"content": "类型#裙*颜色#纯色*风格#简约*图案#纯色*裙长#连衣裙*裙袖长#短袖*裙领型#立领*裙款式#抽绳*裙款式#抽褶", "summary": "BRAND这款别致大方的纯色连衣裙,时尚又气质的立领设计,穿着美观又气质,更显立体大方美。腰间的抽绳设计,大方的褶皱处理,勾勒纤细的腰部曲线。大方的短袖造型,简约又方便,修饰优美的手臂曲线,时尚又大方。"} +{"content": "类型#裙*颜色#白色*图案#碎花*图案#印花*裙下摆#荷叶边*裙袖型#喇叭袖*裙款式#拼接", "summary": "深绿色的底色上点缀白色的碎花印花,仿佛绽放在草地上的花朵,为了迎接春天到来。v型的领口既修饰脸型又能露出精致的锁骨。喇叭袖的半袖甜美活泼,还能衬托手臂的纤细。荷叶边的裙摆拼接俏皮减龄。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#黑白*图案#波点*衣样式#衬衫", "summary": "波点一直都是复古风的标配元素,尤其是黑白波点的搭配特别的耐看,看似简简单单,穿上身却特别气质。直身的宽松衬衣版型,不挑身材,即使受力也不会觉得紧绷,舒适感上佳。面料的垂性很好,上身利落显瘦,很显精神。"} +{"content": "类型#上衣*版型#显瘦*材质#西装面料*颜色#纯色*风格#休闲*图案#纯色*衣样式#风衣", "summary": "这款西装裤选用了柔软细腻的西装面料,具有平整透气的上身效果,垂感非常好。整体光洁的布料,缝制有起皱效果,视觉效果上具有很好的显瘦作用。九分裤直筒的版型,休闲慵懒,带有一丝优雅气质,外搭一件纯色的风衣,还能引来不少异性的回头率~"} +{"content": "类型#上衣*版型#宽松*颜色#白色*风格#复古*风格#简约*图案#复古*图案#波点*图案#线条*衣样式#衬衫*衣款式#抽褶", "summary": "宽松的波点衬衣,散发着浓浓的法式复古风情,系带领的设计显得十分灵动。修饰颈部线条,肩部褶皱设计个性简约,修饰形体。波点面料经典简约,低调的咖色十分的特别雅致。白色的波点一扫沉闷的感觉,白色底色干净利落简约。"} +{"content": "类型#裙*材质#牛仔布*图案#蝴蝶结*裙型#牛仔裙*裙长#连衣裙", "summary": "颜值很高的一款单鞋,中性的帅气与优雅的女人味并存。大小刚好的蝴蝶结装饰在简洁的一字带上,将柔美的气息注入其中。不管是搭配牛仔或连衣裙上身都很显气质。"} +{"content": "类型#裙*风格#复古*风格#文艺*图案#复古*图案#刺绣*裙型#百褶*裙款式#拼接*裙款式#钉珠", "summary": "这款丝绵上衣上有着精致的花朵刺绣装饰着,v型的领口设计看上去别致又显档次,裙摆与袖口处都是百褶样式的,穿在身上格外地飘逸吸睛。复古的钉珠与金线拼接,细节处的设计满满都是品味,让你轻松演绎复古的文艺范儿。"} +{"content": "类型#上衣*颜色#酒红色*风格#性感*图案#线条*衣样式#衬衫*衣领型#v领*衣款式#抽褶*衣款式#荷叶边", "summary": "这是一款女人味十足的衬衫,线条流畅的v领,修饰颈部曲线,展现性感锁骨。腰部收紧设计,做了层层褶皱,更具层次设计感,同时提高了腰线,形成了自然的荷叶边下摆,优雅十足。而颜色更是选择了神秘而魅惑的酒红色,将女性魅力完全展现。"} +{"content": "类型#裤*版型#宽松*材质#纤维*风格#运动*图案#格子", "summary": "采用的聚酯纤维面料,具有柔软细滑触感和透气散热特性,让你在行走运动的时候舒适。宽松的版型,不会影响肢体自由,让伸展灵活自在。格纹的加入,轻盈立体,具有强的层次感。侧边的织带,柔和裤子的硬朗,突显柔情。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*衣样式#卫衣*衣领型#圆领*衣长#短款*衣袖型#落肩袖", "summary": "以短款宽松直筒廓形剪裁的运动衫,带有卫衣风格,上身舒适自由。休闲圆领设计,上身舒适百搭。加上慵懒落肩袖型,带来轻松休闲范儿,正面配有光亮印字装饰,精致美观而不失设计感。"} +{"content": "类型#上衣*版型#显瘦*材质#牛仔布*风格#青春*图案#字母*图案#文字*图案#印花*图案#撞色*衣样式#卫衣", "summary": "这款卫衣展现出了青春原有的活泼与生机感。非常醒目的撞色设计,搭配上立体字母印花,带来了非常年轻的学院气息。而在搭配上也非常的简单,一条牛仔裤一双小白鞋,就可以青春故事了。整体是修身的款型,能够凸显出好的身材来,彰显十足品味。"} +{"content": "类型#裙*风格#复古*图案#复古*图案#线条*裙下摆#垂坠*裙领型#立领*裙款式#腰带*裙款式#螺纹*裙款式#亮片*裙款式#木耳边", "summary": "螺纹立领的设计,复古韵味十足,衬托脖颈修长,使你赚足回头率。以串珠与亮片钉制成的花朵图案,点缀在前襟,光影重重散发出璀璨的光芒,巧妙的吸引眼球,衬托出端庄时髦的气场。腰间搭配一条双层的pu腰带,合身的裁剪设计,束出纤细的小蛮腰。蜿蜒的木耳边点缀在裙边,在垂顺流畅的线条里添一些立体挺括的视觉效果,只需微风拂过,更显蹁跹的步伐。"} +{"content": "类型#上衣*版型#宽松*颜色#纯色*图案#纯色*衣样式#衬衫*衣领型#立领", "summary": "非常素雅的纯色衬衫,宽松版型,穿在身上的,有一种不可比拟的仙气。立领设计,打造立体效果,增加层次感,袖子可以翻起来穿,装点你的成熟气质更加显得干练。"} +{"content": "类型#上衣*版型#显瘦*风格#知性*图案#线条*衣样式#衬衫*衣领型#圆领", "summary": "气质干练知性的优雅气质,这款衬衫连衣裙当仁不让,精巧的小圆领,气质有型的衬托出柔美的,而合体直筒的裙身廓形,彰显出利落之感,同时能够藏起多余的肉肉。搭配直身线条的提花纹理,显瘦的视觉感更是妥妥的。"} +{"content": "类型#上衣*颜色#黑色*颜色#灰色*颜色#浅灰色*风格#复古*风格#简约*图案#复古*图案#线条*衣样式#西装*衣领型#西装领*衣门襟#双排扣", "summary": "两款小众的浅灰和珊瑚灰色调,有别于传统黑色西装的沉闷老气,在正式而严肃的干练商务感中凸显一丝轻盈从容的自在风范,却同时保留了大气利落的女性魅力。简约的版型设计,线条干净利落,完美表达极简主义,上身提升精气神更有气场。精致的西装领型,更显独特气质。复古的双排扣,增添丝丝文静书卷气质。"} +{"content": "类型#裙*颜色#纯色*风格#淑女*风格#简约*图案#纯色*图案#刺绣", "summary": "名副其实的淑女裙,缤纷的彩绣图案赋予其几丝民族风情,精致且令人惊艳。绣花灵活而生动的装饰上半身,如百花盛放般绚烂,叫人欣赏不够。裙摆则以纯色来演绎,进行简约的碰撞,视觉感浪漫唯美。"} +{"content": "类型#上衣*风格#通勤*风格#日系*风格#复古*图案#碎花*图案#复古*图案#线条*衣样式#衬衫*衣领型#polo领*衣领型#翻领", "summary": "这款衬衫采用极具通勤风的polo领设计,甜美的碎花点缀,就是一份精美的艺术品,偏日系风的颜色,天然纯洁无瑕,让肌肤自由呼吸,自然的色彩和翻领的设计为我们的衣身整体增不少,绝对是人群中的焦点。立体廓形简洁干净,复古的花纹图案,衬衫前后线条张弛有度。"} +{"content": "类型#上衣*风格#街头*风格#运动*风格#清新*衣样式#外套*衣领型#立领*衣款式#松紧带", "summary": "有别于传统的防晒衣,这款外套兼具实用性和潮酷感。帅气的立领,内置隐形衣帽,可根据造型需要随心变换。衣身简洁清新的英文字母大大丰富细节看点,同时带出街头玩趣感。可调节性直筒衣袖,兼具实用功能性与高街时髦感。衣摆处收弹力橡筋,轻松勾勒出活力运动风尚。"} +{"content": "类型#裙*版型#显瘦*风格#文艺*风格#知性*风格#清新*图案#线条*裙款式#腰带", "summary": "这款长款的裙装,可以是小清新的纯情旖旎,可以是闲适悠然的知性文艺。搭配一条腰带也可以拉长身材比例,长度也刚好包裹住大腿中相对较粗的地方,露出修长大腿,展现优雅迷人的气质。符合人体,勾勒身体线条,上身显瘦,显精神!"} +{"content": "类型#裤*版型#显瘦*材质#混纺*裤长#短裤*裤款式#流苏", "summary": "双色纱线掺入金丝混纺而成的粗花呢面料,无论是从颜色还是质感上看,都透露出满满的优雅气息。腰部以及裤脚处浓密的流苏装点,配合裤身双排珍珠扣,为小香风的短裤更添精细美好的小细节。腰腹部做了收省的版型,自然贴合腰部曲线,平腹显瘦的效果显著,配合逐渐变宽的裤脚,衬得双腿更为纤细,上身轻松穿出大长腿即视感。"} +{"content": "类型#裤*材质#丝绒*颜色#白色*颜色#粉色*风格#运动*风格#休闲*风格#清新*图案#条纹*图案#线条*裤长#九分裤*裤长#长裤*裤款式#拼接", "summary": "粉色丝绒面料制成的一条运动风格的休闲长裤,不过是九分的裤长设计,上身穿着可以露出纤细的脚踝,而且帮助你拉长腿部线条。因为是丝绒,而且用了清新少女的粉色,散发出好看的光泽感。侧面是白色的竖条纹进行拼接,丰富整体层次感,张扬出穿搭的青春活力。"} +{"content": "类型#裙*图案#蝴蝶结*裙款式#不规则", "summary": "度假味道浓浓的不规则上衣,马上让人想夏威夷,肩头的蝴蝶结俏皮可爱,两件套的设计弥补了细节单调的缺憾。单穿搭配打底裤,随风流动的飘逸裙摆,自然流露出女人味。"} +{"content": "类型#裙*图案#蝴蝶结*裙款式#不规则", "summary": "上半身的大蝴蝶结设计,为你的可爱加分,蝴蝶结的可随意性,使你有更多的搭配方式,不再一种;3d立体的剪裁方式,保证裙子贴合每个人的曲线,凸显你的小蛮腰;下裙的不规则剪裁,使你不缺乏灵动个性十足。"} +{"content": "类型#裙*风格#性感*图案#波点*裙长#连衣裙*裙衣长#中长款*裙款式#收腰", "summary": "这款连衣裙本身就有十足的独特风韵。腰部收腰的版型设计,完美修饰比例,打造性感的女性风采。独特的波点花纹设计,将衣身的独特魅力演绎的淋漓尽致。中长款的适度衣身,给与足够的舒适度。下摆的廓型设计,自然的流露了一种公主版的古典优雅气质。给您十足的穿衣享受。"} +{"content": "类型#裙*颜色#深蓝色*风格#简约*图案#植物*图案#线条*图案#印花*裙腰型#中腰*裙领型#圆领*裙款式#拼接*裙款式#抽褶", "summary": "精致的植物印花设计,以深蓝色为底,再加上红白的植物印花布满裙身,上身效果非常好,突显出女性的知性美。时尚简约的圆领,修饰纤细柔美的颈部线条,同时衬托的脸型更显娇小,彰显自信优雅的女性气质。舒适的中腰款式,恰到好处的褶皱处理,和不漏痕迹的拼接,呈现出曼妙窈窕的女性身姿。"} +{"content": "类型#上衣*材质#丝绒*风格#复古*风格#运动*图案#复古*图案#撞色*衣样式#外套*衣款式#拼接", "summary": "非常运动的一款丝绒拼接外套,无论是可竖可翻的领口,还是蓝白的撞色,都轻松的打造出青春活力的运动风尚。不仅带来出挑的视觉效果,还可以为你增添时尚的氛围。尤其是拼接的丝绒面料,不仅仅是崇尚复古的情调,更容易彰显出穿衣者的高尚格调,在运动中彰显格调的非凡。"} +{"content": "类型#裙*版型#显瘦*风格#潮*风格#性感*图案#条纹*裙长#连衣裙*裙衣长#短款", "summary": "这款连衣裙短款设计,加上修身的款式显瘦显腿长。红白蓝的配色经典时尚,配合条纹更衬潮流气息,腰间结饰装饰,精致俏皮,小小露腰更添小性感。"} +{"content": "类型#上衣*版型#宽松*材质#蕾丝*颜色#白色*风格#性感*图案#蕾丝*衣样式#衬衫*衣袖型#落肩袖*衣款式#拼接", "summary": "以白色为主体色调的衬衣本身看起来就充斥着几分温油又淡雅的味道,更能轻松带出实穿性与。偏宽松的版型设计配合落肩袖型的设计可以修饰肩部曲线,而且还能勾勒出慵懒随性范儿,对于微胖的小仙女也是敲友好的选择。蕾丝拼接的袖子设计可以增加灵动韵味,也能轻松勾勒出性感的女人味精致。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#蕾丝*颜色#粉色*图案#线条*图案#蕾丝*裙袖型#泡泡袖*裙款式#勾花镂空*裙款式#收腰", "summary": "充满小女人风韵的粉色打造而成的裙身带着一些纯洁甜美的感觉扑面而来。全身上精致的蕾丝镂空提花,又一种美轮美奂的艺术感,也散发出了十分柔美的感觉。宽松的泡泡袖口若隐若现之间显露出白皙的肌肤,同时也修饰了手臂的线条,更显纤细。腰间的松紧收腰轻松显瘦,让裙摆呈现出一种仙气满满的感觉,婆娑摇曳之间展现雅致风范。"} +{"content": "类型#裙*风格#淑女*风格#文艺*图案#植物*图案#线条*图案#印花*裙下摆#荷叶边", "summary": "轻薄柔软的面料辅以植物花卉印花元素,碰撞出时髦与文艺气息。甜美的荷叶边俏皮可爱又不失淑女韵味,露腰的版型刚好秀位置,释放优美的线条,体现女性优雅魅力。"} +{"content": "类型#上衣*材质#棉*风格#复古*风格#文艺*风格#清新*图案#复古*图案#刺绣*衣样式#衬衫*衣领型#圆领*衣长#短款*衣袖型#落肩袖*衣袖型#灯笼袖", "summary": "这款衬衣选用纯棉的面料手感柔软,上身穿着透气亲肤。精致的圆领,勾勒锁骨线条美,突显女性柔美的气质。两侧肩部花朵刺绣点缀,散发出清新文艺气息。落肩灯笼袖,藏肉又不失个性,举手投足之间,尽显女性甜美优雅的复古气质。衣身短款的设计,在视觉上提高腰线,优化身材的比例,让你轻松穿出曼妙身姿。"} +{"content": "类型#上衣*颜色#白色*风格#淑女*图案#线条*衣样式#衬衫*衣样式#马甲*衣袖长#无袖*衣款式#绑带*衣款式#抽绳*衣款式#木耳边*衣款式#荷叶边", "summary": "以清浅的白色调来演绎的一款无袖马甲衬衫,袖口处以垂直至下摆的荷叶边点缀而成,修饰整体繁复精致的细节感,同时增添甜美淑女的韵味。领口处采用抽绳绑带的设计,可自主选择收紧,进而营造成甜美细碎的木耳边廓形,也可自然平铺修饰脖颈线条,设计贴心实用。"} +{"content": "类型#裙*版型#显瘦*颜色#蓝色*风格#文艺*风格#清新*风格#性感*裙长#连衣裙*裙款式#勾花镂空*裙款式#松紧带*裙款式#收腰", "summary": "这款连衣裙采用纯净的蓝色调,集清新与文艺为一体,让你轻松穿出别致风情。落肩的设计,显瘦之余更将性感撩人的气质展现出来。松紧带收腰设计,轻松勾勒出你曼妙的身姿让人眼前一亮。衣身胸口及裙摆,做了镂空的设计,别致又新颖,炒鸡美腻噢。"} +{"content": "类型#裙*版型#宽松*风格#文艺*裙下摆#压褶*裙长#连衣裙", "summary": "将娇甜可人的粉,与洋气热烈的大红完美碰撞,便将连衣裙突兀的视觉冲击感带出来了,一下子就捕获了别人的聚焦视线;前襟特意打造的压褶细节感很强,配合清雅脱俗的提花装点,瞬间将潜在的女性柔情通通展现;富有包容性的宽松廓形,让身材稍胖的妹纸,也能穿出苗条的感觉,行走间文艺气质十足。"} +{"content": "类型#上衣*版型#显瘦*颜色#纯色*风格#文艺*风格#简约*图案#纯色*图案#撞色*衣样式#衬衫*衣领型#小立领*衣袖长#长袖*衣款式#纽扣", "summary": "这款衬衣区别于一般的纯色长袖衬衫,这款衬衫采用撞色压线设计,使得衬衫简约而不失优雅。领口采用了经典的小立领设计,搭配撞线点缀时尚简约又十分抓人眼球,袖口处的纽扣点缀,给人一种别致的文艺气质。衬衣下摆巧妙的贴合体型并且修饰腰腹部,穿着舒适又显瘦。"} +{"content": "类型#裤*版型#显瘦*材质#网纱*风格#潮*裤长#七分裤*裤款式#纱网", "summary": "以优雅的的两件套设计,拥有丰富的感,打造时髦新潮的款式。飘逸的系带网纱设计,散发仙气十足的气息,塑造仙女的气质。修身显瘦的七分裤设计,凸显出傲人的身材,整体设计别具摩登感。"} +{"content": "类型#裤*版型#显瘦*颜色#红色*颜色#纯色*颜色#酒红色*风格#复古*图案#纯色*图案#复古*裤口#微喇裤", "summary": "基底色进行设计,少了大红色的张扬,偏暗的色调更加内敛、大气,尽显洋气的复古气息。衣身立体式的裁剪,呈现原始的流畅线条美;裤身正中的折痕,避免纯色的单一,与微喇叭的裤脚相呼应,上身带来显高显瘦的视觉效果。"} +{"content": "类型#裙*材质#针织*风格#简约*图案#线条*图案#印花*裙下摆#开叉*裙长#连衣裙*裙领型#圆领*裙款式#拼接", "summary": "它是针织上衣与印花裙拼接组合的连衣裙。圆领的设计简约大方,修饰漂亮脸型,凸显优美颈部线条,显优雅温婉气质。裙子采用可调节式肩带设计,高矮胖瘦身型都可轻松驾驭,开叉下摆让行走更轻松自如。"} +{"content": "类型#上衣*图案#线条*衣样式#西装*衣领型#翻领*衣款式#抽褶*衣款式#收腰", "summary": "为优雅女性量身打造的收腰连衣裙,融合了经典的西装翻领,颇有些毫不费力的法式BRAND。收腰扣带装饰强调曼妙的腰肢曲线,也在无形中拉长腿部线条,衣身的褶皱肌理是点睛之笔,给行走间灵动的裙摆带来更多张力动感。"} +{"content": "类型#上衣*版型#宽松*颜色#卡其色*风格#休闲*图案#撞色*衣样式#棒球服*衣领型#圆领", "summary": "拥有对美好生活的愿景注入设计的品牌徐洛淇,此番汲取棒球服的设计理念始终贯穿整体衣身;通过经典圆领款式,结合利落的宽松版型,进一步提升整体的休闲随性氛围;透过撞色的圆圈标志缀于胸前,完全吻合棒球服的底蕴;搭配一条卡其色的休闲哈伦裤,轻松在上驰骋。"} +{"content": "类型#裤*图案#线条*裤型#直筒裤*裤款式#口袋*裤款式#纽扣", "summary": "采用腰部的纽扣造型设计,让我们穿脱更加的方便。直筒裤的剪裁设计,更好的修饰腿部线条感。衣身两侧的口袋,更方便我们收纳物品。选用的面料舒适。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*颜色#纯色*图案#纯色*裤型#直筒裤*裤型#阔腿裤*裤腰型#高腰", "summary": "这款阔腿裤做了一个纯色的设计,平时搭配起来更加的百搭,不会挑剔任何风格,都可以穿出不错的质感。宽松的直筒裤版型,对于身材的包容性,穿着起来也会更加的舒适。显得腿部笔直的同时也更加显瘦,高腰的设计更能提高腰线的位置,更加显腿长。"} +{"content": "类型#上衣*颜色#纯色*图案#纯色*图案#蝴蝶结*衣样式#衬衫*衣门襟#拉链*衣门襟#一粒扣*衣款式#拉链*衣款式#飘带", "summary": "甄选质感细腻真丝面料的这件衬衫,虽是纯色的配色,但其融入蝴蝶结飘带之后的它,整体气质可是提升了不少。而后又以皎洁珍珠点缀蝴蝶结,如此更体现出优雅和轻奢,轻松告别普通款的沉闷。而无论是一粒扣袖口还是后领拉链设计,都是为了让衬衫更方便穿脱一些。"} +{"content": "类型#裤*版型#宽松*颜色#黑色*图案#字母*图案#文字*裤长#九分裤*裤款式#拼接*裤款式#口袋*裤款式#抽绳*裤腰型#松紧腰*裤口#小脚", "summary": "黑色的九分运动裤,宽松的版型,让腿部活动自如,毫无束缚感。九分裤的设计搭配束脚。上身更显挺括感,松紧腰搭配抽绳的设计穿脱方便,两侧的口袋。零钱小物,实用方便,裤腿的个性拼接字母丰富了,裤子的设计与层次感使之不再单调。"} +{"content": "类型#上衣*版型#显瘦*颜色#红色*风格#复古*风格#简约*风格#青春*图案#复古*衣样式#外套*衣款式#绑带", "summary": "袖口绑带元素和大红色的外套合并在一起穿上它又是一个活脱脱的时髦!营造出帅气又甜美的时髦气息。这款外套是春秋的款哦~适合现在的天气穿,简约的版型复古潮范,洋溢着青春的活力气息,重点是显高显瘦,便于展现迷人大长腿。面料很柔软舒适,怎么搭配着穿都很舒服。"} +{"content": "类型#上衣*版型#宽松*颜色#黑色*颜色#纯色*风格#休闲*图案#纯色*图案#印花*图案#撞色*衣样式#polo*衣领型#翻领*衣款式#拼接", "summary": "ralphlauren休闲polo衫,黑色彰显稳重感。肩部与侧身撞色拼接,显示出独特巧妙。打破纯色的沉闷,时髦感立显。胸前商标图案印花,别致生动,极富吸引力。翻领设计,气质出众。领口的系扣设计,穿脱更方便。宽松的版型,穿着随意放松,优雅不失风度。"} +{"content": "类型#裙*版型#宽松*材质#棉*颜色#白色*颜色#蓝色*风格#清新*图案#条纹*图案#印花*裙长#连衣裙*裙袖长#短袖*裙款式#腰带", "summary": "白色和灰蓝色竖条纹连衣裙,非常清新,恰到好处的宽松短袖,很好的遮盖粗壮手臂。可调节同色系腰带装饰,凸显腰身柔美曲线。趣味印花图案贴布装饰,打破单调沉闷,俏皮可爱。纯棉亲肤面料,穿着挺括有型又不失柔软感。所以穿在身上显得很有青春活力。"} +{"content": "类型#上衣*材质#纤维*颜色#绿色*风格#文艺*风格#休闲*衣样式#西装", "summary": "BRAND这款西装采用绿色的主调进行渲染,营造出休闲文艺的氛围;整体甄选聚酯纤维面料打造,不具有延展性,不会变形,并持久保持其平整度;再加以经典的平驳领裁剪及双扣设计,彰显出儒雅绅士的风范;辅以背部下摆开叉处理,丰富层次感的同时,透露出几分随性韵味。"} +{"content": "类型#裤*颜色#纯色*风格#简约*风格#休闲*图案#纯色*裤款式#口袋*裤款式#抽褶*裤腰型#松紧腰", "summary": "褶皱型的松紧裤腰设计,弹性适中没有拘束感,又很是方便宝贝穿脱。纯色净面的衣身裁剪的简约大方,没有加入过多花哨的修饰,带来很多样的搭配性能,两侧的斜插口袋干净利落,裤脚裁剪的大小合适,上身后休闲有范。"} +{"content": "类型#上衣*风格#复古*图案#复古*衣样式#卫衣*衣长#短款*衣门襟#拉链*衣门襟#套头*衣款式#拉链*衣款式#连帽", "summary": "短款的版型打造,更显干练利落,连帽卫衣套头穿起来肆意张扬,慵懒随性范儿十足,胸前圆环拉链设计,增加衣身层次感。整个的色彩搭配复古优雅,既有甜美俏皮的韵味,又彰显熟女的成熟气质。"} +{"content": "类型#裤*风格#淑女*风格#清新*图案#格子*图案#波点*裤款式#口袋*裤口#翻折", "summary": "清新的天蓝色格纹图案让孩子的纯真内涵绽放而来,宽边的弹性腰带呵护宝贝的娇小身型,加持裤腿处可以翻边的波点内里,打造出甜美无比的淑女范儿,整体看起来既能吸引众人目光,又不乏柔和温婉的内涵。侧边的口袋让孩子小手伸入打造时尚造型,结实耐磨的面料让裤身充满舒适柔软的品质感。"} +{"content": "类型#上衣*图案#条纹*图案#印花*图案#撞色*衣样式#棒球服*衣样式#外套*衣袖型#罗纹袖口", "summary": "这件棒球服外套采用了活力满满的趣味印花点缀,让人眼前一亮,打造出别样的时尚气质。领口和袖口均有撞色条纹罗纹收口,富有整体造型感,穿着尽显品质。"} +{"content": "类型#裙*版型#显瘦*颜色#粉色*风格#青春*裙长#连衣裙*裙款式#勾花镂空", "summary": "亮粉色比普通的粉色更加萝莉,这款连衣裙以亮粉打底,上身后青春又甜美。结合通身的镂空勾花,更添优雅和仙气,而在视觉上也是轻松营造了精致轻奢观感。另外,连衣裙为修身版型,将腰身衬托的越发纤细,身线更加玲珑曼妙。"} +{"content": "类型#上衣*风格#休闲*图案#撞色*衣样式#衬衫*衣领型#一字领*衣袖型#喇叭袖*衣款式#腰带", "summary": "一件浪漫的衬衫连衣裙,万物复苏的春天可以尝试下这种风格,会让你有意想不到的效果。撞色的荷叶喇叭袖,是整件的亮点。采用休闲感的府绸衬衫料,又不会过分的甜腻。近两年大热的封宽腰带,把腰身包裹的玲珑有致。一字肩但很巧妙地遮住了手臂较粗的地方,刚好露出上方锁骨。"} +{"content": "类型#上衣*版型#宽松*风格#简约*风格#休闲*图案#卡通*图案#字母*图案#文字*图案#刺绣*衣样式#卫衣*衣领型#圆领", "summary": "经典的圆领卫衣款型,简约休闲的设计风格,呈现出十足的利落感,同时自然的版型也营造了舒适的上身感受,捎带宽松的样式彰显出动感十足的年轻风采。衣身上的卡通恐龙图案,个性吸睛,充满了俏皮可爱的感觉,提升整款设计的美观度,同时让人更加减龄。字母刺绣的点缀丰富设计效果,立体别致的字母融入了渐变色的元素,凸显出独特的风采。"} +{"content": "类型#上衣*材质#蚕丝*图案#蝴蝶结*图案#波点*图案#撞色*衣样式#衬衫*衣门襟#系带", "summary": "圆润规律的撞色波点充斥在整件衬衫上,释放出活泼的生动力。其后顺应领口的放下稍微下移,立体的蝴蝶结系带点缀其中,以抢眼吸睛的方式为主体进行修饰美化。同时轻柔顺滑的真丝也融入其中,好像要以一种浪漫温柔的形式表达出对你的与爱意。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*风格#简约*风格#潮*裙型#背带裙*裙型#包臀裙*裙下摆#开叉*裙衣门襟#拉链*裙款式#拉链*裙款式#对称", "summary": "纯黑色调的交叉型背带裙,简约而不简单,让你的搭配随性不挑身材,尽显潮流与时尚,塑造纤细身姿。拉链衣门襟,方便穿脱,搭配两侧对称方形插袋,做工精细,缝合紧密,保持外观整体美感,实用而显大气感。包型设计,讲究束腰显瘦遮肉,打造翘臀诱惑,提升你的女人味儿,彰显出诱惑与吸引力。裙摆开叉设计增添灵活性,减少束缚,让你的走姿更显优雅与端庄。"} +{"content": "类型#裙*材质#网纱*颜色#黑白*风格#简约*图案#波点*图案#刺绣*图案#撞色*裙下摆#荷叶边*裙下摆#垂坠*裙款式#拼接*裙款式#露肩*裙款式#抽褶", "summary": "裙身重工刺绣点缀,细腻的纱质,剔透朦胧的质地,邂逅精致的刺绣,亲们都知道在这样的薄纱面料上面要做刺绣有,做到,立体逼真的枝叶蜿蜒大气,简约黑白撞色,雅致的色调给人含蓄温雅的感觉,上身衬的人特别有气质;肩部拼接透视网纱,看着就给人很干净清爽的感觉,立体精致的刺绣波点设计,灵动俏皮,很是吸睛;拼接荷叶边,层次丰富,自然垂坠挺括的荷叶边,优美均匀的褶皱,配着露肩设计,尽显灵动优雅。"} +{"content": "类型#裙*版型#显瘦*颜色#纯色*风格#复古*图案#纯色*图案#复古*裙款式#抽褶*裙款式#收腰", "summary": "简洁大方的纯色裙身,加上经典百搭的伞型裙摆,时尚百搭还能给人以满满的复古典雅的气质范。经典褶皱收腰处理,显瘦又显高,小粗腿又能轻松自然的穿着起来,完全不挑身材,优雅大气给人以十足的气质感。"} +{"content": "类型#上衣*版型#立体剪裁*材质#针织*衣样式#毛衣", "summary": "此款针织毛衣采用舒适细腻面料,穿着舒适。3d立体剪裁版型,彰显男人魅力。缝线紧密结实,凸显优质品质。包边下摆,防风保暖。"} +{"content": "类型#裙*颜色#纯色*图案#纯色*裙下摆#花边*裙腰型#高腰*裙长#连衣裙*裙袖型#喇叭袖*裙款式#纽扣*裙款式#收腰", "summary": "本款喇叭袖纯色连衣裙,细边皮带高腰分割收腰,上身效果更加高挑纤瘦。领口立体感的半高花边点缀,更加显得有趣并精致。喇叭袖口花边装饰,使裙子具有更加独特的质感。单侧单排纽扣点缀细节,整体展现优雅气质。"} +{"content": "类型#上衣*版型#宽松*颜色#黑色*风格#简约*风格#休闲*衣样式#卫衣*衣门襟#套头*衣款式#罗纹", "summary": "此款套头卫衣,采用个性十足的布贴图案装饰,尽显宝贝的阳光帅气。简约大方的领口设计,配以肩部的开扣设计,让穿脱变得方便简单。罗纹的袖口与下摆,穿着舒适服帖。优雅的纯黑色色调,休闲百搭,宽松舒适的版型设计,穿着无束缚感。"} +{"content": "类型#裤*颜色#黑白*图案#条纹*图案#撞色", "summary": "这款休闲裤裤脚侧面撞色条纹设计,使得视觉具有延伸感,更显双腿修长笔直。而经典的红黑白撞色设计,让你轻松穿出干练清爽的感觉。"} +{"content": "类型#裤*版型#宽松*材质#水洗*风格#简约*风格#朋克*风格#摇滚*图案#音乐*裤款式#破洞*裤款式#口袋", "summary": "裤子的设计是金属朋克气息,有着一股子摇滚的意味很是个性了可以说。简约宽松的版型很好的修饰了曲线。口袋下面有着金属鸡眼和穿绳设计,又精致又能体现出细节感。破洞水洗元素带来放荡不羁感。"} +{"content": "类型#裙*风格#青春*裙腰型#高腰*裙长#短裙*裙款式#绑带", "summary": "这条高腰短裙,看似是短裙,其实内里还做了短裤的设计,时髦感强却又不会显得太浮夸。高腰的廓形设计,能够使得腿部看起来更加的修长。裙身做了钉扣的绑带设计,散发着青春甜美的气息。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*风格#简约*裤款式#不规则*裤口#毛边*裤口#小脚", "summary": "这款fiveplus不规则毛边牛仔裤,简约的修身小脚版型设计,能够更好的修饰腿部轮廓,显瘦之中更能展现出修长笔直的大长腿。同时融合毛边元素装饰,增添设计感,彰显时髦。"} +{"content": "类型#上衣*风格#清新*图案#碎花*衣样式#衫*衣款式#荷叶边", "summary": "荷叶边的在是这款碎花衫的设计亮点。层层叠叠的荷叶边晕染出了浪漫的情调,为衣身注入了小清新的美感,穿起来能将少女的娇美感衬托出来。除此之外,荷叶边还有遮肉的效果,它能在视觉上打造出好身材。"} +{"content": "类型#裙*材质#网纱*图案#刺绣*裙腰型#高腰*裙长#半身裙", "summary": "独特的网纱刺绣设计,更加俏皮可爱;过膝的长度设计,带来满满的安全感;时尚高腰腰型,塑造优美的身材曲线;半身裙是每个季节都不会过时的时尚元素,能够恬静地成为这个夏天不折不扣的主角,百搭的特性可以包含不同的风格单品,个性的网纱面料亦会让你的时髦变得轻而易举。"} +{"content": "类型#裙*材质#网纱*图案#刺绣*裙腰型#高腰*裙长#半身裙", "summary": "活泼可爱的少女,夏季总会选择一件网纱半身裙,这件高腰刺绣网纱半身裙,裙头选择松紧有致的腰部设计,不拘束的同时更收敛身形。裙身选择中国古典风雅的星月刺绣,尽显少女活力十足,而裙外层则选择轻盈剔透的网纱设计,展现少女灵动活泼的个性。"} +{"content": "类型#裤*颜色#黑色*裤款式#亮片", "summary": "一款blingbling的质感,就是最大设计亮点。整个衣身都是blingbling的亮片,一不小心就被亮,非常的耀眼。又是黑色系的,光是看着就觉得。"} +{"content": "类型#上衣*材质#牛仔布*材质#针织*风格#休闲*风格#潮*衣样式#卫衣*衣款式#连帽", "summary": "一款时尚有范的牛仔裤,无论从款式还是面料来说,都非常的满意,保证你穿了还想穿,款式很百搭,三季好穿,能陪你多个春夏秋。早春搭配一款之前的马海毛针织或休闲连帽卫衣,配上这款新潮的牛仔裤,下摆开叉,时尚不羁的感觉,露出一点白皙肌肤,时髦带点潮感,使得穿上后气质更时尚。"} +{"content": "类型#裙*材质#网纱*风格#宫廷*风格#高贵*图案#刺绣*图案#印花*裙长#连衣裙", "summary": "像的春风从指尖,伴着丝丝惬意,这便是这款连衣裙给人的感受。朦胧的刺绣网纱领口还原法国宫廷的经典look,透着浪漫,也不失温柔,铺陈在印面上的工笔画印花,唤醒最古典的中国风情。不一样的元素碰撞,同样的高贵灵秀。"} +{"content": "类型#上衣*风格#休闲*风格#青春*风格#潮*风格#性感*图案#线条*衣样式#衬衫*衣领型#翻领*衣款式#勾花镂空", "summary": "这款男士衬衣在裁剪上紧跟人体身材科学设计的步伐,按照黄金比例裁剪,贴合男士身材,显示出每个人不一样的魅力风格,商务休闲二者流行元素合二为一,显示不一样的潮流风尚。镂空的设计,又使得增添了性感的魅力,小翻领的设计,修饰颈部线条。"} +{"content": "类型#上衣*风格#清新*风格#性感*衣样式#衬衫*衣领型#一字领*衣袖型#喇叭袖", "summary": "一字肩的设计使得美丽的香肩得以展现出来,尽显造型独特并无形中流露出性感的气息。结合浪漫甜美的喇叭袖设计,使得这件衬衫性感却又不至于妖娆媚俗,反而演绎出清新脱俗的小仙女气质,举手投足众生沉醉在你出众的魅力之中而无法自拔。"} +{"content": "类型#上衣*版型#显瘦*风格#英伦*图案#撞色*衣样式#马甲*衣款式#纽扣", "summary": "品牌的这款正装马甲采用商务修身的裁剪设计,提花的纹理图案设计更显精致感。深色的英伦配色,尖角斜下摆的版型裁剪,腰部调节扣设计。单排撞色纽扣门襟,各个细节款型都在彰显优质的品质感以及英伦绅士的穿搭风格。"} +{"content": "类型#裙*风格#清新*图案#碎花*图案#线条*裙长#连衣裙*裙领型#v领*裙袖型#喇叭袖*裙衣门襟#系带", "summary": "魅力女人味,还是要从连衣裙开始诉说起来~美观的中长连衣裙,它有着时尚系带v领,样式简洁大方,更加的富有女人味的感觉。而清新浪漫的碎花元素,分布在这款连衣裙的全身,样式优雅,最显清新感觉。加上这款它还有着精致的喇叭袖设计,在修饰了你手臂线条后,尽显出一种独特和浪漫。"} +{"content": "类型#裙*版型#宽松*材质#牛仔布*材质#水洗*裙型#牛仔裙", "summary": "这款裤子是饱和度较低的蓝调牛仔配合水洗做旧效果,帅气的很低调,版型不要太赞,耐穿耐脏耐摩擦。宽松的裤型顺便把腿型不完美的问题一并解决,这个膝盖洞,透气"} +{"content": "类型#裙*图案#蝴蝶结*裙领型#西装领*裙领型#翻领*裙款式#腰带*裙款式#抽褶", "summary": "沿用西装领的样式设计的时尚翻领,凸显气质。腰带采用金属圈作为装饰,迎合当下时尚风潮,BRAND风十足。选用梭织面料,紧密挺硬,保持裙型。腰部蝴蝶结系扣增添甜美气息。裙身自然褶皱,增添灵动性。"} +{"content": "类型#上衣*风格#文艺*风格#清新*衣样式#衬衫*衣袖型#喇叭袖*衣款式#荷叶边", "summary": "不同于普通白衬衫,这款衬衫很有设计感。温婉清新的喇叭袖设计,展现优雅气质的同时更能展现纤细的手臂。而别致的荷叶边下摆设计,带来不少灵动飘逸的感觉,浓浓的文艺气息。同时,这款衬衫特意选用柔软亲肤的面料,提升了不少穿着体验感,并且立体的剪裁工艺让这件衬衫看起来更加有质感,有气质。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*风格#复古*风格#简约*风格#潮*图案#卡通*图案#复古*图案#线条*图案#刺绣*裙型#a字*裙型#牛仔裙*裙长#短裙*裙衣门襟#拉链*裙款式#口袋*裙款式#拉链", "summary": "妖精的口袋的这款牛仔短裙,经典a字版型,面料挺括、线条流畅简约。微微弹力,飘逸下摆,穿着显瘦自然。后置隐形拉链的融合,方便又摩登。复古的卡通刺绣图案,简约中不失灵动,超级吸睛,让你走到哪里都是潮流中心。"} +{"content": "类型#裤*版型#显瘦*裤长#九分裤*裤腰型#中腰", "summary": "这一款裤子给人的感觉就是很经典大气,如同西装裤一样非常的有气质,也显得比较的正式,我们可以在一些隆重的场合穿上它。九分裤的长度非常的适中,穿上可以完美的修饰出我们的脚踝。中腰的设计则很修身显瘦,很好的衬托出我们的完美身材。"} +{"content": "类型#上衣*版型#宽松*图案#线条*图案#撞色*衣样式#冲锋衣*衣领型#小立领", "summary": "这款冲锋衣采用薄款的面料,触感细腻,让你轻松兼顾温度与风度。衣身采用个性撞色的设计,不仅能穿出时髦感,更显皮肤白皙。小立领的设计很拉风,轻松凸显纤细的颈部线条。宽松的版型适合各种身型的妹纸穿着。"} +{"content": "类型#裙*版型#显瘦*颜色#纯色*风格#通勤*风格#简约*风格#知性*图案#纯色*图案#线条*裙长#连衣裙*裙袖长#长袖*裙款式#对称", "summary": "来自时尚品牌菲梦伊的一款通勤款的连衣裙,简约纯色设计,干净清爽。3d立体修身剪裁,贴合身形,突出纤细腰肢和挺翘饱满的臀部曲线,勾勒女性曼妙身材曲线。经典大方的反驳领设计,给脖颈自在活动空间,简约知性。长袖的设计,修饰手臂线条,彰显女性的温婉含蓄,对称衣扣装饰,丰富衣身结构,更具时尚感。"} +{"content": "类型#裤*版型#宽松*风格#清新*裤长#八分裤*裤型#直筒裤*裤型#阔腿裤*裤款式#拼接*裤款式#不对称", "summary": "这款阔腿裤的面料色调,怀旧又带了点清新禁欲系。款式设计上运用不对称的手法,在一边的裤筒上,用同色系不同纹理的面料拼接插入,而另一边也做了一条相呼应的细边出芽。给裤身制造一个不对称的视觉效果,和谐融入不显突兀。宽松的直筒八分裤型,裤脚的折边让穿着更显随性。"} +{"content": "类型#裙*版型#宽松*风格#休闲*裙下摆#压褶*裙领型#娃娃领*裙款式#口袋*裙款式#纽扣", "summary": "这件睡裙在背部采用了压褶设计,加上甜美的娃娃领,具有减龄的效果,带你重回童真时代。纽扣的门襟开合,缔造出干练自信的女性气场。宽松舒适的版型包容性强,隐藏身材小秘密,尽享惬意休闲时光。胸前的口袋设计,增添视觉层次感,可放置随身小物,方便实用。"} +{"content": "类型#裙*材质#蕾丝*颜色#黑色*图案#蕾丝*裙下摆#垂坠*裙长#连衣裙*裙袖长#无袖*裙领型#圆领*裙款式#拼接", "summary": "日着在今年的夏装中,继续延续了蕾丝所代表的女性柔美一面的设计元素。在这一款连衣裙里,大胆地将黑色的蕾丝面料附着在土黄色的棉布上,以无规则的车线工艺将两块面料进行了拼接,打造出了极度自然的垂坠感。同时,无袖搭配圆领的版型设计,更是将优美的脖颈以及臂部曲线展露无遗,凸显出了女性姣好的身材。"} +{"content": "类型#上衣*图案#线条*衣样式#衬衫*衣样式#风衣*衣领型#v领*衣款式#不规则*衣款式#荷叶边", "summary": "这件衬衫领口的设计是半开的小v领结合不规则的荷叶边设计,整体设计感十足,这样的领口勾勒出颈部线条和锁骨都很柔美。袖子部分的设计是可拆卸的风衣式袖子设计,时尚方便又能凹造型,是你的最佳选择。"} +{"content": "类型#上衣*材质#针织*风格#复古*风格#简约*风格#休闲*风格#青春*图案#字母*图案#文字*图案#复古*图案#撞色*衣样式#卫衣*衣领型#圆领*衣门襟#套头*衣款式#亮片", "summary": "圆领式的针织卫衣,套头的款型不仅方便了日常的穿脱,也展现出简约休闲的设计风格。加上微微的卷边效果以及局部的磨白处理,刻意做旧的感觉,让这款设计充满了复古的情怀,打造出个性又洒脱的女性气质。衣身上亮片字母的点缀,融入了明亮的撞色元素,十分吸人眼球,同时展现出极具现代感的青春风采,让人绽放出活力感。也成为了整款设计的点睛之处。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*裙下摆#花边*裙长#连衣裙*裙款式#不对称", "summary": "这款连衣裙精选优质的面料,不仅手感舒适,穿在身上还非常亲肤透气。宽松的款式穿脱方便的同时还非常显瘦。花边的设计也是一大亮点,使这条裙子更加甜美可爱。不对称下摆的设计也是时髦又别致。"} +{"content": "类型#裙*裙领型#v领*裙领型#翻领*裙款式#露肩*裙款式#收腰", "summary": "衣领在传统的v领上有了新的创新,加上了翻领设计,成熟之余又不失活泼可爱。并且衣领弹性大,可轻松实现正常v领和露肩两种穿法。腰部进行了收腰设计,尽显腰身。蓬松的裙摆可遮挡一些女性臀部的不足,打造完美身材。"} +{"content": "类型#裙*裙领型#v领*裙领型#翻领*裙款式#露肩*裙款式#收腰", "summary": "衣领在传统的v领上有了新的创新,加上了翻领设计,成熟之余又不失活泼可爱。并且衣领弹性度高,可轻松实现正常v领和露肩两种穿法转换。腰部进行了收腰设计,尽显腰身。蓬松的裙摆可遮挡一些女性臀部的不足,打造完美身材。"} +{"content": "类型#裙*裙型#a字*裙腰型#高腰*裙长#连衣裙*裙款式#绑带", "summary": "这款连衣裙最吸引的亮点就是它独特的腰封设计,显腰细max。加上高腰的版型,瞬间拉长全身比例,凸显高挑身材。细长的绑带设计,帅气中又凹足造型。a字型的裙摆敲不挑人穿,菇凉们可以轻松驾驭哦。"} +{"content": "类型#上衣*风格#文艺*风格#简约*风格#清新*衣样式#衬衫*衣领型#小立领*衣款式#口袋*衣款式#绑带", "summary": "衬衫和阔腿裤的组合,简约的搭配方式,但并不简单。上衣采用小立领半开襟设计,简约利落,前两个口袋设计,带来层次感,整体的风琴百褶,尽显女性的婀娜多姿,阔腿裤设计。加上时尚的绑带,利落范,还有开叉的半裙。飘逸清新又文艺。"} +{"content": "类型#裙*材质#蕾丝*材质#纤维*图案#条纹*图案#蝴蝶结*图案#蕾丝*裙长#连衣裙*裙款式#拼接", "summary": "一款舒适有型的聚酯纤维连衣裙;富有特色的拼接设计,将条纹与蕾丝相碰撞,融合出一种充满浪漫的设计感,使时尚的着装拥有了浪漫与柔美的气息;简洁有型的长领角,随性而系的蝴蝶结则凸显俏皮与可爱,垂下不系则显个性的韵味。"} +{"content": "类型#裙*风格#简约*风格#青春*风格#性感*图案#条纹*裙下摆#花边*裙长#连衣裙*裙袖长#短袖*裙领型#圆领*裙款式#露肩", "summary": "一款清爽又减龄的条纹连衣裙,衣身黄白条纹设计,显得青春而靓丽,上身衬肤色又显年轻;简约的花边圆领,散发出甜美可爱的气质;露肩短袖设计,满足穿者的性感小心机。"} +{"content": "类型#裤*风格#复古*风格#嘻哈*图案#复古*图案#撞色*裤款式#拼接*裤口#毛边", "summary": "前身复古的毛边点缀,趣味风十足,一下子提高了造型感,上身倍感玩味俏皮,让你很容易穿出青春活力女孩气息。嘻哈感十足的裤腿带来富有灵动感的穿着效果,结合整体的撞色和拼接设计,透着几分小女生的乖巧气息。"} +{"content": "类型#裙*颜色#浅色*图案#条纹*图案#线条*裙下摆#压褶*裙领型#一字领*裙款式#拼接", "summary": "肩部、袖口都采用条纹拼接,增加层次感,又营造出一字领的错觉,显得精致温婉。浅色线条勾勒裙边,让轮廓感立显,优雅大气。精致的压褶处理,让视觉效果更丰满,有不错修饰胸型的作用,让人心动。"} +{"content": "类型#裙*风格#街头*图案#条纹*图案#拼色*图案#线条*图案#撞色*裙长#连衣裙*裙领型#v领*裙款式#不对称*裙款式#对称", "summary": "这款对称撞色连衣裙,大胆玩转不对称的艺术美。左右对称拼色两件的设计,瞬间让你摆脱路人甲的穿搭造型,成为街头最出彩的。斜条纹的设计,更是摆脱纯色调的呆板无趣,让你穿出腔调感来。v字领口设计,更是完美凸显颈部的纤长线条。"} +{"content": "类型#上衣*颜色#黑色*图案#印花*图案#撞色*衣样式#衬衫", "summary": "个性时尚的衬衫采用了纯黑色的色调设计,整体打造了摩登随性的自由风格,凸显个性时尚的质感。结合了魔术扣的立体印花,在黑色的映衬之下,撞色使得更有摩登时尚的质感,打造活力穿着。"} +{"content": "类型#裙*版型#显瘦*裙款式#松紧带", "summary": "腰部是松紧带设计对身材的包容性很好。高矮胖瘦的姑娘都可以用,显高显瘦的效果不止一点点,另外这款裙子也很适合旅行时穿,飘逸柔美充满异域风情的味道。"} +{"content": "类型#裙*版型#显瘦*裙腰型#高腰*裙长#连衣裙*裙袖长#七分袖*裙袖型#喇叭袖*裙款式#勾花镂空", "summary": "修身款型的连衣裙是衬托曼妙身段的好帮手,高腰的设计将腰部的纤细修饰的更加迷人。尤其是衣袖处的别致设计,带着喇叭袖的造型将手臂修饰的更加纤细,搭配上七分袖的独特为穿戴之人更添优雅之感,从此告别令人尴尬的赘肉。若隐若现的镂空为整件衣衫增添了一抹成熟女人的妩媚之情。"} +{"content": "类型#裙*版型#显瘦*裙下摆#荷叶边*裙衣门襟#排扣", "summary": "一款仿佛礼服般的裙子。修身显瘦的款式衬托出端庄迷人的身姿,腰间处理让纤细的小蛮腰呈现盈盈一握之态。胸前荷叶边的点缀为整体增添了几分随心所欲的美,领口设计衬托颈部曲线的同时展示女性魅力。袖口三排扣从细节中彰显质感。"} +{"content": "类型#上衣*版型#显瘦*颜色#纯色*颜色#浅蓝色*风格#淑女*风格#文艺*风格#民族风*风格#清新*风格#中国风*图案#纯色*图案#蝴蝶结*图案#刺绣*衣样式#风衣*衣样式#西装*衣领型#西装领*衣领型#翻领*衣袖型#喇叭袖*衣门襟#系带*衣款式#腰带", "summary": "气质柔雅的一款纯色风衣,清浅素雅的浅蓝色基调清新脱俗有点不食人间烟火的味道,为整体文艺风格奠定了很好的基调。惹眼的细节莫过于西装翻领上的民族风刺绣花朵,古典的中国风美韵油然而生,把西装领原有的干练洒脱掩盖了起来。腰身修身的腰带系成了甜美俏皮的蝴蝶结,与两侧系带喇叭袖的造型相互呼应,轻松打造出乖巧恬静的淑女形象。"} +{"content": "类型#裙*风格#复古*风格#简约*图案#复古*图案#撞色*裙下摆#压褶*裙长#连衣裙", "summary": "简约不简单的一款连衣裙,裙身复古大气的花纹,好穿又衬气质!融入撞色设计,鲜明的层次感,美的清晰立体。轻松穿出惹眼视觉感!前片压褶设计,精致有型,褶线自然垂落,飘逸,带来一丝柔美韵味。"} +{"content": "类型#上衣*颜色#浅色*衣样式#衬衫*衣门襟#系带*衣款式#不对称*衣款式#抽褶", "summary": "浅色色系的衬衫是春意盎然的代表色系,清爽又整洁,非常时尚的设计。左边的系带使得右边出现褶皱感,使得整件衬衫变得非常有质感,衬衫下摆的不对称,更是设计师赋予它独特的魅力,时尚感强烈。增加了衬衫的,同时侧边开叉,更显女性腰部曲线魅力,非常完美的一件设计作品。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#针织*颜色#黑色*风格#简约*风格#清新*图案#线条*图案#撞色*裙长#连衣裙*裙衣长#中长款", "summary": "这款针织连衣裙,是经典的黑色款,衣身的撞色线条时髦摩登,出街实力吸睛!衣身领口吃采用经典黑、白、红撞色,清新靓丽的大v字设计,糅合低调、浪漫与热情,彰显摩登的时尚感。中长款的宽松款式,落肩的设计轮廓勾勒简约裙装,轻松修饰身材小秘密,俏丽又显瘦!"} +{"content": "类型#上衣*图案#字母*图案#文字*图案#线条*图案#刺绣*图案#撞色*衣样式#卫衣", "summary": "由近年来大火的时装品牌dsquared2出品的卫衣,采用数字、字母logo标识和树叶图案作为点缀,以黑黄撞色的刺绣手法呈现,线条分明且具有立体感,经得起全方位的推敲与。"} +{"content": "类型#上衣*版型#宽松*风格#复古*风格#休闲*图案#复古*衣样式#外套*衣样式#西装*衣领型#西装领*衣门襟#系带*衣款式#收腰", "summary": "充满港式复古风味的西装外套,上身是宽松的西装下身则是休闲的阔腿裤。上身的西装采用的经典的西装领结构,能够衬托出女性精致的脸庞。腰间还加有一根系带装饰,可以随意调节进行收腰处理,勾勒出女性的细腰凸显出女人味。下身宽松的高腰裤型能够拉长下半身的视觉比例。"} +{"content": "类型#裙*颜色#纯色*风格#青春*风格#潮*图案#纯色*裙长#半身裙*裙款式#拼接*裙款式#不规则", "summary": "此款拼接贴布半身裙,采用不规则设计,搭配拼接设计,层次感强,紧随潮流,时尚又个性。贴布设计,增加整体设计感,尽显女性甜美气质。纯色设计,经典百搭,青春减龄,优雅又魅力。"} +{"content": "类型#上衣*材质#针织*衣样式#开衫*衣领型#圆领*衣袖型#罗纹袖口*衣款式#罗纹", "summary": "可爱甜美的棉线针织开衫,手感细腻软糯,是很实用的搭配单品,多种颜色可选,伴随宝贝靓丽出行。适合身形轮廓的剪裁,给身体留出足够的活动空间,自由不拘束。经典的罗纹圆领,织带包边,有效避免摩擦颈部肌肤,且防止拉扯不易变形。有弹性的罗纹袖口,宝贝穿着不易上下滑落,保暖又舒适。"} +{"content": "类型#上衣*版型#显瘦*颜色#白色*风格#知性*图案#字母*图案#文字*衣样式#衬衫*衣领型#娃娃领*衣款式#拼接", "summary": "干练大方的衬衫连衣裙,经典的黑白色拼接,知性优雅。娃娃领的设计,多了几分俏皮感,彰显甜美可爱的少女气息。门襟和袖口的拼接相呼应,时髦大气。修身的版型,束出纤细的腰身,凸显柔美的女性曲线。简洁的a字裙摆,衬出迷人的大长腿。整条裙子点缀彩色爱心,碰撞胸口的字母设计,趣味性十足。"} +{"content": "类型#裙*材质#网纱*材质#蕾丝*图案#蕾丝*裙衣门襟#拉链*裙款式#拉链", "summary": "活力四射的夏天穿搭,少不了一件蕾丝吊带裙。这件网纱蕾丝吊带裙两件套,内层选用精巧可爱的吊带裙,在蕾丝的映衬下,更衬托女性的优雅情调,两肩及袖口处采用质感十足的金属拉链装饰,在女性的柔美上,增添一份女性的独立性格。而腰身则选用精致动人的暗纹设计,隐隐显示女性专属的古典美。"} +{"content": "类型#上衣*材质#蕾丝*风格#简约*风格#清新*图案#蕾丝*衣样式#衫*衣领型#立领*衣袖长#短袖*衣款式#盘扣", "summary": "一款旗袍上衣立领短袖蕾丝衫,立领的设计非常能够彰显个人的气质。搭配盘扣的设计,透着中国风情,更显端庄大方。蕾丝面料的选择,优雅而又别致,女人味十足。纯白的设计,简约而又清新,可塑性很强,可以根据需要搭配任何颜色。"} +{"content": "类型#裤*版型#显瘦*材质#蕾丝*图案#蕾丝*裤长#连体裤*裤款式#拼接", "summary": "连体裤在上衣的部分采用了蕾丝拼接的设计,小小的披肩遮住上身,若隐若现的手臂十足的有朦胧美。而裤子的地方也特意做高了腰线,收褶的设计更是能够将腰部衬托的更加纤细,很好地起到显高显瘦的作用。灰白两种颜色做搭配也是十分的清新自然,仙女气十足。"} +{"content": "类型#裤*材质#蕾丝*颜色#蓝色*风格#清新*图案#蕾丝*裤长#连体裤*裤型#阔腿裤*裤型#连衣裤*裤款式#抽褶*裤腰型#高腰", "summary": "连衣裤的受追捧程度,BRAND也按捺不住加入了此款式的,采用通身的蕾丝质地,打造出一款又美又仙的扮靓单品。蓝色调的渲染,带来了湛蓝天空的清新纯净感;两侧的褶皱花边造型,演绎蝴蝶般的飘飘然姿态;高腰线和阔腿的剪裁,修饰双腿,立显高挑身姿。"} +{"content": "类型#裙*风格#简约*风格#知性*风格#高贵*裙下摆#毛边*裙长#连衣裙*裙领型#圆领*裙款式#亮片", "summary": "此款连衣裙硬挺厚实,细腻的颗粒,凹凸的触感,质感饱满,带来品质体验;把女性之美的璀璨夺目亮片融入其中,摩登与优雅并存,绽放你的美丽与智慧光芒;加上毛边的沿边点缀,简约时尚,削弱原本普通连衣裙的平庸,注入不安分玩转因子,散发出知性优雅的超强气场,摩登轻奢潮品,优雅不失时尚感;精致剪裁,注重每一个细节处理,简约的圆领,凸显优雅气质,勾勒婀娜身姿,犹如一件精致的艺术品,尽显高贵女性韵味。"} +{"content": "类型#上衣*风格#简约*衣样式#毛衣*衣领型#半高领*衣袖长#短袖", "summary": "春天到了,你的上衣准备好了吗?这款炒鸡减龄的毛衣,百搭又时髦,设计了优雅的半高领造型,打破常规,新颖简约不失时髦度,让基本款有亮点。以及短袖的裁剪,特别利落,减少造型的臃肿感,且百搭又有范。"} +{"content": "类型#上衣*版型#显瘦*风格#欧美*图案#几何*图案#线条*图案#印花*衣样式#衬衫*衣款式#荷叶边", "summary": "欧美个性几何衬衫搭配包臀半裙,给人带来视觉上的享受。精美印花的点缀,充满着浪漫的气息。干净利落的剪裁打造流畅的线条,修身的上衣塑造完美身形,荷叶边裙的设计很精巧,细节也做得很到位,上身显瘦。"} +{"content": "类型#上衣*颜色#黑色*风格#民族风*风格#休闲*图案#格子*图案#线条*衣样式#外套*衣领型#翻领*衣款式#口袋", "summary": "这款外套领部采用小翻领的设计,显得干净利落。以粗花呢格纹面料制作而成,展现出满满的小香风~衣身四个方形口袋的设计,集美观性与实用性于一体~并以流畅的黑色线条点缀,增加了整体的层次感。下摆和衣袖处做了流苏的设计,增添几分民族风的感觉,休闲范十足。"} +{"content": "类型#裙*材质#网纱*材质#蕾丝*风格#淑女*风格#居家*图案#蕾丝*裙长#连衣裙*裙款式#吊带", "summary": "一款优雅的连衣裙,将女人气质十足的体现了出来,蕾丝的小吊带图案饱满,亲肤不扎手,也不用担心起球。里面的连衣裙甜美感十足,网纱的裙摆别有一番淑女名媛风。袖子做了一点微喇效果,增加了甜美的感觉。连衣裙面料亲肤舒适柔软,质感也灰常的好。冬天搭大衣、棉服都灰常的有韵味,逛街,居家,上班,旅行穿都很合适。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#清新*图案#条纹*衣样式#衬衫", "summary": "这款衬衫,衣身采用桔红色条纹设计,视觉上很清新的颜色,很显肤色,显白。条纹也会让身体更加纤细。小宽松的版型,能更更加凸显身材,遮肉显瘦,彰显女性柔美的身姿。上身更显气质,处处弥漫着时尚的气息。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#清新*图案#条纹*衣样式#衬衫", "summary": "春天是属于衬衫的,一款清新的条纹衬衫,碰撞出属于这个时节的清爽利落感,纵向的条纹不仅显得帅气而且能增加视觉上的显瘦效果。前短后长的设计别具一格,在增加设计感的同时超级显腿长,宽松的衬衫版型显瘦显慵懒。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#清新*图案#条纹*衣样式#衬衫", "summary": "这款衬衣,的清新蓝白条纹十分给人自然干净的感觉,大方时尚条纹还有显瘦的功能。胸口的设计很特别,胸口的门襟铜扣,十分有质感,一看就很有品质,有着点睛的作用。宽松的版型,遮肉功能十分强大更显强调。"} +{"content": "类型#裙*材质#雪纺*裙型#大裙摆*裙腰型#高腰*裙长#连衣裙*裙款式#露肩*裙款式#不规则", "summary": "这条连衣裙飘逸轻柔的雪纺材质加上不规则露肩的设计,一举一动之间透露着十足的仙女气质,甜美迷人。高腰的大裙摆在视觉上拉长了双腿,更显大气时尚。纤细的肩带和露肩的设计,更是“”心机”地完美修饰脸型和锁骨。"} +{"content": "类型#裤*材质#丝绒*风格#性感*裤长#短裤*裤款式#钉珠*裤腰型#松紧腰", "summary": "钉珠丝绒短裤精选优质高档丝绒面料,轻盈顺滑,细腻绵软。清爽的松紧腰型设计提升腰线,彰显性感小蛮腰和修长大长腿。甜美高档的手工缝制钉珠提升高档品质,彰显个性的同时又有一份优雅和光泽在其中。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*颜色#黑白*颜色#淡蓝色*风格#清新*图案#格子*图案#线条*裤长#七分裤*裤型#阔腿裤", "summary": "格纹元素在服饰中的运用是永不过时的,融合了经典黑白小方格的图案又创新加入了淡蓝色,别出心裁的设计让这条阔腿裤有了与众不同的辨识度。宽松的阔腿裤也能很好的修饰腿部线条,过膝七分长度露出白皙小腿更加显瘦;时尚单品的穿搭充满了美式田园风格的清新雅致,带来泛着甜美气息的春夏风格。"} +{"content": "类型#上衣*版型#宽松*材质#蚕丝*风格#性感*衣样式#针织衫*衣样式#毛衣*衣袖型#蝙蝠袖*衣门襟#套头*衣款式#勾花镂空", "summary": "一款蝙蝠袖撕破勾花毛衣女套头宽松针织衫,非常女人味一款设计,里面搭配一条真丝裙子,满满高级感。蝙蝠袖的设计,让肩膀自由不拘束。撕破钩花的设计,透着淡淡的性感。宽松版型的设计,慵懒随性。"} +{"content": "类型#上衣*版型#宽松*颜色#宝蓝色*风格#休闲*衣样式#卫衣", "summary": "适合装点春日的三色卫衣,通过把握色彩的饱和与明暗度,使其散发出暗藏的满满活力,柠檬黄、宝蓝色和紫外光,实用多样风格穿搭。而前襟处的英文字母点缀恰当,宽松的oversize版型更显洒脱,丰富从整体到细节的质感。在搭配舒适的面料,结合休闲与时髦理念,带来亲肤体验的同时也简洁大方。"} +{"content": "类型#裙*颜色#纯色*风格#潮*图案#纯色*图案#刺绣*裙款式#抽褶", "summary": "衣身采取个性化的绣花点缀,增添时尚的韵味,让你举手投足间尽显妩媚的气息。自然的褶皱裙摆,极具潮流的味道,尽情演绎你专属的摩登气息。雅致的纯色系列,更好的为你勾勒出酷帅的气质。"} +{"content": "类型#裙*版型#宽松*颜色#绿色*图案#线条*裙长#连衣裙*裙领型#圆领", "summary": "这款中长款式的连衣裙,经典的圆领设计,显得利落大方不失优雅,能够在穿着时增添女性的个人气质。精致的荷叶裙摆设计,让宽松的衣着轻松贴身,修饰腰部线条更添纤细之感。选用绿色调装饰,极具摩登时尚气。"} +{"content": "类型#上衣*颜色#黑色*风格#简约*图案#线条*衣样式#外套*衣样式#西装*衣领型#翻领*衣袖长#长袖*衣门襟#双排扣*衣款式#亮片", "summary": "这款翻领长袖西装外套,线条利落,版型挺阔,通身采用黑色面料,再融入亮片的设计,简约大气自带女王气场,潮酷有型。胸前采用v字西装大翻领设计,运用面料本身高级的光泽感,增添了造型的层次感与立体感,更加衬托出奢华的贵族气质。金属双排扣的设计,给整体暗色调加入了亮点,极具装饰美。"} +{"content": "类型#裤*版型#宽松*版型#立体剪裁*裤型#阔腿裤*裤款式#抽绳*裤款式#抽褶", "summary": "了一个冬季的赘肉,更是不能在春季暴露,一条阔腿裤就是帮你隐藏下半身缺陷的完美利器。腿短腿胖?这都是不存在的问题。宽松的廓形设计,遮盖住腿部赘肉,更能修饰o/x腿型。腰带也十分精致,阔腿裤搭配同色系的抽绳,在抽绳末端还设计了金属装饰,使绳子的垂坠感更佳。尤为突出的当属臀部剪裁,立体剪裁,形成圆润弧度,上身完美贴合,减少了臀部褶皱的出现,使臀型更佳完美。"} +{"content": "类型#上衣*材质#棉*颜色#黑色*风格#运动*风格#休闲*图案#字母*图案#文字*衣样式#衫*衣款式#罗纹", "summary": "这件休闲衫采用黑色作为主色调,非常酷炫时尚。胸前是品牌的logo图案,造型是流畅的手写体字字母,给人的感觉非常潇洒自然。这件休闲衫采用的是纯棉面料,贴身穿着也会非常舒服。领口和袖子采用的是罗纹面料,增强了防风保暖的效果,同时运动的时候也会非常方便。"} +{"content": "类型#裤*风格#休闲*裤长#长裤", "summary": "百搭而凸显都市时尚情结的长裤。自然的版型挺括而修饰身材,深黑色的面料,更加适宜各种商务和休闲场合的搭配。腰部的串珠点缀,修饰出华丽精致的时尚品味,面料具备弹力,上身体验舒爽轻柔。"} +{"content": "类型#裤*风格#休闲*裤长#长裤", "summary": "背带长裤打破以往的精致剪裁,结合带有垂坠质感的面料,凸显的是一种慵懒的休闲韵味。腰部的交叉系带设计,收腰的同时也使得更加贴合腰部,视觉上更显腰部的纤瘦和背部的笔直。两条大大的裤腿,自然落下,行走方便自如,没有束缚感的同时更显慵懒的浪漫情怀。"} +{"content": "类型#裤*版型#显瘦*图案#条纹*图案#印花*图案#撞色*裤型#直筒裤*裤款式#拼接*裤款式#口袋*裤款式#不规则", "summary": "印花图案很个性,上身很容易穿出范,而且,真丝面料对色彩的表现力非常好,印染的清晰形象,这非常难得!领口、袖口、下摆,撞色条纹设计轻松点亮视觉,很有气质,不规则下摆,特别有腔调感!无形中拉长身形,倍显高挑!裤子,直筒廓形,修饰腿型,藏肉显瘦前口袋设计,也增添了实用性!裤腿两侧撞色拼接,倍添出~"} +{"content": "类型#上衣*风格#简约*图案#线条*衣样式#西装*衣领型#西装领*衣门襟#一粒扣*衣款式#拼接", "summary": "这款西装设计了经典的西装领设计,有着干净简约的线条,轻松烘托出强大的气场能量。门襟采用一粒扣开合的造型,精致新颖,让心情都觉得活跃起来。袖口融入撞布的拼接,营造出两件套的效果,打破打掉,却增添一份穿衣的风格。"} +{"content": "类型#上衣*版型#宽松*风格#性感*图案#撞色*衣样式#卫衣", "summary": "此款加大型卫衣,宽松舒适。袖子上个性的平面图案为自主开发设计。平面感图案在袖子上排列设计,是今年的设计手法。,做了撞色的设计处理。整个款式个性感十足,彰显设计感。"} +{"content": "类型#裙*版型#宽松*图案#植物*图案#印花*裙长#连衣裙*裙款式#不规则", "summary": "款式简洁的连衣裙,翠绿色的植物印花,能很好的衬托出干净白皙的肤色,同时大面积的满印设计,打破夏日的炎热感,给人的心灵注入清爽与活力。宽松的裙摆营造出丰富的层次感,尽显小女孩的优雅甜美。不规则的下摆,像是随意的剪裁却也不失设计感,个性有趣。"} +{"content": "类型#裙*材质#蕾丝*颜色#纯色*风格#性感*图案#纯色*图案#蕾丝*裙腰型#高腰*裙长#连衣裙*裙领型#v领*裙衣门襟#系带*裙款式#勾花镂空", "summary": "这款纯色的连衣裙采用了蕾丝镂空的设计,性感而不会过于暴露,立体的蕾丝花纹同样作为装饰,若隐若现地展示白皙的肌肤,而高腰部位同样设计了镂空,展示腰线,大胆而吸睛。领口处采用了v领配合上小系带,灵动飘逸,举手投足之间散发甜美气质。"} +{"content": "类型#裙*材质#蕾丝*图案#线条*图案#蕾丝*裙型#百褶*裙长#连衣裙*裙领型#圆领*裙衣门襟#系带*裙款式#拼接", "summary": "一款充满着满满女人味的连衣裙,蕾丝面料的拼接增强了衣身的档次感,同时衬托出女性优雅的气质。基础圆领修饰出颈部线条,腰间系带凸显出女性纤细的小蛮腰,玲珑有致的曼妙身材彰显出来。百褶裙摆富有灵动飘逸韵味。"} +{"content": "类型#上衣*风格#潮*图案#格子*图案#撞色*衣样式#衬衫*衣款式#口袋*衣款式#纽扣", "summary": "以时尚界中永不落伍的格纹点缀于这件衬衫之上,使其保留着很率性的特色。还通过撞色效果加以点缀,令视觉冲击力更突出。而俏皮的弧形衣摆则展现出活力特色,令趣味性提高了许多。使其虽然是经典的纽扣门襟,却能衬出令人眼前一亮的潮流范。再加上对衬口袋的存在,就演绎出中性风BRAND。"} +{"content": "类型#裙*版型#宽松*图案#线条*裙腰型#自然腰*裙长#半身裙*裙款式#绑带", "summary": "这款中长款式的半身裙,别具一格的腰部蝴蝶系绑带设计,元气活力少女心爆棚,也多了几分趣味,穿着时刻轻松减龄更添气质。精致的自然腰版型设计,让宽松的衣着轻松贴身,修饰腰部线条更添纤细之感。选用深绿色调装饰,极具摩登时尚气。"} +{"content": "类型#裙*颜色#蓝色*风格#复古*风格#简约*图案#蝴蝶结*图案#复古*裙长#连衣裙*裙衣门襟#系带*裙款式#木耳边", "summary": "难得温柔的连衣裙,让看到的人便觉得舒适,优美的木耳花边小立领,温柔的弧度缱绻,娇俏可人的小女人味,蝴蝶结系带,一丝复古甜美,让裙子变得生动,而简约舒适的版型更让整体加分,淡淡的蓝色不管是在春天还是在夏天都能给人一种舒适的感觉不会。碎花纹,时髦的元素组合,赋予整款时尚度,同时透着不精心的美。"} +{"content": "类型#上衣*衣样式#衬衫*衣长#短款", "summary": "这款半裙采用a字版型剪裁,上身有效修饰腰部,勾勒出完美的身材曲线,更显女人味。短款的版型,受众于人群,从视觉上拉长下身比例,更显身高。绵羊皮面料处理,手感软糯,保留肉感,舒适细腻。上身可搭配衬衫,时髦又俏皮。"} +{"content": "类型#上衣*风格#复古*图案#蝴蝶结*图案#复古*图案#撞色*衣样式#衬衫*衣领型#立领", "summary": "新颖别致的立领设计,蝴蝶结的造型,打破年龄界限,很是甜美减龄;领口以及袖口的撞色镶边,镶边,别致新颖让人眼前一亮,洋气时髦,加上蝴蝶结的设计,与撞色镶边配合的很是默契,魅力吸睛;同时结合简单正式的衬衫版型,简直就是复古与现代美感的融合,为平淡无奇的衬衫带来了意想不到的时尚感。"} +{"content": "类型#上衣*材质#棉*风格#街头*图案#印花*衣样式#衬衫", "summary": "这件夏威夷衬衫采用了全棉贡缎面料,质感柔软透气,带来舒适的穿着体验。图案由深浅不一的芭蕉叶子组合,大面积印花尽显街头少年的叛逆不羁,尽显夏日的朝气活力。"} +{"content": "类型#裙*颜色#纯色*图案#纯色*图案#花色*裙长#连衣裙*裙领型#圆领", "summary": "这一款端庄的连衣裙,最满意于经典的圆领设计,进一步修饰了柔情的小丽人气质。配合纯色的花色图案印制,处处流露着柔美的都市丽人气质。"} +{"content": "类型#裙*材质#羊毛*图案#线条*裙长#短裙*裙长#连衣裙", "summary": "连衣裙采用80%的羊毛面料材质制作,具有吸水性高和保暖性强以及耐用性好和手感柔软而富有弹性等优质性能。可谓是时尚和保暖兼具的单品哟。短裙的裙长设计,不仅能给人带来满满的拉长腿部线条的视觉冲击感。同时又可谓是给小个子女生显高挑带来了好消息呐。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*图案#线条*图案#刺绣*裤型#直筒裤*裤款式#口袋", "summary": "本款牛仔裤选择了直筒的修身版型,剪裁流畅自然,让整体更显利落清爽。通过简洁细致的线条凸显了整体的修身效果。同时在臀部口袋处加入了精美的刺绣图案设计,让这款时尚单品更具美观性和实用性,让您出行更加自信。"} +{"content": "类型#上衣*材质#羊毛*图案#蝴蝶结*衣样式#针织衫*衣款式#波浪", "summary": "蝴蝶结如诗如画,打破针织衫的规整沉着,加一抹轻盈的灵动风姿。这款针织衫在领口和袖口的地方分别设计成波浪花边状,展现女性出浪漫的情怀;而别致的蝴蝶结装饰,为整体增添一丝甜美俏皮感;采用羊毛面料,手感舒适有着良好的保暖性。"} +{"content": "类型#裤*风格#运动*风格#休闲*图案#线条*图案#刺绣*裤款式#螺纹*裤款式#抽绳*裤口#小脚", "summary": "腰部抽绳设计运动休闲感十足,并且与裤脚的螺纹束口上下呼应,可以很好地修饰腿部线条。抽绳腰头,对身材的包容性很大,穿着自由,随心而不拘泥。最后左腿处的刺绣logo,在运动裤休闲舒适的基础上,又增添了时尚个性。面料柔软舒适,品质面料柔软不易起球、不易变形、不易褪色,有一种亲肤的顺滑干,提升良好穿着感受。"} +{"content": "类型#裙*风格#淑女*图案#蝴蝶结*裙型#网纱裙*裙下摆#层叠*裙腰型#高腰", "summary": "选用高品质的面料为裙身的载体,立体的蝴蝶结装点在高腰衔接处的后身,一抹少女的优雅气息扑面而来。加以蓬松层叠的网纱裙摆,拥有错综复杂的叠加设计,让整体视觉层次更显立体,演绎出唯美时尚的画面效果,彰显满满的淑女范儿。"} +{"content": "类型#裙*材质#蕾丝*风格#性感*图案#字母*图案#文字*图案#印花*图案#蕾丝*裙款式#拼接", "summary": "BRAND的这款睡衣裙,设计师将蕾丝元素运用到衣面设计中,拼接的蕾丝花纹遍布裙面,营造出隐隐的透视感,身型曲线若隐若现,将优雅与性感相结合。肩带上配以字母的印花设计,为整体增添时尚气息,可自由调节松紧度,给人自由的穿着体验感。面料主材质为彩棉,给人亲肤舒适感。"} +{"content": "类型#上衣*版型#宽松*颜色#纯色*图案#纯色*衣样式#卫衣", "summary": "卫衣是纯色款式,非常日常易搭配。微微宽松的版型,对身材不自信的女孩也不用担心。胸前的英文logo设计,非常有个性,很有时尚态度。"} +{"content": "类型#裙*颜色#白色*颜色#黑色*颜色#黑白*图案#印花*裙型#直筒裙", "summary": "以白色为主打色,再加上袖子的黑色,黑白搭配不显单调,加之3d立体玫瑰印花,使裙子显的立体,玫瑰代表着热情,就好像在表达着女性的青春活力,绽放自己的魅力。直筒的裙型,修饰腿部,展现女性独特的身材比例,更能拉长腿部,显高显身材。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*颜色#黑色*裙型#直筒裙*裙长#连衣裙*裙款式#腰带", "summary": "黑色连衣裙是每个小仙女都必备的气质单品,黑色天生自带神秘的美感,加之宽松直筒版型剪裁,营造出视觉显瘦的效果,让各种身材的mm都能穿出自信的时尚态度。没有过多繁杂的装饰,一条宽腰带别在腰间,勾勒出纤细的腰身,给宽松的连衣裙增添一抹精致的女人味,极简更具高雅气质魅力。"} +{"content": "类型#裙*材质#蕾丝*风格#青春*图案#蕾丝*裙型#a字*裙腰型#高腰*裙长#连衣裙*裙领型#圆领*裙袖型#喇叭袖", "summary": "这款蕾丝连衣裙的版型非常容易驾驭,高腰a字的版型穿起来很显苗条和高挑,睫毛蕾丝圆领非常的精致,显得脖颈更加迷人。袖子呢是今年很流行的喇叭袖,很浪漫仙美的感觉,让你举手之间显得气质更加出众。"} +{"content": "类型#上衣*版型#显瘦*风格#复古*风格#知性*图案#复古*衣样式#衬衫*衣领型#翻领", "summary": "这款衬衫,精致的小尖衬衫翻领设计,衬托小v脸型。散发出女性温婉知性的美感,蓬松的袖子设计。显瘦又可爱俏皮,舒适性自是不用说。既清爽自然又复古时兴,观感超棒,让你清凉整个夏日。"} +{"content": "类型#上衣*材质#牛仔布*颜色#黑色*风格#复古*风格#潮*图案#复古*衣样式#西装", "summary": "还记得《王牌》中的帅气吗?传统牛津皮鞋给人非常儒雅老成的印象,就连这种原本的青年穿上黑色西装和牛津皮鞋,也变成绅士!然而平时总不能一直穿着牛津皮鞋出入各种场合,这个时候一双牛津皮鞋改版的复古板鞋就是你的需求了。它儒雅,能搭配正装而且更显年轻;它略微潮流,能跟休闲裤牛仔裤出街装备!"} +{"content": "类型#裙*颜色#粉色*图案#字母*图案#文字*裙长#短裙", "summary": "简简单单的字母t,想呈现什么样的风格就看你自己发挥了。和各种短裤短裙都,如果遇到不会搭配的下装不妨和这件搭配试一试。在如果你已经有了米色的t,推荐可以再入一件橡皮粉色,来给你的衣橱增加一些可搭配的色彩。"} +{"content": "类型#裙*版型#宽松*图案#条纹*裙下摆#垂坠*裙长#连衣裙*裙款式#拼接*裙款式#木耳边*裙款式#抽褶", "summary": "连衣裙做了长版的样式,配合着宽松的剪裁,垂坠感很强,裙摆上还加入了褶皱元素,更加易于带来极强的灵动气息,还能巧妙的掩盖身材小缺点。而裙身下方还选用了条纹拼接,木耳边环绕一周,尽显时尚魅力,同时也打破了纯色调的单一感,层次性油然而出,动人无比。"} +{"content": "类型#上衣*版型#显瘦*颜色#绿色*风格#文艺*风格#青春*风格#清新*图案#格子*衣样式#衬衫*衣袖型#蝙蝠袖*衣款式#拼接", "summary": "衬衫的百搭性能是非常强的,已然成为了人手必备的单品。这款衬衫选用绿色的格纹拼接,既凸显文艺小清新,又散发出青春的活力,适合学院风穿搭。而蝙蝠袖的设计,不仅起到了显瘦的作用,还增添了慵懒的感觉。再加上收褶的下摆设计,提升了整体的造型感。"} +{"content": "类型#裙*材质#针织*颜色#白色*风格#性感*裙长#连衣裙*裙款式#拼接", "summary": "BRAND以深灰色为主基调打造的这款针织连衣裙,整体采用了假两件的剪裁设计,带来较为慵懒且性感的穿着美感。毛衣裙在领口拼接了白色的背心,形成了半遮半掩的穿着美感,尽显都市女性摩登且炫酷的穿着效果,是非常出彩的选择。"} +{"content": "类型#裙*材质#雪纺*风格#高贵*裙下摆#垂坠*裙领型#一字领*裙袖型#喇叭袖*裙衣门襟#系带", "summary": "这条裙子一眼看过去就能带来甜美的即视感,优雅的一字领设计看上去十分的精致高贵。这样一条裙子穿在身上显得十分的少女,像是捧在手里的棉花糖。半透的喇叭袖设计看上去十分的飘逸,雪纺的面料使得袖子更加有垂坠感,有更好的视觉观感。胸前的系带穿绳设计可以有多种穿法,更加百搭多变。这样一条裙子既有少女的甜美又有轻熟女的娇俏。"} +{"content": "类型#裤*版型#显瘦*材质#蕾丝*风格#潮*风格#性感*图案#撞色*图案#蕾丝*裤型#哈伦裤*裤款式#拼接*裤口#小脚", "summary": "玛玛绨的一款港风风格的哈伦裤,采用的是束脚的版型,非常的显瘦,而且让你的双腿显得也更加的笔直。蕾丝透视的设计,让你的腿部的肌肤若隐若现,时刻都透露着性感,而且还能凸显你的女人味。融入了撞色拼接的元素,非常亮眼的颜色,更凸显了潮流的气息。"} +{"content": "类型#上衣*版型#宽松*风格#复古*风格#休闲*图案#格子*图案#复古*衣样式#衬衫*衣样式#外套*衣领型#翻领*衣袖型#收口", "summary": "复古怀旧风的格纹衬衫,配色很别致,出街都基本不会撞衫。简洁的小翻领,修饰脸型显精神都不在话下。整体的设计比较宽松,也可以当做小外套穿着,袖口做了收口,看起来更加利落,两个贴袋装饰。帅气休闲。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#棉*图案#线条*衣样式#衬衫", "summary": "这款衬衫的面料采用纯棉面料,面料手感柔软顺滑,穿着舒适亲肤。版型做了宽松版型设计,藏肉显瘦,适合多数mm驾驭。衣袖直筒设计,修饰纤细手臂线条,下摆包边设计,走线工整,凸显品质精致做工。"} +{"content": "类型#裤*颜色#黑色*图案#线条*裤长#短裤*裤款式#口袋", "summary": "裤子侧边的黑色五角星图案是她最大的亮点,几乎没有人想到在短裤的侧面还可以有这样的设计,真心赞。盾形的后口袋设计,完美提升臀部线条,这样就既有美腿又有翘臀啦。"} +{"content": "类型#裤*颜色#黑色*图案#线条*裤长#短裤*裤款式#口袋", "summary": "这款短裤采用纯黑色的色调,演绎出成熟女性的优雅与沉稳。廓形的剪裁设计,让这款短裤更加有型有范儿。裤脚处的荷叶边造型,更显俏皮可爱的少女气息,也为裤身增添了更多的层次感,丰富了整体的视觉感受。裤身两侧的斜插口袋,兼具美观性与实用性。臀部的尾带设计,将腰臀间的曲线展现的更加凹凸有致,打造曼妙的身形线条。"} +{"content": "类型#上衣*颜色#纯色*图案#条纹*图案#纯色*图案#撞色*衣样式#卫衣*衣袖长#短袖*衣门襟#拉链*衣门襟#套头*衣款式#拼接*衣款式#拉链*衣款式#连帽", "summary": "舒适套头的短袖连帽卫衣,标准版型穿着更加的合身。纯色的衣身精选舒适面料,就算贴身穿着也无妨。衣身两侧下摆处的撞色竖条纹拼接,并辅以金属拉链开叉装饰,开合随意,更有修饰身形,点睛的功效。"} +{"content": "类型#上衣*材质#蚕丝*风格#通勤*风格#运动*风格#青春*图案#条纹*图案#印花*衣样式#衬衫*衣款式#松紧带", "summary": "青色大印花两件套,通勤感又不失流行,拉长的衬衫领是它的小小心情,袖口红蓝白条纹与青色的冲突,更让整套视觉明亮。裙子用松紧带,弹性十足,胖瘦都毫不费劲,适合各类提醒。蚕丝面料舒适而不臃肿,运动风明快,却又带了小女人的气质。"} +{"content": "类型#裙*版型#显瘦*版型#h*材质#蚕丝*颜色#蓝色*风格#复古*风格#文艺*风格#简约*风格#知性*图案#复古*图案#印花*裙下摆#开叉*裙长#长裙*裙袖长#五分袖*裙领型#圆领*裙衣门襟#排扣", "summary": "桑蚕丝印花轻柔飘逸,色彩艳丽,彰显高端品质。h型长裙包容身材,端庄优雅又显瘦,用彩色排扣点缀裙摆开叉处,增添复古的文艺风范,露出蓝色双层裙摆,更显浪漫妩媚。圆领口和五分袖都是最简单的款式,简约中流露知性气质。"} +{"content": "类型#上衣*版型#宽松*颜色#黄色*风格#简约*图案#条纹*图案#蝴蝶结*衣样式#衬衫*衣袖型#喇叭袖*衣门襟#系带", "summary": "一款简约而不简单的衬衫,宽松合身的款式穿着舒适又不挑身材,看习惯了各种白条或者蓝白条纹的衬衫,暖黄色的条纹是不是给你眼前一亮的感觉呢?袖口小喇叭袖与领口上的蝴蝶结系带相呼应,满满的甜美少女感,穿起来也很减龄。"} +{"content": "类型#上衣*衣样式#风衣*衣门襟#拉链*衣款式#口袋*衣款式#拉链*衣款式#对称", "summary": "弹力束口袖口以及衣摆设计,松紧适宜,穿着舒适无束缚,加强防风保暖。侧身斜插口袋设计,对称美观,且实用方便。ykk拉链门襟设计,拉合顺畅且穿脱更方便。轻便耐磨风衣面料,亲肤透气,防风防晒。"} +{"content": "类型#裙*风格#复古*风格#文艺*图案#复古*图案#撞色*裙型#a字*裙领型#v领*裙款式#吊带*裙款式#收腰", "summary": "收腰的小吊带,能够轻松秀出纤细腰肢,下摆呈a字散开,带来几分俏皮甜美感,长度是到小腿的位置,这个长度最显气质,不长不短刚刚好。领口的撞色勾边,很容易就能吸引视线,搭配v领,更有一股复古气息。一款比较偏轻熟风的吊带裙,文艺又不显幼稚。"} +{"content": "类型#裙*图案#蝴蝶结*图案#线条*裙型#a字*裙长#连衣裙*裙款式#绑带*裙款式#收腰", "summary": "连衣裙属于非常好穿的x廓形,这样的线条对于身材不会有太大要求,立体收腰的设计,还会让腰肢看起来更为纤瘦。腰节以下的位置做成了散开的a字摆的样式,中袖的长度也是比较温婉的,袖口附带有绑带蝴蝶结,凸显年轻俏皮的味道。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*颜色#浅蓝色*裤腰型#高腰*裤口#毛边", "summary": "浅蓝的色调迎合夏日明朗的氛围,糅合进牛仔裤间透着一丝丝潇洒大气的味道。简洁的款式摒弃浮华,象征着积极与纯粹的生活态度。高腰的裤型与a字版型相互配合,既能打造黄金身材比例又能巧妙地遮肉显瘦。另外在其裤头与裤腿出运用毛边处理,透着一丝丝不甘于平庸的个性,单调与乏味感瞬间一扫而空,宣扬潇洒不羁的活力。"} +{"content": "类型#裙*材质#蚕丝*图案#印花*图案#撞色*裙下摆#垂坠", "summary": "桑蚕丝是真丝中的翘楚,它能带给女性舒适亲肤的穿着体验,并具有轻盈垂坠的特性,作为高级时装的面料,也是高定晚礼服的时尚宠儿。这款真丝吊带裙充分诠释了桑蚕丝面料的高级之感,轻盈度和垂感都很赞,面料泛着淡淡光泽,夏季有出色的清爽与透气性能。印花图案总是具有化腐朽为神奇的力量,裙身的撞色印花增加几分轻奢味道,也让单品变得生动活泼起来,充满古典雅致之感。"} +{"content": "类型#上衣*版型#宽松*颜色#纯色*风格#简约*风格#休闲*图案#纯色*图案#线条*图案#撞色*衣样式#卫衣*衣袖型#落肩袖*衣款式#抽绳*衣款式#连帽", "summary": "简约又休闲的连帽卫衣版型,搭配上宽松的版型剪裁,瞬间穿出时尚慵懒气质。个性俏皮的落肩袖设计,修饰肩部线条,抹去了肩膀的硬朗感。撞色勾边和撞色抽绳的设计,打破了纯色的单调感,带去丰富的视觉层次感。"} +{"content": "类型#裙*风格#文艺*风格#知性*图案#条纹*图案#线条*图案#撞色*裙型#衬衫裙*裙领型#翻领", "summary": "透气清爽的色织麻衬衫裙,自带回归自然的恬静感,平添几分知性文艺风。整体大廓型的设计,让身体不受束缚的同时,结合撞色的条纹元素,带着时尚的律动感,赋予艺术气息。经典的小翻领,线条硬朗立体,契合裙身,尽显知性温柔,从而呈现出落落大方的穿着感。"} +{"content": "类型#裙*风格#清新*图案#卡通*图案#印花*裙长#连衣裙*裙袖长#无袖", "summary": "让您在夏日穿出清新范的连衣裙。淡雅的色系搭配精美的卡通印花,雕琢出格外唯美的时尚情结。采用欧根纱材质工艺,面料轻柔细腻,无袖背心式设计,上身效果更加清爽通透。"} +{"content": "类型#裤*材质#棉*材质#牛仔布*颜色#纯色*图案#纯色*图案#线条*裤口#毛边*裤口#微喇裤", "summary": "这款来自milibam的儿童牛仔休闲裤,甄选棉质牛仔面料,柔软滑糯,带来轻盈舒适的穿着体验;弹力伸缩裤头设计,可以根据宝宝腰身自由调节,舒适自在;裤脚结合毛边喇叭裤的版型设计,颇具时代感,且拉长腿部线条;甜美的纯色色调,在视觉上享受的同时,更于优雅气质中彰显出高街范,妥妥的小时髦精。"} +{"content": "类型#上衣*材质#蚕丝*风格#欧美*风格#潮*图案#线条*图案#刺绣*衣样式#衬衫*衣领型#圆领*衣袖型#灯笼袖*衣袖型#喇叭袖*衣款式#钉珠*衣款式#荷叶边", "summary": "经典的圆领设计,修饰颈部线条,时尚的绣花设计,尽显甜美可爱。欧美气质a字裙,真丝工艺打造,舒适的圆领喇叭袖,精湛的钉珠有档次,穿着优雅美丽。优雅的衬衫领口结合荷叶边修饰,干练灯笼袖,搭配一步裙,穿着气质大方,简单大方的款式设计。尽显满满的潮流气息,更有个性。"} +{"content": "类型#裙*风格#运动*图案#条纹*图案#撞色*裙长#长裙*裙款式#不对称*裙款式#波浪", "summary": "以剪裁和面料的变化,丰富你的衣橱。这款女子半身长裙,以不对称的下摆线剪裁和腰身波浪般的裙褶设计,为你的运动单品添加更多女人味。采用柔软的绉绸面料打造,舒适时尚。裙侧缀撞色三条纹。"} +{"content": "类型#裙*版型#显瘦*图案#条纹*图案#蝴蝶结*图案#撞色*裙型#a字*裙领型#v领*裙款式#绑带", "summary": "后背蝴蝶结绑带设计甜美可爱,裙身撞色条纹设计个性十足,经典的条纹元素看呢,v领设计修饰精致脸型,a字裙摆遮肉显瘦。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#性感*图案#蝴蝶结*衣样式#衬衫*衣领型#一字领*衣袖型#喇叭袖", "summary": "这一款衬衫一字领设计,露出肩部性感迷人。微微宽松的廓形,上身包容性比较强,遮肉显瘦中凸显美妙身姿。时尚的蝴蝶结装饰,甜美俏皮自然减龄。加上精致喇叭袖,轻盈大方随风摇曳。"} +{"content": "类型#上衣*版型#宽松*材质#网纱*衣样式#卫衣*衣袖型#落肩袖*衣款式#拼接", "summary": "卫衣真是春日里出门必备单品,而这款拼接的卫衣更是能驾驭多种风格场合,你看那网纱拼接的肩部,是不是很有特色呢?而且舒适好穿的落肩袖宽松舒适,上身自由自在不受拘束。"} +{"content": "类型#裙*材质#雪纺*颜色#黄色*风格#简约*风格#性感*裙长#连衣裙*裙领型#圆领*裙款式#不规则", "summary": "柔和的黄色能给人带来一股暖意,轻盈的雪纺面料柔软的手感让你浑身散发女人味。时尚简约的大圆领设计,让整体看起来更有活力,穿着可爱减龄;衣袖还采用了透视面料制作,能够修饰出纤细手臂,突显性感魅力;不规则的下摆设计,给整体增添了立体层次,穿着起来也十分灵动飘逸。穿上这款连衣裙,走在大街上绝对回头率满满哟。"} +{"content": "类型#裙*风格#知性*图案#蝴蝶结*裙型#一步裙", "summary": "很气质有范的版型让人穿起来干净舒服,就算小个子女孩也能驾驭的,穿上仍具有一样美美的气质。很有夏日感色系,搭配肩部设计很特别,不夸张但也能体现你的个人品味,知性又浪漫。腰部蝴蝶结的装饰结合一步裙摆的设计凸显身材高挑,勾勒出曼妙的身姿,打造属于自己的优雅,提高众人回头率。"} +{"content": "类型#上衣*材质#针织*风格#休闲*衣样式#开衫*衣门襟#单排扣*衣款式#罗纹", "summary": "针织开衫是都市女性表达出素雅的单品,诠释出一番淡淡的惬意风情,也玩转出一股端庄大气的即视感。单排扣装饰的门襟,注入一股法式优雅的情怀,加之珍珠扣的高级感,显得时髦又摩登,立体的袋盖点缀,平添出素雅的休闲风情,罗纹针织的收尾,增添一股淡淡的惬意风情。"} +{"content": "类型#裙*风格#淑女*风格#休闲*裙长#连衣裙*裙衣长#中长款*裙领型#圆领", "summary": "中长款款式的气质款式连衣裙,中长款的版型女性穿在身上具有十足的休闲魅力与时尚的个性,极具女性十足的优雅气息。圆领的设计,能够凸显出女性的颈部长度。还能够很好的修饰出女性的淑女气息。"} +{"content": "类型#裙*材质#牛仔布*风格#简约*风格#青春*图案#卡通*图案#刺绣*裙型#牛仔裙*裙衣门襟#拉链*裙款式#破洞*裙款式#纽扣*裙款式#拉链", "summary": "牛仔裤是衣橱里一年四季都不可或缺的时尚单品,这款就是比较简约版型的,整体优选的牛仔棉弹面料更是提升了穿着的舒适感,腰身的纽扣以及隐形拉链,美观也满足穿脱自如。精美的卡通刺绣更是演绎了无限童趣感,裤身的破洞更是凸显出青春专属的时尚与个性。"} +{"content": "类型#裙*材质#蕾丝*材质#雪纺*风格#性感*图案#刺绣*图案#蕾丝*裙长#半身裙*裙款式#拼接*裙款式#勾花镂空", "summary": "让您穿出小心机的雪纺小衫。深黑色的面料,两侧肩部采用蕾丝刺绣拼接工艺,展现出格外动人的肤质。后背的镂空式设计,更让您留下迷人性感的背影。搭配一条半身裙就能气质满满。"} +{"content": "类型#裤*材质#牛仔布*材质#混纺*材质#纤维*图案#线条*裤腰型#高腰", "summary": "这一款牛仔裤采用了棉纤维与聚酯纤维等的混纺面料,裤身具有良好的弹性,上身没有紧绷不适感,行走起来舒适自在。裤身的裁剪相当精巧,根据亚洲女性的独特曲线而裁剪出来的线条,能够更好的修饰腿部线条,结合着高腰的版型,勾勒优美曲线。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*裙腰型#高腰*裙长#半身裙*裙款式#口袋*裙款式#纽扣", "summary": "黑色是很典雅深沉的,既能够修身显瘦还能够衬托皮肤的白皙让你看起来更加的充满魅力。半身裙的长度恰到好处,搭配高腰的版型可以说是很时尚又充满美感了。金属纽扣的装饰,就如同沙漠里面的一般让裙子充满了生命的活力,粗线条的重工点缀增加了立体感和层次感,口袋的设计可以说很贴心的,能够放一些贴身的物品。"} +{"content": "类型#上衣*版型#宽松*颜色#黑色*风格#简约*风格#运动*图案#撞色*衣样式#外套*衣款式#拼接*衣款式#连帽", "summary": "针对运动训练而设计的一款外套,立体宽松的版型剪裁,活动自如不受束缚。整体以黑色做为基调,连帽与后背辅以撞色网眼面料加以拼接,简约之中搭出活力潮范儿。"} +{"content": "类型#裙*材质#网纱*颜色#粉色*风格#复古*风格#宫廷*风格#性感*图案#复古*图案#刺绣*裙长#长裙*裙长#连衣裙*裙款式#拼接", "summary": "网纱拼接长裙,网纱的点缀设计,舒适亲肤,但是又具有一点挺括性。粉色的甜美,延续女孩对梦幻色彩的定义,整体是运用各种几何图形绘制出精细图案,绣花精致的点缀,带来复古宫廷感,演绎浪漫优雅气质。上半身的设计,略带性感味道。单层网纱的设计,不会显得过于膨胀。柔美的仙女色调,朦胧轻盈带着舒适的质感,用在连衣裙温柔甜美。"} +{"content": "类型#裙*颜色#红色*图案#线条*裙下摆#荷叶边*裙领型#v领", "summary": "用红色来衬托白皙的肌肤,增加在人群中的瞩目度,轻薄的材质冰凉亲肤,即使是在炎热的沙滩,也能最大限度的透气散热。v领设计拉长脖颈线条,彰显气质,荷叶边袖口在沙滩微风的轻抚下,飘逸温柔。腰间松紧设计适合不同身形的女孩子,安全的裤裙有效避免了走光的尴尬。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*图案#线条*衣样式#卫衣*衣袖型#泡泡袖", "summary": "春日必备的卫衣,宽松的bf版型,上身舒适修身,遮掩住不必要的肉肉,让你的身材更为纤细玲珑。经典的圆形领口,柔化颈部的线条,修饰下巴的弧线,脸蛋立显精致有型。蓬松的泡泡袖,赋予人惊艳的视觉感,更显轻盈灵动,萌动不失调皮可爱。袖口处的紧实织法,简洁工整凸显质感,让你轻盈又利落哦~"} +{"content": "类型#上衣*颜色#白色*风格#淑女*风格#简约*风格#清新*衣样式#衬衫*衣袖型#插肩袖*衣袖型#喇叭袖*衣款式#绑带", "summary": "淑女风的衬衫,白色调的晕染,简约大气,清新脱俗,衬托白皙脸庞的同时提升女性的高雅气质。小v的领口,荷叶花边与绑带修饰,遮缮肩膀宽度,并融入甜美风情。插肩袖型与喇叭袖摆,方便活动,突显优雅浪漫。"} +{"content": "类型#裙*风格#淑女*图案#刺绣*裙长#连衣裙*裙款式#勾花镂空", "summary": "气质与舒适兼顾的一款连衣裙,因融入刺绣镂空设计,而尽显优雅迷人。小鸟刺绣图案精美逼真,除了透露着甜美大气的淑女气质,还兼备减龄效果。除了栩栩如生的刺绣图案之外,其细腻的针脚也透露出精湛的制作工艺。"} +{"content": "类型#上衣*风格#英伦*衣样式#外套*衣样式#西装", "summary": "这是一款充满英伦典雅气息的西装外套。采用精美挺括的华达呢面料,倾斜的纹理使得视觉与触感更为耐看与饱满。高温定型工艺处理,令经典的版型更加焕发光彩与强大的气场。肩部特别添加适度垫肩,令肩部的廓形更为立体时尚,从而凸显高挑修长的身形。采用英伦沿袭的织带方式装饰衣领与衣袋,同时搭配徽章造型,更加展现干练、自信的气质。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*风格#简约*图案#线条*裙型#牛仔裙*裙型#直筒裙", "summary": "小直筒牛仔裤,上宽下窄的线条,流畅又简约,视觉修饰腿型和宽胯,显瘦实力满分。优质弹力牛仔,妥帖包裹身体曲线,持久回弹,久穿膝盖部位也不易。后腰小v带设计,时尚有趣,增添精致细节看点,还能从视觉臀部范围有所缩小。"} +{"content": "类型#裙*风格#通勤*风格#淑女*风格#文艺*风格#简约*图案#格子*裙长#连衣裙*裙衣长#常规*裙袖长#长袖*裙领型#v领*裙领型#翻领*裙衣门襟#系带*裙款式#拼接*裙款式#不规则", "summary": "这款长袖系带连衣裙,可以说是细节之处都是小亮点了。领口部分运用了常规通勤的翻领样式,形成了一个小v领,可以很好地展现锁骨的曲线。值得一提的是,腰部与袖口处相互呼应的格纹拼接设计,很有文艺淑女的气质,简约中透露着女性的优雅。还有不规则的碎褶裙摆,浪漫又随性不拖沓。"} +{"content": "类型#裙*风格#清新*图案#碎花*图案#线条*裙下摆#压褶*裙长#连衣裙*裙款式#钉珠", "summary": "这款碎花款式的连衣裙,碎花的设计显得很清新优雅,船型领的设计,穿脱方便,前襟压褶的制作,线条整齐自然减龄。最大的亮点就是腰间的手工钉珠的制作,非常的靓丽。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*颜色#黑白*风格#通勤*风格#简约*风格#知性*图案#条纹*裙衣门襟#单排扣*裙款式#腰带", "summary": "经典简约黑色,自有通勤典雅属性,而两肩和袖口,以及腰带的黑白条纹点缀,让裙身散发出律动的的吸睛魅力。此外,小立领和门襟处的单排扣则带有点中山装的影子,内敛沉稳,气质笃定。加上x版型的修身显瘦性能,凸显出女性优雅端庄的知性气质。"} +{"content": "类型#裙*风格#街头*风格#性感*裙型#蓬蓬裙*裙款式#拼接*裙款式#勾花镂空*裙款式#收腰", "summary": "裙摆拼接波浪形镂空花纹,增加整体的时髦度和设计感,看上去挺括立体。为单调的裙摆增添几分酷帅的街头感,蓬蓬的裙摆更具廓形,上身很显精神。收腰的效果真的很棒,束出小蛮腰。很性感,拉高腰线,不压个子不挑人穿。"} +{"content": "类型#裙*材质#丝绒*材质#纤维*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拼接*裙款式#拉链", "summary": "一款舒适有型的聚酯纤维半身裙;别具一格的高腰设计,采用了丝绒作为拼接,贴合腰线并衬出纤腰,使曼妙身姿穿搭出柔美小女人的味道;侧腰处的隐形拉链设计,便于穿脱又不影响裙身的外观,穿着更显大方得体。"} +{"content": "类型#上衣*材质#针织*风格#淑女*风格#高贵*衣样式#毛衣*衣领型#半高领", "summary": "来自的这件针织毛衣领口采用半高领的花边设计,可以起到减龄的作用,同时又显的更加淑女。结合胸前的奢华镶嵌设计让这件针织上更加贵气,因此更能使穿着者更加的有气场且高贵。"} +{"content": "类型#裙*风格#街头*风格#简约*裙长#连衣裙", "summary": "作为时尚界的一股清流,时尚连衣裙得到了都市女性的一致认可和青睐,它的设计风格简约,迎合了都市女性的审美追求。这款时尚连衣裙在设计上没有多余的修饰,但是优质面料无疑是一抹亮色,它让这款连衣裙可以自然而然的在街头脱颖而出,轻松打造出高端路线,对于都市女性来说,简直再好不过。"} +{"content": "类型#裙*材质#网纱*颜色#白色*裙款式#吊带", "summary": "裙子采用两件套设计,内里为长款吊带,外罩选用珍珠点缀的网纱,精致的做工与纯洁的白色相得益彰,素雅纯洁仙气十足~网纱若隐若现的视觉感受与吊带元素提升了整个人成熟的气质。"} +{"content": "类型#上衣*版型#显瘦*风格#休闲*图案#卡通*衣样式#卫衣*衣款式#螺纹", "summary": "卫衣既可以很时尚也能够带来接头的炫酷风格,这款却是满满的少女俏皮感,一身的版型是外廓样式的。穿着起来不挑身材也更是显瘦,经典的圆形螺纹领口,贴合颈部更方便穿脱。胸前的可爱卡通图案还富有童趣感,衣摆还带有开叉设计。休闲同时也提升了洋气感。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#街头*风格#工装*图案#线条*衣样式#外套*衣袖型#插肩袖*衣款式#口袋", "summary": "这件时尚帅气的工装外套,穿着舒适更显个性。宽松版型,不挑身材不挑人,遮肉显瘦很百搭;口袋设计,立体有型,很有设计感;插肩袖设计,修饰手臂线条的同时,更显慵懒宽松的街头感。"} +{"content": "类型#裙*材质#蕾丝*风格#性感*图案#线条*图案#蕾丝*裙腰型#高腰*裙领型#一字领", "summary": "半透一字肩设计的这款婚纱,再饰以立体花朵点缀其中,打造甜美公主范儿的同时,也更能凸显性感锁骨线条。而高腰加上的裙型,上身则更衬优雅女神气质。水溶蕾丝面料的选用,更加彰显出了奢华品质感。"} +{"content": "类型#裙*图案#印花*裙衣长#中长款", "summary": "这款卫衣袖口处的假两件设计,增添层次感,一件出门也能穿出不一样的~帽檐廓形的袖口设计,增添整体的有趣成分更加亮点。oversize版型,松松垮垮,配以中长款打造,配上长靴不要太洋气~加上胸口及袖口的个性印花尽显青春活力气息~"} +{"content": "类型#上衣*材质#棉*风格#运动*风格#休闲*风格#潮*风格#嘻哈*图案#刺绣*衣样式#外套", "summary": "这款小外套选用棉质面料,使版型挺阔立体,上身更加有形。英文字母刺绣,增添潮流时尚感,打造出休闲运动风。细节袢的大袖扣点缀,设计新颖独特,低调中又不失利落帅气,上身嘻哈范十足。"} +{"content": "类型#上衣*版型#宽松*颜色#红色*衣样式#衬衫*衣领型#翻领*衣款式#钉珠", "summary": "让您穿出热情洋溢的名媛范,是这款衬衫带给您的惊喜。亮眼的红色系面料,工整的小翻领,适合各种场合穿搭。领口两侧的钉珠工艺非常别致,呈现出精美的时尚品味。自然宽松版型,可以驾驭各种身材。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*裤腰型#中腰", "summary": "具有弹性的牛仔裤,在穿着的时候更加注重舒适性。设计师在大腿处采用了磨白的痕迹,打造出了时尚的立体效果,看上去更加显瘦。而中腰的版型能够保护腰腹部,防止寒风的入侵,也更加具有时尚感,实用的性能大大上升。"} +{"content": "类型#上衣*材质#针织*风格#简约*图案#线条*衣样式#针织衫*衣款式#勾花镂空", "summary": "这是一款初见简约,再看惊艳的针织衫,它的美,如雏菊,静美极佳。纯白色的衣身,色调大气简约,更好搭配服饰;针织的纹理诠释,竖线的线条更显身姿修长纤细;精妙之处就在于衣身加设了嵌花,精美的小花朵,圆孔镂空,就像一朵朵雏菊般,优雅泛清香。"} +{"content": "类型#裙*版型#显瘦*风格#知性*风格#中国风*图案#亮面*裙袖型#灯笼袖", "summary": "这款齐腰短外套,采用中西结合的造型设计,融入中国风的盘口元素,展现女性温婉知性魅力。精选提花亮面面料,明暗有致的反光效果,使整体立体感十足。衣身部分,鼓起的衣身遮肉显瘦。齐腰的衣长处理,提高腰线,衬出大长腿。的优美弧线,美化脖颈曲线。束口灯笼袖的设计,修饰手臂,亮眼抢镜。"} +{"content": "类型#上衣*风格#街头*风格#复古*图案#条纹*图案#复古*图案#线条*衣样式#衬衫*衣款式#口袋*衣款式#不对称", "summary": "这款衬衫采用了经典的条纹元素,彰显出鲜明的美式复古街头气息,能够打造出雅痞的绅士气度。前幅分别采用了不对称的口袋设计,营造出鲜明的层次感,又具有收纳的作用。弧形的下摆,剪裁比较流畅,可修饰出臀部的线条。"} +{"content": "类型#裤*裤型#灯笼裤*裤型#阔腿裤*裤型#背带裤*裤款式#绑带", "summary": "背带裤带着童年的记忆,总是让人对它爱不,是减龄的王牌单品,更是打造百变造型的小心机设计。以阔腿裤版型剪裁打造,实力修饰腿型,完美适配各种身材。藏在裤脚的设计小心机瞬间你的心,个性的绑带设计让阔腿裤一秒变成灯笼裤,俏皮感十足又百变,满足你对百搭的需求。"} +{"content": "类型#裙*颜色#纯色*风格#通勤*图案#纯色*图案#线条*裙长#连衣裙*裙领型#圆领*裙袖型#蝙蝠袖*裙衣门襟#系带", "summary": "通勤百搭的一款纯色连衣裙,经典时尚的圆领设计,勾勒出优美的颈部线条,结合领口个性的系带装饰,稍微为整体的造型增添几分设计感,显得美观而大方。袖子采用精致的蝙蝠袖,宽宽松松的廓形,能够很好的美化双臂线条,展现温婉优雅的气质。"} +{"content": "类型#裤*材质#天丝*材质#牛仔布*颜色#纯色*风格#复古*图案#纯色*图案#复古*裤款式#口袋*裤口#小脚", "summary": "选取天丝材质以其与生俱来的柔和光泽度与细腻质感,释放出慵懒复古味道。以基础简洁剪裁碰撞纯色设计,尤为时尚大气。简单而不失雅致的纯色衣身,传递出现代人追求便捷,舍简的生活方式,深受大众青睐。胸前口袋设计,在与衣身相同颜色渲染下轻轻点缀,低调添注时尚细节。排扣的设计绅士优雅,简易搭配,可搭配休闲裤、牛仔裤、束脚裤等。"} +{"content": "类型#裤*材质#丝绒*裤长#连体裤*裤款式#亮丝*裤款式#流苏", "summary": "颇具高级名媛气质的一款丝绒连体裤,采用的是质感极好的亮丝丝绒材质,泛着满满的光泽感,让人看一眼便能感受到它的高级感,又带来一种别致的华丽时尚气息;袖口处的羽毛流苏装饰亮眼吸睛,凸显出女性个性的一面;还有那收腰系带的设计,轻松便可勾勒出女性的苗条身姿。"} +{"content": "类型#裤*风格#简约*风格#清新*裤长#短裤", "summary": "本品采用简约短小的短裤造型,适合在炎热夏季穿着使用,能够带来清凉舒适的体验。短小的款式还可以实现修饰流畅腿型的效果,能穿出大长腿。采用清新的颜色款式,充满时尚气质,满足自由搭配的需求。"} +{"content": "类型#裙*版型#显瘦*裙型#百褶*裙长#连衣裙*裙领型#立领*裙款式#纽扣*裙款式#收腰", "summary": "BRAND这款百褶领连衣裙采用了百褶立领设计,显得十分俏皮可爱。后背带有点缀纽扣开合,方便人们穿脱。腰部带有收腰装饰,显瘦。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑白*图案#条纹*图案#线条*衣样式#针织衫*衣领型#翻领", "summary": "十分适合早春穿搭的一款针织衫。经典的黑白竖条纹设计,视觉显瘦效果很好。小翻领设计,能够很好底修饰脖颈线条。特别定制的双层钩针设计,使得整件服饰的保暖性提升一个等级,但是又不会显得过分臃肿。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#棉*风格#青春*裙下摆#弧形*裙款式#腰带", "summary": "选用客供高支棉面料,织面平滑细腻,质感挺括,上身有型感。宽松的廓形,搭配上一款颜色艳丽的腰带,不仅可以修身突出腰部曲线,更为裙装增加设计亮点,凸显女性的青春美丽。弧形裙摆剪裁配合前短后长的设计,丰富层次增添一份时尚感。"} +{"content": "类型#裙*材质#网纱*材质#蕾丝*图案#蕾丝*裙长#长裙*裙长#半身裙*裙款式#拼接*裙款式#勾花镂空", "summary": "网纱拼接半身裙精选优质网纱质地轻盈、触感柔韧、垂感自然飘逸。拼接精致的镂空蕾丝,轻薄柔软。精美的花型纹路多变、清晰细致。花瓣状边缘采用纤细的睫毛收尾,更添层次美感!清爽半身长裙展现出十足女人味。"} +{"content": "类型#裙*颜色#黑白*风格#简约*图案#线条*裙款式#拼接", "summary": "这款鞋子采用牛皮的面料,细腻柔软散发着淡淡的光泽感。纯黑白拼接的设计,展现出简约主义的时髦腔调。方头的设计尽显优雅,平底款式让你行走更加舒适。嘻嘻的扣带的设计,视觉上拉长腿部线条。无论是搭配裙装或是裤子,出街美翻天妥妥的啦。"} +{"content": "类型#裙*材质#网纱*颜色#金色*图案#蝴蝶结*图案#刺绣*图案#撞色*裙长#连衣裙*裙款式#收腰", "summary": "在裙身上做立体珠片堆砌的星星,和小星星形成对比的同时更显得格外的精致,小女人梦寐以求的连衣裙也~在领口及裙摆上用撞色的定制金色网纱做镶边,网纱上特别的刺绣上了撞色的金色珠片,更为贴心的激光镶边,不会散口不说,细节感也做得到位,u型的大领口上系蝴蝶结,甜美一瞬间就出来了,微收腰的版型,裙摆却是散口的a型,甜美,更多的是女人味."} +{"content": "类型#裙*风格#简约*风格#知性*风格#性感*图案#印花*裙长#连衣裙*裙衣长#中长款*裙领型#v领", "summary": "这件简约而不简单的中长款印花连衣裙穿着很显气质,它的设计很用心。设计师采用了经典的v领设计融入衣身,露出精致的锁骨。给人性感而不失优雅的感觉,让你轻松打造出知性迷人的轻熟女韵味,而且v领的融入还能很好的修饰脸型,穿上它让你显得魅力十足。"} +{"content": "类型#裤*版型#显瘦*颜色#粉色*颜色#灰色*风格#运动*图案#线条*裤腰型#高腰*裤口#微喇裤", "summary": "这款瑜伽服采用粉色和灰色的搭配,上身后显瘦效果极佳且丰富你的运动生活,精选柔软透气性的面料,拥有极佳的弹力感。裤子腰部高腰的设计,再配以微喇叭版型的裤脚,修饰了双腿线条的同时让肌肉得到放松。"} +{"content": "类型#裙*颜色#灰色*风格#淑女*风格#清新*裙长#连衣裙*裙款式#腰带", "summary": "这款连衣裙采用了高级灰色的色调,优雅的色彩让整件连衣裙看上去满满的淑女风,优雅大方,立体的剪裁。给人清新可爱的视觉效果,可拆卸腰带处理,新颖时尚,甜美减龄,体现对时尚的追求,灯笼中袖的设计更是别具一格。"} +{"content": "类型#上衣*材质#蚕丝*风格#简约*衣样式#衬衫*衣长#短款*衣款式#拼接*衣款式#荷叶边", "summary": "一款简约的短款衬衫,融入细腻顺滑的真丝材质后,赋予衣身新的魅力,高雅恬静的气质尽显;直筒的衣身廓形巧妙的修饰身材曲线,包容性很好。圆润的领口显得经典又大方,袖口拼接上飘逸的荷叶边,行走起来灵动而柔美,边缘小巧的收褶也更添立体质感。"} +{"content": "类型#裙*风格#清新*图案#条纹*裙下摆#垂坠*裙长#连衣裙*裙领型#翻领*裙袖型#插肩袖*裙款式#拼接", "summary": "充满了少女气息的一款连衣裙,气质小翻领设计衬出小巧五官,蓝白条纹裙身清新减龄同时提升整体时尚度和优雅气质,自然垂坠的裙摆带来无限浪漫情怀和灵动美,插肩袖拼接打造挺括肩型,起到修饰肩型的效果,没有肩宽限制,任何身材都能驾驭。"} +{"content": "类型#上衣*风格#街头*风格#潮*图案#刺绣*衣样式#卫衣", "summary": "卫衣最大的亮点在于胸前新潮独特的logo刺绣,不同于BRAND一贯的设计往事,在其中加入了不同色系的c字logo,令视觉上的层次效果更加分明饱满。而帽子一侧精致的BRAND点缀,强调了双方品牌的联名身份,展现出不羁而时尚的街头气息。"} +{"content": "类型#上衣*风格#街头*风格#潮*图案#刺绣*衣样式#卫衣", "summary": "此款BRAND卫衣,采用经典的帽衫款式,胸口和袖口缀有低调精致的刺绣logo,为衣身增添街头风味和潮人魅力。且搭配柔软面料,内里有加绒设计,手感细腻,带来保暖舒适的衣着感受。此外,俏皮袋鼠兜不仅方便放置物品,同时彰显前卫的潮流风范。"} +{"content": "类型#裙*版型#显瘦*风格#淑女*风格#清新*风格#性感*裙领型#圆领*裙袖型#喇叭袖*裙衣门襟#系带", "summary": "简洁的圆领设计,显露出迷人性感脖颈,增添几分娇俏动人。挂脖式系带,丰富了层次感,带着些许的柔美俏皮。修身a摆裙型,结合及膝的长度,遮掩不完美的大腿达到显瘦的效果,唯美浪漫间透着女人柔美气息。喇叭袖凸显甜美气质,更为时尚平添一股秀气,让清新淑女韵味展露无疑。"} +{"content": "类型#裙*颜色#粉色*图案#碎花*裙下摆#开叉*裙长#连衣裙*裙袖长#七分袖*裙款式#腰带", "summary": "这款粉色连衣裙精选柔软的面料,上身后能修饰身材曲线,且裙摆的垂落感十足,营造满满的仙女形象。七分袖的设计搭配袖口的小开叉,小巧的展现了可爱俏皮的气质,腰间的碎花腰带,提升你的气质。"} +{"content": "类型#上衣*版型#宽松*材质#棉*风格#休闲*图案#条纹*衣样式#衬衫*衣款式#纽扣", "summary": "衬衫采用纯棉材质手感细腻,亲肤透气性好穿着更舒适,条纹设计色彩清爽干净,活性印染色彩牢固不褪色。厚度适中穿着舒适,宽松版型设计更显休闲自在。干练的衬衫领设计,更显宝贝精神有朝气,精致品牌纽扣富有光泽,细节处更彰显品质。衣摆前短后长圆弧设计,更添设计感更显灵动,两侧心机小开叉,增添一丝甜美活力。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*材质#棉*材质#水洗*风格#休闲*裤款式#口袋", "summary": "选取优质的水洗棉勾勒版型,柔软亲肤的触感,带来舒适透气的穿着体验。引用宽松的廓形版型设计,带来遮肉显瘦的穿搭效果。精致的口袋装饰着裤身,平添了几分休闲随性的气息。个性的圆环金属扣装饰着裤身,呈现出炫酷的时髦感。"} +{"content": "类型#裙*材质#棉*图案#格子*裙领型#翻领*裙款式#收腰", "summary": "一款无论什么场合都能驾驭的美裙。精致的小翻领,立体又有型,精气神儿很饱满。纹的设计,柔美又浪漫。经典的对格剪裁,彰显出精湛的工艺与高端品质感。100%客供棉,爽滑柔软又亲肤。收腰的款式,勾勒出纤细的曼妙身姿,气质非凡。"} +{"content": "类型#裙*风格#复古*风格#简约*图案#复古*图案#刺绣*裙长#连衣裙", "summary": "这款连衣裙选用素雅的有机水面料,纹理富有立体感,复古又不失时尚。亦红亦紫的藕荷色衣身,更能展现女性沉静如水的温婉恬静,再以做工精细的刺绣加以点缀,让其华丽感瞬间提升。以面料的“简”去映衬刺绣的“繁”,努力在简约与繁复中寻求一种平衡。"} +{"content": "类型#裤*风格#休闲*风格#潮*风格#工装*图案#字母*图案#文字*裤长#短裤*裤款式#口袋*裤款式#不对称*裤款式#飘带", "summary": "休闲短裤在左右两边的裤腿都设计上分明利落的工装口袋,塑造出帅气的工装版型,将男士硬朗的气场凸显。一侧口袋的翻盖装饰字母贴标,营造不对称的时尚感,两边裤脚还装饰飘带,摇曳出随性的味道,突出细节设计的个性和潮流。"} +{"content": "类型#裙*版型#显瘦*颜色#深色*图案#印花*裙下摆#花边*裙长#连衣裙*裙款式#不对称", "summary": "此款连衣裙选用优质面料打造,上身穿着舒适度爆棚。独特印花,立体饱满,工艺精湛。不对称的肩部设计,打破传统设计的单调感,更显时尚与活力。花边点缀,为整体注入甜美气息,少女感十足。搭配上深色腰封,有效拔高腰线,轻松优化身材比例,显高显瘦。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#针织*颜色#纯色*风格#简约*风格#休闲*图案#纯色*图案#线条*衣样式#开衫", "summary": "一件上身非常显瘦的针织开衫,轻薄垂顺的款式,无门禁,宽松休闲,自在随性,非常时尚百搭,显慵懒气质。后背做了一个立体的剪裁处理,削弱肩部线条,在视觉上就能显瘦十斤,还有一点的效果。简约的纯色设计,简单不单调,两边还有开衩处理,精致立体,更显苗条身形。"} +{"content": "类型#裙*风格#性感*图案#线条*裙下摆#花边*裙下摆#垂坠*裙腰型#高腰*裙领型#立领*裙款式#拼接*裙款式#勾花镂空*裙款式#钉珠", "summary": "镂空花边立领设计,勾勒出优美的脖颈线条,衬的人气质不俗,胸前镂空花纹拼接,加上精致的钉珠点缀,婉约朦胧却并不会显得过于暴露,带来刚刚好的性感情调。裙身上立体的花朵装饰,流露出几分婉约风情。高腰设计,自然垂坠的裙摆,裙长过膝至小腿,有很好的修饰作用,举手投足间尽显柔美气质。"} +{"content": "类型#裙*版型#显瘦*材质#雪纺*风格#复古*图案#碎花*图案#复古*裙款式#拼接", "summary": "很别致的一款碎花雪纺裙,选择温柔到骨子里的色调。带着一丝复古的气息,非常的耐看而且很好搭配。胸下的细褶拼接提高了腰线的位置,视觉上显高显瘦,轻松打造出大长腿的即视感,丰富的层次感充满着浪漫的气息,上身效果非常的轻盈优雅。"} +{"content": "类型#裙*版型#宽松*风格#复古*风格#简约*图案#蝴蝶结*图案#复古*图案#印花*裙型#a字*裙下摆#垂坠*裙款式#腰带", "summary": "这款裙子采用小a字版型,宽松舒适,视觉上提升腰线。慵懒随性的蝴蝶结腰带,增添几分甜美俏皮气息。复古简约方格纹印花,经典百搭不落俗套。垂坠面料与搭片款式的巧妙结合,增加整体层次感,更具独特气质。"} +{"content": "类型#上衣*材质#牛仔布*颜色#白色*风格#休闲*图案#线条*衣样式#外套*衣样式#西装*衣款式#拼接", "summary": "这款外套拥有硬挺的牛仔面料,诠释了西装的版型。设计师运用拼接手法,为西装加入了一丝趣味和休闲感。白色的线条更是勾勒出整体的效果。"} +{"content": "类型#裙*材质#棉*裙长#连衣裙*裙袖长#无袖*裙领型#圆领*裙衣门襟#系带", "summary": "此款连衣裙采用百分百的纯棉面料制作,具有极好的吸湿透气性,上身以后格外亲肤舒适。时尚大气的圆领,能够更好的贴合颈部,凸显女性的落落大方。利落简洁的无袖版型,清爽自在,特别适合夏季穿着。腰部贴心的系带设计,更是可以帮助你塑造纤细腰身。"} +{"content": "类型#上衣*图案#条纹*衣样式#衬衫*衣长#常规", "summary": "咋一看好像只是常规蓝白条纹衬衫,其实袖口处有着不经意的小亮点,打结的设计多了一些趣味性,使这件衬衫不会显得单板。领口大v设计,修饰脸型有瘦脸的效果。"} +{"content": "类型#上衣*颜色#宝蓝色*风格#淑女*风格#复古*风格#清新*图案#复古*图案#线条*图案#印花*图案#撞色*衣样式#衬衫", "summary": "撞色衬衫领拼接裙身,突出复古精巧的设计感,衬显出优美的天鹅颈,修饰精致脸型,上身穿着更显简单大方。个性花边剪裁的宝蓝色印花裙身,鲜明的色彩碰撞,给人活力又明快的俏皮感,凸显出女生清新柔美的气质,看起来纯净又优雅。高腰a字版型剪裁,使腰身的线条看起来更加自然、柔和,完美遮掩腿部线条和臀围的缺点,帮助穿衣者保持应有的淑女风范。"} +{"content": "类型#上衣*材质#蚕丝*风格#宫廷*风格#青春*衣样式#衬衫", "summary": "这款处处流露高贵典雅的桑蚕丝衬衣,尽显女性的优雅魅力,给人端庄大气的宫廷范儿,设计感以及质感完全不输大牌。衣袖采用透视真丝材质,舒适透气触感柔顺,给你清爽的体验。微喇的袖口尽显甜美可人,成熟中含有青春的气息。"} +{"content": "类型#裙*材质#牛仔布*裙长#半身裙*裙款式#破洞", "summary": "BRAND的这样一条别致迷人的牛仔裤设计感极好,彰显出你的独特韵味,让你吸睛十足更洒脱。它的别致半裙设计,甜美万分,同时带给你满满的层次感,让你大气十足。破洞的设计更是高端洒脱,想找别致韵味。"} +{"content": "类型#上衣*版型#显瘦*风格#复古*风格#知性*图案#复古*衣样式#风衣*衣样式#外套*衣门襟#系带*衣门襟#双排扣", "summary": "经典款式的双排扣风衣外套,搭配上色泽柔和的米色调,呈现出了更为复古知性的优雅感。系带设计的融入,让风衣可以更好的贴合多样化的身材,穿出更合身、也更显瘦的视觉效果。将它穿在身上,既优雅又不失气场,让你尽显时尚。"} +{"content": "类型#裙*版型#宽松*颜色#白色*图案#印花*图案#撞色*裙型#百褶*裙长#连衣裙*裙款式#收腰", "summary": "这款连衣裙的主体采用藏青水果印花图案,肩部和下摆加入白色布料,形成时尚的撞色效果。腰部采用收腰的设计造型,能够轻松打造流畅的身段。下摆是宽松的百褶裙摆,活动灵动自然,也能够体现细节美感。"} +{"content": "类型#上衣*版型#显瘦*版型#立体剪裁*图案#刺绣*衣样式#西装*衣长#常规*衣款式#钉珠*衣款式#亮片", "summary": "不同于常规西装的一板一眼,这套西装凸显出了令人惊艳的设计感。翻驳领挺括有型,加入了亮片钉珠的点缀,优雅中折射出唯美光芒。修身剪裁贴合身躯,勾勒出纤细腰姿更显精工的立体剪裁。通体刺绣考究精致,出立体花型堪称艺术,提升整体品相。"} +{"content": "类型#裙*颜色#白色*风格#复古*风格#文艺*风格#知性*风格#清新*图案#复古*裙型#直筒裙*裙长#连衣裙*裙领型#v领*裙衣门襟#系带*裙款式#流苏", "summary": "清新典雅的直筒连衣裙,经典时尚,营造出靓丽形象。清爽的白色基调,更好的凸显白皙肤色,营造文艺气质。精致优雅的提花纹理,充满浪漫知性的文雅气质。经典大气的v领样式,活力可爱,搭配个性系带设计,充满与众不同的个性魅力。点缀复古流苏,营造出飘逸灵动的丽人形象。"} +{"content": "类型#裙*风格#文艺*图案#格子*裙型#蛋糕*裙型#背带裙*裙下摆#层叠*裙长#连衣裙", "summary": "这款独具学院风格的连衣裙采用背带的版型设计,穿搭起来更具减龄的效果,让你充满少女的气息。格纹的BRAND图案修饰其中,衬托出满满的格调,文艺气息脱颖而出,蛋糕裙效果的层叠裙摆造型又很有立体感,看起来很有活力,穿起来更加年轻。"} +{"content": "类型#上衣*图案#文字*图案#印花*衣样式#卫衣", "summary": "这款卫衣是以当下热门的元素作为设计题材,胸前印花采用高清数码印花贴布工艺,呈现出逼真的视觉效果。左侧文字设计配以右侧的宇航员,轻松点出主题,两侧袖子特色贴标装饰,让整体视觉效果更为丰富时尚。"} +{"content": "类型#上衣*颜色#紫色*风格#性感*衣样式#衬衫*衣袖长#长袖*衣款式#露肩", "summary": "欧时力新款的长袖衬衫实在是太凸显女性的魅力光彩了。独特的长袖和性感露肩设计,走到哪里都让人忍不住望一眼,性感时髦的设计会让人爱不释手。搭配亮眼的紫色使得衬衫焕发光彩,加上精选优质柔和面料,亲肤无刺激,舒适不起球,轻松驾驭各种场合。"} +{"content": "类型#上衣*风格#简约*衣样式#针织衫*衣款式#对称", "summary": "针织衫洋气舒适又保暖,是秋冬两季应该选择的一种服饰。这款针织衫设计上非常用心,时髦又透露着典雅风,而且针线很密集,给人一种高大上的感觉,袖口,领口,相同颜色的线,看上去非常简约大气,又有一种对称的美感。"} +{"content": "类型#裙*材质#雪纺*图案#碎花*图案#线条*裙下摆#荷叶边*裙长#连衣裙*裙领型#v领*裙袖型#荷叶袖", "summary": "这条唯美浪漫的雪纺碎花连衣裙,穿着舒适更显气质。v领设计,修饰脖颈部线条,更显修长白皙;荷叶边裙摆设计,甜美浪漫更显层次感与设计感;荷叶袖设计,修饰手臂线条,更显纤细。"} +{"content": "类型#裙*裙型#背带裙*裙型#牛仔裙*裙型#铅笔裙*裙型#直筒裙*裙腰型#高腰", "summary": "这款牛仔裙,背带的设计充满青春活力,减龄效果max。裙摆则选择的是直筒铅笔裙的版式,罕见而又不突兀的搭配让人眼前一亮,更是增添了优雅的魅力。下摆开叉设计,行走起来更加自如。高腰的版式在视觉上更是有显高的效果。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑色*颜色#红色*图案#线条*衣样式#雪纺衫*衣领型#小立领*衣长#短款*衣款式#木耳边", "summary": "这套裙装是经典的红黑配色,非常的百搭时尚。上衣的红色雪纺衫,短款修身更显瘦。精致的木耳花边小立领,凸显脖颈纤长优美的线条感。袖身的木耳花边,和甜美的灯笼袖口,彰显活力甜美的气质。下身的黑色半身裙,高腰a字的版型更显瘦。这套裙装不论是上班还是约会,都非常的夺目吸睛!"} +{"content": "类型#上衣*版型#显瘦*风格#淑女*图案#线条*图案#刺绣*衣样式#卫衣*衣领型#圆领*衣长#短款", "summary": "这款淑女风卫衣,采用圆领的设计,加上条纹的装饰,修饰颈部柔美线条的同时,且丰富视觉美观。衣身精美的绣花,彰显女性的几分典雅气质。短款直筒的版型,遮掩女性身材的不足,上身毫无束缚感又显瘦。此外,品质的面料,给你带来贴身舒适的穿着体验。"} +{"content": "类型#裤*颜色#绿色*图案#卡通*图案#字母*图案#文字*图案#撞色*裤型#哈伦裤*裤款式#口袋", "summary": "绿色的裤身,充满了活力阳刚之气。撞色的弹力裤头,松弛有度,穿着舒适不勒小肚子,还方便了孩子们自由的穿脱。腰后侧字母图案修饰,醒目亮眼,丰富整体的视觉感。两侧对称的假口袋造型配以卡通图案修饰,栩栩如生,构成了一幅妙趣横生的画面。后侧还设计了字母与图案,可爱充满童趣,完美的彰显了男孩子的活泼天真。再加上哈伦的版型设计,让这款裤子的时尚一级。"} +{"content": "类型#裤*材质#牛仔布*风格#潮*图案#字母*图案#文字*图案#撞色*裤长#五分裤*裤型#直筒裤", "summary": "该款牛仔裤采用五分款型设计,夏季穿着清凉舒适。简洁的直筒裤脚走线均匀细密,不易脱线和变形,结合裤腿处的撞色字母logo,成功体现出裤子的品牌魅力,时尚显潮流感。"} +{"content": "类型#裙*风格#简约*裙型#百褶*裙长#连衣裙*裙领型#圆领", "summary": "23区的这款连衣裙,经典的圆领,既能修饰脸型小巧,还能凸显出颈部的纤细,皮扣的设计,不仅能随意转换造型,还能给人一种抢眼的视觉感。百褶的款式,简约又不失时尚,还很好的丰富了整体的立体感。"} +{"content": "类型#裙*材质#蚕丝*颜色#红色*图案#抽象*图案#线条*图案#印花*裙腰型#高腰*裙长#连衣裙*裙袖长#七分袖*裙款式#木耳边", "summary": "连衣裙抽象的印花融入大自然的元素,清爽自然,赋予创新的艺术韵味,优雅之美。流畅的廓形,做了高腰线的处理,拉伸视觉比例,勾勒曼妙迷人的身姿。七分袖的修饰纤细的手臂线条,木耳花边的袖口柔美大方,平添一份浪漫女人味。绚丽的红色与清爽的桑蚕丝浑然天成,质地清透柔滑。踩上一双单鞋,摇曳的步伐,美得不可收拾。"} +{"content": "类型#裤*图案#动物*图案#刺绣*裤型#背带裤", "summary": "绣工精美的动物刺绣图案,形态栩栩如生,非常软萌可爱。腰间装饰扣袢,做工精细,对称整齐,于细节彰显高端品质。实用贴袋平整,裁剪利落,增添背带裤的层次感。重工车线,等距,简单的工艺透出精工细作的品质。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#羊毛*风格#ol*风格#性感*图案#线条*裙长#连衣裙*裙领型#圆领*裙款式#收腰", "summary": "美丽家这件ol风连衣裙,宽松大圆领设计,轻松彰显脖颈线条,透出性感吸睛魅力。裙面采用羊毛面料制作,呈现出的毛呢材质兼具手感与质感,上身亲肤舒适。宽松的伞摆设计,上身立显活力气质。修身收腰版型设计,上身舒适不紧绷,轻松展现身材曲线。"} +{"content": "类型#裤*风格#性感*图案#蝴蝶结*裤长#九分裤*裤款式#绑带*裤口#微喇裤", "summary": "这款的设计亮点在裤脚。利用了绑带的多变化,可以绑成各种造型。超简单的就是绑个蝴蝶结了,增添了裤子的时髦裤。而且让裤脚处略带喇叭式,更加显高挑了。九分的款式,露出脚踝既有几分小性感,也更显高挑了。"} +{"content": "类型#裙*风格#休闲*图案#撞色*裙型#大裙摆*裙腰型#松紧腰*裙长#半身裙", "summary": "一款休闲百搭的半身裙,采用富有弹力的优质面料,给宝贝带来舒适的穿着体验。撞色的松紧腰设计,不仅方便穿脱,还温柔的呵护着腰部肌肤,品牌的图案装饰,增添一丝小俏皮。宽大的裙摆设计,让宝贝穿着清爽舒适。"} +{"content": "类型#上衣*图案#字母*图案#文字*图案#印花*衣样式#卫衣*衣袖型#收口*衣款式#破洞", "summary": "肩袖处有破洞的设计,顺着破洞处的缝线别出心裁,不同于以往的卫衣,就是很好看又很显随意的感觉,立马年轻活力起来。袖口和下摆都做了收口的设计,柔软舒适穿着体验感很好,后背的字母印花打破整体的单调性~"} +{"content": "类型#上衣*颜色#白色*风格#青春*风格#职场*图案#蝴蝶结*衣样式#衬衫*衣款式#拼接", "summary": "白色衬衫是职场女性不可缺少的经典单品,如果你没有更好的设计,fendas可以为你提供多一种选择。这件衬衫在袖口的位置用甜美的蝴蝶结装饰,展现出女孩青春活泼的一面。并且用不同的材质拼接,丰富了视觉效果。"} +{"content": "类型#上衣*图案#刺绣*衣样式#衬衫*衣领型#翻领*衣款式#拼接*衣款式#口袋", "summary": "袖子的拼接设计是这款衬衫的亮点之处,轻松显不同,穿着更容易凹凸个性魅力。经典的翻领,很好衬托气质优雅大方,显颈脖修长。口袋装饰提升整体的丰富性,绣花点缀,体现细节设计,彰显与众不同。"} +{"content": "类型#裙*材质#网纱*颜色#浅蓝色*风格#复古*风格#文艺*风格#清新*风格#性感*图案#复古*图案#线条*图案#印花*裙型#大裙摆*裙长#连衣裙", "summary": "这款独具仙女气息的连衣裙采用浅蓝色作为主基调设计,穿搭起来更具清新文艺感,结合大气的印花图案修饰其中,带来更具复古典雅的韵味。网纱的半透明材质更具性感的味道,大裙摆的线条悠扬而有型,轻松增加端庄优雅的女人味。"} +{"content": "类型#裙*版型#显瘦*材质#针织*风格#性感*裙型#a字*裙款式#螺纹*裙款式#纽扣", "summary": "含蓄细腻柔软的螺纹针织,温暖有型,穿上之后成就你的性感女神。双排手工缝制纽扣,非常有特点,视觉上显瘦,小众显品味,出街不易撞衫。优雅a字版型,演绎名媛风格,既满足了基本款的百搭又兼具了时髦。"} +{"content": "类型#上衣*材质#雪纺*图案#蝴蝶结*衣样式#衬衫*衣款式#口袋*衣款式#飘带", "summary": "精选上等的雪纺材质打造的衬衫,更加呼应夏天酷热的气氛,时刻为你提供一个舒爽干净的穿着环境,轻松应对尴尬的夏季。领口处的飘带装饰突出丰富的立体层次,不论是打成蝴蝶结,还是随意着都很有腔调感。口袋剪裁打造满满细节亮点。"} +{"content": "类型#裙*颜色#红色*风格#青春*裙型#背带裙*裙长#连衣裙*裙衣门襟#单排扣*裙款式#口袋*裙款式#腰带*裙款式#对称*裙款式#收腰", "summary": "背带式的连衣裙,增添了青春俏皮的女性气息,上身起到了减龄的效果,展现出女性美好灵动的风采。腰部一根腰带的搭配,提升了整款设计的美感,在一定程度上起到了收腰的效果。单排扣的设计,美观又实用,同时彰显出优雅大气的女性风采。红色色调十分显白,美丽动人,凸显出精致的女人味。对称的口袋设计,发挥了实用效果。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#混纺*风格#街头*衣样式#衬衫*衣袖型#收口", "summary": "这款哈伦裤是松紧腰头,穿上的舒适感超好,适合各种身材的宝宝们。裤脚有收口的设计,让裤子显得比较立体,在视觉上打造显瘦的效果,而且也带上了几分酷酷的街头味道。裤子的材料是混纺的,比较类似衬衫的那种爽滑度,宽松舒适。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*材质#牛仔布*风格#街头*风格#青春*裤型#阔腿裤*裤款式#拼接*裤款式#纽扣*裤款式#流苏*裤腰型#高腰", "summary": "BRAND的这款牛仔裤俏皮而灵动的拼接裤脚设计,配合上独特的流苏缀边装饰处理,既带来一份不羁叛逆的街头潮感,又能诠释出你时髦玩味的个性风采。而其简洁利落的高腰剪裁,配合上唯美雅致的斑斓纽扣点缀设计,锦上添花的好效果自不必说,还能张扬出你活力满满的青春动感气息。再加上它宽松自在的阔腿裤型,藏肉显瘦的同时也能绽放出你洒脱率性的自在风姿。"} +{"content": "类型#上衣*版型#宽松*材质#棉*风格#性感*图案#条纹*衣样式#衬衫*衣领型#v领", "summary": "蓝白的条纹从远处看仿佛就是一条靓丽的风景线,设计师将其融进衬衣的设计当中,并使用宽松的v领版型,微露锁骨的同时。将小性感的气息散发出来,精致又不失时尚感。而细腻的棉质面料的加入更是增添了舒适的体感,让穿搭更加时尚。"} +{"content": "类型#裤*材质#棉*颜色#黑色*颜色#卡其色*裤型#直筒裤", "summary": "这款休闲裤选用特别舒适的棉质面料,加入氨纶带了弹性,穿着无束缚感。直筒立体版型,穿着挺括硬朗,非常利落。锥形裤的款式,让你气质倍增,使身材很明显,搭配上不拘束。有黑色和卡其色两款可选,都是裤子中的经典配色。"} +{"content": "类型#裤*材质#棉*材质#混纺*颜色#米白色*风格#清新*裤长#短裤*裤腰型#高腰", "summary": "avivaonearth以清新、淡雅的米白色为主基调打造的这款短裤,整体采用了高腰的剪裁设计配合短款的裤腿设计,带来较为显身材且轻便舒适的穿着效果。设计师为这款短裤了苎麻和棉的对半混纺效果,兼顾麻的干爽和棉的亲肤,是非常好穿的单品。"} +{"content": "类型#裙*风格#性感*图案#植物*裙长#长裙", "summary": "领口的松紧设计可以视觉上收紧脖颈,营造完美天鹅颈,优雅迷人,同时腰间的松紧设计收紧腰身,轻松露出小蛮腰。性感撩人,二者相互呼应,打造出完美比例。优雅长裙是是女神们的最爱,仙气十足。大片花卉让人仿若置身花海,远远望去像是花中仙子,沉迷其中。"} +{"content": "类型#裙*材质#蕾丝*风格#知性*风格#性感*图案#蕾丝*裙型#小黑裙*裙领型#圆领*裙款式#拼接*裙款式#勾花镂空", "summary": "好像小黑裙总会给人一种很神秘很妩媚的感觉,裙身采用秀气的圆领设计,贴合颈部,凸显知性优雅,展现女性的天鹅颈。以及领口设计了小镂空的裁剪,微露肌肤,平添了不少性感韵味。肩部采用蕾丝的拼接,显得甜美带洋气的气息。"} +{"content": "类型#裤*颜色#黑色*裤口#小脚", "summary": "个性时尚的休闲裤采用了纯黑色的色调设计,纯黑色的色调,打造时尚摩登的风格凸显的随性自然的特点。束脚裤的版型设计,展示了最具个性时尚的风格魅力,精湛的可口可乐,凸现时尚,摩登。"} +{"content": "类型#裤*颜色#黑色*裤口#小脚", "summary": "经典的黑色小脚裤,多少条都不嫌多,也不用担心搭配出错的问题。厚度适中的面料,不管是稍有寒意的春季还是酷热的夏季,它都是那么的柔软透气。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*材质#牛仔布*风格#简约*裤长#长裤*裤款式#破洞", "summary": "非常春日穿搭的一款基础风格的牛仔长裤。简约的版型,对腿型没有过分的限制,更加宽松舒适。磨毛的裤脚设计,能够凸显出脚踝部分,更加显瘦哦。个性的破洞造型,展现出青春活力之感。"} +{"content": "类型#裤*材质#羊毛*颜色#灰色*风格#知性*裤型#阔腿裤*裤款式#流苏", "summary": "浪漫优雅的麻灰色系,是让人一眼就会爱上的颜色,隶属于不暖的色系,在视觉上给人舒服自然的感觉,着实百搭。裤型采用经典的阔腿裤型,搭配具有丰富质感的羊毛材质,整体风格利落有型,散发着知性的女神范。裤身周围加以精致的钩工流苏边点缀,颇具层次感。"} +{"content": "类型#裙*材质#蚕丝*材质#蕾丝*风格#复古*风格#性感*图案#复古*图案#线条*图案#印花*图案#蕾丝*裙下摆#开叉*裙领型#v领*裙衣门襟#暗扣*裙款式#拼接*裙款式#勾花镂空", "summary": "复古时尚的印花元素让你仿佛闯入了乱花迷人眼的花海,飘逸柔软的真丝材质带来舒适的穿着体验,开衩的裙摆设计,行走间流露出温婉迷人的风情。时尚的v领以暗扣闭合,透露出柔美的颈部线条,衬托出娇小的脸型,腰间唯美的镂空蕾丝拼接,隐约透肉的质感展现出性感曼妙的腰线,是衣身的一大亮点。"} +{"content": "类型#裙*材质#雪纺*裙长#连衣裙*裙领型#v领*裙袖型#喇叭袖*裙衣门襟#系带", "summary": "女人和雪纺仿佛天生就有一种不解之缘。一见钟情,倾心。这款雪纺连衣裙设计了优雅的系带v领造型,精致之余显格外与众不同,且予以肩颈万种风情,还让胸型更饱满。七分喇叭袖的造型,举手间灵动柔美,由内而外散发着小女人味。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#棉*裙长#连衣裙*裙领型#娃娃领*裙款式#拼接", "summary": "这款甄选进口棉并经过细腻的剪裁和拼接工艺,实现了超宽松的廓形,没有了修身版型的这款连衣裙依然可以让我们穿出瘦瘦的身材,同时还能在视觉上将腹部赘肉隐藏住。此外该款连衣裙采用娃娃领造型,它能给我们带来减龄效果。"} +{"content": "类型#裤*材质#牛仔布*风格#街头*风格#休闲*裤长#九分裤*裤型#直筒裤*裤腰型#中腰", "summary": "九分裤的版型加上中腰的裤型设计,给人一种别样舒适的休闲感。直筒版型修饰腿型,上身之后秒变笔直大长腿。中腰的版式设计,给裤子的穿有丰富的选择性,百搭而又时尚。小亮点在于裤边的磨边装饰,带有个性气息,街头风格十足。牛仔面料硬挺时尚,穿着上身透气性好。"} +{"content": "类型#裙*版型#显瘦*风格#中国风*图案#植物*裙型#花苞裙*裙型#包臀裙*裙款式#盘扣", "summary": "中国风盛行的时代,衣橱里怎能少得了优雅的改良式旗袍裙呢?缤纷浪漫的花卉,装点在裙面上瞬间就吸引了别人的目光,浑身上下都充斥着无穷东方韵味;斜襟如意花苞盘扣,将女性的温柔感全带出来了,配合修身包臀的廓形,女性该凸该凹的好身材都能完美呈现,俨然一副古雅端庄的小家碧玉的模样。"} +{"content": "类型#裙*材质#蕾丝*图案#条纹*图案#蕾丝*裙款式#勾花镂空*裙款式#收腰", "summary": "这件裙子的颜色本身就够惹眼了,所以在鞋子、包包和其他配饰上不用太费心,简单些就好。竖条纹的设计让身材更加修长,肩部和裙摆的镂空蕾丝,给人优雅朦胧的感觉。收腰的设计不会凸显小腹,还能显出傲人的身材。"} +{"content": "类型#裤*版型#显瘦*风格#街头*风格#休闲*裤长#短裤*裤款式#口袋*裤款式#不规则", "summary": "虽然夏天还没到,但是短裤好囤起来啦,毕竟这么好看的!这款带点民族风情,还蛮有特色的,在一众短裤里很容易出彩版型是有点a字的,裤腿微宽,不会有束缚感,也能显瘦显腿腰头口袋都做了明缉线装饰,视觉上很有立体感裤口不规则磨破撕边效果随性不羁,休闲街头感很浓"} +{"content": "类型#裤*版型#显瘦*风格#街头*风格#休闲*裤长#短裤*裤款式#口袋*裤款式#不规则", "summary": "虽然夏天还没到。但是短裤好囤起来啦,毕竟这么好看的,这款带点民族风情,还蛮有特色的。在一众短裤里很容易出彩,版型是有点a字的,裤腿微宽,不会有束缚感。也能显瘦显腿细,门襟腰头口袋都做了明缉线装饰,视觉上很有立体感。裤口不规则磨破撕边效果随性不羁,休闲街头感很浓。"} +{"content": "类型#裤*版型#显瘦*风格#街头*风格#休闲*裤长#短裤*裤款式#口袋*裤款式#不规则", "summary": "虽然夏天还没到,但是短裤好囤起来啦,毕竟这么好看的!这款带点民族风情,还蛮有特色的,在一众短裤里很容易出彩版型是有点a字的,裤腿微宽,不会有束缚感,也能显瘦显腿腰头口袋都做了明缉线装饰,视觉上很有立体感裤口不规则磨破撕边效果随性不羁,休闲街头感很浓!"} +{"content": "类型#上衣*颜色#纯色*风格#清新*图案#纯色*衣样式#衬衫*衣领型#一字领", "summary": "“一抹,一曲”的风格在这款衬衣上悄然体现,一字肩简洁大方,轻松勾勒肩部的迷人。花苞袖的设计扩大了优雅的气息,举手投足都是满满的温柔感。清新感十足的纯色打造,给人一种极为雅致的视觉感,好似“犹抱琵琶半遮面”。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*颜色#红色*裙长#连衣裙*裙衣门襟#双排扣*裙款式#绑带", "summary": "炎炎的夏日我们想要赶走阳光带来的,选择这件连衣裙是个不错选择。大红色的衣身配色活力十足,让你轻松地回到年轻时代。同时这种色彩也可以,让你的肌肤看起来白嫩具有光泽。腰间应用的系扣绑带,具有很好的显瘦显高效果。精致无比的黑色双排扣,体现出大牌的做工。"} +{"content": "类型#上衣*图案#线条*图案#撞色*衣样式#针织衫*衣样式#开衫*衣袖长#长袖*衣袖型#落肩袖*衣门襟#单排扣*衣款式#拼接", "summary": "众所周知,春季是针织衫的专属季节,采用冰丝面料的开衫,不会到来过热的穿着触感,能够轻松应付早晚的温差。落肩长袖的拼接,美化了肩臂的线条感,单排扣门襟的装饰,可敞开or合并来穿,各有一番味道,撞色锁边元素,去除单一,营造出立体的层次效果。"} +{"content": "类型#裙*裙型#牛仔裙*裙型#包臀裙*裙下摆#开叉", "summary": "此款牛仔裙大亮点,第一就是它包臀设计,能完美的展现女性的翘臀,在细节处勾勒出女性婀娜曼妙的体态。二是它的开叉设计,行走时能看到大腿曲线若隐若现,增添了神秘感。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#纤维*风格#淑女*风格#复古*风格#文艺*图案#复古*图案#印花*裙衣长#中长款*裙衣门襟#系带*裙款式#不规则", "summary": "时尚印花图案,优雅又大气,聚酯纤维面料,手感柔软舒适,经典复古又永不过时,时而甜美少女时而文艺淑女;素致淡雅的色彩搭配,中长款的版型,修身与宽松的合理搭配,温柔而又显气质;裙摆的不规则裁剪,使得与众不同有个性,腰间系带又恰当的显瘦。"} +{"content": "类型#上衣*风格#街头*风格#青春*图案#条纹*衣样式#衬衫*衣款式#口袋", "summary": ",慢慢的气味回暖。这时候穿件帅气的衬衫,在合适不过啦。想在的街头脱颖而出,经典的条纹衬衫是个不错的选择哦,经典是在时代之后还能流行于大街小巷中,简单明了的规整条纹,简单的口袋做了很好的修饰版型的作用,喜欢的千万别错过哈!"} +{"content": "类型#裤*版型#宽松*版型#显瘦*风格#文艺*图案#刺绣*裤型#阔腿裤*裤腰型#松紧腰", "summary": "松紧一字领搭配上荷叶边的设计,增添了甜美俏皮的灵动感,露出锁骨和肩头,更感情调。刺绣花朵设计带来文艺浪漫的气息,宽松的阔腿裤遮肉显瘦,很有垂顺感。"} +{"content": "类型#裤*版型#显瘦*颜色#纯色*风格#性感*图案#纯色*裤长#长裤*裤腰型#高腰", "summary": "为避免纯色设计太过单调,e在这款长裤侧边加入拼纱丰富层次给人视觉上的惊艳感,小透性感且颇为个性!而高腰紧身版型修身效果佳,勾勒长腿并可遮肚收腰、提臀显瘦,一展苗条身姿轻松穿出气质范,不仅是瑜伽装也是潮感满满的穿搭单品。"} +{"content": "类型#裙*风格#民族风*风格#性感*图案#印花*裙腰型#高腰*裙款式#吊带", "summary": "灵动裙摆设计,展现出优美的律动感,同时也带来了民族风的味道,带有不羁风情。面料舒适柔软,穿着无拘束。高腰设计突显美丽曲线,在视觉上提高了腰线。吊带设计,十分的性感,加倍吸睛。上有印花图案点缀,尤为精致美观。"} +{"content": "类型#裙*版型#显瘦*材质#雪纺*风格#青春*图案#碎花*裙长#连衣裙*裙衣门襟#系带*裙款式#木耳边", "summary": "这一款雪纺连衣裙精致的木耳边领,精致俏皮特别出彩,利落的裁剪,塑造出迷人身段,给人恰到好处的视觉效果,显瘦的同时彰显高挑身姿,加上领口系带,塑造造型特别灵动。碎花装饰,青春减龄丰富视觉。"} +{"content": "类型#裤*颜色#黑色*裤款式#口袋*裤口#毛边", "summary": "裤装以简单的黑色打底,凸显出服装的百搭属性,以军事风为主体设计,更能呼应主题,展现出裤装散发出来的男人味。两侧的立体口袋装饰,不仅美观还很实用,让裤装的视觉装饰更加饱满。另外裤脚处的毛边设计也是充满了时尚的小心机。"} +{"content": "类型#上衣*版型#显瘦*图案#条纹*图案#蝴蝶结*衣样式#衬衫*衣领型#v领*衣袖型#堆堆袖*衣款式#腰带*衣款式#抽褶", "summary": "极具学院风的一款衬衫,前后v领的设计,既能勾勒出迷人的天鹅颈,又能衬托出娇俏的小脸。时髦的褶皱堆堆袖,打破基础款的单一更具层次感。同色系的蝴蝶结腰带,不仅能够修饰身形,还能诠释出个性腔调。大热的条纹元素,结合开叉的衣摆,视觉上更显瘦显气质。"} +{"content": "类型#上衣*颜色#黑色*衣样式#外套*衣领型#翻领*衣款式#口袋", "summary": "这是一款经典的黑色西服外套,版型看上去没有特别的设计,但就是因为经典,才更受欢迎。帅气的翻领设计让整个人显得更有气质,还带有两个翻盖的口袋,在起到装饰作用的同时也十分便利。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*颜色#黑色*裙腰型#高腰*裙袖型#喇叭袖", "summary": "这款孕妇裙采用黑色的主色,黑色有视觉显瘦的效果。宽松的领口将脖颈修饰的更加修长。喇叭袖的设计可以遮挡手臂的问题。高腰的版型是为了不让凸起的小肚子有紧绷难受的感觉。"} +{"content": "类型#裙*版型#显瘦*版型#h*图案#波点*图案#印花*裙下摆#花边*裙长#连衣裙*裙领型#圆领", "summary": "这款由itmichaa推出的连衣裙,修身h版型设计,穿搭上身有着显瘦的效果,适合各种身材穿搭。花边圆领的设计,个性又时髦,又能巧妙的修饰出小巧迷人的脸型。衣身通体以波点印花图案点缀,时尚而新颖,也为衣身带来了丰富的视觉看点。"} +{"content": "类型#上衣*版型#宽松*衣样式#风衣*衣长#短款*衣袖型#落肩袖*衣门襟#系带*衣款式#收腰", "summary": "这件风衣的最大好处在于短款且收腰的设计,非常适合小个子女生。短款的版型能够拉高腰身显腿长,宽松的版型非常遮肉,腰身加上收腰系带的小细节使整体不会臃肿,反而会使腰间看起来更加纤细。落肩袖很适合肩宽的妹子,可以很好的在视觉上削弱肩宽。"} +{"content": "类型#裙*颜色#红色*裙下摆#开叉*裙下摆#垂坠*裙衣门襟#系带", "summary": "独特的材质选择,让裙身具有很好的垂坠感,讲女性优美的身体曲线展现出来,采用了系带与开叉的设计,不仅表现出现代女性的干脆利落,同时还流露出女性特有的妩媚味道。纯正而大气的红色,十分的吸睛,展现浓浓女王范儿~"} +{"content": "类型#裙*版型#显瘦*颜色#红色*风格#简约*图案#印花*裙腰型#高腰*裙长#连衣裙*裙款式#拼接*裙款式#飘带", "summary": "简约而彰显个性的一款连衣裙,纯白色系面料,一侧的红色拼接设计,宛如一条浪漫的飘带坠入眼帘。个性的扇子印花图案,更点缀出满满的诗意。高腰显瘦版型,轻松穿出高挑曼妙的身段。"} +{"content": "类型#裤*风格#简约*图案#线条*裤款式#口袋*裤口#小脚", "summary": "这是一款简约百搭的束脚裤,穿搭实用性高,让你赚足回头率。两边的斜插口袋,可放随身物品,解放你的双手,也是街边凹造型的好帮手;束脚裤型,拉长腿部线条让你变身长腿欧巴。"} +{"content": "类型#裙*图案#线条*图案#印花*裙长#连衣裙", "summary": "见惯了摩登的都市女郎,不妨来点独具匠心的民族风情。此款连衣裙,采用传统交领改良设计,颈部的优美线条瞬间凸显出来,女人味儿十足。斜襟与袖口的仿珠扣装饰,非常有古典风味。醒目的印花加上明亮的颜色,色泽的碰撞形成了一大冲击感,低调奢华。"} +{"content": "类型#上衣*风格#休闲*图案#刺绣*衣样式#衬衫*衣袖型#落肩袖", "summary": "当刺绣遇上了衬衫,让你的穿着美到;时尚刺绣落肩袖衬衫,休闲的版型设计,优雅有气质,配上非常有特色的花朵刺绣图案,会给你的穿搭带来不一样的时尚范儿~"} +{"content": "类型#裙*风格#复古*风格#文艺*图案#格子*图案#复古*裙型#a字*裙下摆#垂坠*裙腰型#高腰*裙衣门襟#单排扣", "summary": "手感光滑柔软的面料,拥有极佳的垂坠感。清爽简洁的a字廓形,剪裁利落,上身优雅大方,洋溢着满满的文艺气息。贴心的高腰设计,轻松勾勒腰线,凸显纤细腰肢。腰部的处理,巧妙的塑造了裙身的层次变化,增强裙身立体廓形感。前门襟处采用单排扣固定,精致的格纹圆扣,带来浓浓的复古学院风,有趣又独特,颇为时尚减龄。"} +{"content": "类型#上衣*风格#知性*图案#线条*衣样式#衬衫*衣领型#v领*衣款式#绑带", "summary": "采用柔软的粘纤面料制作而成,带来亲肤透气的舒适感。个性的衬衫领结合v领的造型设计,别致的绑带缠绕着领口,修饰了颈部线条,同时凸显出干练知性的气质。精美的双层饰边装饰着前襟,尽显甜美浪漫气息。飘逸的百褶裙摆,摇曳出优雅迷人的身姿。"} +{"content": "类型#上衣*材质#蕾丝*风格#性感*图案#蕾丝*衣样式#外套*衣样式#西装*衣款式#吊带", "summary": "吊带同样配置了内衬,所以如果想单独穿的话,一点问题都没有,得体还带那么一些小性感。胸襟的地方,做了睫毛边的蕾丝来点缀,这个细节满分,如果搭配西装外套,或者其它的单品,在胸前恰当露出来,是超巧妙又的性感穿法。"} +{"content": "类型#裙*材质#丝绒*颜色#绿色*图案#波点*图案#印花*裙下摆#荷叶边*裙下摆#花边*裙长#长裙*裙款式#拼接", "summary": "这件连衣长裙多处都运用了荷叶边的拼接设计,肩头处添加花边点缀,更加凸显女性温柔典雅气质,而腰部的荷叶边正好能够勾勒修饰腰部曲线,裙摆处则是让裙装更具灵动飘逸感。波点印花元素的融入,又让裙子有了另一番俏皮可爱的气息。采用高级靓丽的绿色丝绒面料,看着摸着柔软而有质感。"} +{"content": "类型#裙*风格#简约*图案#蝴蝶结*裙长#连衣裙*裙衣门襟#系带", "summary": "素雅而又简约的连衣裙,在腰间缝制上了两条系带,起到了画龙点睛的作用。两根带子既可以系在一起,塑造成一个蝴蝶结的造型,又可以自由的放在两边,打造出慵懒随性的风格。这种独特的设计不仅能够起到装饰性的美观作用,同时还能凸显出穿着者腰间曼妙曲线,令人回味无穷。"} +{"content": "类型#裙*颜色#白色*风格#清新*裙袖长#无袖*裙领型#翻领*裙款式#拼接", "summary": "白色的裙子第一眼就容易让人移不开眼,纯洁乖巧的气息可以想象,上身绝对是甜美清新的完诠释。还有小巧的翻领设计,特别显可爱,还能修饰出优美的颈部曲线。加上无袖,露出纤细白皙的手臂,是不是很诱人呢?最重要还有肩部和腰部的拼接处理,十分具有特色。"} +{"content": "类型#裤*风格#运动*风格#休闲*图案#线条*裤腰型#松紧腰*裤口#小脚", "summary": "这款裤子,结合了休闲与运动两种风格,彰显了不一般的帅气感。腰部的松紧绳设计,不仅不挑人,也显得十分时尚,束脚的设计,避免了整体过于臃肿的尴尬场面,而完美的修饰了腿部线条,衬托得人更加修长。"} +{"content": "类型#裙*风格#青春*风格#性感*裙型#百褶*裙型#包臀裙*裙下摆#花边*裙款式#抽褶", "summary": "可爱与性感兼并的双面性时尚。这一款包臀裙,阳光黄的颜色非常特别,不止显白,而且能给你添加不少温柔的气质。膝盖以上的长度能显腿长而包臀的设计,能完美展现你的身材比例,极具女人味。让人心动的是裙身的花边,充满个性,适合轻熟女。经典的褶皱元素,化作甜美含蓄的百褶,显得青春又优雅,让人不由得回忆起那个校服裙摆飞扬的年代。"} +{"content": "类型#裙*图案#印花*裙下摆#压褶*裙腰型#高腰*裙腰型#中腰*裙腰型#松紧腰*裙长#半身裙", "summary": "这款印花半裙是经过高温压褶的,不易变形,印花有种凹凸的层次美感,内敛含蓄。是上身会微微撑开的a型,整洁有序的褶裥营造出挺括的廓形,并伴随着你的步履摇曳舞动,为设计注入了几分灵动气息。经过高温定型的褶裥,富有层次感,弹力松紧腰,好调节高度,高腰、中腰不受限制,可以根据自己的喜好来。"} +{"content": "类型#裙*版型#宽松*颜色#黑色*风格#复古*风格#性感*图案#复古*图案#波点*裙下摆#荷叶边*裙腰型#高腰*裙衣门襟#拉链*裙款式#拼接*裙款式#拉链", "summary": "复古的波点一直都是不败的经典,想来大家都有目共睹。柔美流畅的荷叶边与经典复古的波点相结合,性感有魅力,黑色的高腰裙设计,凸显出腰身的优美曲线,腰间无松紧,平整的腰部显得质感很好。而且在侧面拼接了隐形拉链用于日常穿脱,方便又贴心整体的版型偏宽松,行走之中带来足够的自由感。"} +{"content": "类型#上衣*图案#印花*衣样式#衬衫*衣款式#不规则", "summary": "精致干练的翻边衬衫领设计,简洁大方,勾勒出干练气质。衣身不规则涂鸦印花,错落有致,凸显时尚活力感。弧形下摆裁剪,优美的曲线营造出优雅气质,不规则的摆动,灵动飘逸又不失轻松随性的风范。"} +{"content": "类型#上衣*图案#印花*衣样式#衬衫*衣款式#不规则", "summary": "很养眼的一款衬衫设计,丰富色彩的大胆碰撞,轻松带来一场精彩的视觉盛宴。不规则印花图案装饰,带来时尚混搭魅力,轻盈面料剪裁,舒适透气又不怕透。"} +{"content": "类型#裤*颜色#灰色*风格#高贵*图案#线条*裤长#短裤*裤腰型#高腰", "summary": "高贵的灰色,把女人独有的优雅大气气质发挥到了极致,带来无与伦比的时尚魅力。圆领精致美丽,展现颈部线条,让你更显端庄优雅,娇俏的荷叶边袖点缀珍珠装饰,增添温婉柔美气质,下装高腰短裤a字版型设计,让你秒变大长腿。"} +{"content": "类型#裙*版型#显瘦*风格#性感*裙下摆#荷叶边*裙下摆#压褶*裙长#连衣裙*裙领型#一字领*裙款式#钉珠", "summary": "这一款连衣裙精致一字领的设计,韵味迷人性感出彩,精挑细选的布料软糯细腻,贴身穿着很舒适,体验度也是不一般。钉珠荷叶边的装饰,气质优雅随风摇曳。加上重工压褶,包容显瘦做工精致。"} +{"content": "类型#裙*版型#宽松*颜色#纯色*风格#性感*图案#纯色*裙型#衬衫裙*裙款式#露肩*裙款式#不规则", "summary": "这是一件充满慵懒与个性之感的衬衫裙,整体采用宽松版型加上不规则剪裁,随性中又带着慵懒气息,尤其是露肩设计展露骨干肩部,轻松展现出女性性感之味。纯色色调更是与夏季搭配适宜,简简单单但是充满纯粹感。同时设计师又采用了以上含材质,不仅舒适亲肤,而且十分吸汗,即便是炎炎夏日也不会感到粘腻。"} +{"content": "类型#裙*材质#网纱*颜色#纯色*风格#性感*图案#纯色*图案#线条*裙下摆#荷叶边*裙长#连衣裙*裙领型#一字领*裙款式#拼接", "summary": "这款纯色的连衣裙中长款式穿着更显飘逸性,采用了性感的一字领设计,展示出白皙的颈部肌肤,带来一丝亮点,女人味十足。采用了拼接荷叶边的设计,富有层次感,上身不失有型。下摆处的拼接网纱设计,若隐若现地展示出优美的腿部线条。"} +{"content": "类型#裤*图案#条纹*裤款式#口袋*裤款式#纽扣*裤腰型#松紧腰", "summary": "布满衣身的竖型条纹,显出女性甜美可爱的形象。松紧的裤头设计,出行穿脱更加的便捷。对称的贴布口袋设计,方便存放随身的小物品,兼具美观性与实用性。精致的单排纽扣进行开合,出行穿脱更加的简单,提升出行的便捷性。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*颜色#纯色*颜色#浅蓝色*风格#简约*风格#休闲*风格#潮*图案#纯色*图案#线条*裙型#牛仔裙*裙型#直筒裙*裙下摆#毛边*裙腰型#高腰*裙款式#纽扣*裙款式#不规则", "summary": "经久不衰的牛仔元素,由一抹浅蓝色渲染,简约的纯色,休闲大方。修身的直筒版型高腰提臀的设计拉长双腿的比例,显高显瘦。结合了纽扣门襟的设计,方便穿脱。简单流畅的线条,巧妙的修饰腿型。搭配运动鞋休闲鞋都能轻易焕发青春活力气息。同时此款裤身采用磨破做旧处理,时尚前卫,尽显潮流风范。裤脚处也运用了不规则毛边的装饰,更显慵懒随性。裤腿的缝补拼布设计别出心裁,设计感十足。"} +{"content": "类型#裙*材质#蚕丝*图案#印花*裙下摆#压褶*裙袖长#七分袖*裙领型#圆领", "summary": "轻盈飘逸的真丝材质,手感丝滑,上身仿若无物,带给你有如婴儿肌肤般的细腻触感。简洁的圆领搭配同样简洁的七分袖,进一步提升干练利落的气质。前幅的压褶设计很是别致,于细节中彰显前卫的设计感。精致的印花点缀裙身,带来春风拂面般的清凉感。"} +{"content": "类型#裤*版型#宽松*材质#亚麻*颜色#纯色*风格#清新*图案#纯色", "summary": "裤裤的设计走的极简路线,或明亮甜美或沉稳优雅的纯色,都衬托出宝贝乖巧干净的气质,同时也百搭夏季清凉的上衣。a字的宽松版型,很好地修饰宝贝的腿型,结合亲肤透气的亚麻面料,穿着飘逸,走路带风,给人清新、格调十足的感觉。"} +{"content": "类型#裙*版型#宽松*风格#复古*风格#简约*图案#复古*图案#刺绣*裙长#连衣裙*裙袖长#七分袖*裙领型#圆领", "summary": "简约而唯美的一款连衣裙。纯白色系搭配起来更加游刃有余。精美的刺绣工艺,呈现出复古而别致的艺术效果。圆领七分袖设计,让您的仪态更显优雅从容。长款宽松版型,轻松穿出高挑曼妙的身段。"} +{"content": "类型#上衣*版型#宽松*图案#线条*图案#印花*衣样式#衬衫*衣领型#圆领*衣款式#不对称", "summary": "萌趣印花点缀整体衣身,元气满满的衬衫让你在春天活力四射。气质的小半圆领设计,巧妙的勾勒你的脖颈线条,在视觉上增添高挑出众的气质。略微宽松的款式设计,巧妙的遮住你的肉肉,形成修长的线条美感,营造一身高级的慵懒感。不对称下摆的设计,增添了整体造型的层次感,巧妙的展现你独特的个性与魅力。"} +{"content": "类型#上衣*版型#宽松*材质#棉*风格#青春*风格#清新*图案#条纹*衣样式#卫衣*衣长#常规", "summary": "这款卫衣打破了常规的单调款型,宽松的假两件版型,凸显出了层次感,穿着随意不受约束。清新的条纹图案修饰,洋溢着满满的青春气息,选用优质的棉质材质制成,柔软亲肤,给你带来舒适的穿着体验。"} +{"content": "类型#裙*版型#宽松*裙长#连衣裙*裙款式#抽绳*裙款式#连帽", "summary": "一款非常时髦的连衣裙,采用网布连帽设计,很适合炎热的夏季。抽绳则用带有淡淡光泽感的铜氨丝打造,做工精细又不显夸张。将面料进行打孔艺术排序,呈现出极致的质朴形象。宽松的a型廓形,很好的包容身材,给予一定的活动空间,让人不自觉充满着活力。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*图案#线条*裙腰型#高腰*裙袖长#无袖*裙领型#高领*裙款式#钉珠", "summary": "这款礼服裙手感厚实优质感,穿着上身,笔挺有型,简洁干练。以深邃的墨黑色作为基调,搭配小高领的无袖设计,仿佛拥有黑天鹅般的优雅气质。肩部的剪裁配合颈部线条特别修饰上半身,高腰处增添钉珠装饰,高腰的修饰能力,拉长腿部线条感。精致小巧的视觉上又非常显瘦。"} +{"content": "类型#裤*风格#潮*图案#线条*裤长#短裤", "summary": "此款短裤的另外一个设计亮点在于它采用简洁的涂鸦线条,勾画出个性活力的天使之翼图案装饰,提升层次和视觉效果。展现现代潮流风。"} +{"content": "类型#上衣*版型#宽松*风格#日系*风格#简约*衣样式#卫衣", "summary": "做旧工艺的使用,赋予了这款日系卫衣更多几分的韵味感,使得简约的它,拥有更为丰富的设计层次。它的版型带着几分宽松,却不会让人觉得肥大,上身之后可以更好的衬托出你随性、潇洒的气质,为你的男性魅力加分不少。"} +{"content": "类型#上衣*材质#棉*风格#简约*衣样式#衬衫*衣领型#翻领*衣款式#拼接", "summary": "法式的简约版型,衬衫的袖口开叉处做了拼接设计,让衬衫整体看上去更有细节感,整件衬衫最吸睛的就是领口,翻领也是做了全包边,突出了衬衫的时髦和高级感,面料选用100%高支棉,是最抗皱不透的,有一定的挺括度,不易皱,不透光。"} +{"content": "类型#裙*版型#宽松*颜色#黑色*风格#宫廷*裙衣长#常规*裙款式#木耳边", "summary": ",几乎是两个常规的宽度,显得手臂纤细修长。木耳花边领+袖,与宽松的衣身组合,真的有种欧式宫廷内衫的既视感,精致中透露出慵懒气质。还有黑色点点提花,规律点缀,多一份少女感。"} +{"content": "类型#上衣*版型#立体剪裁*风格#简约*衣样式#衬衫*衣领型#翻领", "summary": "简约衬衫,经典衬衫版型,遵循布料肌理。立体剪裁,以翻领明门襟的经典造型、配合曲摆的现代人性化裁减,相得益彰,舒适的面料搭配精致缝纫线使成衣领型自然舒展、缝线部位平服工整、牢固耐磨,单穿或者内搭都非常好看。"} +{"content": "类型#裙*颜色#深蓝色*风格#复古*风格#高贵*图案#复古*图案#印花*裙长#连衣裙*裙款式#不规则*裙款式#收腰", "summary": "带有复古BRAND风的一款连衣裙,以深蓝色的基调质地,营造出高贵、优雅的气质。结合收腰的版型剪裁,让腰部曲线更显立体,增加女性魅惑。不规则的裙摆设计有弧度感,可增加视觉层次感。精美的印花质地于裙身,更显优雅。"} +{"content": "类型#上衣*风格#淑女*风格#复古*风格#宫廷*风格#高贵*图案#蝴蝶结*图案#复古*衣样式#衬衫*衣领型#立领*衣袖型#喇叭袖", "summary": "带有浓郁欧式古典宫廷风的气息,这款衬衫造型优雅又高贵。立领的立领再加上蝴蝶结装饰,无比的淑女。胸前有花边的造型,更显娇美可人。复古的喇叭袖处理,典雅的气质尽显。搭配一条小裙子,简直可以直接去拍少女了。"} +{"content": "类型#裙*版型#显瘦*颜色#粉色*风格#性感*图案#蝴蝶结*裙长#连衣裙*裙款式#绑带*裙款式#吊带*裙款式#露肩", "summary": "柔和的粉色连衣裙,轻松凸显少女气息,又显肤色白皙。吊带露肩,凸显迷人小性感,结合绑带蝴蝶结,提升甜美可爱感,波浪式的伞型裙摆。穿着显瘦又具时髦感。"} +{"content": "类型#上衣*衣样式#衬衫*衣领型#翻领*衣袖长#短袖*衣门襟#单排扣*衣款式#口袋", "summary": "经典的衬衫翻领利落有型,展现女性柔美脖颈同时,凸显出端庄大气的气质。直筒版型配合短袖设计,优雅利落有着很好的包容性,遮掩不完美的身形,从容间透着随性的慵懒气息。从领口一直延伸至裙摆的单排扣,落落大方提升衬衫裙的时尚度,带着些许复古风韵味。两侧弧形斜插口袋,方便实用靠上的位置让插袋姿势更具气场。"} +{"content": "类型#上衣*图案#线条*衣样式#卫衣*衣款式#连帽", "summary": "卫衣在简练版型的基础上结合了连帽的造型设计,不仅具有一定的保暖效果,同时对颈部也起到一定的修饰效果,展现出细线的脖子线条。在帽檐的边沿利用包边与缝合的线条,体现出细节的质感与柔软的触感,巧妙的避开了僵硬的触摸感,细心呵护着脖子的肌肤。"} +{"content": "类型#裙*版型#显瘦*裙型#牛仔裙*裙腰型#高腰*裙款式#拼接*裙款式#腰带*裙款式#不规则*裙款式#收腰", "summary": "高腰拼接牛仔裙摆,精致的方扣腰带收腰设计,圈出纤细腰身,气质显瘦;不规则的下摆造型,长短错落,很有层次感,更加凸显了纤细的双腿,视觉上妥妥显瘦。"} +{"content": "类型#上衣*材质#牛仔布*颜色#白色*颜色#黑色*颜色#黑白*风格#街头*风格#休闲*图案#印花*衣样式#外套*衣款式#绑带", "summary": "牛仔外套是街头常见的存在,休闲街头似乎是它与生俱来的魅力。而白色牛仔的出现,注定是街头抢镜的,那渲染着的黑色印花,经典的黑白碰撞,时尚火花,同时将休闲的外套平添了一层优雅的韵味。交叉绑带的设计,出现在身前,设计感的视角,给人焕然一新的感受,独特的它,给你避免撞衫的!"} +{"content": "类型#裙*风格#文艺*风格#性感*图案#印花*裙下摆#花边*裙长#连衣裙*裙袖型#荷叶袖*裙衣门襟#系带*裙款式#勾花镂空", "summary": "印花连衣裙真的是好穿且实用又时髦。衬上文艺格调的花边小立领造型,简洁利落,打造随性优雅的气质。以及领口采用镂空与系带设计,小小透出的肌肤,平添了不少性感韵味。唯美的双侧荷叶袖,特别显气质,而且显手臂纤细。"} +{"content": "类型#裙*颜色#黑色*风格#潮*风格#性感*裙长#连衣裙*裙袖长#长袖*裙款式#勾花镂空*裙款式#飘带", "summary": "这款太平鸟黑色长袖两件套连衣裙,透视的两件套设计,满足叠穿的潮流,更加性感魅惑。同时后背镂空设计,微露美背,尽显迷人魅力。搭配飘带设计,增添细节感,彰显时髦。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑色*风格#青春*图案#线条*衣样式#衬衫*衣领型#v领*衣款式#绑带", "summary": "好喜欢这款衬衫纯白的颜色,轻松穿出优雅妩媚的感觉。袖口的翻边样式,可以说是相当惹人喜爱。黑色腰封绑带,时髦流行设计,纤细腰身线条,v领袖子翻折,下摆u型弧度,高挑显瘦女人!"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#棉*风格#知性*图案#印花*衣样式#衬衫*衣袖长#九分袖", "summary": "显瘦十足的v字延伸衬衫领设计,备显知性温婉的同时将少女气息演绎到了极致。合身的版型设计修身且不紧绷,宽松却不松垮。趣味感十足的个性印花设计,让整体层次感丰富不显单调。纯棉印花的面料设计,亲肤舒适透气效果也是极佳。干练利落的九分袖设计,更加清爽。"} +{"content": "类型#上衣*风格#街头*风格#嘻哈*图案#字母*图案#文字*衣样式#冲锋衣*衣样式#风衣*衣款式#拼接*衣款式#腰带*衣款式#连帽", "summary": "从经典的冲锋衣风格为灵感来源,搭配近来大火的街头嘻哈风范,营造出叛逆而显潮感的风衣版型。胸口搭配拼接图片与字母logo,彰显个性品味之余,更添层次美感。连帽版型配合斜开腰带,舒适保暖贴合身躯。配合贴布袖子设计,于细节中把握时尚腔调。"} +{"content": "类型#裙*材质#网纱*风格#性感*图案#格子*图案#刺绣*图案#撞色*裙领型#一字领*裙款式#拼接*裙款式#腰带*裙款式#收腰", "summary": "肩部网纱拼接,微透雪肤,婉约朦胧,刚刚好的性感情调。配上精致的刺绣图案,上身洋气又不显单调。拼接一字肩格纹裙设计,浪漫雅致的格纹图案,给人一种优雅娴静的知性感,肩部的翻折边设计,修饰肩线。撞色腰带收腰设计,吸睛同时,腰线也收的恰到好处,立体伞摆裙型,更显纤细腰身。"} +{"content": "类型#裙*风格#休闲*风格#潮*风格#性感*图案#线条*裙型#a字*裙下摆#开叉*裙下摆#毛边*裙腰型#高腰*裙长#半身裙*裙款式#口袋", "summary": "半身裙是采用a字型前面开叉的设计风格,不仅有个性时尚还能在走动间露出纤细的双腿,不失性感又潮流。裙底的毛边,显得随性慵懒又自在,高腰的裙身,把身材比例拉长。流畅干净的线条,和裙身两侧的大口袋,可以装点随身携带物品,方便实用又美观,自带休闲气质。"} +{"content": "类型#上衣*版型#显瘦*颜色#黑色*风格#简约*衣样式#衬衫*衣领型#尖领*衣款式#纽扣", "summary": "白衬衫是大多商务男士的标配,那么相比之下。黑色的衬衫则显得更加出众醒目,尖领的裁剪,利落大方的同时却又不失干练气场。同色的纽扣,彰显简约的时尚美,修身的版型,让你健美的身躯完美呈现,上身不自觉释放出几分高冷的气息,更显得魅力无穷。"} +{"content": "类型#上衣*材质#针织*颜色#纯色*风格#简约*图案#纯色*图案#撞色*衣样式#外套*衣领型#小立领*衣门襟#一粒扣*衣款式#对称", "summary": "这款针织外套采用了纯色的做工,穿着简约精致,针织的领口采用饿了简约的小立领领型,并在领口做了处理。搭配简约的一粒扣门襟设计,穿着显得干净利落。后背采用了拼布工艺形成撞色的效果,丰富你的视觉感官。衣身还加入了对称的贴袋,兼具美观与实用性。"} +{"content": "类型#裤*材质#牛仔布*材质#棉麻*材质#混纺*风格#运动*图案#印花*裤腰型#松紧腰", "summary": "时尚易搭的儿童过膝牛仔裤,logo印花的图案装饰,打破版型的沉闷感,时尚更富有活力,打造小潮童造型。宽幅的松紧腰带,平整有弹性,舒适不勒皮肤,孩童穿着活动自如。优选棉麻的混纺面料,具有极佳的透气效果,穿着活动不会有潮湿闷热感,非常适合孩子爱运动的本质。"} +{"content": "类型#裙*风格#知性*图案#条纹*图案#印花*图案#撞色*裙长#连衣裙*裙袖长#五分袖*裙领型#polo领*裙款式#拼接", "summary": "淡雅的条纹印花连衣裙,采用时尚的polo领,领子的撞色设计,可以更好的修饰柔美的颈部,彰显率性干练的气质。新颖独特的条纹印花,横竖条纹拼接使用,碰撞出别致的视觉效果,呈现出知性优雅的女性韵味。利落的五分袖设计,恰到好处的露出白皙的手臂,体现出温婉的女性气息。"} +{"content": "类型#上衣*版型#显瘦*风格#清新*图案#线条*衣样式#马甲*衣样式#西装*衣领型#翻领*衣袖长#无袖*衣款式#腰带*衣款式#对称*衣款式#收腰", "summary": "西装式的翻领领口,在最后设计成对称的样式,搭配单边开叉缺口的设计带来满满的个性与俏皮,轻松展现出干练却摩登的时尚气场。无袖的马甲款式露出了手臂的线条,在视觉上显得整个人更为利落帅气,散发出整洁清新的优雅气质。腰间加入了腰带进行收腰,所以不会看起来拖沓,突出了腰身曲线的设计更显女人味。长款过膝的长度可以修饰臀部和腿部的线条,上身更为显瘦。"} +{"content": "类型#裙*材质#牛仔布*风格#休闲*图案#撞色*裙型#背带裙*裙型#牛仔裙*裙型#小黑裙", "summary": "强推一条带你切换不同风格的背带裙,第一眼就会爱上的必入单品!背带裙一直是大热经典的宠爱单品哟,这款用的深牛仔配色设计,低调的同时非常百搭休闲。设计可以调节肩带长度的版型设计,无论高矮都可以完美驾驭酷炫又有型。随便一双同色系小黑靴再撞色一下吸晴满满出街!"} +{"content": "类型#上衣*颜色#纯色*风格#简约*风格#休闲*图案#纯色*图案#蝴蝶结*衣样式#衬衫*衣长#短款*衣袖型#灯笼袖*衣门襟#系带", "summary": "这件衬衫版型的上衣,简约的设计,显露穿着的大气感。蝴蝶结系带的领口设计,是整件服饰的焦点所在,显露出穿着的满满个性,与洋气的穿着。灯笼袖的设计,轻松遮挡了腰部的赘肉。纯色的色调,休闲时尚,还很百搭,让日常的生活中,展现出穿着的多变感。短款的上衣,无论是搭配裙子,还是裤子,都很百搭有个性。"} +{"content": "类型#裙*风格#淑女*风格#青春*裙长#连衣裙*裙领型#立领", "summary": "初次见到连衣裙就被它独特的立领所吸引,日字圆扣将领边的裁片系起,宛如颈间搭配的一条丝巾,尽显温婉柔美的淑女气质,再加上扣环明亮的金属光泽,带来青春灵动的时尚气息。立领温柔的围裹着脖颈,能够起到拉伸颈部曲线的视觉效果,展示出修长天鹅颈。"} +{"content": "类型#上衣*风格#青春*图案#刺绣*衣样式#棒球服*衣领型#立领*衣长#短款*衣门襟#单排扣*衣款式#螺纹", "summary": "棒球服在近几年来极其流行,螺纹立领的设计加上精致的单排扣点缀,让你无论是敞开还是闭合穿着都极其好看;在衣衣上加入了精美的绣花图案点缀,既增添了一份美感,又释放出女性的优雅魅力;而短款的设计干净利落,穿出女性的那份帅气时尚感。"} +{"content": "类型#裙*风格#复古*风格#青春*图案#复古*图案#线条*图案#刺绣*裙长#连衣裙*裙领型#翻领*裙款式#收腰", "summary": "这一款连衣裙看起来公主风十足,翻领的线条流畅,显得整个人很有气质,而领部上面的精致刺绣散发着甜美的气息,有着青春减龄的效果。收腰放摆的廓裙摆看起来很蓬松,轻松藏肉。复古的提花面料带来优异的质感,更加分哦。"} +{"content": "类型#裙*风格#复古*风格#性感*图案#豹纹*图案#复古*裙下摆#开叉*裙长#连衣裙*裙领型#v领", "summary": "本款连衣裙较适合成熟女性,深v领的设计,尽显撩人的性感魅力,喇叭扇形的衣袖设计,不显约束感。经典的豹纹花纹带来怀旧复古风尚,尽显摩登女郎范。下摆处开叉的设计,使走起路来裙裾飘飘,修长美腿若隐若现。"} +{"content": "类型#裤*版型#宽松*材质#棉*颜色#红色*风格#街头*图案#撞色*裤长#短裤*裤款式#抽绳*裤腰型#松紧腰", "summary": "这款来自品牌太平鸟与可口可乐联合跨界合作的男装中短裤精选百分百的纯棉面料,轻薄舒适的质地贴身穿着更加干爽透气。整体采用宽松的版型,抽绳设计的松紧腰带轻便自在不束缚腰部,裤腿处加入ola的撞色胶印,搭配大红色的基底色调,尽显街头风格的雅痞个性。"} +{"content": "类型#裙*材质#棉*颜色#白色*颜色#藏蓝色*风格#清新*图案#碎花*图案#线条*裙下摆#花边*裙领型#翻领*裙衣门襟#单排扣", "summary": "精选优质的纯棉面料,让裙子穿着后更加亲肤舒适。精致的小翻领设计,有效修饰颈部优美线条,裙身铺陈小碎花图案,带来清新文雅的味道,在藏蓝色的色调映衬下,更显得别致优雅。单排扣的设计加上一边白色花边的装饰,让线条更加明朗,展现设计美感。"} +{"content": "类型#裙*图案#圆点*图案#条纹*图案#蝴蝶结*裙长#连衣裙", "summary": "布满圆点元素的连衣裙,尽显青春活力的感觉。经典的圆形领口,衬托孩子的颈部曲线,穿着起来也不会有束缚感。领口处还添加了条纹边,以及抢眼的蝴蝶结装饰,轻松打造甜美公主范儿。"} +{"content": "类型#裙*材质#网纱*材质#雪纺*风格#性感*图案#波点*图案#线条*裙型#a字*裙长#连衣裙*裙领型#v领", "summary": "很有女人味的一款两件套连衣裙,若隐若现的网纱搭配顺滑的雪纺,袖口那一部分将手臂的肉肉都藏起来,增添了波点元素,上身满满的可爱气息。吊带裙是深v的设计,带着点的小性感,a字裙摆才将利落,能很好的修饰身材线条。"} +{"content": "类型#上衣*版型#宽松*颜色#白色*颜色#黑色*颜色#姜黄色*图案#撞色*衣样式#卫衣", "summary": "卫衣加连衣裙的两件套设计,上衣是黑色和姜黄色的卫衣,搭配白色的裙摆,层次感十分丰富,而且很有撞色的时髦感。宽松的卫衣廓形剪裁,oversize造型,轻松打造出娇小的气质。裙摆的斜切设计,长短不一的错落感,非常有设计感,灵动飘逸的同时,很有优雅魅力。"} +{"content": "类型#上衣*材质#棉*材质#混纺*风格#休闲*图案#几何*衣样式#卫衣", "summary": "休闲的卫衣是宝贝们最喜爱的时尚单品啦!棉混纺面料柔软舒适,保暖效果特别好,微收的袖口和下摆设计也可以为宝贝抵挡风寒呢!夸张的几何图案设计跟萌萌哒的有一拼哦!"} +{"content": "类型#上衣*材质#针织*颜色#黑色*风格#简约*风格#休闲*衣样式#卫衣*衣款式#拼接*衣款式#连帽", "summary": "带有帅气酷炫的风格的针织卫衣,既能展现你潇洒休闲的风格,又能衬托出别具一格的魅力。经典的黑色色调,带来不可多得的神秘气息,展现深沉内敛的性格,简约的连帽设计,衬托轻松随性的风格,带来慵懒的气质,个性的袖子拼接,凸显另类独特的魅力。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*风格#复古*风格#高贵*风格#性感*图案#复古*裙领型#v领", "summary": "宽松的款式设计有效遮住小赘肉,舒适显瘦。靓丽的提花图案点缀于裙身,展现出一股复古优雅情怀。v领的设计不仅可以选择显露出性感脖颈,亦可以选择轻松搭配一件内衬,实用又时尚。后领装饰了品牌logo,让裙子更显高贵品质。"} +{"content": "类型#上衣*风格#通勤*图案#菱形*图案#印花*衣样式#衬衫*衣门襟#系带", "summary": "将衬衫与裙装结合,优雅之余不乏柔美气息,是通勤的不错选择。这款裙子采取了唯美印花,活力时装的菱形图案,优雅中透露着一种浪漫气息。配上贝壳扣单排门襟,简洁利落,流转出的五彩光泽为整衣装点精致感。腰部系带勾勒出纤细腰姿,使身姿更显窈窕玉立。"} +{"content": "类型#上衣*颜色#红色*风格#欧美*风格#清新*风格#性感*图案#格子*衣样式#衬衫*衣领型#v领*衣袖长#七分袖", "summary": "精选优质面料,打造轻薄凉爽的衬衫,上身更显欧美气质。采用经典的红色格子,不仅显甜美风格,还十分衬托白皙的肤色。七分袖长刚刚好,露出纤细的手腕,在荷叶花边袖口里,更加显小清新。v领的设计很好的修饰了颈部曲线,还能隐约看见性感的锁骨。"} +{"content": "类型#裙*材质#蕾丝*颜色#白色*图案#蕾丝*裙型#a字*裙下摆#花边*裙款式#镶钻*裙款式#勾花镂空", "summary": "每一个心中有公主梦的小仙女,都期待有一条属于自己的蕾丝裙,细腻的触感,凹凸有致的镂空花朵,恰到好处的凸显出女性的甜美优雅感。内里是加了白色的内衬,丝毫不用担心走光的危险。镶钻珍珠花边丰富了层次感,立体的a字型裙摆,更是摇曳动人。"} +{"content": "类型#上衣*颜色#黑色*颜色#裸色*风格#文艺*图案#创意*图案#撞色*衣样式#衬衫*衣款式#纽扣", "summary": "波浪形开合边缘,成为衬衫一大亮点,古典文艺创意感满分;配有黑色纽扣,与裸色衬衫撞色搭配,塑造强烈的视觉冲击力;袖口配有纽扣,可随意转换不同造型。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*材质#羊毛*颜色#黑色*风格#文艺*图案#线条*图案#刺绣*裙型#大裙摆*裙腰型#高腰*裙领型#圆领*裙款式#抽褶*裙款式#收腰", "summary": "简单经典的小圆领修饰脖颈线条,宽松版型裁剪,上身有余量,穿着轻松自在。高腰剪裁,往里收,自然的收腰效果衬托出身体线条的纤细。腰间褶皱形成大大的裙摆,随着步伐飘逸而动感。气质显瘦的黑色调,羊毛材质,表面立体感的绣花更添文艺气息。"} +{"content": "类型#裙*材质#棉*颜色#浅蓝色*风格#青春*图案#环保*裙下摆#荷叶边*裙下摆#垂坠*裙领型#圆领*裙款式#拼接", "summary": "柔软环保的纯棉面料织造,亲肤透气,素雅恬淡的浅蓝色,饰以精致的花型图案装饰,带来青春甜美的少女气息。经典圆领,简洁的袖型,修饰手臂,更显纤细。荷叶边拼接腰身,丰富层次,流畅的版型轮廓,浪漫垂坠的裙摆,尽显飘逸灵动韵味。"} +{"content": "类型#上衣*版型#宽松*风格#复古*风格#宫廷*风格#休闲*图案#复古*衣样式#卫衣*衣袖型#落肩袖*衣袖型#灯笼袖", "summary": "衣身选有质感的卫衣面料,自带休闲气息,但挺廓而有型。宽松的设计,慵懒不随意,配合落肩灯笼袖的设计,富有复古的宫廷优雅气息。"} +{"content": "类型#上衣*版型#宽松*材质#牛仔布*衣样式#衬衫", "summary": "万物复苏的季节,穿上一款宽松版型的衬衫出门游玩吧。兔耳朵领子的造型,彰显了品牌的质感,同时显得洋气十足。搭配经典的牛仔蓝底色,可谓是充满了魅力。加上胸前的小口袋设计,让人感受到品牌对细节的执着。"} +{"content": "类型#上衣*材质#羊毛*材质#羊绒*风格#简约*图案#条纹*图案#撞色*衣样式#针织衫*衣样式#毛衣*衣领型#一字领*衣款式#拼接", "summary": "这款羊绒针织衫,整体都采用了撞色条纹,在夏天显得十分出挑。羊毛的材质温暖舒适,十分亲肤,不易变形十分耐穿。百搭舒适版型,拒绝臃肿,一件就能搞定整个冬天。经典的一字领设计,领口采用了撞色的拼接,让整款毛衣看上去十分简约大方,是夏天必备的一款毛衣,展现高档品质。"} +{"content": "类型#裤*版型#宽松*风格#工装*图案#线条*裤款式#口袋*裤款式#螺纹*裤款式#拉链*裤款式#抽绳*裤款式#松紧带", "summary": "弹力的橡筋腰头,以同色系的抽绳装饰,能够自由灵活的调节。裤脚采用了螺纹的收紧,与宽松的裤型结合起来,能够更好的修饰出腿部的线条。裤子的两侧搭配了对称的大口袋装饰,腰部还有拉链的口袋,彰显出浓郁的工装气息。"} +{"content": "类型#裤*版型#显瘦*风格#简约*图案#线条*裤款式#纽扣", "summary": "整体线条简洁流畅,静静地诠释着简约主义的魅力。利落的翻领,简洁的纽扣门襟,上身就是很利索又潇洒的感觉。落肩袖的设计,柔化了肩部线条,也将优雅和慵懒表现的恰到好处。搭配裤子,显瘦的版型,流畅的剪裁线条,上身正好是修身不紧绷的尺度,显瘦惬意,恰到好处的裤长,优雅利落,时髦都市腔调,大爱。"} +{"content": "类型#上衣*版型#宽松*材质#棉*风格#休闲*衣样式#外套", "summary": "在日常的休闲当中,一款棉质的外套可是少不了的~它以简洁的样式出现,穿搭起来更显百搭和气质的感觉。而且比较宽松的版型,对于身材一点也不挑,加上熟练的缝纫车工平整走线,更是能展现出整款的品质和衣型。再配上那棉质的面料,使得它在春日中穿搭,更显舒适的感觉。"} +{"content": "类型#上衣*材质#亚麻*风格#职场*图案#线条*衣样式#西装*衣领型#翻领*衣袖长#长袖*衣袖型#落肩袖*衣款式#口袋", "summary": "柔软不失挺括的亚麻面料舒适有质感,展现优雅不做作的自然美。帅气的西装翻领设计,展露出女性柔美的颈部线条,塑造优美天鹅颈。胸前和胯骨的大翻盖口袋装饰,给人耳目一新的感觉,彰显都市女性的干练职场范儿。而慵懒的落肩长袖,又不会使整体过于正式,为你送上丝丝温暖。"} +{"content": "类型#裙*版型#显瘦*版型#立体剪裁*材质#蕾丝*风格#宫廷*图案#蕾丝*裙下摆#荷叶边*裙下摆#花边*裙长#连衣裙*裙领型#v领*裙款式#拼接", "summary": "连衣裙延续了一贯的版型,修身的立体剪裁上身之后可以提高腰线更显身材。拼接的睫毛蕾丝设计在视觉上很吸睛,凸显出个性不羁的感觉不会太过沉闷。领口很有宫廷气质的花边再加上深v设计看起来很精致,荷叶边的喇叭袖子上身之后有微微的透视感不会太过沉闷。裙身的提花图案不会太过花哨,很有少女气质。刚刚好盖过屁股的长度也不压个子,很适合大多数人穿着。"} +{"content": "类型#裙*颜色#黑色*风格#高贵*风格#清新*裙型#蛋糕*裙下摆#层叠", "summary": "繁复而美好的层叠设计让这款蛋糕裙有着清新而温婉的少女气息。浅粉的配色是永不过时的少女梦想,甜蜜而唯美。黑色的配色则有着高贵低调的韵味,是贵族小姐的冷艳美感。飘逸的下摆柔美缱绻,温柔动人。"} +{"content": "类型#裤*版型#显瘦*材质#羊毛*颜色#黑色*裤腰型#高腰*裤口#微喇裤", "summary": "羊毛呢面料质地挺括垂感极佳,纯黑色极简百搭,微喇叭设计极好的修饰腿型很显小腿纤细,裤脚珍珠点缀女人味儿十足,非常适合春秋季节穿搭。高腰版型极好的在视觉上延伸腿部比例,显高又显瘦~"} +{"content": "类型#裙*图案#蝴蝶结*裙下摆#荷叶边*裙腰型#高腰*裙长#连衣裙*裙领型#v领*裙款式#抽褶*裙款式#飘带", "summary": "不论是约会还是聚餐,你都需要一件漂亮的连衣裙加持。本款采用的是v领的裁剪,搭配飘带的设计,随意系成一个蝴蝶结,甜美度up。立体的褶皱荷叶边装饰裙身,带出柔美温婉的女性魅力。高腰线的设计,修饰出完美的身材曲线。"} +{"content": "类型#上衣*颜色#黄色*风格#复古*风格#简约*图案#复古*衣样式#衬衫*衣款式#收腰", "summary": "这是一款简约复古的衬衫,细节处做了精致珍珠扣装饰,提升整体质感十足。收腰处的设计,正好凸显腰身。选用黄色设计,衬托你白皙的肌肤,让你轻松驾驭。"} +{"content": "类型#上衣*材质#丝绒*风格#复古*图案#复古*衣样式#卫衣*衣款式#绑带*衣款式#收腰", "summary": "一款正反两穿交叉绑带卫衣,非常时髦小众的一款设计,很容易让你成为人群的焦点。正反两穿的设计,可以变换不同的风格,更显新颖而又别致。交叉绑带的设计,透着几分复古的气息,同时起到了一定收腰的作用。丝绒面料的选择,带着别致的光泽感,时髦感万分。"} +{"content": "类型#上衣*版型#宽松*风格#简约*图案#条纹*图案#刺绣*图案#撞色*衣样式#马甲*衣袖长#无袖", "summary": "雅致的竖条纹被剪裁成帅气的马甲形式,利落的无袖赋予圆润的弧度,精湛平滑的车缝线展现着细节的魅力,宽松舒适让手臂活动自如不会产生拘束感。前襟在绣花印章的点缀下,增添了一份软萌的风范,配搭上简约撞色的小口袋,色彩的而对比更显俏皮而不单调。"} +{"content": "类型#裙*风格#宫廷*图案#印花*裙长#长裙*裙款式#吊带", "summary": "以未来的视角,丝绸之路,从出发,寻找唯美壁画。吊带裙的正反都印有丝绸海景印花,充满着风情。吊带装饰有扣,可以调节长短,也非常别致。唯美长裙,显示出西方宫廷装细节,高贵典雅。"} +{"content": "类型#裙*版型#显瘦*风格#知性*图案#线条*图案#刺绣*裙长#连衣裙*裙领型#圆领", "summary": "落落大方的时尚连衣裙,应用了重工刺绣的精美图案,整体焕发着令人心驰神往的迷人魅力。简洁裁剪的圆领曲线,修饰颈部的柔美线条,更衬托出端庄的知性气质,修身的精美版型流线,展现女性的优雅风姿。打造夏季连衣裙的新时尚风度。"} +{"content": "类型#裤*风格#简约*裤型#直筒裤*裤款式#不规则*裤口#毛边", "summary": "简约舒适的直筒裤是妹纸的百搭单品,穿上呈现出自由灵动的气息。裤口不规则毛边处理可以将整体修饰的更加俏皮和个性,加之腰部不规则设计与下部相呼应,着实吸睛。"} +{"content": "类型#裙*材质#蚕丝*风格#清新*图案#植物*图案#印花*裙型#直筒裙*裙长#连衣裙", "summary": "一款彰显清新雅致干的时尚连衣裙,植物印花设计格外别致,呈现出唯美的艺术美感,面料采用真丝材质,细腻轻柔,上身体验舒爽顺滑。长款直筒版型,轻松穿出高挑曼妙的身段。"} +{"content": "类型#上衣*材质#牛仔布*风格#清新*图案#印花*衣样式#外套*衣款式#破洞*衣款式#绑带", "summary": "这一款牛仔外套精美印花点缀其上,看起来特别的有美感,衬得妹子们更显清新脱俗。特别是时尚的破洞装饰,时尚个性凸显不羁。加上精致绑带的装饰,错落有致随风摇曳。这个时节穿,自然就把与众不同美丽突显出来"} +{"content": "类型#上衣*版型#显瘦*图案#撞色*衣样式#马甲*衣样式#外套*衣领型#翻领", "summary": "采用修身的版型设计而成的一款马甲外套,上身穿起来更加贴合女性的身材曲线感,达到更加显瘦的效果。领口珠翻领的剪裁方式,搭配上撞色的图案,更显一种时髦气息。"} +{"content": "类型#裤*颜色#白色*图案#条纹*图案#蝴蝶结*图案#撞色*裤长#短裤*裤款式#勾花镂空*裤腰型#高腰", "summary": "一字肩设计的上衣结合红白的撞色条纹样式,彰显出时尚活力感,上身效果显眼又吸睛。领口还装饰着系带的蝴蝶结,为整衣增添了些许活泼俏皮的感觉。雅致的方形镂空样式很显气质,搭配高腰的白色短裤,打造热情与时尚的夏日穿搭。"} +{"content": "类型#裙*版型#宽松*风格#简约*裙型#直筒裙*裙长#连衣裙*裙领型#圆领*裙衣门襟#套头*裙款式#拼接", "summary": "连衣裙看似简约,实际上有着很强的设计感。宽松直筒版型包容性强,不管是什么身材的女性穿在身上都十分有型。圆领套头设计简约实用,裙身上的拼接设计给人亮眼时尚的感觉,优质面料柔软舒适,保暖性透气性极佳又十分亲肤。"} +{"content": "类型#裙*材质#蕾丝*颜色#黑色*风格#性感*图案#蕾丝*裙长#连衣裙*裙款式#拼接*裙款式#收腰", "summary": "BRAND带来的这款连衣裙,选用经典的黑色系为基调,展现出女性成熟大方的气质,独特的收腰设计,修饰腰部的曲线,秀出曼妙迷人的身姿。加之肩部的蕾丝拼接点缀,打破单调,肩部若隐若现的朦胧感,尽显性感魅惑格调;以及贴心的内衬加持,无需担心走光,透露出女士的知性美。"} +{"content": "类型#裤*版型#显瘦*材质#混纺*风格#青春*图案#线条*裤腰型#松紧腰", "summary": "自带弹力的混纺面料,有一定的厚度,保暖的同时又不会臃肿,萝卜裤版型直挺有型。轻松改变腿部线条,松紧腰舒服又方便穿脱,不挑身材。藏肉显瘦这件事交给它,再加上今年流行的小蜜蜂元素,穿上身绝对时髦有型。"} +{"content": "类型#裤*风格#休闲*图案#条纹*图案#线条*裤长#七分裤*裤腰型#松紧腰", "summary": "商务休闲风的裤型设计,百搭时尚,凸显出女性的干练气质,适合各种场合穿着。竖条纹的设计,以及七分裤的版型,拉伸腿部线条,展现出女性修长美腿。贴心的松紧裤腰的设计,提升穿着方便舒适度,甄选优质面料,纹理大方,手感柔和亲肤。"} +{"content": "类型#上衣*图案#拼色*衣样式#外套*衣领型#方领*衣长#短款*衣款式#口袋", "summary": "这款配色鲜明亮眼的短款外套,真的让人爱不释手呢,一眼就被吸引住了!方领的设计,很好的修饰脸型,凸显时尚干练的气质。衣身拼色设计,超级吸睛,饱和度也非常高,同时也多了几分俏皮感。两个设计不一的实用口袋,有种特别的美感!"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*颜色#金色*图案#条纹*裙长#连衣裙*裙衣门襟#拉链*裙款式#不对称*裙款式#拉链*裙款式#抽褶", "summary": "此款为及连衣裙,不对称的设计,更显瘦显高,翻折式活片打褶领口,圆型肩带。采用立裁处理的腰部褶皱感,黑色与金色经典条纹,侧缝有隐形拉链便于穿脱。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*衣样式#针织衫*衣领型#v领*衣袖型#灯笼袖", "summary": "对于微胖的妹纸来说,oversize版型的针织衫很实用。尤其是这种轻薄的款式,穿起来自由随性而不紧绷。再加上宽松的灯笼袖袖型,修饰了双臂多余的肉肉,令整个人看起来更加苗条。而简洁的v领领口设计,则起到了修饰脸型的作用,同时也令针织衫穿起来更加休闲轻松。"} +{"content": "类型#裙*颜色#白色*风格#休闲*风格#清新*图案#线条*裙下摆#花边*裙长#连衣裙*裙领型#圆领*裙款式#对称", "summary": "massimodutti这款连衣裙,采用粘纤面料,给予孩童柔软舒适的穿着体验。以流畅的线条勾勒圆领版型,柔和了轮廓,打造休闲活力风格。将白色作为主底色,清新脱俗,闪亮的树叶随风飘落在衣面,俏皮又不失趣味性,让人眼前一亮。对称的花边加持,展现孩童的甜美与可爱。"} +{"content": "类型#上衣*颜色#黑色*风格#简约*衣样式#衫", "summary": "BRAND的这款polo衫采用简约素气的纯黑色呈现而成,你在搭配中十分的简易轻松,同时又显得成熟稳重,经典的polo领让肩部看起来挺括有型,精致细腻的做工搭配上舒适柔软的面料,穿着减少摩擦同时又速干排汗。"} +{"content": "类型#裙*颜色#白色*风格#性感*裙款式#露肩", "summary": "女生衣柜里面必不可少的一条裙子就是白色的裙子,但是单调的纯白从来都不会是有时尚嗅觉的女孩的首选。这样一条蓝白混搭的小香风裙子十分完美的诠释了所有女孩子心里面白色裙子该有的样子。个性的旗袍领口看上去十分的别致出色,很有中国古典的优雅感觉。侧边的露肩设计看起来无比的性感优雅,也不会太过于暴露。"} +{"content": "类型#裙*材质#牛仔布*裙型#牛仔裙*裙下摆#毛边*裙腰型#高腰", "summary": "牛仔裤有着硬朗潇洒的态度,又融合了俏皮活力的一面。这款裤子的牛仔蓝与局部磨白拥有微妙和谐的美感。小高腰的设计,真的很显腿长。大大的裤筒带来许多味道,也是对腿的释放,穿起来很有范。裤脚毛边设计,才不至于显得沉闷呆板,又提升了时尚感。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*图案#条纹*衣样式#衬衫*衣袖型#落肩袖", "summary": "红与白的细细密密条纹在衬衣上看起来很有些随性迷人的感觉,更能轻松修饰出柔美身线的诱惑。偏宽松的版型设计配合小落肩袖型的设计可以修饰肩部曲线,而且整体看起来都充斥着几分慵懒随性范儿,休闲气息十足。下摆处做了开叉设计可以带出动感随性范儿,更能修饰纤美腰线诱惑。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*图案#条纹*衣样式#衬衫*衣袖型#落肩袖", "summary": "衬衣可以说是日常穿搭中最为实穿的单品了,无论是怎么样搭配都能有不错的感觉。而条纹更是经典款的元素,分分钟就能带出俏皮又随性的味道,更具灵动迷人的休闲范儿。偏宽松的版型设计配合落肩袖型的设计可以修饰肩部曲线,更能带出慵懒随性范儿,对于微胖的小仙女也是敲友好。"} +{"content": "类型#裙*材质#雪纺*风格#简约*风格#性感*裙下摆#开叉*裙下摆#荷叶边*裙长#连衣裙*裙领型#圆领*裙款式#拼接*裙款式#吊带", "summary": "这款连衣裙,穿着十分浪漫。吊带设计加圆领,简约又大气时尚。腰间荷叶边拼接,浪漫又唯美。裙摆前开叉设计,增添性感味道。雪纺的面料,还带着丝丝飘逸感。出街或海边都十分好搭配,行走间优雅又浪"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#蚕丝*风格#复古*风格#简约*图案#抽象*图案#复古*图案#线条*图案#印花*衣样式#衬衫", "summary": "重磅复古抽象真丝印花衬衫,宽松的版式注定在显瘦的同时多了一分随性,袖子也是的样子,设计感无需很多的点缀一点点执着也就够了。优雅的版型让整个线条更简约,对身材也没那么大限制。天然舒适的亲肤材料、平坦,线条顺畅。露出锁骨肌肤,柔美和帅气并存。"} +{"content": "类型#上衣*风格#英伦*风格#复古*图案#格子*图案#复古*图案#线条*图案#撞色*衣样式#风衣*衣领型#翻领*衣款式#腰带*衣款式#收腰", "summary": "大翻领的款式设计,让风衣带有别样的质感,精致唯美中独具特色。长款的版型能够拉伸身体线条,将率性和洒脱演绎得淋漓尽致,腰间以自然的收腰样式点缀撞色的腰带,勾勒出玲珑有形的曲线,细腻的格纹元素,衬托出复古的英伦风。"} +{"content": "类型#裙*版型#宽松*图案#碎花*裙型#百褶*裙长#长裙*裙款式#木耳边", "summary": "孕妇碎花长裙,优雅的百褶裙摆,让你孕期行走在时尚与气质之间。宽松的裙摆穿着非常舒服,腰部可以包容孕肚。小碎花元素的添加,木耳边的设计,更时尚。"} +{"content": "类型#裙*版型#宽松*版型#h*材质#蕾丝*风格#性感*图案#蕾丝*裙袖型#喇叭袖", "summary": "通体蕾丝裙,天生魅惑而浪漫,微透的肌理,时髦性感紧紧跟随,藏不住的优雅气质;尽显柔美的喇叭袖,优雅中带着利落的气韵,舒适中带着都市女性的儒雅,很凸显气质哦;简洁而流畅的h型廓型,利落包裹身材,宽松的设计,有效遮掩腰腹肉肉,体现自由的穿衣态度。"} +{"content": "类型#裙*颜色#驼色*风格#清新*图案#格子*裙型#鱼尾裙*裙下摆#荷叶边*裙长#连衣裙", "summary": "怡人的春季,怎能少了一件浪漫轻盈的连衣裙,奶驼色的面料,散发着一阵迷人的温柔气息,清新的格纹,让整个季节都明朗起来。荷叶边的点缀浪漫唯美,下摆是鱼尾裙设计,旋转着柔和的阳光里,彰显灵动的气质,既有减龄效果,又不会显得幼稚。"} +{"content": "类型#裙*材质#棉*颜色#黑白*风格#休闲*图案#字母*图案#文字*图案#线条*图案#印花*裙领型#圆领*裙衣门襟#套头", "summary": "整件服装以黑白素色为主基调,加以前襟的人像头像以及字母印花,瞬间打破了单调的沉闷感。使得休闲的衣衫也充满了辨识度,时髦之中透着个性。圆领套头的款式,百搭实穿。裁剪利落的线条搭配上精选的棉面料,兼顾舒适与美观特性。"} +{"content": "类型#上衣*材质#棉*图案#线条*衣样式#polo", "summary": "以简洁流畅线条构筑紧身轮廓,贴合身形不紧绷。polo衫领设计,修饰脸型同时穿着舒适。以纯白色调打造衣身,奠定百搭实穿特性。采用优质面料打造,含丰富棉成分,穿着舒适亲肤。"} +{"content": "类型#裙*版型#显瘦*材质#丝绒*材质#蕾丝*材质#纤维*风格#街头*风格#复古*风格#清新*风格#性感*图案#复古*图案#蕾丝*裙型#抹胸裙*裙长#连衣裙", "summary": "透视蕾丝立体连衣裙,流畅简洁剪裁工艺,精致浪漫演绎神秘而性感诱惑,微透露朦胧的视觉感观。聚酯纤维面料,设计了抹胸连衣裙,细腻柔软有弹性,修饰腰线显高显瘦,凸显迷人身材比例曲线。清新复古的丝绒材质,彰显时尚年轻,充满青春活力。街头出行清新靓丽迷人,落落大方妩媚动人。"} +{"content": "类型#裙*版型#显瘦*图案#线条*裙型#大裙摆*裙袖型#灯笼袖", "summary": "这条裙子最大的亮点在于灯笼袖的设计,有效遮住手臂的赘肉,打造纤纤玉臂。双层大裙摆设计,上身摇曳生姿,飘逸唯美。简洁腰设计,修饰腰部纤细线条,修身显瘦。"} +{"content": "类型#裙*版型#宽松*材质#棉*材质#牛仔布*风格#潮*图案#撞色*裙型#背带裙*裙型#牛仔裙*裙长#短裙", "summary": "背带牛仔短裙是最彰显潮流风尚的单品元素之一,采用棉质面料精心打造,柔软舒适给女孩儿的肌肤贴心的呵护。撞色背带造型个性吸睛,融入levi's元素与后部皮印章相呼应,展现出强烈的品牌质感。胸前的按扣闭合设计,美观的同时使穿脱更加方便。腰部置入松紧皮筋,贴合身体领穿着更服帖。宽松的裙摆十分有型,走线均匀平整,着在身上大方又有型。"} +{"content": "类型#裙*版型#显瘦*材质#涤纶*颜色#焦糖色*风格#简约*裙下摆#层叠*裙衣门襟#系带", "summary": "面料选用羊皮绒面革,里衬选用100%涤纶,舒适有质感。采用围裹设计,腰部可调节系带,可以,所以适合各种腰围和高度。运用了时尚——焦糖色,简约到没有多余的装饰,简单搭配就能散发出优雅气质。层叠式裙摆具有层次感,及膝的长度更能显高显瘦。"} +{"content": "类型#上衣*版型#宽松*风格#复古*图案#复古*衣样式#衬衫*衣袖型#灯笼袖*衣款式#腰带", "summary": "以质地饱满的乱麻材质为主,具有良好的透气性与垂坠性,穿着自然的同时还不会轻易起球。长长的同款腰带勾勒出女性柔美曲线的同时,还能为自身气质增添一丝飘逸灵动感。宽松的灯笼袖设计,更让这款衬衫有了现代时尚与复古风情。"} +{"content": "类型#裙*材质#网纱*风格#性感*裙型#蛋糕*裙型#网纱裙*裙下摆#层叠*裙款式#木耳边", "summary": "轻盈蓬松的网纱裙,性感中又略带点小女人情怀,朋友聚会或是参加婚礼都是不错的选择。外层网纱层次分明,可爱的小木耳边元素,层层叠叠的蛋糕裙,特别有少女心,精心挑选的柔软细网纱,不易勾丝还能时刻保持飘逸感,整身都是小细节的网纱裙,满足你的薄纱情结。"} +{"content": "类型#裙*图案#撞色*裙下摆#花边*裙腰型#高腰*裙长#连衣裙*裙袖长#无袖*裙款式#木耳边", "summary": "合体无袖连衣裙设计,甜美优雅更显公主气质。领口与袖口木耳花边造型,捏褶均匀细致更显宝贝甜美可爱,撞色锁边工艺亮眼醒目,不易脱线更显品质。满印小花朵设计俏皮可爱。高腰线花边装饰,视觉拉长身材比例更显身材高挑。裙摆宽敞设计便于行走活动,飘逸灵动美观大方。"} +{"content": "类型#裤*版型#宽松*风格#简约*图案#线条*裤长#连体裤*裤型#阔腿裤", "summary": "这是一款非常简约气质耐看的连体裤装。吊带v领的设计,露出好看的锁骨线条,修饰美丽的鹅颈,小心思满满。衣身设计到腰部以上是贴身的,下身腿宽松设计,很好的把身形扬长避短,显露了纤纤细腿。"} +{"content": "类型#裙*材质#棉*风格#复古*风格#文艺*风格#清新*图案#复古*裙长#连衣裙", "summary": "原创汉元素品牌重回出品的这款连衣裙,纯净的白与淡雅的绿相互映衬,散发出沁人心脾的清新味道,将少女的欲语还休诉说尽致;优选轻薄的棉质面料制作裙身,上身轻盈舒适;重工手绣花朵点缀,尽显复古文艺气息~"} +{"content": "类型#上衣*风格#性感*衣样式#针织衫*衣款式#拼接*衣款式#勾花镂空*衣款式#绑带", "summary": "精致优雅的针织衫,性感不失优雅的镂空设计,富有层次感与设计感,彰显女性柔情优雅气质。下摆的交叉绑带设计,飘逸且灵动,点缀整体造型,显得丰富多彩。镂空与布料的拼接设计,令人眼前一亮,凸显女性身材曲线,打造完美的黄金比例。"} +{"content": "类型#裙*裙腰型#高腰*裙长#短裙*裙衣长#短款*裙款式#纽扣", "summary": "这一款短裙裤高腰设计,提升腰线自然显高。同时精挑细选的布料自带弹力,贴合身形勾勒曲线,对身形的包容性很好。加上别致纽扣的装饰,增添看点特别出彩。精致短款,衬托身材凸显腿长。"} +{"content": "类型#裙*颜色#绿色*图案#印花*裙长#半身裙*裙长#连衣裙*裙款式#纽扣", "summary": "夏天看到好的颜色的印花总是忍不住要做成成品出来,各种半身裙,连衣裙,湖绿色的底色,搭配着细小的桃花,一路桃花相伴,心情都是美丽的。领口的桃花纽扣的小设计显出了与众不同的小独特小细节中凸显品质感~"} +{"content": "类型#裙*图案#刺绣*裙型#小黑裙*裙领型#翻领*裙款式#拼接", "summary": "一款优雅气质的小黑裙,让人尽显自身魅力,体验非凡感觉。裙身满幅太阳花刺绣,不仅让人的心情变得明媚如春,更丰富了层次,彰显了立体感觉,增添了许多清新自然的氛围。欧根纱拼接翻领,让裙子变得更挺阔,更有型,促使人的身材在变得高挑挺拔的同时也显得精神阳光。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*风格#性感*图案#撞色*图案#蕾丝*裙长#连衣裙*裙领型#圆领*裙袖型#喇叭袖", "summary": "此款连衣裙经典圆领设计,拉长颈部比例,衬托出锁骨的精致,使得脸部轮廓柔和自然,展现女性魅力;时尚撞色+喇叭袖,穿出不一样的韵味,蕾丝花瓣状简洁利落,减龄更灵动飘逸浪漫,随性的下摆伴随着微风肆意摆动的瞬间,成就了女性特有的名媛魅力,更有女人味,以活泼的韵味打造立体的视觉效果;优雅性感的透视设计,时尚显瘦,凸显身材曲线,尽显女性曼妙的身姿,提升女性柔美的气质。"} +{"content": "类型#上衣*颜色#纯色*风格#街头*图案#纯色*图案#线条*衣样式#外套*衣领型#立领*衣长#短款", "summary": "这件很时尚的纯色短款外套穿着很有范儿,它的设计很用心。设计师采用了经典的立领设计融入衣身,给人很酷酷的感觉。让你轻松打造出个性帅气的街头风气息,而且立领的融入还能很好的修饰脖颈线条,让你的脖颈看上去更加的纤细迷人。"} +{"content": "类型#裤*版型#显瘦*风格#性感*裤腰型#高腰*裤口#微喇裤", "summary": "这一款休闲裤时尚的高腰设计,提升腰线自然显高,精挑细选的面料,手感柔软舒适亲肤,有筋骨有弹力而且挺括有型。时尚的微喇裤型,轻松遮肉自然显瘦。加上包臀设计,性感迷人女人味足。"} +{"content": "类型#裙*颜色#黑色*风格#复古*风格#潮*图案#复古*图案#印花*裙型#百褶*裙下摆#垂坠", "summary": "黑色的裙身并没有带来沉闷乏味的印象,七格格的设计师用生动形象的印花装点裙身,反而显得活泼减龄。它采用轻盈垂坠的雪纺纱制成,更是在行走间带来了灵动和飘逸浪漫。裙子的下摆带有顺应复古潮流的百褶,无形中加大了裙身。"} +{"content": "类型#上衣*风格#文艺*风格#知性*图案#蝴蝶结*图案#线条*衣样式#马甲*衣样式#风衣*衣样式#外套*衣领型#翻领*衣袖长#无袖*衣款式#腰带", "summary": "知性文艺风的风衣马甲外套。帅气的翻领造型,线条流畅,拉长脖颈线条。无袖设计,清爽利落,显手臂细长。无门襟造型,金属质感的圆环设计。腰间系上腰带,系上蝴蝶结,个性十足。"} +{"content": "类型#裤*风格#简约*裤长#连体裤*裤型#直筒裤*裤型#背带裤*裤款式#纽扣*裤口#翻折", "summary": "当活力美少女遇上减龄背带裤完美的衬托出小美女的俏皮与可爱,背带前方有纽扣设计,不光是为了方便穿脱还能避免连体裤上厕所的尴尬。背带后面的裤腰增添打褶设计,让裤腰具备一定的弹性,无论哪种身形的小美女穿着都不会紧绷还能增添甜美感。简约的直筒裤腿支持翻折,让减龄连体裤超流感大增。"} +{"content": "类型#裙*版型#显瘦*风格#潮*裙长#连衣裙", "summary": "拥有这件连衣裙立马让你变身精致的猪猪女孩。斗篷的设计酷炫又紧跟潮流,水溶钩花的设计个性新颖,修身舒适的版型展现你的曼妙身姿。"} +{"content": "类型#裤*版型#显瘦*材质#棉*颜色#纯色*颜色#军绿色*风格#街头*风格#简约*图案#纯色*裤腰型#高腰", "summary": "这款休闲裤的设计简约而不简单,高腰修身的版型,勾勒出了腿部优美的曲线,拉高了腰际线。军绿色的纯色裤面,彰显出了简约美感,尽显帅气的街头风范,选用优质的棉质材质制成,充满了质感,给你带来舒适的穿着体验。"} +{"content": "类型#裙*颜色#黑色*风格#性感*图案#线条*裙型#小黑裙*裙领型#v领*裙款式#收腰", "summary": "这款小黑裙采用纯黑色的设计风格,展现出经典大气的女性魅力,同时给人一种成熟优雅的时尚质感。交叠式的v领造型不仅将上身的线条修饰的更加凹凸有致,还起到很好的修饰脸型的作用,同时还带来更多的性感风情。自然的收腰设计,将女性柔美的身材曲线展现的淋漓尽致,打造有型有范的熟女魅力。"} +{"content": "类型#上衣*版型#宽松*风格#简约*图案#线条*衣样式#风衣*衣长#中长款*衣款式#抽绳", "summary": "作为一件中长款设计的风衣,显得有垂感。领口的设计较大气,很有个性。袖子部分则比较宽松,新颖别致,质感十足,置彰显卓越品质。肩型流畅,摩登有范儿。袖口看起来颇有造型,线条流畅做工精湛,上身优雅有型。同时,下摆位置设计了抽绳,较为宽松,方便活动。色系的搭配简约而不简单。"} +{"content": "类型#上衣*材质#针织*衣样式#毛衣*衣款式#不规则", "summary": "这件针织毛衣的亮点就在底摆上,前短后长的不规则底摆,增添了可爱俏皮的气息,穿在身上更有时尚的感觉。工整平滑的底摆锁边工艺,确保了针织不会出现脱线的情况,保证了衣衣的质量上乘。光滑弹性极佳的面料让你有更好的上身体验。"} +{"content": "类型#上衣*材质#针织*衣样式#毛衣*衣款式#不规则", "summary": "这款连衣裙设计。外搭一款暖心针织毛衣,加上的版型。搭配不规则的开叉下摆设计,丰富整体造型,还能让你感受到贴心的小温暖,内搭裙款不埋没骨子里的时尚。"} +{"content": "类型#裙*版型#宽松*风格#性感*裙长#连衣裙*裙领型#v领*裙款式#勾花镂空*裙款式#收腰", "summary": "柔软的面料让这款长款连衣裙穿着舒适,同时搭配宽松版型,更带了份随性慵懒的气质。然后加上镂空和v领设计的领口,更多了份性感气息,而收腰设计的运用,凸显曲线,更多了份优雅气质,再搭配豆沙红底色,非常显白,也非常显气质。"} +{"content": "类型#裙*裙长#连衣裙*裙衣长#中长款*裙款式#腰带", "summary": "连衣裙的长度被成中长款的样式,从而不会显露出肉感很足的臀部与大腿根,气质上更为优雅。整体选用真丝面料打造,其与舒适性十分的卓越,中长款式腰带,轻松勾勒出窈窕身姿。"} +{"content": "类型#裤*材质#雪纺*颜色#白色*颜色#纯色*风格#简约*风格#知性*风格#休闲*风格#性感*图案#纯色*裤长#九分裤*裤款式#勾花镂空*裤口#小脚", "summary": "浪漫轻盈的雪纺上衣,肩部以及腰部的镂空设计打破呆板,层次丰富,增添灵动温婉的气息。深v领口,在拉长脖颈曲线的同时又能展现精致迷人的锁骨,流露不经意的小性感。知性气质的白色九分小脚裤,简约的纯色设计尽显时尚休闲气息。"} +{"content": "类型#裙*材质#牛仔布*颜色#黑色*风格#潮*裙型#a字*裙型#牛仔裙*裙款式#口袋", "summary": "俏皮的a字廓形搭配挺括硬朗的洗水牛仔,穿又实力减龄,同时露出美好的腿部肌肤,让时尚潮人们钟爱不已。黑色毛球装饰在口袋与裙摆上,丰富其层次感之余格外吸睛夺目。舒适的一字腰头,巧妙勾勒出纤细腰肢。"} +{"content": "类型#裙*版型#显瘦*颜色#深色*风格#性感*图案#线条*图案#印花*裙长#连衣裙*裙款式#吊带*裙款式#不规则", "summary": "这款连衣裙是性感迷人的吊带款式,上身后凸显脖颈纤美的线条感!深色的混合印花,更显肤色白皙亮丽。中长的修身款式,对身材的包容性更大,上身遮肉显瘦,更显高挑纤瘦的美感。不规则的下摆剪裁,走动间轻盈飘逸,更显时髦灵动的气质感,出街实力吸睛!"} +{"content": "类型#上衣*版型#显瘦*风格#通勤*风格#休闲*图案#条纹*图案#线条*衣样式#西装*衣门襟#一粒扣", "summary": "竖条纹打破了沉闷感拉长了线条视觉显高挑,修身的剪裁简洁干练,jing典一粒扣版自然有型,将西装的休闲和精致诠释的刚刚好,通勤和休闲都有时髦的意味。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙型#百褶*裙下摆#层叠*裙长#连衣裙*裙袖型#泡泡袖", "summary": "这条复古风的连衣裙是经典的女神款,谁穿上都会非常好看,首先是袖口设计,非常典型的泡泡袖,甜美可爱,再配合整条袖口都是纯蕾丝,这份可爱中又多了一丝的浪漫和温柔。裙摆是多层的,旋转起来,有层层叠叠的百褶效果,追求名人效果的女孩儿必备。"} +{"content": "类型#裙*材质#网纱*风格#知性*风格#高贵*风格#清新*图案#刺绣*裙长#连衣裙*裙领型#圆领*裙款式#收腰", "summary": "透露着清新淡雅,高贵知性,端庄的气质连衣裙,适合每个美眉。它经典的圆领设计,修饰了女性细长的脖颈。腰身处收腰的裁剪,不仅拉伸身材比例,更凸显了高挑的身姿。网纱刺绣的设计,把女性的高贵端庄,典雅知性气质展现的淋漓精致。"} +{"content": "类型#上衣*版型#宽松*材质#雪纺*风格#清新*衣样式#雪纺衫*衣袖长#七分袖*衣款式#木耳边", "summary": "这款甜美气质的雪纺衫,精选优质的雪纺料,柔顺亲肤,穿着舒适。宽松版型,包容各种身材,舒适自然。木耳边点缀领口,浪漫不失气质。七分袖设计,端庄大气,有女人味,采用当前大热的马卡龙色。清新时尚,造型甜美时尚,又百搭。"} +{"content": "类型#裙*颜色#白色*图案#线条*裙下摆#荷叶边*裙下摆#层叠", "summary": "白色的裙身让你如一样圣洁,打造出高冷的女神形象。细细的肩带大方地展露出骨感的肩颈线条,不忍转移视线。尽管采用的颜色较为单一,但是不会显得乏味和沉闷。层叠的荷叶边不仅甜美,还能令扁平的胸部变得丰满。"} +{"content": "类型#裙*材质#牛仔布*颜色#蓝色*风格#文艺*风格#休闲*风格#清新*图案#字母*图案#文字*图案#印花*图案#撞色*裙型#背带裙*裙型#牛仔裙*裙长#连衣裙*裙款式#口袋", "summary": "这款连衣裙采用自然的牛仔蓝色,展现出年轻一族随性自由的生活态度,同时也为衣身带来一丝清新的文艺范儿。撞色字母印花的背带设计,为这款连衣裙带来了全新的活力,也让它的色彩搭配更加丰富,展现出别样的休闲色彩。前襟的口袋造型,不仅将休闲的色彩渲染的更加浓郁,也起到了很好的实用功能。"} +{"content": "类型#裤*版型#宽松*风格#通勤*风格#简约*图案#格子*图案#线条*图案#印花*裤长#长裤*裤型#直筒裤", "summary": "带着点小帅气感的一件长裤。经典的格纹印花点缀,搭配上直筒款式设计,简约线条,满满通勤风时尚感。裤身相对宽松,对大多数身型友好,且包容腿部线条,显得双腿线条笔直。"} +{"content": "类型#裤*版型#宽松*版型#立体剪裁*材质#牛仔布*裤款式#拼接*裤款式#不对称", "summary": "采用高品质牛仔面料,立体剪裁而成。舒适宽松的版型,小裤脚修饰腿型效果更棒。两侧不对称的磨破设计很有新意,背面明辑线拼接细节更具未来感。"} +{"content": "类型#上衣*版型#显瘦*材质#棉*风格#复古*风格#清新*图案#格子*图案#复古*衣样式#衬衫*衣袖型#泡泡袖*衣门襟#系带*衣款式#拼接", "summary": "复古雅致的格纹衬衫,采用100%纯棉面料打造而成,柔软透气,穿感舒适自如。系带修饰领口,增添飘逸律动感,清新减龄。胸口处的毛边拼接设计,增添立体视效,更显别致细节处理。复古泡泡袖型,轻松修饰手臂曲线,遮肉显瘦。"} +{"content": "类型#上衣*版型#宽松*材质#针织*风格#文艺*风格#简约*风格#休闲*图案#条纹*图案#撞色*衣样式#开衫*衣领型#圆领*衣长#短款*衣门襟#单排扣", "summary": "一款休闲减龄的条纹针织开衫,宽松的短款版型,点缀撞色的条纹,上身舒适而不单调,散发出减龄的文艺气质;简约的小圆领结合门襟单排扣,经典而大气,上身大方而不落俗。"} +{"content": "类型#裙*版型#显瘦*材质#羊绒*风格#简约*风格#清新*裙型#包臀裙*裙领型#圆领*裙领型#v领", "summary": "上衣简约圆领,搭配裙子的v领设计,修饰出脖颈的纤细曲线美。显瘦包臀的裙型,完美勾勒出身形的曼妙迷人,甜美浪漫的纯色调,清新而又格外的减龄。使用优质的羊绒面料,触感舒适柔软,温和亲肤,尽显品质做工不凡。"} +{"content": "类型#上衣*材质#蚕丝*颜色#粉色*风格#清新*衣样式#衬衫", "summary": "百搭而张显小清新风格的衬衫,淡雅的粉色系,排扣贴袋设计格外工整,衬托出优雅的仪态。活动式袖袢的设计更加彰显随性洒脱的特质。优良桑蚕丝材质工艺,上身效果轻柔舒爽。"} +{"content": "类型#裙*材质#牛仔布*颜色#纯色*图案#纯色*裙型#a字*裙型#牛仔裙*裙腰型#高腰*裙长#半身裙", "summary": "极简的纯色牛仔半裙,面料和版型上就很有看点,上身效果出乎意料。经典的a字半裙,用棉弹牛仔面料打造,配合腰部立体收省,能全面包裹并收紧腰腹的赘肉,轻松穿出纤细的小蛮腰,自高腰处延伸出来的裙摆,刚刚好遮住较粗的大腿部位,结合恰到好处的三分裙长,轻松穿出迷人大长腿。"} +{"content": "类型#裙*图案#拼色*图案#线条*裙长#半身裙*裙领型#一字领*裙袖型#喇叭袖*裙款式#木耳边", "summary": "这款裙子分为上下两个部分。上身采用一字领设计,很好的展现了脖颈及肩膀处的优美线条,落落大方更显穿戴者的优雅气质。喇叭袖的设计有效的修饰的手臂的赘肉,搭配上木耳边的装饰,让上身更具层次感。下身的半裙则是采用拼色设计,让裙子不显单调,更有设计感。"} +{"content": "类型#裙*材质#网纱*风格#简约*风格#性感*图案#线条*图案#撞色*裙长#连衣裙*裙领型#翻领*裙款式#拼接*裙款式#腰带*裙款式#抽褶", "summary": "网纱连衣裙透露着浪漫的女人味,裙身别致的撞色拼接格调呈现新颖的时髦感,精致的发酵着性感和优雅。简约的翻领修饰颈部线条,门襟一排扣装饰闭合,若隐若现的轻薄质感,弥漫着神秘的气息,恰如其分地诉说一半唯美一半优雅的情怀,袖口抽褶工艺呈现微喇叭的袖口,腰间配弹力腰带。下垂自然的裙摆轻盈飘逸,传递着冬日的浪漫。"} +{"content": "类型#上衣*颜色#粉色*风格#简约*衣样式#衬衫", "summary": "BRAND的这款睡衣,看起来很普通。但是只有你穿上身之后,你才能感受到这款家居服的魅力。棕色的走线设计,恰到好处地修饰了温馨的粉色,让整个人看起来十分简约时尚。而领口采用衬衫领的设计,勾勒出你迷人的颈部曲线。"} +{"content": "类型#上衣*风格#简约*风格#潮*图案#线条*图案#印花*衣样式#毛衣*衣领型#圆领*衣款式#勾花镂空", "summary": "简约与个性巧妙结合是这款毛衣的亮眼之处,线条流畅的圆领设计,轻松勾勒肩部与颈部轮廓。成衣袖口别致的镂空设计,别致新颖又潮流时尚,立体印花的点缀,又给其平添了一分酷雅气息。然后浑身散发出潮流之感,舒适的用料,将不凡格调发挥地淋漓尽致。"} +{"content": "类型#裙*颜色#白色*颜色#紫色*颜色#纯色*图案#纯色*图案#渐变*裙型#百褶", "summary": "这一款比起纯色的百褶裙,更显精致与高级。运用了渐变色的设计,由紫色一直渐变到白色,给人一种妩媚动人的感觉。结合百褶的裙摆,穿出优雅和飘逸感,让你的气质变得更加与众不同。"} +{"content": "类型#上衣*风格#复古*图案#复古*图案#撞色*衣样式#外套*衣款式#连帽", "summary": "BRAND塔卡沙品牌的这款男士连帽外套,采用了经典个性的撞色设计,搭配复古的色彩,更具时尚感。领口v字设计,视觉上起到瘦脸的效果,更加精致有型。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*颜色#红色*风格#高贵*图案#线条*图案#蕾丝*裙型#a字*裙型#包臀裙*裙型#鱼尾裙*裙下摆#花边*裙腰型#高腰*裙款式#不规则", "summary": "有着显肤白红色调的半身包臀裙,珍珠蕾丝花边装饰丰富整体立体层次感,又彰显优雅高贵气质,不规则鱼尾下摆设计增添几分灵动轻盈感,高腰a字版型轻松拉长腿部线条,上身显瘦又显高。"} +{"content": "类型#裙*版型#宽松*材质#蕾丝*图案#拼色*图案#线条*图案#蕾丝*裙型#直筒裙*裙下摆#压褶*裙长#连衣裙*裙领型#v领*裙款式#拼接*裙款式#口袋", "summary": "本款连衣裙采用了宽松宽松直筒的版型,大大的v字领口,能大大的修饰女性的颈部线条,侧边超级实用的口袋,方便又百搭。加上领口处拼接的睫毛蕾丝,妩媚又迷人。下摆的同色系拼色压褶设计更是设计感十足,让人情不自禁的想要拥有它。"} +{"content": "类型#裙*风格#知性*图案#线条*裙型#鱼尾裙*裙长#连衣裙", "summary": "这是一款将传统与现代完美结合的一款连衣裙,受了传统旗袍的启发,保留了旗袍元素中的知性与优雅,小半立旗袍领,很好的修饰了颈部线条,显得温婉典雅。裙摆采用了鱼尾的样式,走起路来更是摇曳生姿,充满女人味。"} +{"content": "类型#裙*版型#显瘦*图案#格子*裙袖型#喇叭袖*裙款式#腰带*裙款式#收腰", "summary": "时尚达人都喜爱的格子吊带裙两件套,个性优雅不过时,适合任何体型的人群穿着。遮肉显瘦,百搭经典的上上之选。时尚喇叭袖袖口设计,修饰手臂曲线,彰显优雅名媛气质。同色系腰带收腰设计,修身显瘦,凸显身材,整齐衣摆设计,修饰身形,彰显妙曼身姿。"} +{"content": "类型#上衣*版型#宽松*风格#街头*风格#简约*图案#字母*图案#文字*衣样式#风衣*衣样式#外套*衣长#中长款", "summary": "采用了帅气的棒球领口制作而成的风衣外套自带一股潇洒随性的魅力,简约的风格有着落落大方的魅力,完美诠释着街头的潮女风范。宽松的中长款版型剪裁搭配上内敛低调的色系。充满着简洁明快的减龄魔力。而衣身下摆处的字母装饰时髦有腔调,有一种趣味的感觉,却又不会过于张扬,既丰富整体细节的同时又能让衣身变得更加亮眼,轻松赚足回头率。"} +{"content": "类型#上衣*版型#宽松*风格#街头*风格#简约*图案#字母*图案#文字*衣样式#风衣*衣样式#外套*衣长#中长款", "summary": "采用了帅气的棒球领口制作而成的风衣外套自带潇洒随性的魅力,简约的风格有着落落大方的魅力,完美诠释着街头的潮女风范。宽松的中长款版型剪裁搭配上内敛低调的色系。充满着简洁明快的减龄魔力。而衣身下摆处的字母装饰时髦有腔调,有一种趣味的感觉,却又不会过于张扬,既丰富整体细节的同时又能让衣身变得更加亮眼。"} +{"content": "类型#上衣*风格#简约*图案#抽象*图案#印花*衣样式#衬衫*衣领型#立领*衣领型#翻领*衣袖长#长袖*衣门襟#单排扣", "summary": "这款来自BRAND旗下精心推出的男士长袖衬衫,前幅利用简约的抽象印花图案修饰,增添整体的时尚气质,又具有别样的迷人气质。经典的立领翻领领口,立体感十足,也让衣物廓形更明晰。时髦的单排扣衣襟,穿脱很便利,展露出温文尔雅的气息,做工与剪裁属于一流。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*风格#复古*图案#复古*裙长#连衣裙*裙款式#盘扣", "summary": "宽松板式设计的连衣裙,是非常好驾驭的款式,遮肉显瘦,又不失灵动的俏丽之感。小立领和斜襟盘扣的中式设计元素,使得本款连衣裙,在一众连衣裙中脱颖而出,上身后既有女性温柔甜美的一面,又不失复古的端庄优雅。"} +{"content": "类型#上衣*材质#网纱*风格#复古*图案#蝴蝶结*图案#复古*衣样式#卫衣*衣袖型#落肩袖*衣款式#螺纹*衣款式#抽褶", "summary": "此款卫衣可谓是足够吸引眼球。粉粉的色调,足够的甜美可爱,加上网纱的领子与蝴蝶结的装饰点缀,又带点复古的俏皮,让人心动不已。落肩袖的设计,修饰肩部曲线。与略微褶皱的袖子的结合,高效地遮住手臂的赘肉。经典的螺纹袖口,舒适度极佳。"} +{"content": "类型#裙*颜色#蓝色*颜色#黄色*风格#简约*风格#潮*图案#植物*裙袖长#五分袖*裙领型#圆领*裙袖型#喇叭袖", "summary": "这条裙子的配色非常舒服,蓝色为底色,黄色的浪漫花卉平铺其上,和谐又雅致,而且衬得人的皮肤白皙清透。它的袖子非常有趣,喇叭袖的设计也是正好迎合了当下的潮流,五分袖又恰到好处地隐藏了胳膊的肉肉。领口选择大方经典的圆领领口,整体非常简约,上身柔美典雅。"} +{"content": "类型#上衣*版型#显瘦*风格#淑女*衣样式#风衣*衣领型#翻领*衣门襟#双排扣*衣款式#腰带", "summary": "该款风衣采用修身版型,上身后有一种女神范儿,在此同时翻领设计还能为你添上一丝淑女气质。双排扣加上腰带看上去非常有气质,符合欧巴心目中女神的穿衣风格。风衣的材质薄厚适度,适合春季过渡穿着,这样既能保证温度也能显示出我们完美的身材。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#清新*图案#条纹*图案#蝴蝶结*衣样式#衬衫*衣袖型#喇叭袖*衣款式#拼接", "summary": "今年的衬衫款,宽松的廓形设计清新宜人的纹饰,总能给人带来清凉舒适的感觉。这款小清新喇叭袖条纹衬衫裙就非常适合学生穿着,细条纹轻盈显瘦,融入拼接元素及蝴蝶结喇叭袖更多细节,可爱甜美。"} +{"content": "类型#裙*材质#丝绒*材质#雪纺*风格#清新*裙型#大裙摆*裙袖型#喇叭袖*裙款式#拼接*裙款式#抽褶", "summary": "雪纺面料具备了亲肤透气的穿着效果,更好地满足了早春的时尚氛围,诠释出舒适自在的穿着体验。腰间运用丝绒织带的拼接,塑造出完美的身材比例。喇叭袖的加持,提升一股淡雅的清新格调。加之褶皱的腰身,蔓延出风情万种的大摆裙,打造出灵动撩人的魅力。"} +{"content": "类型#裙*颜色#白色*图案#条纹*图案#线条*图案#刺绣*裙型#a字*裙款式#拼接*裙款式#不规则", "summary": "蓝白条纹相间的裙身,清爽得如同清冽的,颠覆传统的条纹裙身设计款式,横条纹的上身,显得女孩十分安静,而下摆采用不规则的竖条纹拼接,结合a字下摆版型,瞬间让视觉效果灵动了起来,女孩子显得更加有活力。胸前又以彩色的刺绣线条点缀,完美地结合白色条纹,构成了一幅下雨图,浪漫新颖,又增添了裙身层次亮点。"} +{"content": "类型#裙*材质#蕾丝*风格#潮*图案#撞色*图案#蕾丝*裙下摆#层叠*裙腰型#高腰*裙长#连衣裙*裙款式#拼接", "summary": "这款蕾丝连衣裙,有着粉嫩的柔美色调,衬的肌肤粉嫩白皙,穿上身减龄就是分分钟的事。蕾丝的面料,上身有种层叠的繁复感,轻透的同时又有妩媚优雅的风情,更显窈窕动人。拼接的撞色设计,让整体的层次更为丰富多彩,有种个性时尚的潮流范儿,新颖吸睛。高腰的设计,很好的拉长了身材的比例,收束出纤细的腰身,让你更为玲珑窈窕。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*风格#性感*图案#刺绣*图案#蕾丝*裙型#大裙摆*裙下摆#花边*裙腰型#高腰*裙长#连衣裙*裙领型#v领", "summary": "浪漫仙气的蕾丝连衣裙,v字领口露出迷人的锁骨,衬托纤细小脸;领口周围花边的点缀,轻奢甜美的同时又带点小性感。衣身精致的水溶蕾丝,刺绣的图案低调而华美。修身高腰的裁剪拉长身形,展现女子的曼妙身姿。长长的大裙摆随着步伐摆动,让人心动,满足你对夏日的浪漫幻想。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*颜色#粉色*风格#复古*图案#复古*图案#蕾丝*裙型#a字*裙型#大裙摆*裙长#连衣裙*裙款式#拼接*裙款式#收腰", "summary": "这是一款粉色拼接蕾丝的连衣裙,通过大裙摆表达出十分强烈的仙女感,a字型的版型,十分显瘦收腰,拉长双腿的比例。展现满满的复古甜美感。"} +{"content": "类型#裤*版型#宽松*材质#棉*材质#牛仔布*图案#环保*裤型#直筒裤*裤型#背带裤", "summary": "这款来自米妮哈鲁牛仔背带裤,精选100%纯棉面料制造,手感柔软细腻穿着舒适合身。表面采用环保活性印染,无甲醛荧光剂残留,安全温和的质地守护孩子身体健康。宽松的直筒裤版型包容性较好,可以藏住肉肉凸显孩子的大长腿。"} +{"content": "类型#裤*材质#牛仔布*材质#蕾丝*颜色#黑色*风格#性感*图案#蕾丝*裤长#短裤", "summary": "牛仔短裤,黑色调,散发出性感撩人的迷人风情;简洁的腰头,显得腰线特别好看;裤脚边沿顺着蕾丝睫毛花边,微微蜿蜒的弧线婉约浪漫,整套look更显得精致了几分。"} +{"content": "类型#上衣*颜色#绿色*图案#格子*图案#线条*衣样式#衬衫*衣领型#翻领*衣款式#收腰", "summary": "较为少见的绿色格纹元素,彰显了与众不同的设计韵味。干练衬衫翻领修饰颈部线条,衬托娇小脸型。腰部收腰剪裁,勾勒出曼妙的小蛮腰曲线,裙身做了撞绿色的裙摆插片,丰富视觉看点。"} +{"content": "类型#裤*材质#牛仔布*裤款式#拉链*裤腰型#高腰*裤口#毛边*裤口#小脚", "summary": "这款来自的毛边牛仔裤,高腰的设计,能有效提臀瘦身,塑造迷人身姿。门襟拉链设计,开合顺滑,穿脱便捷。小脚的设计,紧紧包裹双腿,显得腿部纤细又修长。毛边裤脚设计,时尚洋气,穿起来潮范儿十足。"} +{"content": "类型#裤*材质#牛仔布*裤款式#拉链*裤腰型#高腰*裤口#毛边*裤口#小脚", "summary": "这款来自paige的小脚牛仔裤,高腰的设计,能有效遮盖腹部赘肉,轻松塑造纤细小蛮腰。门襟拉链设计,开合顺滑,穿脱便捷。小脚的设计,紧紧包裹双腿,显得腿部纤细又修长。毛边裤脚设计,个性时尚,穿起来潮范儿十足。"} +{"content": "类型#裤*版型#宽松*裤长#九分裤*裤型#直筒裤", "summary": "来自于的这条宽松褶位九分裤,采用的是宽松直筒的版型设计,这种版型的包容性极佳,不仅不挑身材,还能起到一定的修饰不完美腿型效果,而前腰的打褶细节,则使得裤子更为立体。再添加可拆卸织带进行点缀,轻松就能凹出自主造型。"} +{"content": "类型#裤*版型#宽松*裤长#九分裤*裤型#直筒裤", "summary": "这一款裤子腰部串珠的装饰,增添看点自然出彩,立体的裁剪,符合女性的身材曲线,轻松的勾勒出女性曼妙的身材。加上九分的裤长,精致优雅气质不凡。直筒宽松,不挑身材包容性强。"} +{"content": "类型#裙*材质#蕾丝*图案#蝴蝶结*图案#蕾丝*裙下摆#开叉*裙长#连衣裙*裙领型#圆领", "summary": "一款优美的圆领连衣裙,穿起来优雅又时尚。蕾丝花朵装饰衣身,像整个人漫步在花海,充满浪漫气息;侧边开叉蝴蝶结,优雅灵动很显大气,上身减龄又衬气质。"} +{"content": "类型#上衣*版型#显瘦*风格#休闲*风格#清新*图案#条纹*衣样式#衬衫*衣领型#圆领*衣款式#拼接*衣款式#荷叶边", "summary": "这款衬衫很有女人味,特意选用柔软亲肤的面料,带来舒适的穿着体验。同时清新的蓝白条纹设计,在视觉上让你更显瘦。并且休闲的圆领设计更能展现你优雅的脖颈曲线美。让人眼前一亮的是,领口和袖子拼接的荷叶边设计,整体打造了优雅柔美的气质,更能衬托出女性魅力。"} +{"content": "类型#裤*风格#清新*裤型#阔腿裤", "summary": "连体的阔腿裤版型十分抢眼,既凸显了女性上身的颈部曲线,同时也很好的修饰出腿部曲线,最适合那些腿型并不好看的女性穿搭,体现出设计师独特的设计理念和新奇的设计想法。更值得一提的是,这款阔腿裤选择清新的颜色设计,而是选择的是端庄但不张扬的藏青色设计,在放肆的格调中注入一点稳重气息。"} +{"content": "类型#裙*裙型#a字*裙长#半身裙*裙款式#抽褶", "summary": "明明是经典的半身裙,在融入了整齐排列的图案后形成了奇妙的视觉效果,增添了一丝高雅的韵味。上窄下宽的a字版型既修饰了腰身也展现了十足优雅的形象,结合随性的褶皱设计,增强了层次感也更凸显优越品质。"} +{"content": "类型#裙*裙型#a字*裙长#半身裙*裙款式#抽褶", "summary": "本款半身裙采用了棉布的挺括面料,在亲夫吸汗的同时又具备良好的舒适感,大大的a字版型少女感十足,配上30到褶皱的复杂工艺,加上大片的翻飞花朵,炎炎夏日,裙摆随着微风起舞,好一个吸睛的场面!这款设计感十足的半裙,小仙女必须要拿下呀"} +{"content": "类型#上衣*材质#牛仔布*材质#水洗*颜色#浅蓝色*衣样式#外套*衣长#短款", "summary": "早春的时候,没有一件牛仔外套,怎么能算是过春天呢?其实在每年的开春之际,这也是让大家烦恼的一件事,今天种草给大家的这件牛仔外套,简直不要太赞哦!属于短款的版型,相信我不用多说,这种短版型是特别显高的,刚好腰下一点,连水洗的浅蓝色也是符合了早春的气息,超赞的!闭眼入是没有问题的。"} +{"content": "类型#上衣*风格#休闲*图案#骷髅*图案#印花*衣样式#卫衣*衣款式#连帽", "summary": "卫衣两边袖筒的logo图案,结合后幅大骷髅印花,彰显品牌别致的设计理念,极具个人特点。连帽设计尽显无限休闲感,带来舒适惬意的穿着体验。"} +{"content": "类型#裤*版型#宽松*风格#青春*风格#性感*图案#创意*裤腰型#高腰*裤口#毛边", "summary": "气质高腰设计,轻松拉长腿部的比例,尽显身姿的高挑动人。创意与个性十足的毛边设计,展示出青春的不羁与时尚。宽松超短裤的设计,带有几分俏皮的小性感,更迷人。"} +{"content": "类型#上衣*颜色#粉色*风格#街头*风格#性感*衣样式#卫衣*衣样式#开衫*衣样式#外套*衣门襟#系带*衣款式#口袋*衣款式#连帽", "summary": "这是定制的一款开衫外套,选用的是独特的橘粉色,既能很好地显肤白,又给你满满的少女感,而且有一种温暖的感觉。集开衫和卫衣两种衣款于一身,带给你两种风格的完美结合。连帽与系带设计,具有街头随性感。口袋设计,方便而美观。"} +{"content": "类型#裙*风格#性感*图案#格子*图案#线条*裙长#连衣裙*裙袖型#喇叭袖*裙款式#勾花镂空*裙款式#木耳边*裙款式#收腰", "summary": "格子元素的连衣裙,既低调内敛,又精致时髦,充分展现出女性风情。考究裁剪的版式,结合收腰的设计,自然贴合身形,展现出纤细迷人的曲线。双肩的镂空造型,凸显出了精致的锁骨线条,平添几分性感的小女人味。裙身多处的木耳边的点缀,呼应柔美的喇叭袖型,缱绻浪漫,尽显甜美少女风。"} +{"content": "类型#裙*裙型#a字*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拉链", "summary": "这款半身裙采用高腰的a字版型设计,拉高了腰际线,尽显身材纤细,亮色的漆皮材质,带来了很好的吸睛效果。优质的五金拉链,充满了质感,在细节中彰显出了品质,选用进口的羊皮制成,给你带来舒适的穿着体验。"} +{"content": "类型#上衣*版型#显瘦*风格#职场*衣样式#衬衫*衣样式#西装", "summary": "西装对于职场人士而言一定是必备的单品,更何况这样一款剪裁优良,版型合身,质感高级的设计。选用纯黑的色调,十分的经典,同时可塑性极强,搭配种颜色的衬衫都是非常不错的选择。而修身的版型,更是很好的勾勒出美好的身材曲线,优雅而又气质。"} +{"content": "类型#裙*版型#宽松*风格#休闲*风格#潮*裙长#半身裙*裙款式#拼接", "summary": "时尚加分的拼接设计,在视觉上丰富了服装的层次感。把不同图案、面料组合在一起,碰撞出个性抢眼的休闲风。让你秒变不折不扣的潮流,走在之中别提多吸引人了。十分宽松的裤腿设计,从远处看,有种半身裙的既视感。恰到好处的隐藏住略显不完美的腿型,既时髦又飘逸,好感度up。"} +{"content": "类型#上衣*图案#条纹*图案#蝴蝶结*衣样式#衬衫*衣袖长#长袖*衣袖型#衬衫袖", "summary": "面料条纹清晰,光泽柔和,具有良好的透气性,上身立体挺括。衬衫袖型为连衣袖,丰富衬衫结构,细节的时尚新颖。淡雅的颜色搭配基础的直筒版型,前幅横向分割融入连身长袖,领部蝴蝶结玩转细节心机,记住衬衫三颗扣,达人没错了。"} +{"content": "类型#裙*颜色#蓝色*风格#复古*风格#知性*图案#复古*裙长#连衣裙", "summary": "想要简洁而知性的感觉,那你不妨试试素雅的色调。你看那清浅的蓝色调,配合在一款中长的连衣裙上,总是能展现出十足的魅力。而那复古的旗袍领设计,穿搭起来可以让你更显优雅范儿。配上婀娜的身线和那波浪式的裙身,让你的身材更俏,从而展现出优雅美观的女人味来。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*风格#街头*裤长#九分裤*裤口#毛边", "summary": "牛仔裤不得不说,的确是衣橱百搭小能手,不知道怎么搭配的时候来一条这样的九分牛仔裤怎么穿都合适,上身真的超级显瘦。九分裤长裤脚撕边,露出精致的脚踝,多了几分时髦感,区别常规的千篇一律,裤脚随性的毛边,演绎着摩登街头感的雅痞气息。"} +{"content": "类型#裙*图案#动物*裙型#a字*裙型#大裙摆*裙下摆#荷叶边*裙长#连衣裙*裙款式#木耳边*裙款式#收腰", "summary": "来自东北动物明星的原创“史前壁画”,描绘一个万物初开,烂漫的年代。而图案,这一回用的是仙味儿连衣裙呈现,上身后帮姑娘们悄咪咪减龄哟。双层的荷叶边元素丰富连衣裙的层次效果,使得裙子变得更加飘逸灵动,领口的木耳边轻盈而甜美,呼应荷叶边喇叭袖口,尽显女孩子柔美动人姿态,结合收腰a字大裙摆,尽显仙气十足的少女魅力。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#简约*衣样式#卫衣*衣领型#圆领*衣款式#拼接*衣款式#不规则", "summary": "不规则的设计和卫衣的搭配得很好,袖子部分也进行了拼接,不再稚气之余又添加了几分乖巧。不规则的下摆显高又显瘦,走起路来飘逸感十足。经典的圆领设计,修饰脸型不挑人穿。整体简约大气,剪裁干净利落,小宽松的长款版型,遮肉又显瘦。"} +{"content": "类型#上衣*版型#宽松*材质#牛仔布*风格#青春*衣样式#外套*衣袖型#落肩袖", "summary": "春暖花开,又是一年牛仔季,已经忍不住要给你们出各种好看的牛仔外套啦,这一款彰显着骨子里的青春个性,零星点缀的毛边处理,让牛仔帅气不减;宽松版型+落肩袖的设计率性不羁,上身各种自由随性and显小减龄"} +{"content": "类型#上衣*版型#宽松*材质#牛仔布*风格#青春*衣样式#外套*衣袖型#落肩袖", "summary": "春暖花开,又是一年牛仔季,已经忍不住要给你们出各种好看的牛仔外套啦,这一款彰显着骨子里的青春个性,零星点缀的毛边处理,让牛仔帅气不减;宽松版型+落肩袖的设计率性不羁,上身各种自由随性and显小减龄。"} +{"content": "类型#裤*颜色#白色*风格#运动*风格#青春*风格#潮*图案#条纹*图案#线条*裤型#直筒裤*裤口#开叉", "summary": "直筒裤裤型上身舒适不束缚,裤身线条流畅,垂感很好,易凹造型;两侧条纹装饰紧跟校服裤潮流,释放时尚运动风气息,青春动感,有着瞬间减龄的;裤脚上点缀了白色的英文字母,一点就能抓住眼球,醒目吸睛;适当的裤脚开叉设计让整条裤子远离刻板,前卫个性。"} +{"content": "类型#裙*颜色#纯色*图案#纯色*裙衣长#中长款*裙衣门襟#系带", "summary": "纯色的色彩使用,时尚百搭,出行更加的便捷。腰间系带的工艺,轻松修饰女性完美身姿,塑造女性完美腰型。半袖的制作工艺,让穿着活动更感轻松自由,体验畅快无拘束的穿着效果。中长款的裙型制作,拉长女性身形比例,视觉显身高。"} +{"content": "类型#上衣*图案#线条*图案#撞色*衣样式#卫衣*衣款式#抽绳*衣款式#连帽", "summary": "这款连帽卫衣让人一眼就会爱上,独特撞色工艺搭配经典连帽抽绳版型,下摆部分的走线将主体分成上下两部分,将男性率性洒脱不羁爱自由的风格展现出来,又不会显得过分花哨。独特的假两件设计修饰手臂部分的线条,衬托出穿着者修长挺拔的身形"} +{"content": "类型#上衣*版型#显瘦*材质#针织*颜色#裸色*风格#淑女*风格#潮*风格#性感*图案#撞色*衣样式#针织衫*衣样式#外套*衣领型#圆领*衣袖型#喇叭袖", "summary": "早秋时节,针织衫实用保暖又百搭。整体色调呈现裸色,干净简洁贴近肤色更加实用。小圆领的款式剪裁经典又百搭,方便搭配各种外套。撞色的喇叭袖设计结合了当下的潮流元素,体现女性手腕处的性感,彰显淑女气质,独特的细节设计,引人瞩目。整体款式为紧身款修身显瘦,采用天然针织面料,寒冷的冬季作为内搭十分柔软舒适,保暖性能强。"} +{"content": "类型#裤*风格#复古*图案#复古*图案#波点*裤型#阔腿裤*裤腰型#高腰", "summary": "阔腿裤加上超长裤腿的设计,走路,气场十足,很。垂感很足的面料,行走间飘逸灵动。黑底白点的波点元素,尽显复古优雅,女人味十足。高腰的设计,更是塑造胸以下都是腿的既视感。"} +{"content": "类型#裙*材质#牛仔布*裙款式#不对称", "summary": "这款具有时尚色彩的短外套。不对称的门襟设计别具用心,半个领子的设计也是个性突出,穿起来很潮。牛仔面料看起来质地很好,板正有型。"} +{"content": "类型#裙*材质#绸缎*风格#性感*裙型#抹胸裙*裙衣门襟#系带*裙款式#勾花镂空", "summary": "色彩靓丽的绸缎纹路,带来与众不同的视觉冲击,即便是同款亦有不同色呈现,可谓独一无二。人性化的可拆卸挂脖,后背系带设计,双重既能让你抹胸撩人又能系带风情万种,内侧贴心的防滑硅胶是你穿抹胸的安全保障。而泳裤的侧边镂空式设计,除了性感之外也是展示自我的小秘诀。"} +{"content": "类型#裙*版型#显瘦*裙型#一步裙*裙下摆#开叉*裙长#连衣裙*裙袖型#花瓣袖*裙衣门襟#拉链*裙款式#拉链", "summary": "这款连衣裙采用修身的版型设计,搭配精致的剪裁设计,结合一步裙款式,展现出玲珑曼妙的曲线,更为显瘦。而后中开叉设计,穿着给身体更多的活动量,行动更为方便。加上花瓣袖设计,更为优雅浪漫。搭配拉链设计,穿脱更为方便。"} +{"content": "类型#裤*风格#性感*图案#卡通*图案#印花*裤长#短裤", "summary": "夏季不只是空调呆着,更重要的让身体全方位凉爽。这款工字露脐吊带衫,带着bf风的3d卡通印花,艳丽的色彩和更具和谐感,让你舒适感倍增,穿上配上短裤妥妥的性感风。"} +{"content": "类型#裙*版型#宽松*版型#立体剪裁*材质#蕾丝*风格#宫廷*图案#线条*图案#蕾丝*裙型#花苞裙*裙型#抹胸裙*裙长#连衣裙*裙领型#一字领*裙袖型#泡泡袖", "summary": "它像是夏天里的一朵含苞待放的花苞裙,宽松的抹胸一字领设计,这种版型属于谁穿谁好看的那种,恰到好处的展现迷人的锁骨和肩部线条而。蕾丝的优雅与浑然天成的甜美感,也衬托得无比细腻。立体裁剪独有的柔美素雅,更是仙气满满,让你回头率爆表。加上气场强大的宫廷泡泡袖设计,搭配七种不同花型,造就了这件充满生命力的连衣裙。"} +{"content": "类型#上衣*版型#宽松*风格#复古*风格#简约*风格#休闲*图案#复古*衣样式#卫衣*衣门襟#系带", "summary": "对卫衣情有独钟的妹子可不要错过这款趣味十足的卫衣哦!点缀之上的可爱装饰让人眼前一亮,同时充满了妙趣横生的画面感。再搭配复古的咖啡色,打造出甜美不乏个性的特点。还独特的采用了后襟系带设计,富有满满的设计感,再结合简约宽松的廓形,打造出时尚休闲的穿搭风格。"} +{"content": "类型#上衣*颜色#白色*风格#简约*风格#清新*图案#线条*衣领型#翻领", "summary": "穿上这款白衬衣,会是耳目一新的感觉,它清丽而不失优雅,端庄而不失纯真,茉莉白色清新的空气一样,让人百看不厌。是那种咋见欢喜,也惊艳的感觉。端庄的小翻领,加上衣襟简约干练的线条"} +{"content": "类型#上衣*风格#清新*图案#刺绣*衣样式#卫衣*衣袖型#灯笼袖", "summary": "法式浪漫小清新的卫衣,真的很适合爱美的小仙女。BRAND的这款灯笼袖卫衣格外的洋气,清新配色的小鹿图案绣花,精美的展现在胸前,增加了十足的俏皮感,穿着凸显出满满的仙气。"} +{"content": "类型#裙*材质#雪纺*风格#性感*图案#刺绣*裙长#连衣裙", "summary": "这条花朵刺绣雪纺连衣裙领口部分采用优雅的系带领口设计,可以根据自己的喜好随意调节,让小性感呼之欲出。精致的花朵刺绣装饰,是整条裙子的亮点所在,精湛的刺绣工艺让花朵看起来栩栩如生,增添了裙子的生动气息。另外裙子选用轻盈的雪纺面料,打造出浪漫飘逸的感觉。"} +{"content": "类型#上衣*版型#宽松*图案#人物*图案#字母*图案#文字*图案#线条*衣样式#风衣*衣领型#翻领", "summary": "宽松廓形的风衣,大大的版型,帅气又有型;简洁精干的翻领设计,修饰线条又凸显气质;光滑抗寒的面料,薄薄一层却可以抵挡春日寒意。背后的人物图案,前身的字母装饰。减龄又显活力。松紧袖口的设计,活动方便又实穿。"} +{"content": "类型#上衣*版型#显瘦*图案#条纹*图案#线条*衣样式#衬衫*衣袖长#长袖", "summary": "基础款的条纹衬衫,圆形的领口修饰精美的颈部线条,使得脸型也变得更加的小巧精致,长袖的设计贴合手臂的线条时,手臂更加的纤细活动自如。修身的版型穿在身上贴合身体线条,选用优质的面料,舒适柔软而又亲肤,适合贴身穿着"} +{"content": "类型#裤*材质#牛仔布*颜色#深蓝色*风格#清新*裤型#直筒裤*裤型#阔腿裤*裤腰型#高腰", "summary": "来自BRAND的牛仔裤采用深蓝色造型,耐脏的同时也不容易哦。高腰的款式结合荷叶边的加持,将身材比例拉长,再多的肉肉也能秀出小蛮腰。直筒阔腿的裤型,遮盖腿部小秘密,轻松显笔直腿型,走路间飘逸自在,既能扮酷又能散发清新哟。"} +{"content": "类型#裤*版型#显瘦*裤长#九分裤*裤型#哈伦裤", "summary": "这条裤子比一般的哈伦裤,来的更加的显瘦。来自于它腰头的褶裥和省道收割,该收的地方收,该放的地方放。上身特别的藏肉,然后来自于带点慵懒味道的宽大裤筒,遮肉,,很显腿直!简直是梨型身材妹子的福音。配合九分的裤长,穿起来更显腿长不说还很帅。"} +{"content": "类型#裙*颜色#粉色*风格#淑女*图案#印花*裙型#百褶*裙腰型#高腰*裙衣门襟#系带", "summary": "过膝至脚踝的长度将双腿都隐藏在裙身当中,再搭配上高腰的设计,展现出曼妙高挑的身材。淡粉色的款式更能衬托出甜美的气质,小巧的印花图案还增添了淑女感,也有减龄的效果。v领搭配系带的设计更显乖巧,风琴百褶的裙摆富有层次,轻盈流动。"} +{"content": "类型#裙*颜色#深蓝色*裙型#蓬蓬裙*裙长#连衣裙", "summary": "采用金属色绣线将棱角分明的星星刻画出来,以毫无规律的姿态摆放在裙身。在和深蓝色基调相融之间,营造出一种夜空繁星的感觉。让这款蓬蓬纱连衣裙看起来梦幻又浪漫,并且给人一种高调轻奢的感觉,穿起来气质又迷人。"} +{"content": "类型#裙*版型#宽松*材质#蕾丝*颜色#粉色*风格#文艺*图案#蕾丝*裙下摆#荷叶边*裙长#连衣裙*裙领型#圆领*裙款式#木耳边", "summary": "飞扬的夏季,少不了蕾丝裙的踪影。这件粉色文艺荷叶边连衣裙,采用简单大方的圆领设计,修饰脖颈,微露一丝性感美。袖口采用宽松舒适的荷叶边,修饰手臂,又迎合整件衣裙的飘逸感。腰间以精巧可爱的木耳边递进,在宽松上又增添一丝格调美。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*风格#清新*图案#印花*衣样式#卫衣*衣款式#连帽", "summary": "BRAND带来的这款印花大贴袋卫衣,衣身采用常见的连帽款式设计,轻松减龄,上身穿出清新少女感。宽松的版型裁剪,舒适不紧勒,带来休闲洒脱的惬意。衣身正面的logo印花点缀,装饰衣身的同时,提升了品牌辨识度。"} +{"content": "类型#裤*版型#宽松*图案#线条*裤型#阔腿裤*裤款式#口袋*裤款式#拉链*裤腰型#高腰", "summary": "珂莱蒂尔高腰阔腿裤采用宽边高腰腰头设计,可完全包裹腰腹部,可隐藏小赘肉,展现纤细腰部线条。拉链顺滑易拉,穿更方便、更轻松。两侧对称斜插口袋设计,可方便插手的同时,还可放置随身携带的小物品,美观又实用。宽松的裤腿设计,可隐藏腿部肉肉,更加凸显腰部曲线。内里走线工整,精致工艺,彰显品质。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#运动*风格#休闲*风格#青春*衣样式#卫衣*衣样式#外套", "summary": "卫衣款裁剪的外套,延续了一贯的休闲运动风,倍显青春的动感减龄范儿。整体宽松的廓形裁剪,营造出了舒适的上身感,富余的空间维度,赋予了衣身修身百搭的实穿属性。看点满满的帽带摆脱了以往的平庸造型感,宽织带点缀的小猫图案,结合的金属环扣,尽显潮酷个性。"} +{"content": "类型#裤*材质#牛仔布*风格#性感", "summary": "3x1的这款牛仔裤采用浅白的牛仔面料为裤身材质,其柔然的手感和细腻的质地,在穿着舒适的同时,透露着清纯甜美的个性气质。除此之外,流畅的裤身剪裁将性感的腿部曲线彰显的淋漓尽致,不失为一款随性出街的必备单品。"} +{"content": "类型#裤*图案#线条*裤长#九分裤*裤型#阔腿裤*裤款式#拼接*裤款式#口袋*裤腰型#高腰", "summary": "这条九分休闲裤选择百搭的色系,可以随性穿搭。高腰的裁剪和阔腿的版型,拉长你的双腿线条,缔造黄金比例,九分的裤腿露出纤细的脚踝,不会使下半身显得臃肿。裤腿处的设计舍弃了单调的侧面单裤缝,而是选择了裤脚两侧拼接,使腿部看起来更加纤细。后侧的一字口袋,提升臀线,塑造臀部凹凸有致。"} +{"content": "类型#裤*材质#羊毛*裤口#卷边", "summary": "选用高品质精细羊毛面料,质地轻薄细腻,贴身丝毫没有。柔韧而不失挺括度,能极好的撑起极简的裤型,上身立体有型。腰部和脚口融入金线织带装饰,打造出吸睛的细节看点,令整体不再单调。卷边的裤脚造型,更有设计感,也能丰富裤身层次,提升上身的时髦度。"} +{"content": "类型#裙*颜色#深蓝色*风格#复古*风格#性感*图案#条纹*图案#复古*图案#撞色*裙长#连衣裙*裙款式#吊带*裙款式#不规则", "summary": "深蓝的连衣裙色调沉稳而不显浮夸,设计师仅以复古的条纹勾勒,加以蓝白的撞色点缀,在丰富细节亮点之余,也更具吸睛抢眼的时髦个性。而吊带的设计,让你肩部的肌肤些微裸露,不经意间散发的性感气息将你迷人的风情演绎的淋漓尽致,搭配上不规则的下摆设计,优雅天成。"} +{"content": "类型#裙*颜色#黑色*裙型#衬衫裙*裙下摆#荷叶边*裙款式#露肩", "summary": "以荷叶边装饰整体裙身,在肩部开口的露肩设计,可谓是点睛之笔。微露肉的设计,给人以甜美温柔的印象。黑色竖纹的元素装饰,更为衬衫裙平添亮点,轻松满足icon多种风格搭配所需。"} +{"content": "类型#裙*版型#宽松*风格#青春*风格#清新*风格#性感*图案#刺绣*裙长#长裙*裙袖长#五分袖*裙衣门襟#系带", "summary": "清新飘逸的长裙设计,融合五分袖款式,突出独特个性的时尚气息。性感的内衬及若隐若现外衫设计,加上精致小心机的胸前刺绣设计,彰显出青春活泼的气质。颈后系带设计,突出个性,彰显特色。宽松舒适的款型设计,张扬出女性轻柔飘逸的气质,加上精致的裁剪工艺,体现出女性独有的柔和美好。"} +{"content": "类型#上衣*材质#棉*材质#牛仔布*衣样式#外套", "summary": "一款柔软耐穿的棉质牛仔外套,独具一格的打褶毛边袖口,更显别致优雅。衣身的明线装饰更是增添了整体的造型。养眼的天蓝色让你在这个春日充满了活力。"} +{"content": "类型#上衣*材质#棉*材质#牛仔布*衣样式#外套", "summary": "牛仔外套经典不败,质量要经得起考验。这款牛仔外套采用全棉牛仔布制作而成,织物坚牢,抗皱性能卓越,而且具有较好的吸湿性和透气性,上身后自然舒适。同时棉料为主的牛仔布,防也不错,非常适合应对的早春温度。"} +{"content": "类型#上衣*材质#棉*材质#牛仔布*衣样式#外套", "summary": "这款牛仔外套采用了棉质牛仔面料制作而成,棉质牛仔面料拥有非常柔软舒适的手感,质感亲肤透气。而且棉质牛仔面料不会褪色和变形,经久耐穿。"} +{"content": "类型#裙*版型#宽松*材质#牛仔布*风格#文艺*风格#青春*图案#字母*图案#文字*裙型#牛仔裙*裙下摆#荷叶边", "summary": "自在宽松的牛仔裙,利用编制肩带连接,营造出文艺乖巧的少女气质。自然做旧的牛仔面料,配以字母元素点缀,提升造型时尚度更显潮女独有的朝气活力。配以荷叶边点缀,让你在青春俏皮的同时更多几分小女人的妩媚感。"} +{"content": "类型#上衣*颜色#金色*图案#蝴蝶结*衣样式#外套*衣门襟#系带*衣款式#纽扣", "summary": "长款毛呢外套最大的设计亮点在于衣身的金色纽扣设计,非常精致大气。领口蝴蝶结系带设计,尽显优雅美观,衣身外兜设计,增加立体层次感。"} +{"content": "类型#裙*材质#网纱*颜色#纯色*风格#性感*图案#纯色*裙型#百褶*裙长#连衣裙*裙领型#圆领*裙衣门襟#拉链*裙款式#拼接*裙款式#拉链", "summary": "纯色的连衣裙采用了圆领设计,配合上肩部的百褶斗篷设计,层次感强,还可以遮掩肩部的肉肉。采用了隐形拉链的设计,简洁又不会破坏整体的设计感,同时在穿脱时更加方便,领口处的透视网纱拼接更是性感而又不会过于暴露。"} +{"content": "类型#裙*颜色#黑色*风格#高贵*风格#性感*裙下摆#层叠*裙款式#亮丝*裙款式#拼接*裙款式#吊带", "summary": "运用经典的黑色调,打造出性感的女人味,蔓延出骨子里的柔美气息,配合细吊带的娇俏感,发挥出妩媚动人的吸引力。惹眼又吸睛的亮丝面料,诠释出高大上的都市情调,衬出不一样的诱惑气息,加之层叠拼接的裙摆,洋溢出几许浪漫的风情,正好诠释出高贵大方的名媛气场。"} +{"content": "类型#裙*材质#网纱*材质#蕾丝*风格#性感*图案#线条*图案#蕾丝*裙长#连衣裙*裙领型#半高领*裙款式#木耳边*裙款式#收腰", "summary": "这件连衣裙可谓完美演绎了不是人间烟火的高雅气质。木耳边与轻盈网纱带来别致女人味。睫毛蕾丝勾勒的半高领,修饰修长的脖颈线条。花型蕾丝元素,微微露出脖颈肌肤与手臂线条,带着若隐若现的性感。合体的收腰版型勾勒出曼妙的身材曲线。蕾丝元素下摆,行走间更具柔美风情。"} +{"content": "类型#裙*版型#宽松*颜色#红色*颜色#蓝色*颜色#粉色*风格#性感*图案#撞色*裙长#连衣裙*裙领型#v领*裙袖型#荷叶袖*裙款式#绑带", "summary": "一款极具设计感的连衣裙。红色、粉色、蓝色的撞色设计,小v领加绑带的设计和荷叶袖的设计,微微性感中带着一丝乖巧。舒适的材质搭配宽松的版型,上身非常舒适。"} +{"content": "类型#裙*颜色#黄色*图案#蝴蝶结*裙长#连衣裙*裙款式#绑带", "summary": "这款来自百伶妈妈孕妇连衣裙,无论是款式还是品质都无可挑剔。领口以绑带蝴蝶结装饰,丰富了整个裙面尽显甜美浪漫的情怀,而且明黄色的蝴蝶结还能让裙子更耀眼夺目。裙面采用高品质面料制造,手感柔软细滑穿着舒适,此外面料悬垂性较好上身效果超棒。"} +{"content": "类型#上衣*颜色#黑白*风格#青春*图案#卡通*图案#条纹*衣样式#卫衣*衣款式#连帽", "summary": "穿搭中不可或缺的自然是连帽卫衣,光是版型就能体现出女性追求的随性自在感,倘若想在清凉的天气中穿着。这款条纹连帽卫衣相信会适合你,经典的黑白条纹上映衬着卡通图案的造型,带来青春中的童真趣味感,将减龄活泼的个性透露出来,举手投足都会超有气质!"} +{"content": "类型#上衣*风格#清新*风格#性感*图案#蝴蝶结*衣样式#衬衫*衣款式#拼接*衣款式#纽扣*衣款式#荷叶边", "summary": "此款BRAND荷叶边衬衫采用多种材料拼接打造而成,质感细腻柔和,上身效果极佳,温暖而舒适;经典的荷叶边袖子处理,传递出优雅清新的气质;脖颈间的蝴蝶结装饰,呈现出一派甜美可爱的女性气息;精致的纽扣点缀,彰显新颖个性感。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*材质#水洗*风格#复古*风格#民族风*风格#性感*图案#复古*图案#刺绣*裙型#a字*裙型#牛仔裙*裙腰型#高腰*裙领型#v领", "summary": "这款带着水洗的做旧质感的牛仔连衣有着独特的迷人味道。v领的设计,显露出迷人性感的脖颈与锁骨,显瘦显脸小,上身气质又女人味十足哦。从v领延续下来的门襟一直到裙摆,以包扣点缀,加上扭结的高腰设计,时髦感爆棚哦。右边袖口和背后的民族风精致绣花,更让整体展现复古的优雅与恬静。a字裙摆,灵动飘逸。"} +{"content": "类型#裤*版型#显瘦*材质#涤纶*风格#性感*裤型#阔腿裤*裤款式#绑带*裤口#开叉", "summary": "一款舒适有型的涤纶阔腿裤;富有心机感的腰部,将自然打褶与同色系绑带相结合,系上则贴合腰线并收腹显瘦,为纤腰的展现赋予了柔美的韵味;裤脚处的开叉设计,裸露美腿并凸显性感的魅力,使自信的漫步间洋溢着个性洒脱的时髦范。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#蕾丝*图案#蕾丝*衣样式#外套*衣门襟#无扣", "summary": "这款蕾丝外套,采用宽松版型,完美包容各种身材。显瘦不臃肿,精选优质蕾丝料,柔软亲肤,穿着舒适。蕾丝花纹装饰全身,温柔细腻,彰显优雅女人味。无扣设计,方便穿着之余,轻松打造个人气场,走路带风,整体时尚显气质。"} +{"content": "类型#裙*版型#显瘦*风格#性感*图案#线条*裙型#抹胸裙*裙型#大裙摆*裙腰型#高腰*裙长#连衣裙*裙款式#波浪", "summary": "性感的抹胸设计,加上修身的版型,使得连衣裙能很好的展现出女性优美的上身线条,增添几分妩媚感。高腰大裙摆设计,使连衣裙上身效果十分显腿长,让女性的身材更加完美。裙身的波浪纹设计,如的浪花,灵动而又美丽。"} +{"content": "类型#裙*图案#渐变*裙下摆#荷叶边*裙长#连衣裙*裙袖长#长袖*裙款式#绑带*裙款式#收腰", "summary": "这款连衣裙将精致的品位展现的淋漓尽致,衣襟,长袖及裙摆由荷叶边细心装点,打造出动感的飘逸,上身后随着举止舞动,与众不同。收腰的绑带完美将腰身展现,体态尽显。渐变的色系使得裙装变得有光泽,灵气十足。"} +{"content": "类型#裙*颜色#白色*颜色#红色*风格#复古*风格#青春*风格#清新*图案#蝴蝶结*图案#复古*图案#线条*图案#印花*裙腰型#高腰*裙长#短裙", "summary": "上衣铺满了红色印花图案,且图案多样,丰富靓丽,清新优雅。红色蝴蝶结,在优雅的基础上增加了一丝俏皮可爱,更显青春活泼,喇叭袖口,繁复多样,显出复古基调,时尚大方。白色高腰短裙,拉长腿部线条,让身材比例更加完美,整个人青春优雅。"} +{"content": "类型#裙*材质#蚕丝*材质#蕾丝*图案#蕾丝*裙长#连衣裙*裙款式#勾花镂空", "summary": "连衣裙,将浪漫的镂空与蕾丝相如何是女神,衣橱不可缺少的单品。经典的白,打造的薄纱真丝裙,让你用最轻松的方式展现优雅的气质。"} +{"content": "类型#上衣*图案#印花*图案#撞色*衣样式#衬衫*衣袖长#长袖", "summary": "基础的长袖衬衫设计,是以简单的直筒版型设计,让你穿着多一点舒适大气的感觉。并且个性的细碎印花设计,配上时尚的撞色设计,让视觉多一点清爽效果。而顺滑的真丝面料,可以为你的穿着,带来多一点的亲肤舒适效果。"} +{"content": "类型#裙*材质#牛仔布*风格#复古*风格#简约*图案#复古*裙型#牛仔裙*裙下摆#开叉*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拉链", "summary": "这款帅气的牛仔半身裙,采用了时尚的金属外拉链,简约实用,还给裙身增添了一份视觉层次。经典的高腰版型,视觉上拉长腿部比例。裙身微磨白工艺,展现了几分怀旧复古的韵味。开叉下摆,美观又个性,使行走更加舒适自由。"} +{"content": "类型#上衣*风格#日系*风格#简约*风格#工装*衣样式#卫衣*衣款式#口袋*衣款式#抽绳*衣款式#抽褶*衣款式#连帽", "summary": "外穿内搭皆可的廓形卫衣,采用简约日系工装风格,前片大口袋多包袋设计,可随身携带多个小物品。搭配个性的抽绳连帽,稍稍提高领口设计,具有防风保暖的作用。帽子和袖子融入褶皱设计,让版型更加立体有型。"} +{"content": "类型#上衣*材质#牛仔布*风格#街头*风格#运动*风格#潮*衣样式#卫衣*衣款式#破洞", "summary": "当破洞式的潮流从牛仔裤延伸到卫衣,它的潮流毫不逊色于破洞牛仔裤。这款运动卫衣运用破坏性的设计手法,在手肘处剪开两道口子,营造出感,让原本沉闷的卫衣增添了些许街头的感,让你潮的。"} +{"content": "类型#上衣*版型#显瘦*风格#简约*风格#朋克*衣样式#外套*衣样式#西装*衣款式#腰带", "summary": "都市女性必备的时尚单品就是西装外套了,利落的裁剪简约而又大气,挺括的面料质地更加凸显气场;微微修身的阔形完美的展露出优美的身姿,及臀的长度,展露出修长的双腿,倍显高挑,加上独特的圆环打孔腰带装饰,增添几分不羁的朋克范。"} +{"content": "类型#裙*版型#显瘦*版型#立体剪裁*材质#雪纺*图案#碎花*图案#线条*裙下摆#开叉*裙衣门襟#系带*裙款式#不规则*裙款式#收腰", "summary": "气质名媛碎花雪纺,透气舒爽。不规则裙型设计,修身有型,飘逸的裙摆落落大方。贴心的系带收腰又能凸显身材,造型简单洋气。开叉设计,也方便了行走,增添了更棒的时尚看点,上身还多了高挑感。整条裙子都是立体裁剪线条,版型非常流畅。"} +{"content": "类型#上衣*材质#针织*材质#网纱*颜色#纯色*风格#潮*风格#性感*图案#纯色*图案#拼色*图案#线条*衣样式#开衫*衣领型#v领*衣款式#拼接", "summary": "这款针织开衫,区别于以往纯色的开衫,采用独特的拼色设计,潮流感十足。大大的v领设计,拉长了颈部线条,非常显脸小。袖口一圈网纱拼接,给人以朦胧感,随着风随意摆动,性感迷人。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*颜色#深色*风格#街头*图案#字母*图案#条纹*图案#文字*裤型#直筒裤*裤型#阔腿裤", "summary": "sibyl以显瘦的深色基调打造的这款裤子,整体采用了直筒的阔腿裤剪裁,带来较为宽松且舒适的穿着效果。设计师为这款裤子的侧边点缀了红白条纹,配合字母的提花效果,显得炫酷且出彩,是充满了街头气息的裤装穿着之选。"} +{"content": "类型#上衣*风格#复古*风格#潮*图案#文字*图案#复古*图案#撞色*衣样式#卫衣", "summary": "由具有独树一帜设计风格的杜嘉班纳品牌出品的卫衣,采用经典人物形象点缀;以复古涂鸦元素为背景,轻松塑造出帅气新潮的型男造型;加之辅以撞色数字标识,显示出独特的年代感。"} +{"content": "类型#裤*材质#牛仔布*材质#水洗*风格#复古*风格#简约*图案#复古*图案#色块*裤型#直筒裤", "summary": "来自BRAND的这款牛仔裤,采用舒适的棉弹牛仔布料制作,柔软亲肤,穿着舒适;经典简约的直筒裁剪方式,勾勒出男士的硬朗身形;经过做旧的水洗技术处理之后,呈现的深浅色块显得独特而又个性,既营造出复古不羁的质感,又带来了率性时髦的穿搭体验。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*风格#复古*图案#格子*图案#复古*裙型#a字*裙下摆#花边*裙领型#立领", "summary": "一款假两件格子a字裙,假两件的设计,更显新颖别致。格子的设计,更添几分复古气息,花边立领的设计,甜美又灵动。宽松的版型设计,慵懒随性,同时起到了很好的遮肉显瘦的效果。"} +{"content": "类型#上衣*颜色#黑色*颜色#灰色*图案#印花*衣样式#卫衣*衣款式#连帽", "summary": "BRAND带来的这款连帽卫衣,以黑色为基调,就连前胸与后背的印花,也选用哑光黑与灰色妆点,营造风范。本次将三角进行了立体化处理,前胸与后背遥相呼应,彰显品牌一贯的lessismore的设计理念。"} +{"content": "类型#上衣*材质#针织*衣样式#开衫*衣门襟#拉链*衣款式#拼接*衣款式#拉链", "summary": "采用弹力的针织面料,柔软舒适松紧适宜。拉链开衫设计,方便穿脱时尚前卫。肩部网面拼接,透气吸汗。腰身两侧可开拉链设计,内里为网面拼接和肩部相互呼应,有整体感,造型多变。后颈部用心的小细节设计,可调节式织带显独特个性。"} +{"content": "类型#上衣*版型#宽松*颜色#黑白*颜色#绿色*风格#简约*风格#潮*图案#字母*图案#文字*图案#印花*衣样式#卫衣*衣款式#连帽", "summary": "自带潮流个性的连帽卫衣可以说是人人必备,所以炉品这次满足需求。出了这么一款连帽卫衣,采用宽松版型,让其穿着舒适的同时更具随性潮范儿。然后采用基础的黑白灰三款底色以及少见的豆绿色,轻松适应多样风格穿搭。再加上胸前的字母印花,简约不失个性,轻松穿着独特潮范儿。"} +{"content": "类型#上衣*风格#民族风*图案#线条*衣样式#外套*衣领型#圆领", "summary": "经典的圆领设计,衬托脸部线条,民族花纹织带装点,散发别样的异域气息,在设计过程中。通过民族风图案全新设计,在袖管、衣襟进行连贯的点缀。串联起来的线条更显外套的利练之外,也赋予其独特的韵味,雅致而典雅的风格,让你在对自己的全新探索中收获更完美的自己。"} +{"content": "类型#上衣*版型#显瘦*颜色#深色*风格#休闲*图案#刺绣*衣样式#卫衣*衣领型#圆领*衣款式#螺纹", "summary": "一件卫衣,让你舒适度过微凉的春天。这件圆领卫衣,螺纹圆领,照顾到了穿着的舒适性,深色系配色更显瘦有质感。挚爱的立体花鸟刺绣,精致的设计感,更显时尚品味。休闲的卫衣版型,适合有个性的你穿着!"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*风格#复古*风格#性感*图案#复古*图案#线条*裤款式#口袋*裤腰型#高腰*裤口#开叉*裤口#微喇裤", "summary": "这是一款修身版型的牛仔裤,流畅的裁剪线条,打造出的开衩设计,带有轻柔飘逸的质感,让你在帅气与优雅中随意的切换,还不失性感魅力。经典的喇叭裤造型,演绎复古时尚。高腰的设计,凸显腰线的位置,视觉上秒变大长腿。对称性的斜插口袋,方便放置物品。"} +{"content": "类型#裤*材质#棉*材质#牛仔布*材质#混纺*裤款式#破洞", "summary": "设计很赞的一款磨白破洞牛仔裤,选用棉混纺面料制作而成,呈现出帅气有型的上身效果,走在街上格外的亮眼。个性的磨白工艺搭配破洞设计,带给你非同寻常的穿着体验。"} +{"content": "类型#裤*版型#宽松*材质#混纺*裤型#阔腿裤", "summary": "阔腿裤腿视觉上有着一气呵成的流畅感,留有宽松的余量,行走间自在如风。高密实的棉涤混纺面料,布料平整挺括度好,穿着时自然板正有型。立身剪裁上身收腰提臀,修饰身型的同时不会显胯宽。带有几分时光的,百搭不挑人。"} +{"content": "类型#裤*材质#牛仔布*颜色#浅蓝色*图案#线条*裤长#九分裤*裤型#直筒裤", "summary": "这款极具帅气男友力的牛仔裤,以浅蓝色调渲染,更加的百搭时尚,同时展现了青春活力感。直筒的裤型设计结合九分裤长剪裁,能够更好的修饰腿部的不足。拉长腿部线条的比例,显露出大长腿,补丁磨旧的设计,增添设计看点,展现帅气个性。"} +{"content": "类型#上衣*材质#雪纺*颜色#黑色*颜色#黄色*风格#休闲*图案#印花*衣样式#衬衫*衣领型#立领*衣款式#绑带*衣款式#不规则", "summary": "一件兼具休闲风和甜美气质的雪纺衬衫。明亮的黄色象征着少女的满满朝气,加上黑色的不规则印花星星点点地分布在垂坠感极好的面料上,顿时蔓延出几分慵懒风情。领口的部分别出心裁,把绑带和立领结合在一起,简直仙气十足!"} +{"content": "类型#裤*版型#宽松*材质#牛仔布*颜色#黑色*风格#休闲*图案#线条*裤长#短裤*裤型#阔腿裤*裤腰型#高腰", "summary": "这款短裤采用黑色的牛仔风格,展现出自然随性的休闲韵味,也流露出女性内心的优雅与大气。宽松的阔腿裤型,将双腿的线条修饰的更加纤瘦有型。微微翻卷的裤边,流露出强烈的个性色彩。自然的高腰版型,将双腿拉伸的更加修长,从而打造高挑动人的身材曲线。"} +{"content": "类型#裤*版型#显瘦*裤型#直筒裤*裤款式#纽扣*裤腰型#高腰*裤口#微喇裤", "summary": "都是拖地的直筒裤型,但这条是有几长度的微喇叭高腰,腰围有两个纽扣,可以把小腹收起来,遮肉显瘦不挑身材。中高腰的版型穿着不会束缚,布料是西裤的那种,春夏秋冬都可以穿,不会热而且保暖,两种颜色,带着时装又冷酷的感觉。"} +{"content": "类型#上衣*版型#显瘦*风格#休闲*图案#条纹*图案#线条*衣样式#衬衫*衣款式#拼接*衣款式#腰带*衣款式#不对称", "summary": "这一款带有设计感的衬衣,穿在身上凸显女性的个性和时尚感。它采用了修身的版型,剪裁得体利落,能够贴合身体线条,带来非常合身的穿着效果。衣身采用条纹的拼接,粗细横竖条纹,搭配不对称的下摆,带来时髦的美式休闲风,而且方便凹造型。腰部搭配腰带,凸显腰身。"} +{"content": "类型#裙*版型#显瘦*材质#蕾丝*图案#蕾丝*裙型#鱼尾裙*裙下摆#荷叶边*裙长#连衣裙*裙袖型#喇叭袖*裙款式#拼接", "summary": "精美的蕾丝连衣裙拼接了双层喇叭袖,富有立体的层次美感。举手投足间也给人一抹灵动性,带有甜美减龄的气息,很有时尚感。同时,其鱼尾摆的设计能勾勒出美好的身形曲线,女人味十足。再加上,拼接的荷叶边使得裙身更丰富,有飘逸浪漫的感觉。而整款蕾丝连衣裙在穿着上很显瘦,又衬气质。"} +{"content": "类型#上衣*版型#宽松*颜色#黑色*风格#性感*衣样式#衬衫*衣领型#立领", "summary": "纯黑色颜色使用的这一件较为宽松版型设计的衬衫最大的设计亮点在于衣身上面采用的立领款式的设计哦,这样的款式的设计使得整一件衬衫给人一种很时髦很有个性感的气息,让人一眼看过去就很是喜欢。"} +{"content": "类型#上衣*图案#撞色*衣样式#衬衫*衣样式#外套*衣袖长#长袖*衣款式#拼接*衣款式#口袋*衣款式#纽扣", "summary": "黯哑沉稳的配色有素净的感觉,撞色为视觉上提升亮点。面料的拼接营造假两件的效果,像是长袖衬衫外套着背心,富有层次感,立体的口袋与纽扣设计提炼假两件的细节,看上去更加逼真。背心部分的面料挺括感强,使着装干练挺拔。"} +{"content": "类型#裤*版型#宽松*风格#复古*风格#简约*风格#休闲*图案#复古*图案#印花*裤长#短裤", "summary": "BRAND的这款短裤以极具有民族特色的波西米亚风印花为主,复古典雅,轻松凸显出优雅的气质和出众的时尚品味,经典的腰头设计,舒适亲肤。简约而不,大气的宽松版型,包容性很好,穿起来不仅很显轻盈。还洋溢着轻松休闲的气息。"} +{"content": "类型#裙*版型#立体剪裁*风格#简约*图案#线条*裙型#大裙摆*裙衣门襟#拉链*裙款式#口袋*裙款式#拉链*裙款式#收腰", "summary": "简约又气质大方的onepiece~收腰设计可以很好的勾勒出腰部线条大大的裙摆侧面有设计口袋用料足足的立体剪裁细细品味也不乏小细节感背后的拉链设计细节感很好"} +{"content": "类型#上衣*版型#宽松*风格#复古*风格#休闲*图案#复古*衣样式#衬衫*衣样式#外套*衣款式#口袋", "summary": "宽松休闲的bf款型,可以当作衬衫穿着更可以是一件开春小外套。缝接处用明线装点,增加细节,视觉上显得格外生动有趣。袖口扣子设计,俏皮又不失个性,结合胸前口袋的装饰,率性利落,带出浓郁复古的中性气息,帅气又拉风!"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "以淡淡的天蓝色打造的这款连衣裙,整体采用了长款的剪裁设计,配合飘逸的薄纱,显得较为灵动且迷人。裙子为四分袖的剪裁,结合朦胧的蕾丝花纹,呈现出较为精美且大气的穿着美感,是非常出彩的优质裙装单品。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "蕾丝是女人味的至佳代表元素,没有女人不爱蕾丝的,对于蕾丝连衣裙更是有无数丽人追捧和热爱着,它制造出来的气质——精致而优雅。这款连衣裙拿到手一眼就能看出其精美以及品质感。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "这款两件套连衣裙,一眼就能看到清晰的lace,视觉上有凹凸的感觉。几种不同的蕾丝花纹出现在同一件裙子上,充满新鲜感和设计撞击,孕期可以穿,产后哺乳期也方便,上身雅致有腔调,带一点俏皮的女人味。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "性感美腻的一款连衣裙,见到的第一眼就想把所有美好的形容词都用在它身上。重工水溶蕾丝裙,大朵大朵的花型,立体感超强,有筋骨,有挺度,柔韧,自然溢出女人香气,非常吸引人,非常惊艳眼球!"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "一袭优雅大气的蕾丝连衣裙!蕾丝能穿出高品质感,真心不简单!专门到大工厂ding制而来的,优美大气的蕾丝花纹,不同于市面上普通常见的那种花纹,这款很有特色!有质感有风骨,耐品耐看~小立领,柔美的睫毛边,肌肤,风雅撩人。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "蕾丝连衣裙对于女生的吸引更是加倍的,繁复精美的蕾丝天生就拥有着优雅和仙气,这种气质是深入骨髓也是无可复制的,它也似乎充满着魔力,勾住人们的眼球让所有人为之倾倒。加之蕾丝面料触感细腻光滑,更加衬托出女人独特的个人魅力。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "温柔的蕾丝连衣裙不失为小女生的必备,既能让你变得优雅十足,又充满着少女感。衣身采用淡雅的,整体透着一股子高贵典雅的感觉,瞬间提升气质。通透的蕾丝面料,为服装增添了神秘色彩,带来朦朦胧胧的浪漫魅力。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "这件时尚的连衣裙,面料采用的是蕾丝绣,工艺极其繁琐,精致又优雅。袖口与领口的睫毛边细节,增添灵动感,时尚而且不失浪漫。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "精美又迷人的蕾丝钩花,悄悄地盛开在那迷人的连衣裙之上,既有着迷人得视觉效果,又能让的宝贝展现出十足的公主范,确实很美妙哦!单单看那一抹蕾丝钩花,就展现出了熟精湛的工艺,既能够凸显不凡气质,又能缔造独一无二的美感。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "重工定织的蕾丝面料,不同的碰撞,真的让人惊艳,特别清晰的花型,片片花朵风情相连,浪漫中带奢...花纹立体精致,肌理细腻,手感柔软干爽。shou先从面料材质把确保这款蕾丝连衣裙的高品质感。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "春天到了。仙女们都换上了美美的连衣裙,经典充满魅力的蕾丝元素怎么可以少呢,梦幻精致的水溶蕾丝,带有非常甜美的减龄皱褶裙摆,满满少女心。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "美好的春夏季节,可以用一款美美的蕾丝裙来应景,它会穿出你的精致与优雅,让你美得独特与别致。这款裙子上的蕾丝,具有很好的肌理感与挺括度,层次立体丰富,仿佛雕刻上去的一样,蕾丝的存在,让连衣裙别有风情与韵味。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "唯美动人的连衣裙,穿在身上总是能够彰显出迷人的视觉效果,优雅大气个性别致,造就出了非凡的气质,显得魅力非凡。配以精美大气的蕾丝钩花设计,随时随地凸显出了迷人的视觉效果,彰显出十足的时尚魔力。优雅吸睛的蕾丝钩花还能够彰显出无与伦比的神奇身材,缔造出婀娜的美感。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "蕾丝连衣裙对于女生的吸引更是加倍的,繁复精美的蕾丝,天生拥有着优雅和仙气,这种气质是深入骨髓当中无可复制的,它也似乎充满着魔力,勾住人们的眼球,让所有人为之倾倒。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "作为小女孩喜欢的连衣裙,选用少女气质的樱花粉甜美婉约,是小公主们无法抗拒的色彩,不仅可以衬托宝贝那的脸蛋,还能展现出小女孩独有的俏皮与甜美。浪漫的蕾丝材质,让宝宝化身成为童话中的小公主。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙长#连衣裙", "summary": "这款连衣裙,运用了雾的特色理念,将其打造成朦胧般误入的仙女。蕾丝结合的裙摆,加上纯白色色彩基调,如同般高洁,穿上它,你就是神圣美丽的小仙女。"} +{"content": "类型#裙*版型#显瘦*颜色#黑色*裙长#连衣裙*裙袖长#无袖*裙领型#方领*裙款式#收腰", "summary": "非常素雅的纯黑色连衣裙,低调雅致的色彩显得朴素干净,能够很好地显衬肤色。加持无袖方领的剪裁大气利落,给人以明朗亲和的气质感。纯黑色还自带高贵典雅的格调,配上收腰的伞型裙身更是亮眼出彩。显瘦修身的收腰伞摆蓬松轻盈,让上身效果非常有型。的衣襟细节别致出挑,工整精致的包布圆扣保证了整体视感的和谐精致。"} +{"content": "类型#裙*版型#显瘦*颜色#黑白*图案#条纹*裙型#直筒裙*裙衣长#常规", "summary": "比基础款的条纹羊毛衫,更加时髦有趣!常规厚度,内搭外穿都ok,舒适实穿经典优雅的黑白条,直筒修身版,款式简洁耐看,配上细羊毛纱的高弹力,舒适,裹出曲线身材。"} +{"content": "类型#上衣*颜色#白色*颜色#纯色*颜色#绿色*风格#简约*图案#纯色*衣样式#衬衫", "summary": "此次推荐的这款衬衫,就显得有些简约大方。淡雅的纯色系版型配合整个立体廓形剪裁,使得衬衫带有一份低奢的故事感,很精致亦不失时尚。衬衫拥有抹茶绿与纯净白两款选择,酵洗的白色与的绿色,这两款衬衫的颜色表现力都带有自己的味道,淡雅值得你去细细。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*风格#复古*风格#文艺*风格#知性*风格#清新*图案#复古*裙领型#圆领", "summary": "宽松的版型,穿着遮肉显瘦,自然复古的色调打底,只用看一眼就感觉拥有清新的心情。精致的圆领设计,显得知性文艺。搭配长款的轻薄裙摆设计,走起路来飘逸感十足,仙气满满,突显女性端庄委婉的一面。"} +{"content": "类型#裙*版型#显瘦*图案#撞色*裙型#百褶*裙型#a字*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拉链", "summary": "腰头钉扣的装饰,精致有型大方。后身拉链的设计,穿着舒适方便。青春少女款百褶裙,让你的小公举气质随便锋芒毕露。这款a字型的半身裙,减龄可以打个高分呢。高腰中带有修身效果,让你的身体看点很足,拉长身高视觉感,裙身的彩色纹理带来时兴指数。打造撞色般的效果,视觉上塑造的震动美感。"} +{"content": "类型#裙*版型#显瘦*图案#撞色*裙型#百褶*裙型#a字*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拉链", "summary": "立体绒布暗花的设计,精致美观。隐形侧拉链的设计,穿脱方便。青春少女款百褶裙,让你的小公举气质随便锋芒毕露。这款a字型的半身裙,减龄可以打个高分呢。高腰中带有修身效果,让你的身体看点很足,拉长身高视觉感,裙身的彩色纹理带来时兴指数。打造撞色般的效果,视觉上塑造的震动美感。"} +{"content": "类型#上衣*材质#雪纺*风格#简约*图案#蝴蝶结*图案#线条*衣样式#衬衫*衣门襟#系带", "summary": "这条蝴蝶结衬衫裙选用的雪纺面料,上身非常细腻舒适。领口是简约的衬衫领设计,彰显独立干练的女性气质。还有巧妙的腰间系带设计,可以优化身材比例,拉长腿部线条。裙子下摆设计了双侧开叉,走起路来有若隐若现的感觉,非常轻柔浪漫。"} +{"content": "类型#裙*版型#宽松*版型#立体剪裁*风格#复古*图案#复古*图案#电影*图案#线条*裙长#连衣裙*裙衣门襟#系带", "summary": "想尝试电影里的日式和服,又担心。试试这条简化改良版,日式风复古连衣裙。面料凉爽飘逸,前后都采用放大的设计,延伸颈部线条。沉稳的网格设计,给人古风素雅的感觉。立体剪裁的宽松大袖子,十分有设计感,仿佛自来。加上经典系带,系出优雅自信。"} +{"content": "类型#上衣*图案#线条*图案#印花*衣样式#卫衣*衣款式#螺纹*衣款式#抽绳*衣款式#连帽", "summary": "来看看这件卫衣在设计上采用了经典的抽绳连帽设计,可以根据自己的需求随意的调节造型,并且修饰颈部线条,时尚大方。再加上后背的闪电大logo印花,看上去造型独特,彰显出自身的张扬个性,与浓郁的时尚色彩。并且采用弹力螺纹袖口,舒适贴肤,更加保温。"} +{"content": "类型#上衣*图案#线条*图案#印花*衣样式#卫衣*衣款式#螺纹*衣款式#抽绳*衣款式#连帽", "summary": "本款卫衣在设计上采用了经典的抽绳连帽设计,可以根据需求随意的调节造型,并且修饰颈部线条,显得时尚大方。后背的教堂贱猫印花,恶搞趣味浓郁,凸出自身的时尚品味和自身的张扬个性。弹力螺纹袖口,保证手腕活动自如。精致的袋鼠兜装饰,造型独特,美观实用。"} +{"content": "类型#裤*颜色#纯色*图案#纯色*图案#线条*裤型#阔腿裤", "summary": "纯色的配色不免会让人觉得有些单调,但这款阔腿裤只采用了一种颜色,看上去也丝毫不会显得单调。没有了色彩的交织,不过采用干脆利落剪裁工艺制作的它,亦拥有了流畅线条和挺括阔腿裤型,这样的裤子包容性强大,也非常具有魅力。"} +{"content": "类型#裙*颜色#黑色*颜色#深色*风格#性感*图案#撞色*裙腰型#高腰*裙长#连衣裙*裙款式#拼接*裙款式#腰带", "summary": "这款连衣裙最吸引人的地方就是撞色拼接的设计,不仅打破了深色裙身带来的沉闷感,还点亮了整体的造型。搭配上高腰版型,视觉上又拉伸了身材比例,尽显女性高挑个子。而且点缀上黑色的腰带,还能轻松凸显出性感的小蛮腰,女人味十足。"} +{"content": "类型#裙*裙下摆#荷叶边*裙长#连衣裙*裙款式#勾花镂空", "summary": "极具设计感的连衣裙,后背处采用了镂空处理,在腰线和两边都加了灵动的荷叶边设计,增添裙身的俏皮感,又能适当的修饰下背部的肉肉,展现曼妙身姿曲线。"} +{"content": "类型#上衣*颜色#白色*颜色#金色*图案#线条*衣样式#衬衫*衣领型#v领*衣款式#钉珠*衣款式#亮片", "summary": "衬衫纯净白色调,半开v领自然贴合,修饰迷人的脖颈线条。领子金色边装饰,顺延出长长的垂落。后背及胸前鸟儿图案,配色多彩丰富。亮片钉珠立体装点,轻松提升整体质感。"} +{"content": "类型#裤*版型#宽松*材质#混纺*材质#纤维*风格#街头*风格#简约*风格#休闲*风格#青春*风格#性感*裤口#小脚", "summary": "如今的休闲裤将经典与时尚结合,以束脚的工艺诠释着青春的率性感,同时街头感十足。这款由马克华菲推出弹性休闲束脚裤,采用经典的纤维混纺面料制成,整体的舒适度非常赞,穿着弹力十足。加之简约的小宽松版型的设计,穿着不挑身材,适合更多种身形的型男进行选择。"} +{"content": "类型#裙*材质#蕾丝*风格#性感*图案#蕾丝*裙腰型#高腰*裙长#半身裙*裙衣门襟#拉链*裙款式#拉链", "summary": "蕾丝剪裁,在下摆自然的流出须边,犹如性感的睫毛。下摆的四片式做法,让半裙在行走间充满了灵动的韵律。干净工整的高腰头设计,使得穿着效果更显高挑。侧边的隐形拉链,不露,保持着简洁大方的完整造型。"} +{"content": "类型#裤*材质#牛仔布*裤长#短裤*裤腰型#高腰*裤口#毛边", "summary": "牛仔裤总是女孩子们非常关注的裤子款式之一。高腰的牛仔短裤,可以拉长女性的身材曲线,让自己的双腿变得更完美哦。再利用毛边进行点缀,也可以彰显出自己的随性美感呀,不用担心自己穿着很普通啦。又利用防走光的设计,更可以给自己带来保守气质哦。"} +{"content": "类型#裙*风格#复古*风格#休闲*风格#潮*图案#复古*图案#撞色*裙型#百褶*裙长#连衣裙*裙款式#拼接", "summary": "可以给人眼前一亮的假两件连衣裙,裙身的上半部分为休闲随性的卫衣样式,裙身则采用了灵动飘逸的百褶裙摆拼接,毫不费力就可以穿出新潮时髦的层次感,简直就是懒癌患者们的福利款!优雅浪漫的百褶裙摆还小心机地加入了个性的撞色设计,瞬间让整体造型的色彩感更加地丰富显眼,很有复古的feel,而且还衬得人整个人很有气质,推荐啦~"} +{"content": "类型#裙*材质#蚕丝*颜色#纯色*颜色#深色*图案#纯色*图案#渐变*图案#线条*图案#刺绣", "summary": "亲肤透气的真丝皱面料,手感柔软透气性好,穿着轻盈飘逸。裙身斑驳的渐变染色设计,很适合初春时分,比深色俏丽,比纯色风韵,有着万物苏醒的生命力。的花瓣领口温婉大方,展露柔美颈脖线条。裙身精美刺绣图案,时尚美观。过膝的版型微露脚踝,端庄优雅感立显。"} +{"content": "类型#上衣*版型#宽松*材质#针织*材质#混纺*风格#清新*图案#花色*图案#线条*衣样式#开衫*衣领型#圆领*衣袖型#插肩袖", "summary": "春日里不可或缺的基础款针织开衫,是一直都偏爱的简洁清新样式。细腻亲肤的腈纶混纺针织面料,触感舒适绵糯,扑在肌肤上软软的,没有明显的扎肤感。圆领设计尽显颈部线条。古着感的花色琥珀在门襟上,插肩袖柔化肩部线条,袖管宽松适宜,给手臂留足了空间,舒展间不会有束缚感。"} +{"content": "类型#裙*风格#知性*图案#碎花*图案#风景", "summary": "碎花元素的裙装一直是春夏里最美的时装风景,它有着一种魔力,无论你的多刚强,也会被它甜美温柔的的气息包围着,仿佛世界一下子就变得温柔起来。它有熟女的优雅知性,也不失小女生的甜美俏皮,让你在唯美中度过草长莺飞的漫漫春夏时光,融合飘逸的裙摆设计忍不住雀跃,翩翩起舞。"} +{"content": "类型#上衣*版型#宽松*颜色#白色*颜色#红色*风格#复古*风格#休闲*图案#条纹*图案#复古*衣样式#衬衫*衣领型#翻领", "summary": "衬衫是比较百搭的单品,能够给你增添精神的气质。以经典的白色调为底,加入红色的条纹纹路在其中,具有复古海军风的时尚美感,加上细微的小小翻领设计,让你穿起来更显率性魅力。宽松的版型设计具有休闲时尚的美感。"} +{"content": "类型#裤*版型#宽松*颜色#蓝色*风格#复古*风格#简约*风格#运动*风格#潮*图案#复古*裤款式#口袋*裤款式#松紧带*裤腰型#松紧腰*裤口#小脚", "summary": "蓝色基调带来一种历史复古气息,简约时尚又潮流,加上独特的裁剪,是它看起来个性又魅力十足。口袋设计不仅美观,而且方便携带物品。松紧带松紧方便实用,适合多种体型穿着。特别的板型设计,臀部和大腿部位一般为宽松剪裁,而在小腿至裤脚部分则慢慢收窄,最大特点便是裤脚位置采用束脚设计,运动风和时尚风兼得。线口缝制细密,做工精细,整体看起来潇洒感十足。"} +{"content": "类型#裙*版型#宽松*风格#文艺*裙型#仙女裙*裙领型#娃娃领*裙款式#抽绳*裙款式#收腰", "summary": "一款优雅文艺的仙女裙,浪漫的星星印花布满裙身,穿着灵动飘逸,仿佛就是不被世俗污染的林间仙子;独特翻边娃娃领,甜美可人的造型,让每个女孩子都怦然心动;宽松的版型,即使是身材丰腴的mm也能轻松驾驭;另配有抽绳收腰,轻松勾勒出妙曼身姿。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#复古*图案#复古*衣样式#风衣*衣款式#绑带", "summary": "这款风衣是属于宽松直筒版型,因此上身显瘦又遮肉。与众不同的是,这款风衣的斜襟式衣领和腰间的绑带设计充满名媛复古范,非常的衬托气质。后背的交叉绑带装点为点睛之笔,更显优雅复古。"} +{"content": "类型#裙*颜色#藏蓝色*风格#知性*裙长#连衣裙*裙袖型#公主袖*裙款式#镶钻*裙款式#钉珠*裙款式#抽褶", "summary": "成熟优雅的藏蓝色连衣裙衬托出女性非凡的气质,加上遍布的钉珠与镶钻点缀,彰显出高级华丽的设计美感,吸睛满满。别出心裁的褶皱公主袖设计优雅大方,随着手臂摆动尽显温婉知性的迷人姿态。"} +{"content": "类型#上衣*图案#字母*图案#格子*图案#文字*图案#线条*图案#印花*图案#撞色*衣样式#衬衫*衣领型#立领*衣款式#纽扣", "summary": "这件印花立领衬衫,在洁白的底色上以撞色字母印花装饰,各字母之间以线条格子图案,使衣衫极具时尚大方的感觉,直筒的版型设计,修饰身材的同时,也能有一定的包容性,能有效的遮掩小肚上的赘肉,后背领口处开叉设计。以纽扣开合,方便穿脱。"} +{"content": "类型#裙*材质#蕾丝*颜色#白色*颜色#浅蓝色*风格#复古*风格#宫廷*图案#格子*图案#蝴蝶结*图案#复古*图案#蕾丝*裙长#连衣裙", "summary": "来自西班牙的的连衣裙,延续了欧洲宫廷的复古风格,用暗红色与浅蓝色的格纹相衬,再配以前片的白色蕾丝和优雅的蝴蝶结点缀,让古典主义与现代时尚相互碰撞,凸显出小女孩俏皮的气息,造就了优雅的小公主形象。"} +{"content": "类型#上衣*颜色#金色*风格#性感*衣样式#西装*衣领型#一字领*衣门襟#三粒扣*衣款式#吊带", "summary": "挺括有型的西装版型,萦绕出一股凌厉干练的职业范儿。为了避免强势而显得,巧妙的加入了金色三粒扣,甜美可爱,软化了硬朗外形。经典的一字肩剪裁,彰显柔美锁骨,性感撩人。加上吊带的设计,更是凸显圆滑的肩部曲线。尽显优雅气质。"} +{"content": "类型#上衣*版型#显瘦*材质#棉*材质#蚕丝*衣样式#衬衫", "summary": "衬衫采用棉和桑蚕丝组合的面料材质制作,具有吸湿性和保湿性好以及亲肤透气和手感柔软光滑等完美优点。能给穿者带来每时每刻都是好心情的美感体验哟。直筒的版型设计,不仅能给人带来满满的修身显瘦视觉冲击感。同时又是完全不挑身材不挑人穿着的哟。"} +{"content": "类型#上衣*版型#宽松*颜色#白色*颜色#红色*风格#街头*风格#复古*风格#简约*风格#休闲*图案#复古*衣样式#外套*衣门襟#拉链*衣款式#拉链", "summary": "这一季度BRAND的秋冬新款外套,依旧融合经典三道杠系列,打造十足复古优雅魅力。这款外套配色上选择了热情又带有复古韵味的红色暗纹设计,碰撞出优雅简约时尚质感,白色三道杠点缀在袖部,更为外套添加经典复古风潮。白色的拉链与胸前的三叶草logo互相呼应,整体配色不会过多,又不至于单调,是足够日常的休闲造型。宽松开襟造型搭配柔软保暖的舒适质感,实属轻松满足秋冬需要的街头造型。"} +{"content": "类型#上衣*材质#牛仔布*颜色#白色*风格#青春*图案#线条*衣样式#风衣*衣样式#外套*衣款式#口袋*衣款式#收腰", "summary": "特别适合春天的一件长款风衣外套,采用到了自带减龄感的牛仔材质,与众不同的白色加上双插口袋更是带来了挡不住的青春感,有着收腰束带的装饰更加凸显纤细的腰身线条。"} +{"content": "类型#上衣*材质#网纱*颜色#白色*衣样式#衬衫*衣领型#翻领*衣门襟#单排扣*衣款式#腰带", "summary": "白色衬衫连衣裙配上抹胸网纱裙子,两件套叠穿的方式很时髦。腰间配上腰带的设计带来时髦的气息,加上网纱下摆与白色内衬凸显丰富的层次效果,透露出白皙的肌肤,很迷人。前面配上单排扣点缀其间,开合简单,穿脱起来很方便。精致的衬衫翻领打造出小巧别致的脸蛋。"} +{"content": "类型#上衣*风格#街头*风格#复古*风格#休闲*图案#复古*图案#印花*衣样式#卫衣*衣领型#圆领*衣袖型#落肩袖", "summary": "经典的圆领设计时尚百搭,手臂两侧经典印花复古有个性。明快亮丽的色调轻松打破了沉闷,带来一丝自由愉悦的气氛。胸前的英文印花点缀着衣身,凸显出满满年轻活力氛围,释放出年轻人任性追逐时尚的态度。休闲时尚落肩袖穿着舒适,印花与卫衣的结合在视觉上呈现出时尚街头风范。"} +{"content": "类型#裙*风格#复古*风格#文艺*风格#简约*风格#清新*图案#碎花*图案#复古*图案#线条*裙型#大裙摆*裙下摆#垂坠*裙长#连衣裙*裙袖长#无袖*裙款式#收腰", "summary": "通常度假风的连衣裙总是有小仙女的,这款连衣裙更是融入了小碎花,清新之余也赋有一种文艺风,围裹样式的v型领口更是带有复古气息,简约的无袖设计还方便了手臂活动,收腰设计配以垂顺自然的大裙摆,修饰腿部线条同时也更具穿搭的灵动美感。"} +{"content": "类型#裙*版型#显瘦*颜色#粉色*风格#清新*风格#性感*裙型#直筒裙*裙长#连衣裙*裙领型#一字领", "summary": "春天来了,夏天还远吗?是时候准备一条美腻的连衣裙啦。粉色连衣裙是很多美少女一样就会爱上的单品,浅粉色不会给人过于甜腻的感觉,却又透着女孩的清新浪漫味道,很显气质。一字领的设计,大胆的露出肩颈曲线,带着几分性感,好身材显露无疑。直筒版型,肉肉女孩也可以穿哦,非常显瘦。"} +{"content": "类型#裤*材质#棉*材质#牛仔布*风格#性感*图案#字母*图案#文字*图案#拼色*裤型#直筒裤*裤款式#绑带*裤口#开叉", "summary": "一款舒适透气的棉质牛仔直筒裤;富有趣味感的裤身处,将拼色的字母绑带交叉而系,新颖别致中透着个性的韵味,为时髦的穿搭增添了俏皮的活力范;侧边对称的大开叉设计,轻松裸露出诱人的美腿,使自信的漫步中洋溢着性感帅气的风范。"} +{"content": "类型#裤*材质#纤维*裤长#九分裤*裤型#哈伦裤", "summary": "使用聚酯纤维为主的材质制作,手感舒适柔软,亲和肌肤,加入少量的粘胶纤维,缔造出色的肌肤穿着效果。腰部腰袢设计,可搭配腰带穿着,增添时尚气息。九分的裤长结合哈伦的版型,穿着舒适美观,带来简洁利落穿着感受。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#简约*衣样式#外套*衣领型#圆领*衣袖长#无袖", "summary": "圆领设计,时尚大方,优雅舒适有型。简约无袖的设计,尽显干练利落,简洁圆领设计,剪裁简洁大气,衬托出一丝禁欲气息,轻松优雅,宽松自在,遮肉显瘦,舒适还保暖,单穿或者搭配大衣外套都是经典,优质面料,舒适透气保暖,不起球不易变形,耐穿手感舒适顺畅。"} +{"content": "类型#裤*材质#混纺*风格#运动*风格#休闲*图案#刺绣*裤款式#抽褶", "summary": "运用了带有弹性的面料,还有着抗褶皱的功能,因为其挺括性,可以衬托出男士修长的腿部轮廓。同时,轻薄的款型带着混纺的材质制作出来,更适合于休闲运动的场合,笔挺的造型衬托出腿部的修长感觉。裤兜处的精致刺绣更是彰显出男士独有的魅力。"} +{"content": "类型#上衣*版型#显瘦*风格#运动*衣样式#卫衣*衣款式#连帽", "summary": "刚刚步入春季,温度也随之上升,这时候就需要保暖耍酷兼具的服装来衬托自己,卫衣就是最理想的首选装备。这款具有运动气息的卫衣,采用连帽设计,既能防风保暖,还能将自己率性的一面展现出来;修身的版型,除了能完美的构造身材,还能轻松搭配各种服装,那时尚不仅仅展现一面。"} +{"content": "类型#上衣*材质#蚕丝*材质#网纱*材质#蕾丝*颜色#肉色*风格#性感*图案#蕾丝*衣样式#衬衫*衣领型#v领*衣款式#拼接", "summary": "这款衬衫在布料的选择上下了一定功夫,真丝双绉布料,清透飘逸。双面细微而均匀的纹理,正是因为纹理的存在,使得布料带有一点哑光的质感,素净典雅。深v领口,搭配上门襟处的精致蕾丝,性感迷人,女人味十足。门襟处拼接的肉色密网纱,柔软亲肤,又提供了很好的私密感。袖口处的拼接蕾丝给人一种小女生的可爱感。"} +{"content": "类型#上衣*材质#蚕丝*材质#网纱*材质#蕾丝*颜色#肉色*风格#性感*图案#蕾丝*衣样式#衬衫*衣领型#v领*衣款式#拼接", "summary": "真丝材质的衬衫,结合精美绝伦的设计感,更能凸显造型感。性感的v领造型,修饰颈部曲线更纤长,更骨感。门襟处拼接肉色网纱,若隐若现的性感意味,更具致命诱惑力。领口袖口处的蕾丝拼接,充满律动美感,女人味儿十足!"} +{"content": "类型#裙*材质#羊毛*风格#文艺*裙款式#不规则", "summary": "这款裙子非常的精致,选用的是欧洲的面料,很细腻柔软的全羊毛,手感很舒适。裙摆不规则的设计也是格外的有味道,举手投足更有飘逸灵动的韵味,俏皮又减龄。文艺有质感的深灰色,行走时露出的小腿白皙又修长。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙型#a字", "summary": "版型的处理上是采用了经典的a字版型设计,通过精湛的剪裁工艺,彰显出高级的品质感。裙身表面的蕾丝花纹,精致美观,上身穿着很显气质。内衬里布的贴心设计,追求时髦设计之余更加注重贴心细节,安全防走光,兼顾细节感与设计感。"} +{"content": "类型#裙*材质#蕾丝*图案#蕾丝*裙型#a字", "summary": "这款蕾丝裙的设计亮点在于蕾丝面料以及a字裙摆的设计,采用优质蕾丝面料提升了裙子的整体品质,同时a字版型修饰身材,遮住赘肉,减龄的设计,灵动俏皮,凸显甜美气息,是非常别致的设计亮点。"} +{"content": "类型#上衣*版型#显瘦*风格#街头*风格#复古*风格#文艺*风格#清新*图案#格子*图案#复古*衣样式#衬衫*衣款式#口袋*衣款式#对称", "summary": "这款衬衫采用了清新的格纹风格,经典且不过时,能够彰显文艺复古腔调。前幅搭配了对称的翻盖口袋装饰,又以个性的贴布点缀,使得衬衫不显单调和乏味,更具街头感。衬衫采用的是修身的剪裁设计,能够契合身形轮廓,塑造出挺拔俊朗身姿。"} +{"content": "类型#上衣*颜色#绿色*风格#性感*图案#条纹*图案#线条*衣样式#针织衫*衣袖长#五分袖", "summary": "针织衫采用了绿色的条纹图案,起到视觉上的冲击对比。竖条纹图案能够从视觉上拉伸身材比例,配合心领的设计。展示性感锁骨的同时也可以起到瘦脸的效果,恰到好处的五分袖长,露出白皙修长的手部线条,举手投足之间不失去优雅。"} +{"content": "类型#上衣*材质#棉*风格#简约*图案#撞色*衣样式#衬衫*衣袖长#长袖*衣款式#拼接", "summary": "一款简约的基础款长袖衬衫,版型很时尚也很百搭,整体以简单干净的配色为基调,衣身个性的撞色拼接设计增加了时尚感,显得与众不同,更是能带来一丝丝优雅浪漫的气息。在材质上精选优质的棉面料,穿着舒适透气。"} +{"content": "类型#上衣*版型#显瘦*风格#简约*衣样式#外套*衣门襟#一粒扣*衣款式#收腰", "summary": "帅气中不乏优雅气质的一款外套。精选优质的绵羊真皮面料,质感比较柔软,中和皮衣的硬朗,凸显女性的柔美气质。时尚的翻驳领设计,简约大方,更具造型感。简约的一粒扣收腰版型,修身显瘦尽显曼妙迷人的好身材。"} +{"content": "类型#裤*版型#宽松*材质#棉麻*颜色#米白色*图案#刺绣*图案#撞色*裤型#阔腿裤", "summary": "这条裤子,无论从色彩上还是穿着感上,考虑到季节的特点,从视觉与触觉上给人以清爽与舒适的感受。质地采用柔软又吸湿的棉麻,亲肤又好穿,透气柔软,上身舒适。裤身色彩选用柔和自然的米白色系,与裤前大面积的刺绣工艺撞色,清丽和谐,看起来颜色饱满,明朗大方。宽松的阔腿造型,隐藏小肉肉的同时更加随性。"} +{"content": "类型#上衣*版型#宽松*衣样式#衬衫*衣袖长#长袖", "summary": "宽松的样式,让这款衬衫能够拥有更好的修饰效果,轻松的修饰整体的身形,搭配上细条纹的样式,打造出更为随性的个性魅力,作为整体的设计亮点。超长袖的时尚元素,能够更好的与这款衬衫进行结合,时尚更富有优雅魅力,轻松的让整体更显与众不同。"} +{"content": "类型#裤*材质#棉*风格#简约*风格#青春*风格#清新*图案#印花*裤长#短裤", "summary": "很青春时尚的一款棕榈印花短裤,简约的版型设计,上身特别的时尚个性。清新的印花设计,好穿又舒适。纯棉材质设计,亲肤舒适,透气又清爽。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#蚕丝*材质#蕾丝*风格#清新*风格#性感*图案#刺绣*图案#蕾丝*衣样式#衫*衣袖型#喇叭袖*衣袖型#收口*衣款式#拼接", "summary": "仙气轻薄透视风的蕾丝衫,星星点点的绣花图案有种隐隐约约的含蓄美,不失温油清新小性感。面料精致,上面有一丝丝的blingbling银线镶嵌。的立体真丝绉面料,轻薄丝滑,很飘逸。门襟拼接了一层水溶蕾丝花边精致感满满。宽松的版型藏肉显瘦,任意身材都能驾驭。朦胧的灯笼收口喇叭袖也很加分,甜美迷人,举手投足之间多了一些飘逸感,浪漫唯美。"} +{"content": "类型#裙*版型#宽松*材质#蕾丝*图案#刺绣*图案#蕾丝*裙型#a字*裙型#蓬蓬裙*裙下摆#花边*裙长#连衣裙", "summary": "这款新春连衣裙采用了可爱宽松的a字版型。可以轻松修饰身材上的小缺点。公主一般的蓬蓬裙体现出小女人的可爱精致感。采用了手艺精湛的刺绣工艺打造出高端华美服装品质感。同时还运用了大量的蕾丝元素并将它用花边的形式呈现出来。给人一种高贵典雅的美感。"} +{"content": "类型#裙*图案#条纹*图案#刺绣*裙下摆#荷叶边*裙长#连衣裙", "summary": "连衣裙蓝白条纹,简洁清爽,又很有生动感,美的很实在。靓丽的刺绣花朵点缀蔓延,充满灵气。与条纹来了个碰撞,让视觉充满饱和,也让裙子多了一份浪漫基调。荷叶边,刺绣花朵点缀,增强轮廓感,更显灵动。彰显了女性的魅力。"} +{"content": "类型#上衣*版型#显瘦*风格#通勤*衣样式#马甲*衣长#中长款", "summary": "马甲既是一件可以通勤上班穿,又能日常逛街的拼时髦单品。中长款的设计,这样的长度衬托的比例很好,而且藏肉显瘦,各种身形都很适穿。"} +{"content": "类型#裙*图案#卡通*裙领型#圆领*裙衣门襟#排扣", "summary": "每个小女孩都有一个童话公主梦,充满童真与梦幻的色彩。灯芯绒背心裙,精致可爱的造型,简单的圆领配上背后排扣的点缀,增添细节质感,方便穿脱。衣襟前后的小捏褶设计,端庄大方优雅不刻意。尤其的裙摆出色彩艳丽的卡通贴布绣点缀,尽显天真俏皮的可爱童趣,妥妥的小公主一枚。"} +{"content": "类型#裙*风格#简约*图案#风景*图案#线条*图案#印花*裙长#连衣裙*裙领型#圆领*裙袖型#喇叭袖*裙衣门襟#系带", "summary": "素雅的浅粉色在炎炎夏日给你清爽的感觉。一款别致的连衣裙带你领略美丽风景,简约的圆领系带设计衬托出女性脖颈部线条和小巧的脸型,靓丽印花的点缀散发出浪漫柔美女性风情,喇叭袖凸显温婉而又不失时尚,裙摆设计行走间尽显女性自信和魅力。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*风格#性感*衣样式#衬衫*衣领型#v领", "summary": "衬衣控们快看过来,喜欢bf风的可不要错过这款咯。特意打造成宽松的款式,可以完美遮掩你的小肥肉,显瘦效果佳。让人眼前一亮的就是v领领口的设计,展现了你的脖颈曲线美之外更透露了小性感,迷人又可爱,同时非常百搭,适合不同场合的穿搭。"} +{"content": "类型#裙*材质#牛仔布*风格#知性*风格#青春*图案#刺绣*裙型#牛仔裙*裙长#半身裙", "summary": "真心美到cry的一款牛仔半裙!一点点知性,一点点优雅感,气质宁静迷人~运用今年大热流行的蝴蝶元素,一改之前硬朗帅气的设计,增添了很多女性化元素,重工蝴蝶刺绣,每只蝴蝶大小不同,栩栩如生,刺绣非常精美!有蝴蝶翩翩起舞的即视感,蝴蝶不是直接绣上去,而是做好之后再车上去,工艺着实够复杂!"} +{"content": "类型#裤*风格#简约*风格#性感*图案#线条*裤长#连体裤*裤款式#口袋*裤口#卷边", "summary": "生机勃勃的春季自然少不了中性连体裤,帅气又不失干练。柔软透气的面料,简约的v领设计拉长脖颈线条,露出纤细锁骨,时尚迷人。卷边设计修饰腿部曲线。胸部的对称口袋设计,起到了优化胸型的作用,增添性感韵味。"} +{"content": "类型#裤*风格#潮*图案#刺绣*裤长#九分裤*裤款式#破洞*裤腰型#高腰*裤口#翻折", "summary": "花朵刺绣这一时尚元素可是说是在各大秀场频频出现,稍加点缀就很有时尚的feel,这款裤子就采取了花朵刺绣,裤腿上的花朵刺绣栩栩如生,漂亮别致,很有春天的感觉呢。除了刺绣,还有潮流的破洞设计,个性前卫。裤脚做了翻边设计,九分裤的裤长,露出脚踝,高腰的设计,视觉上拉高腰线,很显高挑哦。"} +{"content": "类型#上衣*风格#性感*图案#线条*衣样式#西装*衣领型#v领*衣领型#翻领*衣款式#腰带", "summary": "西装式的大翻领从视觉上很好的拉长了颈部线条,凸显出女性干练大气的魅力,深v领完美得展现出女性饱满的上围,散发着十足的女人味。金属搭扣的腰带装饰很好的隐藏了腹部上的赘肉,勾勒出女性纤细迷人的小蛮腰,金属搭扣避免了整体造型过于单调,瞬间提升了整体气质。下摆开叉设计更显得性感妩媚。"} +{"content": "类型#裤*版型#宽松*颜色#纯色*风格#运动*风格#休闲*图案#纯色*裤款式#绑带", "summary": "这条裤子,采用简单的宽松款式,展示出一种休闲的运动风,穿起来给人一种舒适有型的立体感;长款的设计拉长了腿部的比例,简单的纯色色系,色泽靓丽又大方,给人不一样的视觉;腰间绑带的搭配,打破了常规设计,使用起来收缩自由,实用便捷;整体衬托出满满的轻松休闲。"} +{"content": "类型#裙*版型#宽松*风格#性感*裙型#鱼尾裙*裙领型#v领*裙款式#收腰", "summary": "这款收腰设计的鱼尾裙,让你的小蛮腰完美的展现,穿起来绝对吸睛。v领秀出性感锁骨的同时缩短了上半身的视觉长度,鱼尾裙的魅力在于它宽松的下摆,展现女生优雅和婀娜的体态。"} +{"content": "类型#裙*材质#棉*材质#网纱*颜色#白色*颜色#藏蓝色*风格#清新*图案#条纹*裙下摆#层叠*裙款式#拼接", "summary": "精选纯棉弹力面料,手感细腻舒爽,吸汗透气性能,特别适宜夏季穿着。采用新颖的水手服拼接浪漫纱裙的方式,呈现出不一样的唯美清新感觉。藏蓝色与白色条纹的搭配,为整件裙子赋予了更多优雅的意义。而水手领的运用,更成为视线中最美的焦点,完全展示出女孩的大方、活泼,对于远方的美好向往之情。小飞袖轻松甜美,搭配网纱层叠裙摆更显娇俏与可爱。"} +{"content": "类型#上衣*版型#宽松*材质#天丝*材质#混纺*风格#复古*图案#复古*衣样式#衬衫*衣袖型#灯笼袖*衣款式#拼接*衣款式#纽扣", "summary": "一款柔软清凉的混纺天丝衬衫;富有心机感的前后身处,将自然捏褶作为拼接,赋予身体宽松的余量,使自在的穿搭尽显时髦的趣味感;纽扣开合的灯笼袖设计,既能轻易遮盖不足又可修饰臂部,使自信的举手投足间蔓延着优雅大气的复古范。"} +{"content": "类型#上衣*版型#宽松*材质#针织*颜色#红色*衣样式#针织衫*衣款式#勾花镂空", "summary": "ur这款针织衫运用优雅的枚红色点缀衣身,结合独特的镂空设计,露出女性上身的优美轮廓,更显优雅气质美。宽松的版型包容性极强,能够迎合大众身材,轻松塑造慵懒随性范。最后再辅以柔软细腻的针织面料,打造绝对舒适的穿着体验。"} +{"content": "类型#裙*版型#显瘦*裙型#a字*裙腰型#高腰*裙衣门襟#系带*裙款式#不对称*裙款式#不规则", "summary": "第一眼看到就很喜欢的裙子,腰部时尚的系带设计,不只是很好的装饰,还能显腰身显瘦,高腰的a字版型,上身给人胸部以下都是腿的视觉感。裙摆不规则的设计,让腿部有加长的视觉效果,斜剪裁不对称裙摆烘托柔美气质。"} +{"content": "类型#裙*版型#显瘦*裙型#a字*裙长#长裙*裙款式#不规则", "summary": "BRAND的风格以时尚不失典雅著称,会出其不意打破常规,展现个性。这款半身长裙作为代表,以不规则的剪裁,为整体营造丰富层次,行走时带来轻盈灵动感,行走间飘逸生姿,诉说娴静柔美的气质。结合它a字版型的演绎,将女性的下半身完美遮盖,显瘦效果满分。"} +{"content": "类型#裙*材质#网纱*裙型#网纱裙*裙下摆#压褶*裙长#半身裙*裙款式#拼接*裙款式#不规则", "summary": "这款半身裙不同于其他网纱裙单一的版型,采用了不规则三层拼接,每一层的网纱都做了压褶处理,更好地展示了裙子的层次感,臀部有点大的妹子穿,可以遮住臀部的肉肉,亮点是腰部的腰封。腰封的也做了压褶处理,很好的修饰了腰身。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#牛仔布*风格#休闲*风格#潮*图案#刺绣*衣样式#外套*衣长#短款", "summary": "休闲款式的经典牛仔外套,清洗时髦的牛仔蓝简洁大气,短款衣身加上宽松版型,有着极其突出的显瘦气质。衣摆的毛边彰显出随性不羁的潮流魅力,而精致的刺绣图案更有着非常好的观赏性,帅气风格中透露几分软萌的少女心。"} +{"content": "类型#上衣*材质#针织*风格#复古*图案#条纹*图案#复古*图案#撞色*衣样式#毛衣", "summary": "让人眼前一亮的是很剑走的怪异时尚,撞色的元素作为时髦的主基调,你就能从平庸的风格之中脱颖而出。换上舒适的轻软针织,质感绝佳的毛衣分分钟就你的芳心。自带复古感的红蓝条纹,趣味间隔玩转色彩艺术!"} +{"content": "类型#上衣*风格#街头*风格#简约*风格#青春*衣样式#外套*衣领型#翻领*衣长#短款*衣门襟#系带", "summary": "微凉春日中,最清晰活泼的选择,少不了一件短款外套。这件翻领系带短款外套,衣身选用简洁大方的明线设计,更显十足简约风格。衣袖选择飘逸非凡的系带设计,长长系带更突显街头少女的灵动活泼。而经典大方的短款设计,显露腰身,更显靓丽青春。"} +{"content": "类型#上衣*版型#宽松*风格#休闲*图案#线条*图案#刺绣*衣样式#毛衣*衣领型#高领*衣袖型#喇叭袖", "summary": "这款毛衣属于非常宽松的版型,穿在身上很休闲,大气,包容感很强。小高领的设计,精致可爱不挑人。喇叭袖的设计,很好的修饰手臂的线条。刺绣花朵点缀袖子,举手投足间尽显迷人的魅力。"} +{"content": "类型#裙*版型#立体剪裁*图案#线条*裙腰型#高腰*裙款式#收腰", "summary": "开口可以显得脖颈修长,显得脸小,收腰的立体剪裁可以打造高腰线,视觉上显得身材比例更好。领口圆润的弧度上身之后让脸部线条看起来更加好看。坠感很好的裙摆不会太过沉闷,让你一秒变身大长腿。"} +{"content": "类型#上衣*材质#棉*材质#针织*颜色#纯色*图案#纯色*衣样式#开衫", "summary": "这款纯色的开衫采用的是优质针织面料,亲肤柔软,绝对适合宝宝稚嫩的小皮肤。全棉内里设计,守护宝宝整个冬天的温暖,俏皮的彩虹图案设计加上朴素的平针造型。让整件开衫变的生动有趣,带给孩子一个快乐的穿着体验。"} +{"content": "类型#上衣*版型#显瘦*风格#简约*风格#ol*风格#职场*图案#拼色*衣样式#衬衫*衣领型#翻领*衣门襟#系带*衣门襟#单排扣*衣门襟#双排扣*衣款式#抽绳", "summary": "这款衬衫简约素雅的风格加上高级的颜色,适合职场ol的日常搭配,精致的翻领搭配上双排扣的门襟,相比一般的单排扣显得更有设计感,扣子的拼色细节,精致又减龄。腰间内嵌抽绳,前中系带,收放自如,而且可以凸显腰线,非常显瘦、显高。这个抽绳系带的设计还增加了俏皮感,使衬衫显得不那么正式。"} +{"content": "类型#裙*材质#水洗*颜色#黑白*图案#印花*裙型#百褶*裙下摆#压褶", "summary": "这款裙子上有黑白沾染水洗留下的自然印记,看起来就像是泼墨的印花图案,具备年代感,一眼看过去就觉得舒服。而裙摆带点微微的百褶压褶处理,尽享垂感,行走间,飘逸动人。"} +{"content": "类型#裙*颜色#红色*风格#知性*图案#电影*裙下摆#压褶", "summary": "自带女主光环的一套,仿佛带入了90年代电影里。醋酸手感细腻温柔融入了藕粉拼红色更是微妙,请给春天来点“洋气”裙摆的压褶恰到刚刚好,层次感也凸显了出来,配温柔的藕粉基本款上衣更显得人知性温柔。两款无论是单穿还是组合都不失特色~"} +{"content": "类型#裙*图案#刺绣*裙型#a字*裙腰型#高腰*裙长#半身裙*裙领型#v领*裙袖型#喇叭袖", "summary": "干练而展现高雅气质的时尚两件套。温婉而动人的纯白色上衣,大v字领口喇叭袖设计,百搭而彰显优雅气息。搭配的高腰a字半身裙,深黑色的面料和精美的刺绣工艺,雕琢出精致冷艳的视觉美感。"} +{"content": "类型#裙*风格#性感*裙下摆#开叉*裙下摆#荷叶边*裙长#连衣裙*裙领型#v领", "summary": "一款时髦简单的新颖连衣裙,甄选优质高级的舒适面料,上身显得尤为亲肤和自然,同时自带大牌感的光泽感,微微显露轻奢气息。加以衣身的精致v领点缀,带出小女人的性感气息。同时甜美荷叶边点缀袖口和裙摆,加以开衩的设计,平添几分优雅气息,更添层次美感。"} +{"content": "类型#裙*材质#蚕丝*图案#条纹*图案#印花*裙长#连衣裙*裙款式#腰带*裙款式#抽绳", "summary": "这款抽绳连衣裙采用蚕丝的材质,非常的轻盈,运用竖条纹的印花方式,能给人视觉上的冲击,很好的拉长身体的比例。抽绳腰带的设计,更是将女性甜美气息更好展现出来,非常的不错。"} +{"content": "类型#上衣*风格#简约*风格#青春*风格#潮*风格#性感*图案#条纹*图案#撞色*衣样式#衬衫*衣领型#圆领*衣袖长#五分袖*衣款式#拼接", "summary": "这款衬衫采用经典的条纹拼接,不仅丰富了整体层次感,还显出了女性独特的气质。实用圆领款式,包边设计,均匀工整车线,凸显服装的时尚,新潮。撞色条纹拼接袖口,青春感十足,简约不张扬,低调却很有自我格调。俏皮的五分袖设计,视觉看起来性感而又不失端庄,别具一格的袖子设计,凸显出了服装的独特性。底摆做工精致,细条纹收边,不易变形,穿着舒适。整体给人感觉亮丽青春。"} +{"content": "类型#上衣*风格#简约*风格#青春*风格#潮*风格#性感*图案#条纹*图案#撞色*衣样式#衬衫*衣领型#圆领*衣袖长#五分袖*衣款式#拼接", "summary": "这款衬衫采用经典的条纹拼接,不仅丰富了整体层次感,还显出了女性独特的气质。实用圆领款式,包边设计,均匀工整车线,凸显服装的时尚,新潮。撞色条纹拼接袖口,青春感十足,简约不张扬,低调却很有自我格调。俏皮的五分袖设计,视觉看起来性感而又不失端庄,别具一格的袖子设计,凸显出了服装的独特性。底摆做工精致,细条纹收边,不易变形,穿着舒适。整体给人感觉亮丽青春。"} +{"content": "类型#上衣*版型#立体剪裁*材质#棉*衣样式#衬衫*衣领型#翻领", "summary": "经典衬衫版型,遵循布料肌理。立体剪裁,以翻领明门襟的经典造型、配合曲摆的现代人性化裁减,相得益彰,舒适的面料搭配精致缝纫线使成衣领型自然舒展、缝线部位平服工整、牢固耐磨,整体简单素面。面料是棉质的,手感舒适耐磨,单穿或者外搭都非常好看。"} +{"content": "类型#裙*颜色#深蓝色*裙款式#亮片*裙款式#收腰", "summary": "重工亮片裙,静谧的夜空,深蓝底,亮片蝴蝶,华丽丽的,穿上给人一种神秘的奢华感,更是流露出无以复加的梦幻气息,上身仿若,两根细细的肩带道出专属于女人的似水柔情,给人清爽感受的同时传递着打动人心的美感。收腰放摆的经典廓型,美好修饰女性的身材曲线!裙摆在透视纱的笼罩下,上身有一种朦胧梦幻缥缈的美丽。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*图案#格子*裙型#大裙摆*裙型#鱼尾裙*裙长#半身裙*裙款式#不规则", "summary": "这一款半身裙不规则裁剪的设计,穿着新颖别致自然出彩。不挑人的合体微宽松版型,包容性很好,藏肉于无形穿着显瘦。特别是迷人的格子装饰,丰富视觉自然减龄。鱼尾大摆,随风摇曳韵味十足。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#灰色*风格#街头*风格#清新*图案#条纹*衣样式#卫衣*衣长#短款*衣款式#抽绳*衣款式#连帽", "summary": "这款卫衣是经典的灰色款,色调清新柔和,上身更显肤色白皙!宽松的短款设计,对身材的包容性更大,上身遮肉显瘦。舒适的连帽设计,领口采用气眼抽绳的设计,上身实力减龄。衣身下摆和袖口处,采用彩色亮丽的条纹装饰,更显时髦率性的街头范儿!"} +{"content": "类型#裙*裙下摆#开叉*裙长#连衣裙*裙袖型#插肩袖*裙款式#拼接", "summary": "这款连衣裙别出心裁的与通透的梭织拼接,打造出轻盈飘逸宛如仙女般的气场,着实是惊艳四方。半高的领型,能更好修饰颈脖曲线,插肩袖型自带闲适风范,上身倍感轻松舒适,开叉的裙摆,若有若无的透出纤细的小腿,引人无限遐想,好不诱惑,行走间更显灵动十足。"} +{"content": "类型#上衣*版型#宽松*风格#简约*图案#字母*图案#文字*图案#线条*图案#刺绣*衣样式#衬衫*衣领型#圆领*衣门襟#系带", "summary": "衬衫采用经典圆领设计,优美的领部弧形线条,唯美动人。衣身正面设计字母绣花标贴,小设计大亮点,提升衬衣层次感。宽松的衣袖,以可拆卸的圆环系带装饰,造型新颖独特。简约直筒下摆,不挑人穿着,包容性佳。"} +{"content": "类型#裙*材质#蕾丝*风格#青春*图案#蝴蝶结*图案#蕾丝*裙长#连衣裙*裙领型#v领*裙衣门襟#系带*裙款式#拼接*裙款式#抽褶*裙款式#不规则", "summary": "v领设计的这款连衣裙,再于前襟加入褶皱流行元素,打造出层次丰富的视觉效果,同时也更衬精致小巧脸型。采用钩花蕾丝拼接而成的后背,上身更是女人味十足。不规则裁剪的荷叶裙摆,以及腰部的蝴蝶结系带,灵动间又带点俏皮感。"} +{"content": "类型#裙*风格#淑女*风格#复古*风格#文艺*图案#格子*图案#复古*裙型#大裙摆*裙腰型#高腰*裙长#长裙", "summary": "裙身延用了经典的格纹图案,设计风格非常复古文艺,层次感十足。运用了高腰的设计,凸显身材比例,塑造迷人腰肢。蓬松飘逸的大摆长裙,仙气十足,穿着优雅得体,上身舒适感很好。彰显淑女气质,十分温柔甜美。"} +{"content": "类型#裤*材质#雪纺*风格#性感*图案#线条*裤长#连体裤*裤款式#松紧带*裤款式#飘带*裤腰型#高腰*裤腰型#松紧腰", "summary": "很性感的一件雪纺露背连体裤,给人的感觉就是仙仙的,很有灵动感!整个版型可以说是很有设计感,小高腰的设计提拉腰线,显得上下身的比例就很好看。加上腰部又是做了橡筋松紧的设计,不会限制腰的围度,穿脱起来也是很方便的。后背是做了恰到好处的露背设计,但是这个露背又不会显得太夸张,还加了一个飘带设计,更灵动飘逸,整个背部线条会显得更迷人,小性感。"} +{"content": "类型#裙*版型#显瘦*图案#条纹*裙长#连衣裙*裙领型#翻领*裙袖型#衬衫袖*裙衣门襟#系带*裙款式#衬衫式*裙款式#收腰", "summary": "衬衫式的设计让这款连衣裙展现出一种利落大方的感觉,尤其是搭配了气质小翻领和衬衫袖的设计,更加的稳重、自信,展现出都市女性的独立、干练。竖条纹的设计,不仅能够起到视觉显瘦的效果,也让裙子看起来更有立体感。腰间的系带既收腰显瘦,又提升气质。"} +{"content": "类型#上衣*版型#宽松*风格#复古*图案#格子*图案#复古*衣样式#卫衣*衣样式#毛衣", "summary": "此款阔腿裤选用优质面料,顺滑挺括,垂感度佳。宽松的裤腿设计,不,轻松修饰纤细笔直双腿。格纹元素的装点,赋予整体复古韵味,优雅大气。腰部花形穿绳,可随意调整大小,兼具实用度与美观度。无论是搭配毛衣还是卫衣,都不在话下,轻松化身时尚icon。"} +{"content": "类型#上衣*版型#显瘦*颜色#灰色*风格#青春*衣样式#外套*衣领型#半高领*衣长#短款", "summary": "小半高领今年很流行,保暖气质,很有特色的是从上到下共五层不同宽窄的竖条状,上面采用修身,下面越宽条衬出女人的大方优雅,别致款!墨灰色颜色特别适合搭配长款大衣整体有气场短款外套也可以"} +{"content": "类型#裙*颜色#黑色*风格#高贵*风格#性感*图案#线条*裙型#鱼尾裙*裙长#连衣裙*裙领型#立领*裙款式#拼接", "summary": "经典的黑色连衣裙,同时象征女性端庄优雅的品质。恰到好处的领口拼接设计,气质性感。立领的设计,时髦摩登,搭配透视的小斗篷,修饰手臂线条,又美观大方。下摆的鱼尾拼接设计,凸显女性高贵的气质,每个细节都彰显着女性妩媚动人性感十足的气质。"} +{"content": "类型#裤*材质#亚麻*颜色#米白色*风格#职场*裤长#长裤*裤型#阔腿裤", "summary": "米白色的亚麻长裤设计,米白是亚麻的精致代表色,非常的美观和大气,几乎是完美的百搭色,自带典雅气质,亚麻更是舒适和透气的最好材质。收腰设计具有时装的装饰美感,也能很好的修饰身材曲线,金属扣带来质感碰撞,阔腿裤具有职场美感,也有率性气质。"} +{"content": "类型#裤*版型#宽松*颜色#黑色*风格#简约*风格#休闲*风格#潮*图案#字母*图案#文字*图案#刺绣*裤腰型#松紧腰*裤口#小脚", "summary": "宽松舒适的黑色休闲裤与呆萌的校服类似,有了个性的扣环织带、刺绣字母边线的助攻,尽显时髦前卫的潮流范。简洁的束脚设计,营造了裤管的空荡感,让你的腿型更加纤细修长。贴合身形的松紧腰,分割精细,凸显纤细身形,更显休闲感。简约的斜袋设计,方便携带小物件,插兜街拍更有型。"} +{"content": "类型#裙*图案#碎花*裙长#连衣裙*裙衣门襟#系带*裙款式#露肩*裙款式#抽褶*裙款式#收腰", "summary": "精致大方的时尚连衣裙,运用了露肩的设计,露肩配合双层的荷叶摆,展现小女人的柔美姿态。腰部采用了褶皱的收腰设计,勾勒出俊美优雅的动感身姿。配合领口的飘逸系带,更显真挚的时尚魅力。唯美大方的碎花点缀,焕发着典雅的气息。"} +{"content": "类型#裙*颜色#粉色*图案#条纹*裙腰型#高腰*裙长#连衣裙*裙领型#v领", "summary": "连衣裙用粉色做基调,充满少女的活泼感,衬托肤色白皙细腻。v领领口设计可以修饰脸型显脸小,又展露精致锁骨,散发迷人魅力。高腰设计在视觉上拉长了腰线,凸显大长腿显人更高挑。裙身竖条纹设计,丰富视觉效果,展现女性柔美气质。"} +{"content": "类型#裙*裙下摆#花边*裙长#半身裙*裙款式#口袋*裙款式#抽褶*裙款式#对称", "summary": "设计师用花边的设计,在半身裙的后面做了两个夸张的对称口袋,设计很特别。用褶皱花边最大程度上打造视觉上的甜美感觉,并没有显得很突兀。"} +{"content": "类型#上衣*版型#宽松*材质#棉*颜色#蓝色*图案#线条*衣样式#衬衫*衣样式#衫*衣领型#翻领*衣袖型#落肩袖", "summary": "今年大火衬衫,这件衬衫的配色格外的独特,蓝色的色调配上淡淡的棕色,上身之后了撞衫的尴尬。上宽松的版型,柔化了肩部与手臂上的线条,显得纤瘦娇小。小翻领的设计,更好的修饰了颈部的线条。全棉的材质穿着更透气舒适。"} +{"content": "类型#裙*版型#显瘦*风格#休闲*风格#青春*图案#线条*裙型#a字*裙长#连衣裙*裙款式#抽褶", "summary": "玛玛绨连衣裙a字版型的裙式设计,少女感十足,洋溢着满满的青春俏皮范儿,穿着巧妙的减龄。a字显瘦的效果也十分的出色。的裙摆,隐藏住调皮的赘肉,衬托出更纤细修长的腿部线条。较高的包容度也方便不同身型的女生穿着。裙身随意的褶皱,更为整体增添了一丝休闲感。百搭又减龄的款式,轻松的凹凸你的造型。"} +{"content": "类型#裙*材质#网纱*风格#清新*图案#渐变*图案#线条*裙下摆#花边*裙长#半身裙*裙领型#立领*裙袖型#喇叭袖*裙衣门襟#系带*裙款式#拼接*裙款式#木耳边*裙款式#抽褶*裙款式#不规则", "summary": "木耳边半立领,增添褶皱即视感,塑造花边效果,俏皮清新甜美可爱,进一步拉伸颈部线条,提升你的气质。袖口设计为系带结合,塑造喇叭袖造型,展现少女气息,轻松减龄。搭配同色系半身裙,网纱面料拼接,不规则裙摆,更显飘逸与大气,随着你的走姿翩翩起舞,引人注目。整体设计为同色系的两色搭配,形成渐变效果,拒绝单一丰富色调,靓丽清新,打造森系少女形象。"} +{"content": "类型#裙*材质#蕾丝*图案#刺绣*图案#蕾丝*裙型#a字*裙款式#收腰", "summary": "采用利落的剪裁工艺,将胸腰曲线展现的十分立体,收腰的设计,展示出你纤细傲人的身材。a字裙摆自然展开,弥漫出优雅的气息。袖子部分采用刺绣的设计,增添浪漫唯美的气息,饱满圆润的珍珠扣装饰,泛出莹润的雅致光泽,蕾丝装点添出一份小女人的柔美气息。"} +{"content": "类型#上衣*图案#线条*衣样式#外套*衣领型#翻领", "summary": "小外套使用翻领设计,剪裁简洁,给人干净利落的感觉,展现女性少有的帅气。制作工艺匠心独运,线条流畅,将优雅和时尚结合起来,衬托女性气质。款型设计贴身,修饰女性身材。材质优良上档次,做工精细展现不凡品味,适合多种年龄层、多种场合穿搭。"} +{"content": "类型#上衣*图案#线条*衣样式#外套*衣领型#翻领", "summary": "在简单利落的外套上融入利落的翻领设计,呈现出穿着者独有的率性气息,剪裁自然流畅,领边规整而又精致的走线。从视觉上拉高了颈部的线条,结合立体干练的版型,上身就能体现无懈可击的时尚调性。"} +{"content": "类型#上衣*颜色#黑白*图案#线条*衣样式#卫衣*衣领型#v领*衣领型#高领*衣门襟#套头", "summary": "这款卫衣是一件不挑身材的版型,黑白两色基础的色系都非常百搭,卫衣套头的,前面扣子敞开v领的效果,修饰颈部的线条,扣上就是高领的效果,无论是单穿或者是做内搭,都非常好看。"} +{"content": "类型#裙*版型#显瘦*图案#线条*裙款式#勾花镂空*裙款式#收腰", "summary": "以修身收腰的版型勾勒出优雅的裙装,展现出纤细的腰身和修长的腿部线条。在清爽镂空的面料上加入别致的钉钻点缀,平添几分俏丽感,搭配清雅淡雅的色调,赋予衣衫浪漫的气质。"} +{"content": "类型#裙*风格#复古*图案#复古*图案#刺绣*裙长#短裙*裙款式#盘扣*裙款式#收腰", "summary": "复古的刺绣工艺,经过现代时尚的演绎,展现出女性柔美气质,同时增强时尚气场。中短裙的版型设计,飘逸的裙摆,小个子也可以穿出大长腿的既视感。精致的后领盘扣设计,古典雅致,衬托出女性独有的韵味。收腰的版型剪裁,勾勒出曼妙身姿曲线。"} +{"content": "类型#裤*风格#青春*风格#性感*裤长#九分裤*裤型#直筒裤*裤腰型#高腰", "summary": "这款休闲裤走的是青春时尚的风格路线,尽显出你的与众不同的个性,体现出你的帅气气场,不失自信的一面。采用了直筒的版型,带来舒适自在的体验感。配合高腰的腰型设计,尽显出纤瘦性感的一面,九分的裤长细节,突显出时尚个性的一面。"} +{"content": "类型#裤*版型#显瘦*风格#简约*裤长#九分裤*裤款式#口袋*裤款式#抽褶*裤腰型#松紧腰", "summary": "经受的强大的考验才会被人们所信赖的BRAND,以其出色的设计理念吸引你的眼球。这条运动裤看着简简单单,实际上却是满满的心机小细节,整体简约而不简单,九分裤的设计,显瘦的同时还能拉长身材比例,让你穿上运动裤也能分分钟变大长腿,细节处的口袋和褶皱松紧裤腰,走向干净利索,彰显高超技术,精致logo凸显质感。"} +{"content": "类型#裤*版型#显瘦*风格#简约*裤长#九分裤*裤款式#口袋*裤款式#抽褶*裤腰型#松紧腰", "summary": "简约不简单的一款裤装,后身松紧收腰设计,时髦显瘦展现优美比例,释放女性姣好身形;前后多口袋点缀,实用方便不乏俏皮,同时还让潮女插兜街拍更有型。匠心的九分的设计,隐约露出一丝打破你的一本正经,无意间让你吸睛无限;不甘于平平无奇,耳目一新的褶皱细节,更显前卫和别具腔调。"} +{"content": "类型#上衣*材质#牛仔布*材质#水洗*颜色#金色*风格#复古*图案#复古*衣样式#外套*衣领型#一字领*衣长#短款*衣门襟#拉链*衣款式#拉链", "summary": "这款一字肩做旧水洗牛仔外套,妥妥的时尚又百搭的初春单品。短款的设计,同时还拉长身材比例。复古的旧金色拉链和好玩的主题四合扣,给衣身注入了新鲜的活力,不管怎么搭配都很赞。"} +{"content": "类型#上衣*风格#性感*图案#创意*图案#印花*衣样式#衬衫*衣领型#v领", "summary": "萌趣十足的印花点缀在衬衣之中,形成一种特别的视觉效果来提升你的魅力,展现柔美又很温和的一面让人发现你的与众不同。v领的设计效果凸显颈部白皙与精致,可以在扣饰开合的作用下将性感的味道加以提升,穿出百变的创意理念。"} +{"content": "类型#上衣*版型#宽松*材质#针织*风格#复古*风格#文艺*风格#潮*风格#性感*图案#复古*衣样式#开衫*衣领型#v领*衣长#短款*衣袖型#落肩袖*衣袖型#收口*衣款式#纽扣*衣款式#罗纹", "summary": "针织开衫体现出的是温婉大方的东方古典美韵,加上小v领的版型却具有了性感潮流的个性。前襟的单排树脂纽扣门襟,穿脱方便装饰感强,突显复古典雅的美感。紧密的编织设计,具有文艺复古气息,宽松的落肩短款版型,适合多种身材穿搭,领口袖口的罗纹包边收口,做工细腻美观大方。"} +{"content": "类型#裤*版型#宽松*裤长#长裤*裤型#直筒裤*裤款式#不对称*裤腰型#松紧腰", "summary": "这样一款颜值与质感皆在的长裤,面料是精选的顺滑面料,具有一定的光泽度。比较宽松的直筒裤型,做了松紧腰的设计,对各种腿型都很友好,更是让你轻松驾驭。侧边加入了双色的织带,一根是较暗的深藏蓝织带,另一根是哑光的涤棉织带,两边还是特别的不对称式。"} +{"content": "类型#裙*风格#复古*风格#性感*图案#格子*图案#复古*裙型#鱼尾裙*裙长#连衣裙*裙领型#v领", "summary": "这款连衣裙穿着轻便而舒适,具有良好的弹性和恢复性能,面料不起皱不起球,穿起来挺括有型。格子底纹加上配色显得非常的时髦复古,前后v领的领口设计性感又优雅,腰间上提的款式独特新颖。微微鱼尾的裙摆款式显得非常的优雅气质。"} +{"content": "类型#裤*图案#线条*裤长#短裤*裤长#五分裤*裤型#直筒裤*裤款式#口袋*裤腰型#高腰", "summary": "纹提花直筒五分短裤,口袋和下摆运用面料的反面,在图案和颜色上打破了直筒五分短裤给人一本正经的感觉,穿起来更加夺取眼球。直筒裤包容性强,不仅可修饰大腿线条,高腰版型更有效拉伸整体比例。"} +{"content": "类型#上衣*版型#宽松*风格#知性*风格#休闲*衣样式#衬衫*衣款式#拼接*衣款式#不规则", "summary": "足以当做连衣裙来穿着的一款衬衫,带有几分知性风格。宽松的版型剪裁,气质休闲且包容性强大。而丰富衣身的不规则拼接,更是成为了格外吸睛醒目的存在,搭载着肩膀处新颖的斜排扣设计,增加整个衣身的亮点,丰富视感个性的时尚感。"} +{"content": "类型#裙*材质#蕾丝*风格#性感*图案#蕾丝*裙下摆#花边*裙长#连衣裙*裙领型#一字领*裙袖型#喇叭袖", "summary": "这一款连衣裙喇叭袖的设计,看起来具有十足的仙气,加上花边下摆的设计,行走之间显得很有美感。时尚的一字肩设计,美肩微露特别性感。蕾丝装饰,妩媚动人风情浪漫。穿上让人深深着迷无法自拔,时尚精致。"} +{"content": "类型#上衣*材质#蕾丝*图案#条纹*图案#蕾丝*衣样式#衬衫*衣领型#翻领*衣款式#拼接*衣款式#纽扣*衣款式#吊带", "summary": "这款衬衫上衣采用了蓝白条纹的图案,再加上它一侧拼接了蕾丝吊带,打破了传统衬衫的设计款式,在设计上更是凸显出它与众不同之处,带来一丝甜美气质。采用了翻领的设计,加上单排纽扣的衣门襟设计,简洁时尚,方便穿脱。"} +{"content": "类型#裙*版型#显瘦*风格#复古*风格#简约*图案#条纹*图案#植物*图案#复古*图案#刺绣*裙长#连衣裙*裙袖长#短袖*裙款式#收腰", "summary": "非常具有细节设计感的连衣裙,灰白色的条纹设计,色调经典又美观,带来恬静的贵族风格,上身横纹简约大气,下身的竖纹设计,更加显瘦和纤细。肩部的两朵小刺绣花卉,显得精致典雅,具有点睛之效,喇叭短袖设计,复古又很时尚,收腰更加突出造型。"} +{"content": "类型#上衣*版型#显瘦*风格#简约*图案#线条*衣样式#针织衫*衣领型#圆领*衣长#短款*衣袖长#七分袖", "summary": "简约百搭的针织衫特别有范,胸前火烈鸟图案充满了个性和活力,精致优雅带来满满的俏皮感。经典圆领设计修饰颈脖,显得简洁大方,短款修身的版型从视觉上拉长身形,构造曼妙身姿,七分袖长度露出纤细修长的手腕线条,举手投足之间绽放出女性的独特魅力。"} +{"content": "类型#裙*材质#蚕丝*裙长#连衣裙*裙衣门襟#拉链*裙款式#拼接*裙款式#拉链*裙款式#木耳边", "summary": "这款拼接的真丝连衣裙采用木耳花边的设计,穿着之后甜美俏皮又灵动。后背隐形拉链的设计,细腻不容易被察觉增加了整体的美观效果。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*颜色#深蓝色*风格#清新*裤长#七分裤*裤型#阔腿裤*裤腰型#高腰", "summary": "纯白和深蓝两色设计的牛仔裤,纯白色为牛仔裤注入清新气质,极简色调清新典雅,点亮了牛仔裤,特深蓝色更加内敛显瘦,一深一浅都极具精致美感。高腰设计突出身材曲线,带来很好的一体连贯美,纤细的腰部曲线展露无余,阔腿七分更添率性,典型的现代美。"} +{"content": "类型#裙*材质#牛仔布*颜色#蓝色*风格#街头*风格#英伦*风格#休闲*风格#青春*风格#清新*图案#线条*裙型#背带裙*裙型#牛仔裙*裙型#直筒裙*裙长#连衣裙*裙衣门襟#排扣", "summary": "这款很有青春气息的连衣裙采用背带的版型设计,剪裁出更具有型利落的线条感,直筒的造型着身更是舒适休闲,结合排扣的装点,衬托出满满的英伦洋气街头感,加上小清新韵味十足的牛仔蓝色,轻松增加了百搭效果。小背带的设计又能让活力气息脱颖而出。"} +{"content": "类型#上衣*衣样式#外套*衣领型#v领*衣门襟#系带", "summary": "这款背心裙精选优质的面料,不仅手感舒适,穿在身上还尽显出高级质感,非常时髦大方。v领的设计也是一大亮点,衬显这款外套更加的吸睛优雅,凸显气质。腰部系带的设计还可以很好的勾勒腰线,展现好身材。"} +{"content": "类型#裙*风格#知性*风格#清新*图案#条纹*裙长#连衣裙*裙款式#不规则", "summary": "连衣裙一直是女性们出街凹造型的神器,它兼具着时尚与百搭的特性,让很多美眉们爱不释手。这款连衣裙,下摆处采用不规则的设计,时尚而带有层次感,浪漫飘逸,同时为整体增加了设计感,让你行走间展示女神般的魅力。融入条纹的设计,清新雅静,不仅丰富了整体的视觉效果,同时还能拉长身形,散发出女性优雅知性的时尚气质。"} +{"content": "类型#裙*风格#文艺*风格#清新*图案#蝴蝶结*裙长#连衣裙*裙款式#腰带*裙款式#吊带", "summary": "连衣裙采用麂皮绒面料制成,细腻顺滑有光泽,穿着自然舒适。麂皮绒经过设计师精心剪裁和缝纫,结合清新的素色印染,展现出简洁大方的版型,边缘整齐走线流畅,不易脱线经久耐穿。除此以外,吊带式的设计和腰间的蝴蝶结腰带相呼应,洋溢出浓浓的青春活力气息,打造出小清新的文艺女青年形象。"} +{"content": "类型#上衣*风格#韩版*风格#简约*风格#潮*衣样式#衬衫", "summary": "90后潮搭韩版潮牌衬衫,面料厚实透气性好显成熟稳重气质。看看用这个全新的方式来诠释潮流,穿搭很轻易让你打造出秋冬型男造型。既简约又帅气哦。"} +{"content": "类型#上衣*材质#棉*颜色#黑白*风格#清新*图案#条纹*图案#印花*衣样式#卫衣*衣领型#v领*衣长#中长款", "summary": "纯棉的面料上简单的配色,以大v领口修饰小巧脸型,不幼稚也过于成熟,黑白条纹构成的清新配色洋溢出满满的青春活力,充满了阳光的味道。一别传统卫衣,中长款膝的长度少女的甜美,胸口的印花成环形点缀,唤醒童趣,可爱炸裂。"} +{"content": "类型#裤*材质#牛仔布*风格#街头*风格#性感*图案#线条*裤款式#破洞*裤腰型#高腰", "summary": "在街头破洞元素一直是彰显个性自我与众不同的代表,这条牛仔裤的膝盖为破洞处理,露出腿部肌肤,展现性感魅力,同时也流露出不羁的态度。为了拉长腿部线条,裤子采用的是高腰版型设计,它转化了身体的比例。腿部还加有金属别针装饰,增加了裤子的时髦前卫感。"} +{"content": "类型#裙*颜色#粉色*风格#复古*风格#宫廷*风格#清新*图案#碎花*图案#复古*裙下摆#荷叶边*裙袖型#喇叭袖", "summary": "以淡雅的粉色作为底色,清新的碎花作为点缀,两者互相衬,满满的仙女气息。荷叶边蔓延至裙身,喇叭袖与荷叶边的搭配,宫廷复古轻松重现眼前。"} +{"content": "类型#裤*版型#宽松*材质#牛仔布*风格#运动*风格#休闲*风格#潮*图案#字母*图案#文字*图案#线条*图案#印花*图案#撞色", "summary": "设计师匠心独运将字母印花,巧妙点缀于上半身,远看如同翅膀,颇具新鲜趣味性。黑红、白红撞色设计,制造强烈视觉冲击,倍潮流。圆领套头款式,以基础简洁剪裁碰撞繁复图案,尤为时尚大气。裁剪细腻,线条明快流畅,宽松版型上身舒适,兼备休闲风和运动风。时尚百搭,可搭配休闲裤、运动裤、牛仔裤等。"} +{"content": "类型#上衣*材质#棉*材质#纤维*风格#清新*图案#印花*衣样式#外套", "summary": "在炎热的夏天,小清新的绿意与粉嫩的少女色搭配最是能拂去因天气带来的烦躁,首先在视觉上就是一片清爽,特别设计了多款印花,双面不同印花设计,无需在意正反面,带给你全新的使用体验。它在材质选择上外套为天然纯棉,内里为聚酯纤维,集柔软亲肤,透气吸湿于一体,在温度不下的夏天为你营造干爽舒适的睡眠氛围,让你在夏天也可好睡眠。"} +{"content": "类型#裙*材质#牛仔布*材质#水洗*颜色#浅色*颜色#深色*图案#拼色*图案#撞色*裙型#牛仔裙*裙下摆#毛边*裙款式#拼接*裙款式#破洞", "summary": "裤型很棒。裤脚口做了毛边的设计颜色了比较正的浅色水洗牛仔色,右腿膝盖的位置做了假破洞的设计,破洞是不露肉的设计,在里面拼接了一块深色牛仔布,撞色的拼色给人独特的视觉效果,对于腿型不那么好看的姑娘来说,也是可以放心入的。牛仔棉布面料,舒适度好。颜色是水洗的刚刚好的蓝。"} +{"content": "类型#裤*版型#宽松*版型#显瘦*裤腰型#高腰*裤口#小脚", "summary": "这一款裤子高腰设计,提升腰线自然显高。略微宽松的版式,上身轻松收身显瘦,惬意舒适中自然显大方。小脚裤型,精致优雅自然出彩。精挑细选的棉布材质,亲肤细腻穿着柔软,搭配出街,自然减龄。"} +{"content": "类型#裙*材质#雪纺*裙型#蛋糕*裙下摆#花边*裙款式#抽褶", "summary": "雪纺工艺,更加的突出了质感,不论是前胸的褶皱花边,还是三层波浪式蛋糕裙下摆,都强调了这款裙子的独特,穿起来像优雅的小仙子。质地柔软细滑,搭配漂亮的项链装饰穿在身上回头率高哦。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*颜色#浅蓝色*风格#清新*风格#潮*裤款式#口袋*裤款式#纽扣*裤款式#拉链*裤口#毛边*裤口#小脚", "summary": "一件百搭时尚的牛仔裤,一年四季常备的款式,茵曼的这款是浅蓝色的款式,带有一点小清新视感,很有气质。修身小脚的裤型,能修饰腿型,瘦腿美腿,打造一双纤细修长的大长腿,让你更加高挑纤瘦。裤脚用毛边装饰,增添潮流帅气感,凸显细节。门襟用经典拉链纽扣,两边实用立体口袋,方便插手凹造型,实用又美观。"} +{"content": "类型#裙*风格#性感*裙型#直筒裙*裙袖型#泡泡袖*裙款式#抽褶", "summary": "裙子采用了舒适的直筒廓形设计,版型上适当的松量,利落不松垮,不会显得瘦的过分,也不会显得曲线美不够好,简直就是藏肉的一把好手。袖子采用了别致的抽褶设计,形成了微微的泡泡袖,为整体造型增添了不少的个性感。光是百搭易穿就足够令人疯狂。"} +{"content": "类型#裙*裙长#半身裙", "summary": "黑裙背后的黑纱长至脚踝,与前面到膝盖的半裙形成对比,为原本单一的版型增添了层次感。黑纱的轻柔与飘逸,让你在气场全开的同时,又多了一份柔情与浪漫,让小礼裙瞬间变得独一无二。"} +{"content": "类型#裙*裙长#半身裙", "summary": "拥有宽大裤脚的一条休闲裤,穿在身上似裙子又似裤子。既有半裙的内敛优雅,又有裤装的潇洒惬意,宽宽大大的样式能很好的隐藏腿上的肉肉,拉长身高,看起来又瘦又高。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这款半身裙腰部采用了章贴设计,设计使半身裙更具吸睛亮点,打破裙子单调沉闷感,吸睛更时尚。而且元素层次感丰富,使整体更显时髦。"} +{"content": "类型#裙*裙长#半身裙", "summary": "半身裙备受美眉的喜爱,它可以搭配多种风格。此款半身裙最突出的亮点是腰封设计,采用腰封装饰,让裙子看起来更有设计感和时髦感,让半身裙变的不再单调,而是更加丰富和内涵。同时腰封设计可以更好的拉高腰线,让腿部看起来更加修长。因此,腰封设计是此款半身裙的点睛之笔。"} +{"content": "类型#裙*裙长#半身裙", "summary": "半身裙在裙摆处添加双色织带点缀,与裙身底色形成鲜明对比,打破了单一色调带来的枯燥乏味感,为整体造型注入一丝活泼学院味道。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这款的设计感在意左右两边长短不同,断层的落差感,尤其上身后,能很明显的感受到细节的质感。这款半裙的版型比较修饰身材,能把身材曲线勾勒的很美,腰头的位置剪裁平整,上身很有型。"} +{"content": "类型#裙*裙长#半身裙", "summary": "每一条半裙上面的颜色位置不是固定排列的,定做的,不止颜色很特别,就连版型也是如此,下摆是呈360半圆弧度。关于360度旋转,纱裙为半透明的效果,更为仙气。为了方便日常穿着,还在里面加了一件单独的独立内衬,也就是带橡皮筋的安全短裤,可脱卸的设计,所以实际上是两件套,穿着更加方便。"} +{"content": "类型#裙*裙长#半身裙", "summary": "精致的鹅黄半身裙,搭配上美观的花纹款式,穿上身后耀眼夺目,张扬属于自己的个性魅力,春天的到来,刚好这款透气的裙子适合小仙女。"} +{"content": "类型#裙*裙长#半身裙", "summary": "真正好的版型一上身就能感受它带来的惊喜,r就有这个能力。设计师在腰节处做了多道省位,包括间距,大小,长度,都是经过调整达到zui佳。"} +{"content": "类型#裙*裙长#半身裙", "summary": "一件半裙,如同一枚的郁金香花朵,优雅中含苞欲放的温润与羞涩,包容着整个身形,修饰这双腿的曲线,有着浑然天成般自然,无论是走路还是坐立,都是自然雅致的模样。"} +{"content": "类型#裙*裙长#半身裙", "summary": "春夏天谁的衣柜里面没几条半裙?女孩子在夏天怎能,搭配上很是需要,百搭而且还不容易出错。但半身裙款式、穿法基本上都差别,怎么才能脱颖而出是关键,这重点就在挑选的款式上了。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这款半身裙,选用的是优质的高品质面料,做工精湛,面料的质地细腻,手感柔软舒适,纹理清晰美观,穿着结实耐磨,内衬安全裤的设计长度适中,穿着舒适透气。"} +{"content": "类型#裙*裙长#半身裙", "summary": "半裙的腰头是借用旗袍的领型为设计,像穿上旗袍的高雅,散发古典韵味。裙身像孩童的画一样的贴布绣设计,充满童真的减龄感。既有优雅女人味又有少女感的一条半身裙。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这款精心设计的半身裙,能在瞬间呈现出优雅迷人的美感,造就出匀称的身材曲线,迷人至极洒脱的个性的气息惹尽人们的喜爱,随时随地展显出优雅大牌气息,而且有很好的显廋效果,气质非凡。"} +{"content": "类型#裙*裙长#半身裙", "summary": "来自于tao的女童半身裙,采用松紧的腰头设计,恰到好处的弹力不仅能紧贴于宝宝腰部,营造出舒适的穿着感,而且还有助于宝宝在日常穿脱时,更加的省时省力。裙身上精美的图案点缀,充满了时尚又俏皮的气息,宝宝穿着更能彰显天真的个性和满满的活力哦。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这件半裙的裙身是用细腻的网布,轻盈有质感不显臃肿,加上做了袭击的袭击的烫金工艺,整体更有细节感,还有加了同色系里布的设计,让整体的舒适度更好。"} +{"content": "类型#裙*裙长#半身裙", "summary": "层次设计半身裙,上面一层天丝麻裙摆活片设计,从侧面和下摆露出里面的丝裙摆,灵动又不失含,营造半露的迷人风情,让你穿出个性与时尚。"} +{"content": "类型#裙*裙长#半身裙", "summary": "非常百搭的一款半身裙时尚又富有气质,选取的面料非常细腻平实,而且有很好的抗皱感性,轻薄的面料也能给人带来很好的穿着体验,能够让人在行走中自带飘逸感。"} +{"content": "类型#裙*裙长#半身裙", "summary": "雪花半身裙,亮色纱线交错,清晰的编制肌理,低调而又不沉闷,因为面料本身比较别致,所以没有做其他装饰设计,简简单单就很耐看。"} +{"content": "类型#裙*裙长#半身裙", "summary": "此款半身裙采用的是精致的粗呢小香风编织面料,从细节中传递出一种优雅的气质。凹凸不平的表面,丰富了裙身的肌理感,奢华感油然而生。"} +{"content": "类型#裙*裙长#半身裙", "summary": "这款百搭时尚的仙女半身裙,整体设计非常的飘逸随性,穿上之后每个女孩子都能瞬间变成小仙女啦。料子非常的轻盈,透气性也很好,穿到夏天也很舒适。"} +{"content": "类型#裙*图案#印花*裙长#连衣裙*裙领型#立领*裙款式#盘扣", "summary": "这款改良旗袍连衣裙与我们平日里所见的连衣裙大有不同哦,尤其是将如此栩栩如生的印花图案点缀在衣衣上,瞬间柔和了旗袍裙带来的传统正式感,平添了几分女性的趣味性与时尚感;还有那简洁的立领设计以及盘扣的点缀,已然成为旗袍连衣裙的标配,瞬间流露出一丝中式风韵味。"} +{"content": "类型#裤*版型#显瘦*风格#简约*风格#民族风*图案#线条*裤长#连体裤*裤型#阔腿裤", "summary": "民族风连体裤,线条简约透彻,拥有自己冷静的一套处事。绚烂色彩结合阔腿裤剪裁,露出修身的线条,显得十分刻板端正,不。下半身的阔腿裤走特立独行的法则,避免腿型分明的尴尬。将比例很好的隐藏,可以气场。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*图案#条纹*图案#格子*衣样式#衬衫*衣领型#polo领*衣门襟#单排扣*衣款式#拼接", "summary": "polo领BRAND单排扣的经典衬衫版型搭配略宽松的oversize版型,打破呆板,看起来更加轻松随意。在经典的法式蓝白竖条纹基础上增加格子拼接元素,匠心独具。还能从视觉上更加显瘦,而精致细腻的做工将衬衫的质感又提升一个level。"} +{"content": "类型#上衣*颜色#黑色*颜色#灰色*风格#复古*风格#文艺*图案#格子*图案#复古*衣样式#外套*衣款式#拼接*衣款式#口袋", "summary": "柔软的外套肩膀与胸口口袋处拼接着皮革材质,使表面呈现出丰富多变的质感;黑色与灰色相间的格纹之间,点缀着细小的花朵图案,颇具复古文艺气息的同时,又不会过于死板。"} +{"content": "类型#裙*材质#蚕丝*图案#印花*裙长#连衣裙*裙款式#拼接*裙款式#木耳*裙款式#抽褶", "summary": "oz家的这款真丝连衣裙,舒适质感搭配层次丰富的裙摆拼接,丰富而不会觉得繁琐,视觉上尽显优雅的气质!腰部拼接的木耳褶边,加以细碎的褶皱尽显俏皮和可爱感!裙身上的印花,是一种很有艺术韵味的花朵,素雅好搭配的色彩,上身大气又尽显温柔感!"} +{"content": "类型#裙*版型#显瘦*裙长#连衣裙*裙衣长#中长款*裙领型#v领*裙衣门襟#系带", "summary": "这件连衣裙的腰部带有侧边系带的装饰,十分美观也能够很好的勾勒腰部曲线。常见小v领修饰脖领曲线微露迷人锁骨,展现出女性独特的魅力。整体中长款修身的版型巧妙的包裹着身体,得体,拥有十足的魅力。"} +{"content": "类型#裤*版型#宽松*风格#街头*风格#清新*裤型#直筒裤*裤款式#破洞*裤口#毛边", "summary": "带着一点小清新的裤子在设计上一定会有着自己的独特魅力。街头元素风格的破洞设计增加了毛边的做旧,看着也变得不会那么单调,给人一种非常时尚的前卫风格。宽松的直筒版型可以很好的掩饰腿部缺陷,适合大多数人穿着,裤腿则是有着做旧的毛边,折叠起来也非常有个性,整体都是属于非常耐看的款式。"} +{"content": "类型#裙*材质#雪纺*颜色#黄色*风格#清新*图案#碎花*裙长#连衣裙*裙袖长#五分袖", "summary": "这是款自带小清新感觉的连衣裙,穿起来很是减龄,清爽的黄色小碎花在雪纺的渲染下,格外具有层次感。色彩如同水墨画的晕染,那般富有想象力和动感,五分袖的设计格外端庄优雅,彰显出女性由内而外的自信与美丽。"} +{"content": "类型#裤*版型#宽松*材质#棉*颜色#黑色*风格#休闲*图案#线条*裤长#七分裤*裤腰型#高腰", "summary": "这款休闲裤是经典的黑色款,非常的实穿百搭,出街实力吸睛!采用纯棉的面料设计,上身质感更舒适。高腰的款式优化身材比例,七分的长度设计,更显双腿修长。宽松的裤身版型,更好的修饰臀部和大腿的线条。上身后彰显休闲时尚的气质感!"} +{"content": "类型#上衣*版型#显瘦*材质#羊毛*图案#印花*衣样式#毛衣*衣领型#圆领*衣袖长#长袖*衣门襟#套头", "summary": "来自BRAND,黑豹嵌花毛衣。精选100%羊毛材质打造,软糯轻薄,穿着透气。简洁小圆领长袖套头款式设计,略微修身的版型作为内搭或是外穿皆出彩。个性黑豹印花图案装饰,彰显霸气设计细节。"} +{"content": "类型#裙*版型#宽松*颜色#黑色*风格#性感", "summary": "candie’s是拉夏贝尔旗下主打少女风的时装品牌,它旗下的这款宽松打底黑色吊带裙,以宽松剪裁构筑版型,助力于甜美少女完成舒适自在的衣着造型。辅以兼具性感与清纯气息的吊带裙样式,尽显轻熟少女妩媚气息。"} +{"content": "类型#裤*版型#显瘦*材质#棉*颜色#白色*颜色#黑色*风格#复古*图案#复古*裤款式#拼接*裤款式#抽绳*裤口#小脚", "summary": "这款复古束脚裤,经典时尚的纯黑色为底色,侧边的白色拼接,给整件裤子增添了一分活力感。精选优质棉质面料,手感舒适,纹理细致,耐洗耐磨。裤脚的罗纹缩口设计,防风有型又显瘦。腰部抽绳设计,可调节松紧度,穿着舒适,无束缚感。"} +{"content": "类型#上衣*版型#宽松*颜色#军绿色*衣样式#风衣*衣领型#翻领*衣领型#小立领*衣长#常规*衣款式#腰带", "summary": "这款过膝长款风衣,采用经典军绿设计,搭配双排口点缀,立刻凸显优雅时尚范儿,袢加上精致肩章,层次感分明,给整体大大加分,形版型则不会显得臃肿。再加上宽松腰带的加持,瘦高的身材立体一秒体现。小立领设计,打破了风衣常规的翻领,勾勒出脖颈曲线。"} +{"content": "类型#裙*材质#雪纺*图案#几何*图案#印花*裙型#百褶*裙下摆#弧形*裙下摆#垂坠*裙长#连衣裙", "summary": "这款连衣裙甄选了柔软的雪纺面料打造,垂坠有型,穿着很舒适。弧形的领口,恰到好处的露出领口,浓浓的女人味。几何印花的造型,超级有女神范,自然百褶的元素,唯美浪漫。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#白色*颜色#黑色*颜色#黑白*风格#英伦*风格#复古*风格#文艺*图案#格子*图案#复古*衣样式#衬衫*衣领型#v领*衣门襟#系带*衣款式#拼接*衣款式#荷叶边", "summary": "这件宽松款式的的格纹衬衫,特别选用了细密的黑白格纹,经典的黑白格纹可以是“格纹”,暗黑系的配色加入反光的白色纹路形成的小格子,带来了许多活泼跳跃的感觉,又营造出英伦复古的怀旧文艺气息,这种格纹不仅时尚还暗藏显瘦心机。拼接的荷叶边,廓形立体飘逸灵动,垂于胸前更显浪漫甜美。领口黑色拼接系带,打造出v领视觉,精致又洋气。"} +{"content": "类型#裙*材质#棉*风格#文艺*图案#格子*裙型#包臀裙*裙型#鱼尾裙*裙下摆#荷叶边", "summary": "藏青色的格纹图案,将整件包臀裙饰以文艺复古风的设计,100%的纯棉面料,舒适亲肤,贴合身形的同时。前身立体感的荷叶边点缀,结合下身的鱼尾裙下摆,更为凸显浪漫情怀。"} +{"content": "类型#裙*版型#显瘦*颜色#红色*风格#简约*图案#条纹*图案#刺绣*图案#撞色*裙型#直筒裙*裙长#连衣裙*裙领型#圆领*裙款式#拼接*裙款式#不规则", "summary": "富有设计感的一款连衣裙,用红色与条纹的撞色不规则拼接,凸显新颖有个性,吸睛亮眼,不仅丰富视觉感受,更层次鲜明。侧边还有一个刺绣的点缀,精美可爱,让衣身更加时髦靓丽,充满个性帅气。连衣裙的版型是修身直筒的,搭配简约的圆领,方便好穿,也很百搭显瘦,不挑身形。"} +{"content": "类型#裙*版型#显瘦*颜色#白色*风格#休闲*风格#清新*裙型#背带裙*裙腰型#自然腰", "summary": "休闲感十足的一款背带裙,是少女的专属。修身的版型,勾勒出婀娜多姿的黄金曲线。自然腰的打造,彰显出优雅大方的少女感,清新的本白色仿佛在诉说着一个简单的故事,雅致而又十分的显韵味,简单呈现出都市时尚摩登感。"} +{"content": "类型#裙*风格#日系*风格#文艺*风格#知性*风格#性感*图案#格子*裙下摆#压褶*裙长#连衣裙*裙款式#拼接*裙款式#勾花镂空*裙款式#不规则", "summary": "镂空的挂脖领设计个性且十分吸睛,精巧别致的露出性感的锁骨,却不失文艺的气息,洋溢出日系风格的时尚感。不规则的压褶处理,结合衣身与衣袖的双层拼接,不显单调且充满了层次感,还可以遮肉,举手投足间凸显灵动柔美感。裙摆的大胆设计使原本中规中矩的连衣裙瞬间变得时髦又有趣,洋溢出随意风的惬意感。凹凸有致的立体格子装饰,彰显精致感,迸发着时尚亲和力,尽显知性优雅的气息。使整体透露出日系风格的甜美。"} +{"content": "类型#裙*风格#性感*图案#刺绣*裙型#大裙摆*裙长#连衣裙*裙衣长#中长款*裙袖型#喇叭袖*裙款式#勾花镂空*裙款式#收腰", "summary": "一款洋气而性感的镂空连衣裙,整体采用了镂空的刺绣面料,给肌肤带来舒适的清爽感,透视也增加了性感的韵味;自然散开的喇叭袖,微露肌肤性感而优雅;收腰大摆的中长款版型,上身尽显高挑身姿。"} +{"content": "类型#裤*颜色#白色*颜色#黑色*风格#通勤*风格#ol", "summary": "这是一款ol通勤风的西装裤。设计师选用经典黑色,显得腿很直哦!裤脚卷起来露出白色的设计,特别有感觉,打造个性时尚。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*材质#棉*颜色#黑色*风格#复古*图案#条纹*图案#复古*衣样式#衬衫*衣长#中长款*衣款式#露肩", "summary": "issdm这款竖条纹露肩衬衫带来满满复古BRAND风。经典的竖条纹元素时尚减龄更具年轻风采。足量纯棉面料质感绵软舒适穿着更亲肤透气。宽松慵懒的版型轮廓鲜明生动方便自由舒展身躯更展现年轻活力。酷劲十足的纯黑色调优雅高级不做作。个性十足的穿衣肩带设计带来一丝潮酷十足的哥特式风范气场十足。中长款版型显高显瘦更有气质。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*颜色#白色*裙下摆#花边*裙领型#娃娃领*裙款式#拼接*裙款式#飘带", "summary": "白色花边拼接娃娃领,配合飘带设计,妥妥的心机,乖巧的风格,减龄指数飙升;宽松的裙型,自然随身且十分显瘦;裙摆,宽摆设计,增加了整体层次感,既活泼又提气精神。"} +{"content": "类型#裤*颜色#粉色*风格#简约*风格#休闲*图案#拼色", "summary": "的这款拼色休闲女鞋,简约而自然,充满个性化。采用拼色粉色底+经典系带款式,是此单品的一大重要亮点,穿起来既年轻又有活力,特别好看。外型讨巧,包裹性良好,可搭配各种裤型,漂亮又实用。轻盈鞋身设计,让你几乎忘记鞋子的存在,使步伐更加轻松。"} +{"content": "类型#裙*颜色#红色*图案#印花*裙下摆#荷叶边*裙领型#v领", "summary": "大气活泼的红色裙身,会是一抹很独特的存在;精致的v领,能优化颈部的曲线,还能在不经意间,看到迷人的事业线;腰间增加了荷叶边的点缀,既能优化腰部的曲线,还带着立体的美感;精美的印花,装饰着衣身,丰富了裙身的色彩,精美有韵味会吸睛;轻薄的材质,透气性保护肌肤。"} +{"content": "类型#上衣*颜色#白色*风格#简约*衣样式#衬衫*衣款式#荷叶边", "summary": "这款衬衫崇尚简约风格,一切从简但又不失别样质感。简单白色的运用,看起来毫无瑕疵,有种的气质。前襟和袖口的荷叶边设计,使整体散发着优雅清纯的韵味,时尚无比。"} +{"content": "类型#上衣*版型#宽松*材质#棉*颜色#纯色*风格#休闲*风格#清新*图案#纯色*衣样式#开衫", "summary": "拥有春天般清新的纯色开衫,在春季怎么能少。基础的纯色系渲染,在搭配上能与衣橱所有的衣物组合,打造宝贝帅气的模样。宽松随性的版型,穿着休闲又日常,精选柔软的棉面料,让宝贝在春季穿着刚刚好,清爽又透气,享受舒适的穿着感。"} +{"content": "类型#裙*版型#h*风格#简约*图案#印花*裙长#连衣裙*裙领型#圆领", "summary": "经典的h型版型,打造出精致的连衣裙。秀气简约的印花,让你更具名媛韵味。简洁大气的圆领设计,更能修饰优美脖颈和脸型。尽情彰显女性优雅的魅力。暗色系的颜色,低调又有内涵。连衣裙的面料也很舒适,给你亲肤新体验。"} +{"content": "类型#上衣*颜色#纯色*图案#纯色*图案#线条*图案#撞色*衣样式#风衣*衣款式#口袋", "summary": "简洁大气的风衣廓形,自带着飘洒的自由属性,清爽的没有冗杂的线条,只用纯色与材质彰显内心的纯粹气质。两道斜插口袋轻巧在两侧,可以舒适自在地安放双手。正中点缀着一只撞色小扣,轻巧透出一点玩趣的少女个性,打破单一色调点亮整体。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*风格#休闲*图案#字母*图案#文字*图案#刺绣*裙型#牛仔裙", "summary": "时髦帅气的字母刺绣元素设计,很能突显女性青春活力的感觉。宽松休闲的长款牛仔裙款式,显瘦又遮肉,随性又让人轻松自如哦!百搭又个性十足!"} +{"content": "类型#上衣*版型#宽松*材质#网纱*颜色#白色*风格#复古*风格#性感*图案#斑马纹*图案#复古*图案#创意*衣样式#卫衣*衣款式#拼接", "summary": "这款由黑白色打造的卫衣,打破传统卫衣的局限性,衣袖处采用网纱拼接,复古优雅中充满着性感魅惑。宽窄不一的斑马纹,打破世俗的旧看法,创意无限让人浮想联翩。上身的效果宽松舒适,却能勾勒出前凸后翘的好身材,尤显做工精致。"} +{"content": "类型#裤*材质#棉*材质#牛仔布*材质#混纺*风格#复古*风格#简约*风格#休闲*图案#复古*图案#撞色*裤型#铅笔裤*裤款式#拼接", "summary": "这款铅笔裤采用棉质混纺面料,吸湿透气,手感厚实,穿着舒适。裤身两侧经典复古对称插袋,随意休闲,具有良好的实用性。设计师在简约的版型设计上添加了撞色拼接的元素,让简约的牛仔裤多了几分街头时尚腔调,同时又不失减少原本的复古调性。"} +{"content": "类型#裤*版型#宽松*材质#水洗*风格#运动*裤口#小脚", "summary": "水洗做旧工艺让整款裤子的色彩感得到很好的调试,让裤子不显平凡枯燥。而宽松的裤腿加上小脚的设计,增加了裤子穿着的舒适性还不显拖沓累赘,更加便于日常出行和运动。"} +{"content": "类型#裙*版型#宽松*材质#针织*风格#高贵*图案#线条*裙型#a字*裙型#鱼尾裙*裙领型#圆领*裙领型#高领*裙款式#钉珠*裙款式#抽褶*裙款式#收腰", "summary": "舒适的圆领,修饰颈部线条,时尚狐狸头针织图案,立体生动。高领保暖,褶皱弹力,鱼尾裙摆,充满浪漫风情又结实耐穿,风格精致、迷人而不失物料的轻薄、飘逸的特性。重工钉珠点缀,轻奢华丽的质感璀璨闪耀光泽。收腰a字版型收腰塑型效果好,上身高贵名媛范,宽松的设计不挑身材。"} +{"content": "类型#裙*版型#显瘦*图案#线条*裙下摆#开叉*裙长#半身裙*裙款式#不规则", "summary": "这款半身裙采用贴身版型设计,有一定的廓形度,上身舒适无束缚。正面不规则斜向小开叉,隐约露出腿部线条,非常显瘦。满的提花猫咪面料,设计感十足,为你赚足回头率。"} +{"content": "类型#上衣*材质#蕾丝*颜色#黑色*图案#创意*图案#撞色*图案#蕾丝*衣样式#外套*衣长#短款*衣门襟#拉链*衣款式#拼接*衣款式#螺纹*衣款式#拉链", "summary": "全新的时代,让我们摒弃千篇一律的设计。极具创意个性的黑色蕾丝短款外套,给了我们独特的视觉感受。黑色短款上衣能够拉高腰际线,修饰身形,塑造出大长腿女神形象。衣摆处的撞色螺纹拼接,彰显着新时尚趣味。而金属拉链又毫无突兀的期待装饰的作用,让衣身看起来更加时尚大气。"} +{"content": "类型#裤*材质#牛仔布*颜色#黑色*颜色#黄色*风格#复古*图案#字母*图案#文字*图案#复古", "summary": "来自BRAND的牛仔裤,依旧采用了复古的黑色牛仔布来塑造,并将黄色的车缝线密集缝制,让黑色的裤身添入了几分靓丽感。正面设计以简洁款式为主,背部则是经典的大m从腰头下一只贯穿至裤腿上,并将其印制出黄色的字母装饰,让经典得到颠覆,使得裤身潮感十足。"} +{"content": "类型#上衣*材质#天丝*衣样式#风衣*衣领型#翻领*衣款式#腰带", "summary": "BRAND这款天丝轻薄风衣采用了层次感的翻领设计,显得十分有个性,展现个人气质。腰间带有可拆腰带设计,方便人们秀出腰线,展现好身材。"} +{"content": "类型#上衣*材质#天丝*衣样式#风衣*衣领型#翻领*衣款式#腰带", "summary": "BRAND打造的一款比较适合初春搭配的薄款风衣,采用大翻领的设计,更加的潇洒,更能凸显你的气场。可以拆卸的腰带设计,随心搭出专属自己的风格,凸显你的魅力。采用的是天丝的面料,轻盈又爽滑,穿起来更加的舒适。"} +{"content": "类型#裤*颜色#绿色*风格#青春*风格#清新*风格#性感*图案#条纹*裤长#连体裤*裤型#阔腿裤", "summary": "连体裤是十分减龄又有范的单品,上身不用担心穿搭,又轻松显气质。这款连体裤是十分清新的绿色设计,淡淡的条纹极具气质范,缝的包边彰显精致的做工。裤子是无袖的设计,具有性感的气息,阔腿裤的版型搭配摇曳的荷叶边,更显俏皮和活力,也带来满满的青春范。"} +{"content": "类型#裙*风格#淑女*风格#英伦*风格#复古*风格#文艺*图案#格子*图案#复古*裙下摆#荷叶边*裙长#连衣裙*裙领型#娃娃领*裙衣门襟#系带*裙款式#抽绳", "summary": "最近时尚界大玩复古风潮,90年代的复古裙型,又再度流行起来。像这样一件极具文艺与复古味道的连衣裙,将双层荷叶边塑造娃娃领效果,甜美中透出怀旧的情调。格子图案的加入渲染英伦气息,让身上气质更显优雅干练。腰部的抽绳系带,让裙摆更显蓬松效果,展现淑女的一面。"} +{"content": "类型#裙*颜色#藏蓝色*风格#文艺*裙型#百褶*裙下摆#垂坠*裙款式#亮片", "summary": "对于爱好学院风的文艺少女来说,百褶裙简直就是的标志。百褶裙是学院风的经典单品,可以说是永不过时。这条百褶裙采用了文艺的藏蓝色,看起来温和低调充满了宁静的气息。简单的设计搭配亮片装饰让这条裙子穿起来更加少女。面料的原因让裙子拥有良好的垂坠感,穿在身上不会显得臃肿。这样一条裙子完全可以衣柜。"} +{"content": "类型#裙*颜色#黑白*风格#性感*图案#拼色*图案#线条*裙下摆#荷叶边*裙长#连衣裙*裙领型#一字领*裙款式#露肩*裙款式#收腰", "summary": "这是女神款一字领连衣裙,简洁性感的露肩一字领口设计令优美的肩颈线条一览无余。充满立体结构感的夸张袖口和下摆丰盈荷叶边裙摆,与收腰裁剪形成廓型上的对比,加以经典的黑白拼色,更是本季不容错过的主打元素。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*图案#印花*衣样式#衬衫*衣袖长#长袖*衣门襟#单排扣", "summary": "本款上衣整体采用较为宽松的直筒版型设计,藏肉显瘦,不挑身材,适合各种身形的人穿着。飘带领的领口设计,使得本款衬衫穿在身上看起来十分的甜美可爱。单排扣的衣门襟设计,又使得本款衬衫十分的经典大方。精美的印花图案装饰,使得本款衬衫不显得单调,上身给人一种独特的时尚魅力。长袖的设计,更加的贴合手臂曲线,上身更加的舒适贴身。"} +{"content": "类型#裙*材质#天丝*裙长#连衣裙*裙袖长#长袖", "summary": "这一款廓形简洁的长袖连衣裙,我们选用了优质的天丝面料,天丝材质细腻顺滑,手感柔软舒适,却不失筋骨,有着爽滑的触感,非常适合做春夏季的时装搭配。"} +{"content": "类型#上衣*版型#显瘦*风格#通勤*风格#运动*风格#休闲*衣样式#衬衫*衣领型#翻领", "summary": "简单配色和基础翻领设计让这款衬衫看起来大方利落,不加修饰反而更具穿搭性,日常休闲、运动出游甚至是通勤场合皆可穿着;且修身版型设计显高不挑人,驾驭轻松穿出绅士范;辅以袖口的搭扣让穿法灵活多变,时尚出街。"} +{"content": "类型#裙*颜色#纯色*图案#纯色*裙型#百褶*裙长#连衣裙*裙款式#木耳边*裙款式#抽褶", "summary": "有时候纯色布料看多了会造成视觉上的疲劳,但这款连衣裙却不会。虽然其主体部分只采用了一种面料,但在它的裙摆处融入了细腻褶皱工艺处理,所呈现出来的是层次丰富的百褶裙摆,它结合前襟的木耳边装饰,给连衣裙增添了浓浓的甜美气息。"} +{"content": "类型#裙*版型#宽松*材质#棉*材质#雪纺*图案#刺绣", "summary": "胸襟处彩色花朵刺绣,充满民族风情,透露一丝古典美感。优质雪纺面料配上宽松裙摆样式,不仅活动方便,还呈现飘逸灵动美感。做工整齐细腻,一丝不苟,彰显高端品质。优质纯棉内里,亲肤柔软,呵护宝宝的每一寸娇嫩肌肤。"} +{"content": "类型#裙*材质#针织*颜色#粉色*风格#简约*裙下摆#开叉*裙领型#圆领*裙领型#v领*裙衣门襟#系带*裙款式#亮丝*裙款式#拼接", "summary": "针织亮丝的材质,把人带入未来科技感的新领域。简约的拼接小圆领,区分层次感。藕粉色闪着微微的星星光泽,好似人鱼一般的梦幻迷人。开叉系带的袖口,摩登时尚。外搭的v领吊带裙,具有减重的视觉效果。"} +{"content": "类型#裙*风格#性感*裙型#包臀裙*裙下摆#荷叶边*裙腰型#高腰*裙款式#不规则", "summary": "高腰的版型,使双腿看起来更加修长,实现了身材比例的完美分割。包臀设计,展现出曼妙的身材曲线,散发出性感的女性气息。不规则的荷叶边设计是整条裙子的亮点,上身更是造型感十足。"} +{"content": "类型#裤*风格#通勤*风格#青春*图案#条纹*图案#印花*裤型#阔腿裤", "summary": "阔腿裤样式和竖条纹元素结合,更好的修饰身形,掩藏身材上的小缺点,显得身高腿长。作为近两年流行热点,阔腿裤利落大气的裤型更加适合通勤风格,增强女性职业感,气场强大。精致的印花蝴蝶穿梭在竖条纹间,增添几分灵动和有趣。"} +{"content": "类型#裙*版型#宽松*材质#蕾丝*颜色#黑色*风格#知性*风格#性感*图案#蕾丝*裙长#长裙", "summary": "这款宽松的长裙,穿着起来有着性感的感觉。精美的蕾丝花装饰衣身,既充满年轻女孩的活力,又展现着知性优雅的熟女风范。黑色调自带一分清冷气质,米色调轻松打造青春少女风,两款可满足你任何场合的需求。"} +{"content": "类型#裤*版型#宽松*材质#牛仔布*颜色#白色*风格#休闲*风格#青春*图案#条纹*图案#撞色*裤型#直筒裤*裤腰型#高腰", "summary": "这款很有青春气息的牛仔裤采用白色作为色调设计,撞色的明线的条纹勾勒其中,衬托出十足的活力气息,同时又营造了视觉上的立体感。宽松的直筒版型轮廓不挑身材,穿搭起来慵懒而休闲,自由驾驭感十足。高腰的版型剪裁着身更具舒适度,贴合不同人的需求。"} +{"content": "类型#上衣*材质#涤纶*风格#简约*风格#知性*图案#拼色*图案#线条*图案#撞色*衣样式#外套*衣领型#圆领*衣袖型#落肩袖", "summary": "raive这款粗花呢外套,采用涤纶材质打造,触感柔软舒适;整体采用杂色拼色的设计,色彩搭配和谐,温婉又不失活力;简约的低圆领,能够柔和脖颈线条,打造天鹅颈;慵懒型的落肩款式,彰显随性自如格调;撞色勾边处理,能够冲击视觉亮点,活跃整体氛围,打造优雅知性的名媛气质。"} +{"content": "类型#上衣*版型#显瘦*图案#线条*衣样式#卫衣*衣款式#抽绳", "summary": "七格格卫衣连衣裙在腰部处加入了可调节抽绳设计,使裙身的版型样式获得巧妙的提升与蜕变,层次感明显的同时很好的勾勒了腰部处的线条轮廓,缔造出纤细的视觉美感,傲人小蛮分钟。可调节抽绳的融入也带来了很大的实用性,以便于在上身穿着时可以根据自己的腰围进行适当灵活的调节,就算是微胖型的姑娘也可以轻松灵活的驾驭,既时髦又显瘦。"} +{"content": "类型#裙*版型#显瘦*材质#天丝*风格#青春*图案#格子*图案#线条*裙下摆#荷叶边*裙领型#v领*裙款式#不规则*裙款式#收腰", "summary": "格子是永不褪流行的经典元素,搭配天丝质感的吊带裙,更显别致优雅。v领设计增加精致感,能很好的修饰脸部线条。同色系带收腰提高腰线,勾勒曲线更显婀娜。不规则荷叶下摆增加层次感,修饰腿部线条,更显瘦显高挑。"} +{"content": "类型#上衣*风格#简约*风格#性感*图案#线条*衣样式#针织衫*衣领型#一字领*衣门襟#系带", "summary": "能够让你在一票极简风格中,领略到不一样美感的针织衫甚是独特!一字领带出流畅的设计线条,加入一点性感的小心思,就是肩部的系带穿绳,个性十足的同时,展示出性感的锁骨线,让你在简约中美美的迎来春天的穿搭!舒适贴身的款式,也是属于实穿又时髦的BRAND!"} +{"content": "类型#裙*裙下摆#开叉*裙领型#西装领", "summary": "一款很显女性风骨的单品,穿出不凡气质。优雅的西装领设计,经典时尚增添女性干练气场;裙摆式开叉下摆,带来更多活动空间,自由随性,穿出洒脱气质。"} +{"content": "类型#上衣*版型#显瘦*版型#立体剪裁*材质#水洗*风格#休闲*衣样式#衬衫*衣领型#翻领", "summary": "这款军事徽章休闲衬衫选用特色牛津纺面料制作,经过怀旧水洗工艺处理后,复刻出自然落色效果,更显时尚美观;翻领的设计让衣身更加具有时尚感,干练精神,更显年轻活力;而修身版型采用立体裁剪设计,提升时尚气质,彰显男士独特魅力。"} +{"content": "类型#裙*版型#显瘦*颜色#白色*图案#条纹*裙腰型#高腰*裙长#半身裙", "summary": "这款BRAND的半身裙,采用高腰修身的剪裁。其在视觉上巧妙的修饰着身材曲线,较短的裙摆露出修长的双腿,瞬间穿出大长腿的既视感。除此之外,其白色条纹的裙身点缀,具有较强的视觉冲击,在个性吸睛的同时,尽显不凡的穿搭品味。"} +{"content": "类型#上衣*版型#显瘦*风格#通勤*图案#花色*图案#印花*衣样式#衬衫", "summary": "这一款衬衫创造性的运用了印花元素,是整件衬衫最抢眼的地方,与其他衬衫不同,这款不论是在花色选择还是颜色搭配上都别具一格。前面的竖式剪裁更是特别,显瘦又前卫。上班族可以参考的方式将衬衫扎进半身裙中,时尚感浓浓的通勤风同样粉无数!"} +{"content": "类型#上衣*材质#棉*颜色#白色*颜色#蓝色*风格#淑女*风格#高贵*风格#清新*衣样式#衬衫*衣领型#翻领*衣袖型#泡泡袖", "summary": "这是一款专为女童打造的精致淑女范衬衫。拥有清新柔美的蓝色与白色两个色系,纯粹的色彩似将美好的梦想牢牢守护,同时凸显气质的高贵与优雅。采用纯棉面料并添加氨纶弹力成分,上身包容感更强,配合立体多片式剪裁,更显纤细窈窕的腰身曲线。小巧的翻领,易于衬托宝贝乖巧圆润的脸蛋,而压褶泡泡袖肩的设计,更是将甜美与浪漫尽情挥洒。"} +{"content": "类型#裙*版型#宽松*材质#棉*风格#复古*风格#清新*图案#复古*图案#刺绣*裙长#连衣裙*裙衣门襟#系带", "summary": "一款柔软透气的棉质连衣裙;具有复古韵味的绣花图案,巧妙的蔓延在袖部,精致唯美又不失趣味的设计感,为单调的裙身赋予了清新素雅的韵味;腰间可调的同色系系带,收紧则凸显纤腰穿出优雅范,松弛而系则显宽松穿出随性的魅力。"} +{"content": "类型#上衣*版型#宽松*颜色#纯色*风格#淑女*风格#简约*风格#清新*风格#职场*图案#纯色*衣样式#衬衫*衣款式#绑带*衣款式#收腰", "summary": "这款连衣裙采用衬衫设计风格,尽显职场简约干练气息,整体版型比较宽松,但是在腰部添加绑带设计,有收腰效果,勾勒完美身材曲线。衣身的纯色面料制作,简约大方,打造甜美淑女气质,轻薄面料自带垂感,清新时尚。"} +{"content": "类型#裙*风格#休闲*图案#条纹*裙下摆#开叉*裙长#半身裙*裙衣长#短款", "summary": "简洁的短款半身裙,在侧边做了两条条纹的装饰,让裙子变得休闲时髦起来,更具百搭性。裙摆的开叉设计,让你穿着更舒适,无束缚感。"} +{"content": "类型#上衣*图案#刺绣*衣样式#衬衫*衣门襟#系带", "summary": "古典风格的衬衣,选用素雅的颜色,聚集仙气。领口系带设计,翩翩东方少年的感觉。狐狸图案刺绣,打破沉闷,乐趣,下摆两边大的开叉,给腰身更多一点的自由空间。"} +{"content": "类型#裤*版型#宽松*颜色#纯色*风格#休闲*风格#潮*图案#纯色*裤长#短裤*裤款式#口袋", "summary": "这件韦恩泽维尔宽松休闲纯色短裤,本身在设计风格中就是非常的与众不同的,主要就是体现在裤子上的翻盖口袋装饰,让一条再简单不过的裤子展现出不一样的潮流。裤子采用的宽松的版型设计,让你不管有再粗的腿也能很好的穿着,采用的优质面料,透气吸汗,也是非常的亲肤。"} +{"content": "类型#上衣*风格#休闲*风格#潮*图案#印花*衣样式#开衫*衣样式#毛衣", "summary": "这是一款集柔软凉爽。不易产静电、起毛和起球为衣身的开衫,在我们印象中。提到开衫可能会联想到毛衣开衫,但是这款开衫是夏季的,透气又舒适。独特的印花底纹潮流时尚,流苏的装饰点缀个性新颖。蝙蝠版型设计休闲又时髦。"} +{"content": "类型#上衣*版型#显瘦*材质#棉*风格#简约*风格#性感*图案#线条*衣样式#针织衫*衣领型#v领*衣款式#罗纹", "summary": "这款来自massimodutti的针织衫,百分百全棉材质,自带亲肤触感,具有较好的弹力和透气性,结实耐穿。整体的版型简约大方,在修身的廓形下尽显高挑纤细的身形魅力。v领的领口直观展现颈部的线条和傲人的锁骨,性感十足。领口和下摆加入的罗纹束口,干练大气。"} +{"content": "类型#裙*风格#复古*图案#复古*图案#线条*图案#印花*裙下摆#荷叶边*裙长#长裙*裙袖型#喇叭袖*裙袖型#收口*裙衣门襟#系带", "summary": "一袭飘逸长裙再点缀甜美印花图案,温婉间不失清爽感,彰显了优雅气质。荷叶边收口的喇叭袖设计经典复古,体现了做工的精致繁复,增添了浪漫的风情。腰部的系带设计拉长了腿部线条,看起来更显高挑纤细。"} +{"content": "类型#裙*版型#显瘦*风格#清新*图案#植物*图案#印花*裙下摆#荷叶边*裙长#连衣裙*裙款式#收腰", "summary": "设计师将清新的花卉印花元素使用在裙子的设计当中,结合上荷叶边,赋予了连衣裙更多几分的甜美气质。收腰版型的设计非常贴心,可以避免裙子的腰身过于宽大,穿上之后,显瘦效果颇为不错,能够让你的身形看着更为窈窕。"} +{"content": "类型#上衣*风格#简约*衣样式#风衣*衣门襟#双排扣*衣款式#拼接*衣款式#绑带", "summary": "所有的时尚都离不开经典的延续。这款风衣的设计经典而简约,上身却自然流露出一份洒脱感。翻驳领挺括有型,加之衣片拼接立体感十足。双排扣开合飒爽帅气,开合穿着都有范儿。束腰绑带勾勒出纤细腰姿,随意一个整体更有美感。"} +{"content": "类型#裙*版型#显瘦*风格#民族风*图案#刺绣*裙长#长裙*裙款式#勾花镂空", "summary": "这条民族风的波西米亚长裙,有几处心机设计特别夺目。首先是精致的手工刺绣,传承了民族文化的美感而不失严密的针法,其次就是腰部镂空设计,既能遮肚又显瘦。"} +{"content": "类型#上衣*风格#潮*图案#字母*图案#文字*图案#印花*衣样式#外套*衣长#中长款", "summary": "在多风的季节,还是让孩子穿着巴拉巴拉中长款版型的女童外套吧,它不仅能很好的修饰身材,而且还很防风穿着不怕冷呢。看它身后那不同色彩的字母印花图案,在打破衣身单调性的同时,还给整体增添了一丝时尚气息,能轻松塑造具有潮流气息的萌娃形象哦。"} +{"content": "类型#裤*版型#显瘦*风格#性感*图案#线条*图案#刺绣*图案#撞色*裤款式#口袋*裤口#翻折", "summary": "修身版型,收裤脚设计,修正腿型衬出纤细柔美的腿部线条,精致显瘦。穿着时挽起裤脚,形成撞色的翻边观感,更具个性感。两侧口袋设计,实用美观。裤管花朵刺绣设计,增添几分神秘浪漫的气质。"} +{"content": "类型#裙*风格#性感*裙长#连衣裙*裙款式#立体装饰*裙款式#吊带", "summary": "推出的度假风的吊带连衣裙,采用的是深v的设计,很好的拉长你的脖颈,同时还能露出你优美的锁骨,显得超性感。吊带的款式设计,可以轻松凸显你的高挑感,看起来魅力十足。裙身上的立体装饰,更有设计感,让裙子更有仙气。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#纯色*风格#通勤*图案#纯色*图案#线条*衣样式#打底衫*衣领型#v领", "summary": "一件舒服好穿的打底可以说是衣橱里的许多单品的“救星”了。温柔的猪猪这款通勤感打底衫,采用v领设计,更加修饰肩颈线条;领口采用包边工艺,不易变形,更显质感;腰身微宽松版型,不限制身材,轻松掩盖不完美之处,更加显瘦;纯色设计,低调优雅。"} +{"content": "类型#裤*颜色#白色*颜色#黑白*图案#条纹*图案#线条*图案#撞色*裤长#九分裤*裤型#哈伦裤", "summary": "腰部的撞色设计将白色条纹缀在腰间,双层纹路不但增添了层次感还有着黑白色彩的对比显得简洁中有精致的设计感。九分裤长设计的哈伦版型有着彰显腿长的效果,看上去修长很有拔高身形的作用。上松下紧的设计让裤子完美地贴合腿部线条,勾勒出腿型的优势。醋酸纤维的面料使得裤身有适当的弹性,不论腿粗还是细都能驾驭好,有着自然的垂坠感。"} +{"content": "类型#上衣*材质#天丝*材质#针织*颜色#宝蓝色*风格#休闲*风格#高贵*图案#刺绣*衣样式#开衫*衣款式#口袋*衣款式#连帽", "summary": "针织开衫的实穿性能非常强,已然成为了都市女性衣橱必备的单品。这款开衫选用宝蓝色调,凸显出优雅且高贵的气质。添加了天丝材质,触感是爽滑而舒适的。采用连帽设计,又展现出了随性的休闲范。而大口袋的设计,不仅实用还提升了造型感。再加上小狗刺绣的点缀,还增添了趣味性和活力。"} +{"content": "类型#裙*材质#丝绒*风格#居家*风格#性感*裙长#连衣裙*裙领型#立领*裙款式#勾花镂空", "summary": "一款别致的两件套长衫连衣裙,多种穿搭,各具风格。肩部镂空的丝绒吊带裙,保留了传统中式立领,居家外穿皆适宜,传统中带着性感的小心思。"} +{"content": "类型#上衣*衣样式#雪纺衫*衣款式#绑带*衣款式#荷叶边", "summary": "这一款雪纺衫时尚的荷叶边装饰,随风摇曳轻盈出彩。精挑细选顺滑柔软的布料,泛着柔亮的光泽,穿着体验度不一般。加上领口绑带,精致美丽错落有致。整体看起来,款型不会显得单调,更显轻松随意感。"} +{"content": "类型#裤*颜色#黑白*风格#简约*风格#性感*图案#线条*图案#撞色*裤长#短裤", "summary": "长袖上衣加短裤的巧妙结合,瞬间让平淡无奇的造型立刻变得生动俏皮,特别的吸睛亮眼。此款黑白撞色上衣,显得非常有气质,简约中透露着时髦味道,既清雅又显气质,大v领设计,更为性感抚媚,且展露迷人纤细的颈部线条,波动着男士的心,且短裤的搭配,穿着干练脱俗,更加增添女人的柔情,洋溢着青春活力感。"} +{"content": "类型#上衣*材质#棉*颜色#白色*风格#休闲*图案#刺绣*衣样式#衬衫*衣领型#尖领*衣袖长#七分袖", "summary": "太平鸟与可口可乐推出的一款品,而且还是男士衬衫,尖领以及七分袖的设计定位于半休闲的场合。全身采用素净的白色,用料是纯棉质地,尽显柔和斯文的气质。而可口可乐的logo以白色刺绣的形式低调点缀于腰间。"} +{"content": "类型#裙*裙型#花苞裙*裙下摆#开叉*裙长#连衣裙*裙领型#v领", "summary": "一款甜美优雅气质连衣裙,选用一字带v领设计,展现的时尚魅力。另外选用花苞袖设计,丰富整体造型美感,更显甜美气质。搭配开叉裙摆设计,打造出一款与众不同的气质美衣,也就是此款连衣裙的亮点之处。"} +{"content": "类型#裙*风格#街头*裙下摆#花边*裙长#连衣裙*裙领型#立领*裙袖型#灯笼袖*裙款式#拼接*裙款式#钉珠", "summary": "镶边立领搭配精致的花边和桀骜的钉珠,将新时代年轻女孩的性情展现的淋漓尽致。拼接的半灯笼袖设计,不会显得手臂粗壮,还给整条裙子增添了活力。这条连衣裙,精致华美的同时又兼具桀骜不驯的街头气息,自由自在做自己,才是年轻人应有的态度。"} +{"content": "类型#上衣*风格#性感*衣样式#风衣*衣领型#翻领*衣门襟#双排扣*衣款式#露肩", "summary": "这款风衣双排扣的设计,延续了军装的那种英姿飒爽的感觉,加入了两侧露肩设计,展现优美性感的肩部曲线;简洁的翻领设计,凸显女性的大气干练气场,上身效果立体感强;同色系袢设计,勾勒出纤细腰身和迷人曲线。"} +{"content": "类型#裤*风格#复古*图案#蝴蝶结*图案#复古*图案#线条*裤型#阔腿裤*裤款式#松紧带", "summary": "设计师新颖的将金丝绒面料与阔腿裤的版型相结合,赋予了裤装几分复古的风情,垂顺的面料呈现出拉长腿部线条的视觉效果,更显高挑纤细身姿。松紧带设计,即穿脱更方便,又令其舒适没有束缚感。裤头蝴蝶结系带点缀,尽显女性闲适的自在感。"} +{"content": "类型#裙*颜色#黑白*风格#复古*风格#知性*图案#条纹*图案#蝴蝶结*图案#复古*裙长#连衣裙*裙款式#不规则*裙款式#收腰", "summary": "这款收腰连衣裙走的是端庄知性的风格路线,尽显出你的温柔温和的性情,非常的优美迷人。配合黑白条纹的设计,尽显出复古端庄的气质。不规则的裙摆,尽显出你的华丽大方。配合蝴蝶结的袖袢,体现出新鲜新颖的视觉效果,说明其品质可嘉。"} +{"content": "类型#上衣*版型#宽松*材质#棉*颜色#纯色*风格#文艺*风格#运动*风格#清新*图案#纯色*衣样式#polo", "summary": "仿佛是专门为了夏天而存在的一款polo衫,纯净耐看的纯色系列穿出男性的好气色,瞬间年轻好几岁不是问题。经典的版型延续了一贯的舒适性,上身宽松不紧勒肌肤,更显随性慵懒。柔软的纯棉材质轻松吸汗透气,即使是在运动后的状态也能完美应付。干净的纯白色更显清新文艺。"} +{"content": "类型#上衣*材质#蕾丝*图案#蕾丝*衣样式#西装*衣款式#拼接", "summary": "春暖花开,万物复苏。又到了西装发挥作用的时候,西服的硬朗是不可磨灭的。这款袖口蕾丝的皮西装,中和了中性的感觉,在帅气与女人之间随意切换,肌理感pu皮面料。袖口与拼接蕾丝可脱卸,复古风十足~"} +{"content": "类型#上衣*材质#蕾丝*图案#蕾丝*衣样式#西装*衣款式#拼接", "summary": "春暖花开,万物复苏。又到了西装发挥作用的时候,西服的硬朗是不可磨灭的。这款袖口蕾丝的皮西装,中和了中性的感觉,在帅气与女人之间随意切换,肌理感pu皮面料。袖口与拼接蕾丝可脱卸,复古风十足"} +{"content": "类型#上衣*材质#蕾丝*图案#蕾丝*衣样式#西装*衣款式#拼接", "summary": "春暖花开,万物复苏。又到了西装发挥作用的时候,西服的硬朗是不可磨灭的。这款袖口蕾丝的皮西装,中和了中性的感觉,在帅气与女人之间随意切换,肌理感pu皮面料。袖口与拼接蕾丝可脱卸,复古风十足。"} +{"content": "类型#上衣*版型#宽松*版型#显瘦*颜色#白色*颜色#绿色*风格#简约*风格#休闲*衣样式#衬衫*衣款式#不规则", "summary": "纯粹利落的经典衬衫款式,打造休闲时尚风,这款BRAND白色女士衬衫,简约的白色百搭又不乏大气美感。不规则的弧形下摆处理,让整体更具随性自然的休闲气息。宽松版型上身具有显瘦的效果,个性绿色贴布点缀,带来夏季清爽之感。"} +{"content": "类型#裙*版型#显瘦*材质#牛仔布*风格#复古*风格#性感*图案#复古*裙型#牛仔裙*裙型#包臀裙*裙下摆#开叉*裙下摆#毛边*裙长#半身裙", "summary": "牛仔半裙的出相信是每个mm都知道的,这是一款既复古又有范的牛仔半裙,全手工的漆点尽显个性和潇洒,为半裙增添满满的不羁气质。半裙是修身的包臀版型设计,视觉上尽显你的曼妙身材,也性感又抢眼。裙身有一道开叉设计,搭配下摆的毛边设计,更添率性气息。"} +{"content": "类型#上衣*衣样式#风衣*衣样式#外套*衣领型#翻领", "summary": "这件风衣外套在设计上极具突破性,不再是经典的风衣廓形,做了很多创新和改变。首先保留了翻领设计,但是不再是规矩的翻领,还有面料不再是挺阔的面料,加上下摆的百褶处理,穿着更显飘逸和灵动,没有一点点拘束感。"} +{"content": "类型#上衣*衣样式#风衣*衣样式#外套*衣领型#翻领", "summary": "风衣的干练和帅气,是其他外套无法给予的,上身随意但又不乏气场的这款风衣,简洁的翻领设计,流畅的剪裁尽显干练大气;同色系袢设计,勾勒纤细腰身展现出迷人的曲线;后背开叉和贴布点缀的亮点,增添设计细节又洒脱随性。"} +{"content": "类型#裤*版型#宽松*材质#亚麻*裤长#九分裤*裤长#连体裤*裤款式#勾花镂空*裤腰型#松紧腰", "summary": "这款连体裤版型宽松,对衣者的身材要求没有过多要求;九分裤设计则是小个子妹子们不可错过的显高武器,而腰间系绳设计,更是进一步提升了腰线;同时圆环镂空搭配系绳更方便调节腰部的松紧程度。亚麻面料上身舒适,是值得尝试的时尚单品。"} +{"content": "类型#裙*版型#宽松*版型#显瘦*裙领型#圆领*裙款式#勾花镂空*裙款式#纽扣", "summary": "让你孕期也能魅力依旧,经典的圆领,更具百搭感,宽松的镂空袖子设计,让裙子更立体,同时还能显瘦遮肉。精致的花瓣领设计,整体造型更具女人味,后背的一字纽扣设计,更适合孕妈穿着。"} +{"content": "类型#裤*版型#显瘦*材质#牛仔布*颜色#深蓝色*裤腰型#高腰", "summary": "misssixty的这条牛仔裤采用了时尚大气的烟灰色,改变了常规的深蓝色带来的审美疲劳。它是高腰裤的版型,腰身部位用一排扣子装饰,适合外穿把纤细的腰身展露出来。带有弹力的修身版型,让你即使是穿着牛仔裤也不会感到束缚感。"} +{"content": "类型#裤*材质#牛仔布*颜色#白色*风格#简约*图案#线条*裤长#短裤*裤型#阔腿裤*裤腰型#高腰*裤口#毛边", "summary": "设计剪裁立体有心意,穿着更加有型有范儿。设计师匠心独制,高腰阔腿设计修饰腿部线条为此款白色牛仔短裤添彩,另外做旧毛边工艺设计又将短裤点缀得恰到好处。简约大方的牛仔面料,使每个女孩子都能轻松驾驭,展现夏日里的好身材,释放女生独特魅力。"} +{"content": "类型#裙*版型#显瘦*材质#网纱*风格#性感*裙型#百褶*裙下摆#压褶*裙长#连衣裙*裙衣门襟#拉链*裙衣门襟#套头*裙款式#拼接*裙款式#拉链*裙款式#木耳边*裙款式#抽褶*裙款式#不规则", "summary": "内搭的连衣裙,几乎每个人都有一件。但是像这件这么性感妩媚的很少,没有拉链,套头就可以穿,网纱的垂感很好,因此是很显瘦的,而且大身网纱另外做了压褶,因此有了百褶的效果,不规则木耳边抽褶拼接,更添吸晴亮点。"} diff --git a/ptuning/datasets/AdvertiseGen/train.json b/ptuning/datasets/AdvertiseGen/train.json new file mode 100644 index 0000000000000000000000000000000000000000..45a137ea5034e4cd0ec37c25ce21a9cf2b4f396a Binary files /dev/null and b/ptuning/datasets/AdvertiseGen/train.json differ diff --git a/ptuning/datasets/Zettels/dev.json b/ptuning/datasets/Zettels/dev.json new file mode 100644 index 0000000000000000000000000000000000000000..1eb63187fc7758ead4acb23d72a41ccfc01ac732 --- /dev/null +++ b/ptuning/datasets/Zettels/dev.json @@ -0,0 +1,21 @@ +{"content": "语料描述-为什么要进行低光增强?", "summary": "可见光图像的成像过程受光照强度及环境影响,低照度环境下采集的可见光图像存在信噪比、对比度、分辨率均较低等特点,给进一步图像处理,图像识别、目标检测等任务带来更严峻的挑战."} +{"content": "语料描述-为什么要进行低光增强?", "summary": "当前,人类在水下和低光环境中的活动越来越频繁,比如:目标检测、智能识别、水下探测等。因水下和低光图像具有很多的相似情况,因此,对于如何解决两者共同存在的问题成为了比较热门的话题"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "随着国家科技水平的发展和国民经济水平的提高,人们对于出行的要求也让汽车行业得以迅猛的发展.但是,汽车行业的迅猛发展,在方便人们的出行的同时,也导致了交通事故率的显著提高."} +{"content": "语料描述-为什么要进行低光增强?", "summary": "在有雾的天气下,大气中存在着很多悬浮的小液滴,导致所获取图像的可视性显著下降,会出现场景细节模糊、图像色彩衰减等问题。随着社会的发展,机器视觉被广泛应用于安全监控、目标识别、遥感成像、图像分类等领域,机器视觉的有效性通常建立在具有较好可视性的输入图像之上。因此,消除有雾天气产生的不良视觉效果,获得可视性良好的图像具有重要的意义。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "随着智能算法的发展,图像识别技术越来越受到人们的关注,科研人员在训练识别模型的时候也希望得到更准确、更稳定的识别模型。但是,图像数据库的图像质量可能会受到各种各样外在因素的影响,如光照不均、相对运动和色彩失真等。对图像进行增强处理,可提高图像品质,丰富图像的信息量,便于计算机进一步理解图像。因此,彩色图像的增强处理技术在交通智能化、军事国防和医学等方面能够发挥相当重要的作用。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "日常生活中经常需要在低光条件下捕捉图像,例如在夜间或昏暗的室内房间。在此环境下拍摄的图像往往会出现能见度差、对比度低、噪声大等多种问题。虽然自动曝光机制(如ISO、快门、闪光灯等)可以增强图像亮度,但同时也会产生其他的影响(如模糊、过饱和度等)。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "受光照强度的影响,在夜间和背光条件下采集的图像往往含有较低的对比度、大面积的暗区域和明显的噪声污染。这些降质图像往往导致人们无法正确地辨识场景内容,也常常给图像检索、多媒体信息安全等后续计算机视觉任务带来严峻的挑战。因此,低照度图像增强具有重要的理论价值和现实意义,受到学界广泛关注。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "在光线昏暗的环境下,摄影师拍摄出正常成像非常困难。在更极端的黑暗情况下,如光照严重受限(月光)或短时间的曝光,成像就更为困难。低光下拍摄的图像和极低光下拍摄的图像对比如图1所示,明显看出极低光下拍摄的图像相比低光下拍摄的图像被隐藏的信息更多。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "在图像采集过程中,所在场景的光照条件往往是影响图像质量的重要因素之一,在现代社会生产生活中,人们采集图像变的更为方便和快捷,由于光照条件不足产生的低照度图像识别度不高,导致缺乏可用性,并对后续的图像处理、目标识别、语义分割等仸务造成了困难"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "可见光图像的成像过程受光照强度及环境影响,低照度环境下采集的可见光图像存在信噪比、对比度、分辨率均较低等特点,给进一步图像处理,图像识别、目标检测等任务带来更严峻的挑战"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "在很多计算机视觉任务中,如目标检测、图像检索、图像分割等,都要求输入图像亮度合适、细节清晰。然而,在弱光照或者曝光不足的情况下,采集到的图像存在亮度低、色彩不饱和、细节模糊等缺点,这些缺点将影响到来操作来都不像像后续的计算机视觉任务。因此,研究弱光照图像的增强很有必要。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "随着计算机视觉领域的収展,携带着丰富信息的高质量图像无论是在日常生活还是科学研究中都有着巨大的研究潜力。但是,由于不同的光照条件、周围的噪声等原因,图像质量高低不一,严重影响了人们判别图片中的信息,从而引起不必要的冲突和结果。尤其是在黑暗条件下,人们难以识别摄像头捕捉到的图像信息,而且智能系统很大程度上也依赖于高质量的输入图像,为了解决这个问题,本文就低照度图像问题迚行了研究。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "在光线昏暗的环境下,摄影师拍摄出正常成像非常困难。在更极端的黑暗情况下,如光照严重受限(月光)或短时间的曝光,成像就更为困难。低光下拍摄的图像和极低光下拍摄的图像对比如图1所示,明显看出极低光下拍摄的图像相比低光下拍摄的图像被隐藏的信息更多。"} +{"content": "语料描述-为什么要进行低光增强?", "summary": "部分图像增强在计算机断层成像、工业产品质量检验、交通监控及卫星图像处理中有着广泛的应用。其中低照度图像是一种常见的图像种类,它的主要特点是微光、暗色区域占图像主要部分。造成图像低照度的原因有很多,例如光线不足,摄影设备性能有限以及设备配置不正确等,这类图像可见性偏低,不便于观察与分析,且会对相关应用产生负面影响,尤其是在图像的匹配、融合、分析、检测以及分割方面,给数字图像处理带来极大挑战。"} +{"content": "结论 ", "summary": "本文提出了一种可以对夜间的油茶果果实图像进行快速、精准识别的目标检测网络YOLON,(1)可以对夜间的油茶果果实目标进行进行检测,mAP可以达到98.17%,高于YOLO v4、EfficientNet、YOLOX等对比算法的mAP,表明YOLON可以较好地对夜间油茶果的进行检测。(2)对不同遮挡程度下的(3)但是YOLON对的运行速度尚有待提高,此外运行时对GPU的依赖程度较大,无法在CPU上完成快速推理,对于嵌入式设备的部署尚不理想,这也是本文未来重点的研究方向。"} +{"content": "单果、双果、多果、整树的识别效果 ", "summary": "结果表明,随着果实数量的增加,成功率呈下降趋势。在2个果实中,成功率为100%。三果、四果、五果串的成功率分别为91.6%、88.9%和85.3%。"} +{"content": "检测效果对比 ", "summary": "使用YOLON、YOLOR、和YOLO V4在COA数据集上进行了进一步的比较, 本文提出的YOLON其在COA数据集上的表现优于YOLO V4 和YOLOR。"} +{"content": "算法的运行时间 ", "summary": "随着不同分辨率的图像尺寸的增加,时间与mAP的变化,选取最适合的分辨率,可以花上去两个曲线,找两条曲线相交的那个点,类似于下面的效果"} +{"content": "结果 ", "summary": "为了验证该方法的有效性,对COA数据集中216张测试图像进行了测试,结果如表1所示,该方法的准备率和召回率分别为90.00%和90.00%,平均IOU和mAP分别为90.00%和90.00%,检测速度可达30f/s,能够满足油茶果采摘机器人对果实进行实时检测的需求,可以为算法部署在油茶果采摘机器人的开发上提供技术支持。"} +{"content": "本文要研究的重点问题 ", "summary": "夜间油茶果的果实目标识别"} +{"content": "本文提出的基于多分支结构和U-net结合的低照度图像增强算法有以下三点贡献: ", "summary": "1)本文提出了一种新颖的端到端的低照度图像增强网络,可以应用于多种场景,计算速度和准确率也都有所提升。此网络还结合了多分支网络和U-net迚行特征提取,取得了不错的结果。2)本文的方法在噪声抑制、对比度增强等方面有着较好的效果,能够有效地减少噪声的影响。3)本文使用大量的实验来证明所提出模型的有效性,且使用了很多优秀的方法来进行对比,并且从定性和定量的角度分析对比结果,取得了满意的效果。总体来说,本文提出的斱法在各方面都很大程度上优于现有的算法。"} diff --git a/ptuning/datasets/Zettels/train.json b/ptuning/datasets/Zettels/train.json new file mode 100644 index 0000000000000000000000000000000000000000..be7d7c0ee7443048bd95aad3e0ff195e05854b8f --- /dev/null +++ b/ptuning/datasets/Zettels/train.json @@ -0,0 +1,140 @@ +{"content": "目前国内外常见的低照度图像增强方法主要分为四种: ", "summary": "(1)基于直方图增强法(HistogramEqualization,HE).该方法通过整体调整图像直方图分布来增强图像亮度和对比度,此类方法精简快捷,但常出现颜色失真、细节丢失等问题."} +{"content": "目前国内外常见的低照度图像增强方法主要分为四种: ", "summary": "(2)基于Retinex增强法.Land提出人眼视觉亮度与颜色感知由实际物体自身的反射率决定,与环境光强度无关.根据Retinex理论提出带色彩恢复的多尺度Retinex(multi-scaleRetinexwithcolorrestora-tion,MSRCR)和LIME等多种经典算法.此类方法容易出现颜色失真,虽然有学者增加颜色校正模块,但仍无法完全克服颜色失真问题."} +{"content": "目前国内外常见的低照度图像增强方法主要分为四种: ", "summary": "(3)基于伪雾图增强法.该方法利用低照度图像的反转图像通过去雾算法进行增强.如Dong等提出增强方法取得较好的照度增强效果,但在应对复杂场景增强时容易出现块效应和噪声."} +{"content": "目前国内外常见的低照度图像增强方法主要分为四种: ", "summary": "(4)基于神经网络的增强法.此类方法利用神经网络学习低照度图像到正常照度图像的映射,如刘超等提出利用卷积自编码网络从低照度图像训练集中学习图像特征.此类方法能够有效的对低照度图像进行照度增强,但增强的图像在细节及色彩方面有所欠缺."} +{"content": "单纯地增加曝光量的问题 ", "summary": "相机的动态范围有限,如果增加相机曝光度来揭示曝光不足区域的信息,那么曝光良好的区域可能会出现过度曝光甚至饱和的现象。"} +{"content": "单纯地增加曝光量的问题 ", "summary": "在图像拍摄过程中可以通过提高感光度(ISO)增加亮度,但不可避免地会放大噪音。采用缩放或直方图拉伸等一系列后期处理方法可以减弱噪音的影响,但并不能有效解决信噪比低 的问题。在拍摄过程中采用打开光圈、延长曝光时间、使用闪光灯等物理方法虽然能达到在低光环境下增加信噪比的效果,但图像会因为相机抖动或物体的移动而变得模糊。"} +{"content": "说明别人方法不好 ", "summary": "为了解决这一问题,研究人员提出了一些有效的图像增强技术。采用这些方法虽然可以获到良好的主观质量,却不能准确地反映场景图像的真实亮度和对比度。"} +{"content": "说明别人方法不好 ", "summary": "LIME虽整体图片偏亮,但是却与原色彩模式相比,存在失真的问题"} +{"content": "深度学习算法的好处 ", "summary": "深度学习方法具有较高的 能力,同时可以通过训练样本的控制来提供较多的先验知识 来降低低照度图像重建问题的病态程度,可以针对特定的低照度数据集实现较好的增强效果。"} +{"content": "油茶果描述 ", "summary": "油茶Camelliaoleifera是我国最主要的经济树种之一,与油棕、油橄榄和椰子并称为世界四大木本食用油料树种。因其良好的适应性,油茶适宜种植在我国南方广大的红壤丘陵地区,主要分布在湖南、江西、浙江、海南、广东、广西、重庆、四川、贵州、湖北、安徽、福建等地。其中湖南、江西和广西三省(区)合计栽培面积占全国总面积的75%以上,常年产茶油量占全国总产量的80%以上。据初步统计,截至2011年,全国已选育出的油茶新品种共有365个,其中通过国家审定的油茶良种有73个,省级油茶良种292个。品种繁多、良莠不齐,优良品种经过长期嫁接繁殖逐步退化等问题给油茶产业的发展带来了严重的消极影响。"} +{"content": "油茶果描述 ", "summary": "油茶林主要分布于我国南方地区,其油茶果实经加工后,可广泛用于食品、工业、医药等重要领域,是重要的经济林树种之一。油茶树树叶繁茂,枝桠交叠,油茶果成熟时呈球形或椭圆形,颜色呈淡黄色或暗红色、或呈绿色,与树叶颜色相近。油茶果目标与背景成多元信息叠加,且受光照、摇晃等不稳定因素的影响,以上因素致使快速、准确地识别油茶果极其困难。在对林果采摘机器人视觉识别技术的相关研究中,目前主要集中于对目标的颜色。形态、纹理、光谱等单特征识别。在多特征识别研究中,Hayashi等人提出了基于形态学特征及颜色特征的茄子图像分割法,并采用网格模板获取完整的茄子目标;Zhao等人利用纹理特征和颜色特征识别树上苹果;Blasco等人结合多光谱特征和形态学特征,利用贝叶斯分类器检测柑橘表面;王津京等人利用支持向量机算法对待识别的目标苹果的颜色和形状特征进行了综合分类。"} +{"content": "油茶果描述 ", "summary": "卷积神经网络可以自动提取特征,并分类检测,精度高,实时性强,成为果蔬目标检测的主流框架。而Faster R-CNN 卷积神经网络经过RCNN、Fast-RCNN的不断改进,精度和检测效率都得到了进一步的提高。在卷积神经网络果蔬识别方面已经有大量的研究。西北农林科技大学冯亚利使用改进的DY3TNet模型实现了田间猕猴桃果实的检测。闫建伟等为了快速准确识别自然环境下刺梨果实,提出了一种基于改进的Faster-RCNN的刺梨果实识别方法。傅隆生等为了实现田间条件下快速、准确地识别多簇猕猴桃果实,提出了一种基于LeNet卷积神经网络的深度学习模型进行多簇猕猴桃果实图像的识别。程鸿芳等针对传统的基于内容的识别方法在特征提取方面存在的计算复杂、特征不可迁移等问题,对LeNet卷积神经网络结构进行改进,设计了一种基于改进LeNet卷积神经网络的苹果目标识别模型,并利用该模型对不同场景的苹果图像进行了识别与验证,综合识别率达到93.79%;与其他方法相比,该算法具有较强的抗干扰能力,图像识别速度快,识别率更高。中南林业科技大学张习之等提出了一种基于改进卷积自编码机神经网络的油茶果图像自动识别方法,该改进算法100次迭代所需时间为166s,平均识别准确率为90.4%。"} +{"content": "低光增强深度学习算法使用的数据集 ", "summary": "CHEN等在2018年首次采用SID(see in the dark)数据集,基于数据驱动方法训练一个端到端的网络,实现了极端低光情况下图像的增强,取得了良好的效果,但该方法设计的网络对物体细节的还原仍然有着很大的不足,增强之后的图像中物体的边缘存在模糊现象,且该方法只在其构建的数据集上表现良好。"} +{"content": "说明目前用的人很少 ", "summary": "目前在卷积神经网络识别油茶果方面的研究较少,尚无文献用Faster-RCNN卷积网络的方法识别油茶果。本文选用Faster-RCNN交替优化训练方法,并使用Faster-RCNN卷积神经网络对油茶果进行了识别。"} +{"content": "最后一段说明自己要干啥 ", "summary": "为了提高低照度图像的图像质量,本文利用生成对抗网络的学习能力,提出一种基于U-Net生成对抗网络的低照度图像增强方法.该方法利用U-Net结构和深度卷积结构构造生成对抗网络,实现利用低照度图像生成照度与正常照度图像拟合的增强图像.实验表明,该方法能够有效提高低照度图像的亮度、对比度."} +{"content": "低照度图像产生的硬件原因 ", "summary": "在低照度条件下拍摄的图像通常会出现不同程度的质量退化,虽然专业设备和先进的摄影技术可以在一定程度上缓解这些退化,但固有原因产生的噪声是不可避免的。由于没有足够的光,相机传感器的输出易受系统的固有噪声干扰,因此输出的图像可能会在曝光不足的区域丢失部分重要信息,从而加大了计算机视觉任务的难度。"} +{"content": "深度学习发展很快 ", "summary": "近年来,深度学习发展迅速,在高层次视觉任务中应用非常广泛,如图像识别、语义分割等。与此同时,也有一些研究人员尝试用深度学习算法去解决低层次图像领域问题,如图像去噪、图像去雾、图像超分辨率等,这些算法也取得了较好的成绩。相对于传统算法,深度学习算法具有不需要人工设计特征提取方法,可直接端到端地训练和输出结果等优势"} +{"content": "本文的贡献 ", "summary": "本文提出的基于多分支结构和U-net结合的低照度图像增强算法有以下三点贡献: ∑"} +{"content": "引出我们研究的问题 ", "summary": "针对低照度条件下的图像对比度不高、颜色失衡和存在噪声的问题,提出了一种基于卷积神经网络的低照度图像增强模型。"} +{"content": "引言中引出本研究的表述 ", "summary": "为了对低照度图像进行增强,本文利用卷积卷积神经网络的非线性映射能力,提出了一种基于U-Net网络的低照度图像增强方法。"} +{"content": "引言中引出本研究的表述 ", "summary": "利用数据驱动的方式,实现了对低照度的图像进行增强,实验表明,该方法能够有效提高低照度油茶果图像的亮度、对比度,且细节可以保持完整"} +{"content": "Dong的方法", "summary": "基于Dong的去雾算法增强。Dong等发现,低照度图像经过翻转后图像与雾天图像具有一定的相似性,因此可以采用图像去雾的思想对低照度图像进行增强,可以取得一定的增强效果。但由于Dong未考虑雾图中存在的白色区域所导致的暗通道理论失效的问题,在应对复杂场景增强时容易出现块效应和噪声等问题。基于伪雾图增强法。该方法利用低照度图像的反转图像通过去雾算法进行增强,如Dong等提出的增强方法取得了较好的照度增强效果,但在应对复杂场景增强时容易出现块效应和噪声等问题。"} +{"content": "将注意力机制引入CNN结构中有什么好处 ", "summary": "将注意力机制应用于UNet分割网络中,可以比较好的实现对显著性区域的关注,以及对无关背景区域的抑制。"} +{"content": "将注意力机制引入CNN结构中有什么好处 ", "summary": "注意力模型可以很好的嵌入到CNN框架中,而且不增加计算量的同时提高模型性能。"} +{"content": "将注意力机制引入CNN结构中有什么好处 ", "summary": "提出了Attention-U-Net算法,该算法将通道注意力及空间注意力机制加入低光增强网络之中"} +{"content": "CIL数据集的采集条件", "summary": "用于网络训练的低照度/正常图像对,于2020年10月25日-11月5日白天在湖南省天沃科技有限公司雷叔叔油茶果种植基地拍摄,拍摄的油茶果的品种为华顺203,205。图像采集设备为尼康D90数码相机,相机的光圈为F22,快门模式为M模式,ISO为200,拍摄设备如图1所示,尼康相机固定在三脚架上,三角架轴心到树干中心的距离为200cm,相机镜头到油茶树最外侧的距离为95cm,相机距离地面的高度为150cm;在对同一场景同一位置的油茶果采集不同曝光度的图像时,若采用手动按快门进行拍摄,会不可控地引起相机镜头的抖动,从而造成拍摄的不同曝光度的图像,无法达到像素级的对应,对后续网络的训练造成较大干扰,因此本研究在制作油茶果的低照度/正常光照图像的数据集CIL(Camellia oleifera Abel Image in low light)时,采用红外遥控的方式的进行图像的采集2021.01.25,其中红外遥控器的发射端由采集人员手持,红外遥控器的接收端置于单反相机的上方,两者通过热靴连接。"} +{"content": "注意力机制引入的区域 ", "summary": "attention模块用在了skip connection上,原始U-Net只是单纯的把同层的下采样层的特征直接concate到上采样层中,改进后的使用attention模块对下采样层同层和上采样层上一层的特征图进行处理后再和上采样后的特征图进行concate"} +{"content": "注意力机制引入的区域 ", "summary": "Attention coefficients(取值0~1)与feature map相乘,会让不相关的区域的值变小(抑制),target区域的值变大(Attention)。"} +{"content": "注意力机制引入的区域 ", "summary": "利用下采样层的结构化信息和当前层纹理信息的融合,利用sigmoid归一化,得到关联性强的区域,和当前层做乘积,从而强调本层的显著性区域的特征。"} +{"content": "注意力机制引入的区域 ", "summary": "在基础的UNet的基础上增加了attention 的机制,通过自动学习参数来调整激活值,attention的可视化效果还是主要部分,不像non-local的方式每一个像素点都要和其他像素点进行关联,可以视作一种隐式的注意力机制。"} +{"content": "我们的方法如何好 ", "summary": "通过大量的实验表明,运用深度残差网络和U-net,可以更好地进行特征提取,低照度图像增强的效果也更好,很大程度上优于现有的技术。提出的方法不仅在视觉上提高了亮度和对比度,色彩更真实,更加符合人眼视觉系统特性,而且PSNR、SSIM等七项客观图像质量指标在几种算法中都是最优的。"} +{"content": "我们的方法如何好 ", "summary": "3)本文使用大量的实验来证明所提出模型的有效性,且使用了很多"} +{"content": "我们的方法如何好 ", "summary": "其可对不同尺度的低照度图像进行增强"} +{"content": "我们的方法如何好 ", "summary": "通过在公开数据集(LOL,SID)上验证表明,RUNet方法在效果上有所改进,尤其是整体视觉效果。"} +{"content": "我们的方法如何好 ", "summary": "不仅在客观评价指标上的评分更高,PSNR、SSIM等七项客观图像质量指标在几种算法中都是最优的,而且在视觉上提高了亮度和对比度,色彩更真实,更符合人眼的视觉原理。"} +{"content": "合成数据集 ", "summary": "我们从RAISE收集了780张原始图像,700张用于生成训练对,80张用于网络的验证。使用Adobe Photoshop Lightroom调整对应的参数,包括曝光度、振动和对比度,具体参数为:曝光参数E设置为了[-5,0],振动参数V设置为[-100,0],对比度参数C设置为[-100,0]来合成低照度图像。为了防止颜色偏差的问题,在训练数据集中加入了700对灰度图像对。这些灰度图像对被转换为彩色图像对。为了使增强前后的黑白区域保持一致,我们添加了五对白-白训练对和五对黑-黑训练对。最后,将所有图像的大小调整为400x600,并转换为JPG格式。"} +{"content": "合成数据集 ", "summary": "通过伽玛校正来合成数据集,但对于伽玛校正,可能无法反映不同曝光级别之间的关系。"} +{"content": "CIL数据集的明细 ", "summary": "CIL数据集,分别拍摄了单果、多果、整树、花朵、花和果实、无果等多种油茶果图像场景的低照度/正常图像对,以确保算法对不同的场景的低照度图像的增强效果。CIL数据集用较短的快门时间拍摄的油茶果图像做为低照度图像,较长的快门时间拍摄的图像做为对应的参考图像,一共包含1200组不同曝光度的图像,各个场景下的具体图像数量如表1所示,曝光时间分别为1/1000s、1/500s、1/250s、1/60s、1/10s,其中做为低照度图像的曝光时间分别为1/1000s、1/500s、1/250s、1/60s,每组图像对应的参考图像的曝光时间为1/10s。"} +{"content": "为什么要使用U-Net", "summary": "在数据集规模较小的情况,使用U-Net进行烟尘分割。相比于其他语义分割网络,U-Net的优点在于能针对小型数据集进行端到端的快速、有效训练。"} +{"content": "存在的问题 ", "summary": "油茶果处于高山丘陵地区,因此山上多风,所以对在非同一时刻采集到的油茶果图像可能会存在轻微的晃动,"} +{"content": "U-Net的结构描述", "summary": "U-net网络结构因其对称结构与英文字母“U”相似而得此名,主要由下采样、上采样以及“桥”连接三部分组成,其网络结构见图2。左边称为下采样,也叫压缩路径或编码器(encoder),主要作用是提取图像的浅层特征,如图像的位置信息;右边称为上采样,也叫扩展路径或解码器(decoder),主要作用是提取图像的深层特征,如图像中像素的类别信息;中间的箭头表示“桥”连接,也叫跳跃连接(skipconnection,SC),主要作用是把下采样得到的特征图与上采样得到的特征图进行复制拼接,形成一个同时具有深层次和浅层次信息的特征图,实现更为有效的分割。除此之外,上、下采样和“桥”连接部分由卷积层、池化层、反卷积层以及激励层组成。其中卷积层用于提取甲状腺图像的特征;池化层用于下采样部分,其将获取的特征数据和参数进行压缩,减少模型过拟合;反卷积层用来还原特征图的尺寸大小,使得最后输出大小与原图大小一致;激励层是将卷积所得的输出进行一个非线性映射,来保证模型能够更好地拟合图像。本研究所使用的激活函数为ReLU,最后输出层使用Sigmoid函数对像素点进行分类"} +{"content": "对U-Net增强效果的描述", "summary": "对单通道的图像进行伪彩色处理,可以直观地看到,经过U-Net增强后的图像,其整体照度得到了增强,而且,各个通道的轮廓和细节信息也得到了增强,变得更为突出。例如R通道,增强前的树叶、树枝等信息耦合在一起,无法区分,而增强后,两者的边界则变得非常显著,有利于后续图像的进一步分析。"} +{"content": "对U-Net增强效果的描述", "summary": "自己的U-Net代码是对图像进行了端对端 的处理,不仅仅是可以对图像进行低照度增强,还进行去噪,"} +{"content": "我们提出的网络", "summary": "受SID算法的启发,本文验证了利用神经网络对低照度图像进行端到端的增强,并提出了一个有连续的自网络组成的新型网络RV-UNet。"} +{"content": "我们提出的网络", "summary": "通常原始输入图像的比编码-解码网络的输出包含更多细节,因为可以为恢复细节提供信息。采用级联代替跳跃连接,将上一个上采样块的特征映射与输入图像相结合,使原始信息和照明估计都能完全保留并传输到下一步。级联层之后是带有ReLU的三个激活层,它将输入的图像信息与估计的全局光照信息想结合,最终生成细节更好的增强效果。"} +{"content": "多级", "summary": "在训练过程中,首先针对不同曝光时间的图像乘上对应的放大系数进行粗增强,,该系数不参与网络训练,由研究人员指定;而在网络测试时,模型先由输入图像的直方图进行亮度模式判定,根据先验知识分别乘上对应的放大因子,对图像进行粗调整后,再输入到A-UNet网络中进行端到端处理 。"} +{"content": "油茶果图像的采集", "summary": "本研究采集油茶果的试验地点为湖南省永州市雷叔叔油茶果种植基地,拍摄时间为2020年10月28日-2020年11月6日的白天,,"} +{"content": "油茶果图像的采集", "summary": "江西省林业科学院国家油茶林基地。拍摄相机为索尼相机,像素为640×480。于2019年9月14~23日的晴天拍摄,采集了典青、赣兴46、赣抚20等34个品种的油茶果图像并保存成JPG的格式。图1为拍摄的部分油茶果图像。环境下的油茶果图像,将训练集图像数量扩充到3820张,使卷积神经网络学习到各种情况下油茶果图像的特征(若训练数据集中没有包含多样化的样本,则会导致机器学习不足,识别结果置信度降低)。"} +{"content": "试验环境", "summary": "本试验在Windows图形工作站计算机上进行,处理器为Intel-i5-9400F,内存为32GB,操作系统Windows10(64位)。考虑到GPU算力的需要,选用显卡为RTX 2080Ti,显存11GB。Python的版本是3.6.4,在pycharmIDE上编译。深度学习框架选择PyTorch。同时为了提高训练速度,采用GPU加速方法,cuda版本是8.1,cudnn版本为7.6.0。"} +{"content": "U-Net的优点", "summary": "由于U型网络可以较好地保留图像的细节,为了学习低光图像和正常光照图像之间的映射关系,本文采用U型结构的卷积神经网络。如图3所示,该网络由编码器、解码器和跳跃连接构成。"} +{"content": "实验结果的图表", "summary": "各个算法对应多个评价指标,使用多个图表"} +{"content": "实验结果的图表", "summary": "在一个图表上对应上所有的方法和评价指标"} +{"content": "说明本文的做法好", "summary": "以上数据表明本文算法在低照度图像的增强效果以及运行效率上均有一个较好的表现,为自然环境下油茶果的检测和识别提供了保障。"} +{"content": "说明本文的做法好", "summary": "大量的实验表明,该方法不仅可以获得视觉上令人满意的低光增强效果,而且能够很好地表示图像的分解过程"} +{"content": "说明本文的做法好", "summary": "我们进行了大量的实验,以证明我们设计算法的有效性及相对于常见低照度增强算法的优越性。此外,我们算法在2080Ti GPU上花费不高50ms来处理VGA分辨率的图像,这些特性使得我们的产品具有是有价值。"} +{"content": "针对于本文不利的指标进行申辩", "summary": "本文所提的增强算法,尽管在时间效率上低于LIME和HE等传统算法,但是增强效果上,则远远高于这些算法。表明本文算法具有良好的实时性和准确性,为后续的计算机视觉的准确识别,有助于提高油茶果采摘机器人采摘作业的准确性和效率。"} +{"content": "我们算法的增强效果", "summary": "实验结果表明,与传统的低照度增强算法相比,本文所以提出的模型,不仅可以最大程度地还原图像的真实亮度,而且能够有效提高图像的对比度、调整颜色失衡已经去除噪声,客观图像质量评价指标也高手同类算法。"} +{"content": "我们算法的增强效果", "summary": "实验结果表明,本文所提算法不仅在主客观评价上有较好的表现,而且利用增强后的图像训练出来的YOLO v4网络比未经过增强处理的图像训练出来的YOLO v4网络识别准确率更高。"} +{"content": "我们算法的增强效果", "summary": "该增强算法是利用神经网络进行端到端的训练,参数调整过程中没有任何人工参与,大大减少了人工设计的复杂性。"} +{"content": "我们算法的增强效果", "summary": "实验表明,本文算法,不仅在合成图像中表现出色,而且"} +{"content": "我们算法的增强效果", "summary": "本文分别从定性和定量的角度将RV-UNet与流行的低照度增强方法进行了比较,结果表明,本文算法不仅在各个增强指标上表现出了,而且也最符合人眼特性,xxx效果最好。"} +{"content": "我们算法的增强效果", "summary": "综上所述,RV-UNet的增强效果优于LIME等传统增强算法,而且与Retinex-Net等深度学习增强算法相比,图像的主客观增强效果均表现出色。但本文方法也存在一些问题,例如对一些区域进行处理时存在过度增强,从而引起亮斑问题。"} +{"content": "我们算法的增强效果", "summary": "与其他算法相比,本文提出的算法可以有更生动和自然的结果,由于RV-UNet对输入图像具有全局感知能力,并且可以按照语义信息对整个图像进行增强,因此可以避免在正常或者较亮区域的过度曝光或者在较暗区域的曝光不足。此外,低照度图像的细节了增强后仍然可以保持不变,得益于细节重构步骤。"} +{"content": "损失函数", "summary": "本文实验选择低照度增强公开基准数据集LOL进行训练的,该数据集包含1500组低光/亮光图像对。具体来说,有500对大小为400×600的真实场景图像和1000对大小为384×384的原始数据合成图像。本文使用1000对合成图像对和485对真实图像对进行训练,其余15对进行测试。由于LOL测试集的图像是在极弱光条件下拍摄的,图像的黑暗区域充满了强烈的噪声,因此通过这个数据集的结果可以很好地显示了本文算法在低光照条件下的增强性能。"} +{"content": "常见的图像去噪算法", "summary": "最流行的方法可以归纳为BM3D 和WNNM 。由于测试中优化过程的高度复杂性,以及参数需要人工进行精心设计,这些传统去噪算法在实际应用中往往表现不是很出色。"} +{"content": "人们对于图像去噪的态度", "summary": "在图像处理、多媒体、计算机视觉等领域,图像去噪一直是人们关注的热点,近几十年来提出了很多经典算法。"} +{"content": "数据驱动", "summary": "Retinex分解图像的方法是直接在输入图像上估计反射率和照度,但设计适合各种场景的适当约束函数并不容易。因此,尝试以数据驱动的方式解决此问题。DeNet每次都会获取成对的弱光/正常光图像,并在弱光和正常光图像的指导下学习弱光及其对应的正常光图像的分解。"} +{"content": "基于深度学习的去噪算法", "summary": "基于深度学习的去噪算法显示出了优越性,有代表性的工作,如使用堆叠稀疏去噪自动编码器的SSDA 、可训练非线性反应扩散的TNRD、具有残差学习和批处理归一化的DnCNN,由于"} +{"content": "编码-解码描述", "summary": "在编码-解码网络中,首先是由编码网络对图像进行编码,然后将图像的全部信息压缩为瓶颈层的细长向量,最后由解码网络解码进行解码。"} +{"content": "时间复杂度", "summary": "评价一个算法的优劣,一般主要从算法的执行时间(计算量)和所需要占用的存储空间(访问量)两方面进行衡量,但时间复杂度计算的不是程序具体运行的时间,而是算法执行语句的次数。"} +{"content": "评价增强效果", "summary": "客观评价方式是通过峰值信噪比、信息熵、标准差等参数对低照度图像的增强效果进行定量分析。"} +{"content": "评价增强效果", "summary": "使用PSNR评价指标,偶尔会出现评价结果与主观评价不一致的情况,这是由于该指标"} +{"content": "评价增强效果", "summary": "PSNR在对图像进行评价的时候,只是对图像的绝对质量进行数字打分,而忽略了人眼的主观感受,例如人眼对一个区域的感知结果会受到其周围邻近区域的影响,因此常常会出现PSNR分数与主观评价不一致的情况。"} +{"content": "评价增强效果", "summary": ",使用PSNR对图像进行评价,有时候会出现评价分数与主观评价不一致的情况,"} +{"content": "评价指标", "summary": "本文在测试数据集上对所提出的网络性能进行了评估,并分别将其与文献、文献和文献三种主流低照度图像增强的方法进行了定性和定量的比较。为了公平起见,本文应用了作者提供的带有推荐参数设置的代码。为了评价这些算法的性能,本文采用峰值信噪比(PSNR)、结构相似性指数(SSIM)和自然图像质量评估器(NIQE)来量化增强后图像的恢复质量。"} +{"content": "评价指标", "summary": "PSNR是一个绝对误差,使用像素相对于其最大可能值的均方误差来计算。在假设人类的视觉系统高度协调以提取结构信息的情况下,SSIM试图通过更紧密地与人类的感知保持一致来改进绝对误差度量。这两个客观评价指标的值越大,表明图像处理效果越好,而NIQE值越大说明与自然图像差距越大,质量越差。可以从表1中看出本文网络表现出最优的性能。"} +{"content": "空间复杂度", "summary": "空间复杂度是对一个算法在运行过程中临时占用存储空间大小的量度。在深度学习中,空间复杂度决定了模型的参数数量,模型的参数越多,训练模型所需的数据量就越大,而现实生活中的数据集通常不会太大,这会导致模型的训练容易过拟合"} +{"content": "使用Google Cloud Vision 进行检测", "summary": "这个东西相对来说会客观一点儿,大家都是认这个东西的"} +{"content": "使用Google Cloud Vision 进行检测", "summary": "图3为谷歌云识别低照度/正常光照图像对,原始图像来自于MEF数据集。由于光照较低,谷歌云视觉只能将图像标记为“天空”、“云”和“尖顶”,经过增强后,前景埃菲尔铁塔被成功地检测到,并用一个绿色的boundingbox精确地标记出来,表明了我们方法的有效性"} +{"content": "缺乏数据集", "summary": "由于在现实世界中,获取可以做到像素级匹配的低照度/正常图像对的难度较大,早期采用深度学习算法对低照度图像进行增强的研究中,多是采用人工合成过的低照度图像数据集。然而,这些低照度图像通常是由研究者们通过一些已知参数的具体算法,对正常光照条件下的图像进行随机变暗处理,以及增加随机噪声等方式获得。与真实场景下采集到低照度图像相比,这类人工合成的图像,往往过于简化,缺乏真实场景下的复杂的噪声与图像失真。"} +{"content": "基于深度学习的低光增强方法", "summary": "由于深度学习网络对非线性映射有较好的拟合作用,近年来研究者相继尝试了使用深度学习来对低照度图像进行增强的方式。"} +{"content": "基于深度学习的低光增强方法", "summary": "这类算法能够利用神经网络学习低照度图像到正常照度图像之间的非线性映射,例如Lore等人最早提出了将LLNet网络用于处理低照度图像的"} +{"content": "基于深度学习的低光增强方法", "summary": "基于伪雾图增强法。该方法利用低照度图像的反转图像通过去雾算法进行增强,如Dong等提出的增强方法取得了较好的照度增强效果,但在应对复杂场景增强时容易出现块效应和噪声等问题。"} +{"content": "基于深度学习的低光增强方法", "summary": "随着深度学习的出现,许多低级视觉任务都从中受益,例如[14]、[15]用于去噪,[16]用于超分辨率。"} +{"content": "基于深度学习的低光增强方法", "summary": "RetinexNet"} +{"content": "基于深度学习的低光增强方法", "summary": "最近出现的做法"} +{"content": "防止卷积后图像变小", "summary": "为了避免卷积时对图像边缘造成影响,丢失图像边缘,每个卷积层在进行卷积操作前都会对图像的边缘进行0填充,是的图像在卷积前与卷积后的大小可以保持一致。"} +{"content": "为什么不能直接对低照度图像进行亮度增强", "summary": "在低照度条件下拍摄的图像通常xxx质量很差,这是因为除了不理想的光照条件外,还存在着多种类型的图像退化,例如噪声和颜色失真,都一起被隐藏在图像中,因此仅仅提高低照度图像的亮度将不可避免地放到这些噪声,使图像产生伪影和失真。"} +{"content": "为什么不能直接对低照度图像进行亮度增强", "summary": "不同低照度区域具有不同程度的噪声,如果直接对低照度图像进行亮度增强的话,也会不可避免的放大这些噪声,使图像产生伪影和失真"} +{"content": "闪光灯", "summary": "闪光灯是点状光源,而自然光是平行光源,因此使用闪光灯辅助成像有可能会产生不自然的曝光,而且使用闪光灯可以在一定程度上使环境变得明亮,但也会了成像中引入高光和不平衡的照明,使图像在视觉上缺乏真实感。"} +{"content": "拍出低照度图像的场景", "summary": "由于油茶果生长环境的特殊性,油茶果采收机器人在作业时,视觉系统并不是在所有时刻和地点都可以获得良好的光照条件,例如黄昏、阴天、雾天等光照不足的情景下,甚至在晴天的一些背光处也会有欠曝光的现象。在这种光照条件下直接捕获的图像,往往具有较低的信噪比,容易对后续机器视觉的目标检测等工作造成较大干扰。"} +{"content": "拍出低照度图像的场景", "summary": "造成低照度图像的原因有很多,例如拍摄环境光照条件较差、摄影设备性能有限以及设备配置不当等。"} +{"content": "必要性", "summary": "油茶果采摘机器人的工作环境复杂,尤其是在阴天、傍晚,采集到的果实图像存在整体偏暗、模糊、对比度不高、细节不清晰、动态范围压缩有限等问题,给后续油茶果的自动化采收带来了较大困难。"} +{"content": "必要性", "summary": "从低光环境下采集的油茶果图像,经常存在对比度低、细节丢失、噪声污染严重等问题,不利于人眼的观察和计算机视觉的检测。"} +{"content": "必要性", "summary": "图像质量与很多计算机视觉相关的相关技术的效果息息相关,高质量的图像可以带来更多信息,方便后续的增强任务。油茶果机器人作业时的很多因素都会直接或者间接地影响图像质量,低光照便是其中之一。"} +{"content": "必要性", "summary": "低质量的图像会降低很多计算机视觉的性能,因为这些算法,通常是针对高质量输入图像设计的,因此对低照度图像进行增强,不仅可能图像的视觉效果,还可以提高相关视觉算法的作业效果。"} +{"content": "必要性", "summary": "在拍照成像中,光照不足,会非常明显的影响成像质量,使其对比度降低并且丢失细节信息,不仅影响视觉效果,而且会给后续为自然光照图像设计的计算机视觉系统的性能造成较大影响。为了使这些隐藏在低照度区域的细节清晰可见,提高计算机视觉系统的准确性,需要对低照度图像进行增强。"} +{"content": "延时摄影", "summary": "长时间曝光延长了拍摄时的曝光时间,可以让更多的光子到达成像设备的感光元件上,但是仅限于静态摄影,若是物体在拍摄过程中发生了位移,很可能导致成像结果模糊"} +{"content": "目的", "summary": "研究低照度增强的主要目的在于提升图像亮度的同时,降低图像噪声、减小色彩偏差、增强图像整体与局部的对比度,以增强图像的视觉效果与质量,例如锐化图像特征,使图像具有更高的视觉质量。低照度图像增强的目的是"} +{"content": "目的", "summary": "低照度图像增强的目的是改善低光照条件下的成像图像质量与视觉效果,同时减小对光照以及拍摄设备的依赖程度,具有广大的应用前景。"} +{"content": "发展水平", "summary": "国内外很多学者,对低照度图像增强技术进行了相关研究,取得了不错的效果,但大多是针对通用场景的增强,针对农业领域里面的低照度图像进行增强的研究相对较少。"} +{"content": "发展水平", "summary": "近年来,很多学者对低照度增强领域进行了研究,使低照度增强技术有了较大的发展,但是开发一种可以用于实际领域的低光增强策略仍然面临着较大挑战,因为常见的低光增强算法往往受限于特定的场景,增强效果只是限制在这个特定领域里面,换另外一个场景有可能就不再适用了。XXX"} +{"content": "目标", "summary": "理想的低照度图像增强算法,不仅应该可以对低照度图像进行增强,而且还可以有效地去除隐藏在暗区域中的图像,并且灵活地调整图像的增强等级。"} +{"content": "应用", "summary": "低照度图像增强的目的是改善低光照条件下的成像图像质量与视觉效果,同时减小对光照以及拍摄设备的依赖程度,具有广大的应用前景。"} +{"content": "应用", "summary": "RV-UNet主要应用场景为帮助提高其他计算机视觉任务的性能,比如物体检测和识别。由于大部分视觉识别模型都基于自然"} +{"content": "难度所在", "summary": "对于大部分低照度图像,对图像进行简单的调整,并不能同时提升图像的亮度与质量。xxxx(中间省略很多具体算法的做法)这些难以调和之处正是低照度增强算法的难点所在。"} +{"content": "难度所在", "summary": "很难确定精准的Ground Truth∑"} +{"content": "难度所在", "summary": "深度学习应用于低照度图像增强的难度所在∑"} +{"content": "深度学习应用于低照度图像增强的难度所在", "summary": "如何从单个图像中有效地估计出照明分量,并灵活地调整光照水平在对低照度区域进行增强后,如何消除之前隐藏在暗区域中的噪声和颜色失真等变换如何只通过少量的数据集训练一个没有Ground Truth的低照度增强网络?"} +{"content": "很难确定精准的Ground Truth", "summary": "从用户的角度考虑,不同的人、不同的需求场景可能需要不同的图像亮度值,对于研究人员而言,很难精准确定一个适用于所有人的Ground Truth。"} +{"content": "直方图均衡化 ", "summary": "在常规的图像增强任务中,直方图均衡化被广泛地应用到了各种图像增强任务中,通过对图像的直方图进行变换,得到从当前像素值。"} +{"content": "HSV颜色空间", "summary": "HSV色彩空间的H通道采用环形数据表示,在该通道上设计增强网络的损失函数难度较大,"} +{"content": "为什么要将低照度图像转换到不同的颜色空间中进行处理", "summary": "低光照图像与自然光照图像之间的主要区别为:图像的亮度和色彩的偏移,而亮度和色度则可以很好地反映出两者之间的差别。 因此将图像转换到YCbCr 颜色空间中,从而得到一个比在RGB颜色空间更适合改变图像亮度和色度的调整模型。"} +{"content": "YCbCr颜色空间", "summary": "图像在YCbCr颜色空间的数字矩阵,相比于在RGB颜色空间的数字矩阵,更适合进行低照度增强处理,可以帮助提升模型的性能。"} +{"content": "直方图的作用 ", "summary": "直方图可以统计数字图像中具有不同像素值的像素数量,图像直方图描绘了图像中像素值的分布情况。直方图被广泛应用于各种图像增强任务中,通过对图像的直方图进行变换,得到从当前像素值到新的像素值的直接映射。这类方法中最经典最常用的是直方图均衡化,但是直方图均衡化不会对图像的内容进行判断,只是在单纯地对所有的像素值进行计算映射,容易放大原始图像中的噪声。同时直方图均衡会使图像的平均亮度保持在像素值的动态范围中间,这会破坏一些场景的整体亮度。"} +{"content": "局部直方图 ", "summary": "一些方法使用局部直方图均衡来避免破坏图像整体的平均亮度,但容易导致一些边界问题,而使得图像出现Checkerboard效应 等伪像。"} +{"content": "传统算法", "summary": "LIME的色彩保持度较好,但是存在过度增强的问题,细节处会有损失。"} +{"content": "传统算法", "summary": "Wang等人提出了一种称为NPE的算法,这种算法可以在增强对比度的同时保持照度的自然性。"} +{"content": "传统算法", "summary": "FU等人提出了一种算法,该算法通过融合最初估计的光照图的多个导数来调整图像亮度,但这种方法往往会牺牲包含丰富纹理区域的真实感。"} +{"content": "传统算法", "summary": "Guo等考虑从最初的结构光照图估计结构光照图。这些方法通常假定图像无噪声和颜色失真,并且不明确考虑退化。"} +{"content": "SID ", "summary": "chen等人提出了一种可以用于处理低照度图像的算法,该算法是基于全卷积网络的端到端训练,可以同时处理噪声和颜色失真。但该算法受限于特定格式的数据集,如果修改网络以接收JPEG格式的数据(这个说法是原文自己说的),性能将会显著下降。"} +{"content": "伽马变换 ", "summary": "伽玛以非线性的方式对每个像素都进行非线性映射,虽然可以提高亮度,尤其是较暗区域的亮度,但是却没有考虑单个像素与其相邻像素之间的关系,因此增强效果往往会失真。"} +{"content": "Retinex", "summary": "Retinex理论的关键假设是图像可以分解为两个分量,即反射和照明。早期的算法包括单尺度的Retinex(SSR)和多尺度的Retinex(MSR),但是其结果通常看起来不自然,并且在某些地方存在过度增强。"} +{"content": "Retinex", "summary": "基于深度学习的算法∑"} +{"content": "基于深度学习的算法", "summary": "shen等人认为多尺度Retinex等价于具有不同高斯卷积核的前馈卷积神经网络,受此启发,他们构建了一个卷积神经网络(MSR网络)来学习低照度图像和正常光照图像之间的端到端映射。ΞWei等人设计了一个深度网络,称为RetinexNet,Retinex-NetΞKindling the Darkness"} +{"content": "传统Retinex算法的限制所在", "summary": "虽然这些算法在特定情况下会有较好的增强效果,但是对于不同场景的低照度图像增强,每个场景都需要进行人工进行精心的参数设计,费时费力,实际应用价值相对较弱。"} +{"content": "传统Retinex算法的限制所在", "summary": "现在大多数基于Retinex的方法都为这种高度不适定分解精心设计了约束和参数,而这些约束和参数在应用于其他场景时可能受到模型容量的限制,从而无法取得良好的效果。"} +{"content": "Retinex-Net", "summary": "在Retinex理论的指导下,我们设计了一个Deep Retinex-Net网络来一起完成反射/照明分解和弱光增强。网络由三个步骤组成:分解、调整和重建。在分解步骤中,Retinex网络通过Decom网络将输入图像分解为R和I。在训练阶段,同时将低光/正常光照图像输入到网络,而在测试阶段,则仅将低照度图像作为输入。在低/正常光图像具有相同反射率和照明平滑度的约束下,Decom网络通过数据驱动的方式,对反射图的分解参数进行学习,使得不同照明图像分解的反射图R保持一致。在调整步骤中,使用Enhance-Net来使照明变亮。Enhance-Net采用了编码-解码网络的总体框架。采用多尺度级联保持当对图像进行亮度调整时,使全局与局部之间保持语义上下文的连贯性。此外,如果需要,通常会将在弱光条件下产生的放大噪声从反射图中去除。然后,在重建阶段,我们将调整后的照明图和反射图通过元素的乘法结合起来,从而得到最终的低照度增强效果图。"} +{"content": "shen等人认为多尺度Retinex等价于具有不同高斯卷积核的前馈卷积神经网络,受此启发,他们构建了一个卷积神经网络(MSR网络)来学习低照度图像和正常光照图像之间的端到端映射。", "summary": "L. Shen, Z. Yue, F. Feng, Q. Chen, S. Liu, and J. Ma, “Msr-net:low-light image enhancement using deep convolutional network,”p. arXiv , 11 2017."} +{"content": "未来的研究方向", "summary": "本文未来的研究方向是:1.增加算法对不同场景的鲁棒性,将算法的应用场景扩展到雾天、雨天等背景中 2.对算法的增强效果进行合理化约束,避免对非低照度区域进行增强,从而导致增强效果过亮的问题。"} +{"content": "未来的研究方向", "summary": "反光部分,也会被放大"} +{"content": "批标准化", "summary": "当前实验实验使用的增强网络包含了6个卷积层,我们在其前5个卷积层的激活层后再加入批标准化层,使得目前的CNN结构由Conv-Relu变为Conv-Relu-BN结构。"} +{"content": "对网络图的描述", "summary": "图中的K表示卷积核的尺寸,K3即表示卷积核尺寸为3x3,Conv表示卷积层,Conv前的数字表示卷积层中卷积核的个数,"} +{"content": "低光照图像与正常光照图像的区别所在", "summary": "低光照图像与自然光照图像之间的主要区别为:图像的亮度和色彩的偏移,而亮度和色度则可以很好地反映出两者之间的差别。"} +{"content": "对颜色的影响", "summary": "在低光照条件下,亮度和色度受到的损耗程度是不同的,亮度通道相比于色度通道受到了更多的全局损耗。"} +{"content": "实验条件及参数设置", "summary": "本实验使用PyTorch 深度学习框架实现网络,在显存为11G的RTX 2080Ti上训练。使用的SID 数据集包含5094个原始的短曝光极低光图像,每个极低光图像均有对应的长曝光参考图像。长曝光参考图像I^{Hgt}高斯滤波 缩小4倍得到低分辨率图像I^{Lgt},图像宽度W和高度H均为512,每批次输入1张图片。预训练中,转换子网络T-1采用Adam 优化算法,beta1设为0.900,beta2设为0.999,训练总批次为4000,学习率为10^–4,在批次大于2000后,学习率为10^–5。转换子网络T-2采用Adam优化算法,beta1设为0.500,beta2设为0.999,损失函数中λr设为1.000,λp设为0.006,λg设为0.001。训练总批次为2104,学习率为10–4,在104批次后将学习率慢慢衰减至10–6。转换网络T采用Adam优化算法,beta1设为0.500,beta2设为0.999,损失函数中λr设为1.000,λp设为0.006,λg设为0.001。训练总批次为2104,学习率为10^–4,在前100个批次将学习率线性衰减到10^–5,在10^4批次内将学习率线性衰减到10–6,之后再以学习率10–6训练104次。"} +{"content": "实验结果与分析", "summary": "为了评估本方法的性能,与近期已有的几种方法包括多通道融合的方法(BIMEF)、带色彩恢复多尺度Retinex算法(MSRCR)、自然保留增强算法(NPE)、基于光照估计的方法(LIME)、多偏差融合方法(MF)、反射光照估计方法(SRIE)进行比较。本文在两个公共数据集(LIME数据和DICM数据)的低照度图像上对上述方法进行了性能评估。"} +{"content": "结论", "summary": "本文提出了一种面向低照度图像增强的双曝光融合处理算法。首先,利用照度估计技术得到用于图像融合的权重矩阵;然后,通过摄像机响应模型合成双曝光图像。鉴于不同曝光量下图像颜色基本相同,定义低亮度像素和亮度分量找到最佳曝光率,使合成图像在原始图像曝光不足的区域得到更好的曝光;最后,根据权重矩阵将输入图像与合成图像进行融合,得到增强结果。和已有算法相比,本文方法能够获得较小的亮度失真,且具有合理的时间开销。由于实际环境的复杂性,对过度曝光进行优化建模仍是一个充满挑战性的问题,未来将对此展开进一步研究。"} +{"content": "结论", "summary": "本文针对极端低光情况下的图像增强问题,提出一种新的增强模型,引入残差网络和感知损失重构图片的高频信息,更好地还原了图像的细节,得到了更好的视觉效果,在PSNR和SSIM这2个定量指标上也有所提升。"} +{"content": "结论", "summary": "另一方面目前亮度放大倍数为人为输入,未来可以根据极低光图像的信息估算出亮度放大倍数。如何在进一步地提升增强后图像视觉效果的同时提高PSNR和SSIM定量指标的值,以及如何估算光度放大倍数,将是未来研究的方向。"} diff --git a/ptuning/datasets/chat/dev.json b/ptuning/datasets/chat/dev.json new file mode 100644 index 0000000000000000000000000000000000000000..3a60fb0bd8af53d93897dbe1b69e7f49d40e5639 --- /dev/null +++ b/ptuning/datasets/chat/dev.json @@ -0,0 +1,20 @@ +{"content": "101", "summary": "10201"} +{"content": "102", "summary": "10404"} +{"content": "103", "summary": "10609"} +{"content": "104", "summary": "10816"} +{"content": "105", "summary": "11025"} +{"content": "106", "summary": "11236"} +{"content": "107", "summary": "11449"} +{"content": "108", "summary": "11664"} +{"content": "109", "summary": "11881"} +{"content": "110", "summary": "12100"} +{"content": "111", "summary": "12321"} +{"content": "112", "summary": "12544"} +{"content": "113", "summary": "12769"} +{"content": "114", "summary": "12996"} +{"content": "115", "summary": "13225"} +{"content": "116", "summary": "13456"} +{"content": "117", "summary": "13689"} +{"content": "118", "summary": "13924"} +{"content": "119", "summary": "14161"} +{"content": "120", "summary": "14400"} \ No newline at end of file diff --git a/ptuning/datasets/chat/train.json b/ptuning/datasets/chat/train.json new file mode 100644 index 0000000000000000000000000000000000000000..d72c9c8f04ef470c03e8c65fc95207055ace21b7 --- /dev/null +++ b/ptuning/datasets/chat/train.json @@ -0,0 +1,102 @@ +{"content": "0", "summary": "0"} +{"content": "1", "summary": "1"} +{"content": "2", "summary": "4"} +{"content": "3", "summary": "9"} +{"content": "4", "summary": "16"} +{"content": "5", "summary": "25"} +{"content": "6", "summary": "36"} +{"content": "7", "summary": "49"} +{"content": "8", "summary": "64"} +{"content": "9", "summary": "81"} +{"content": "10", "summary": "100"} +{"content": "11", "summary": "121"} +{"content": "12", "summary": "144"} +{"content": "13", "summary": "169"} +{"content": "14", "summary": "196"} +{"content": "15", "summary": "225"} +{"content": "16", "summary": "256"} +{"content": "17", "summary": "289"} +{"content": "18", "summary": "324"} +{"content": "19", "summary": "361"} +{"content": "20", "summary": "400"} +{"content": "21", "summary": "441"} +{"content": "22", "summary": "484"} +{"content": "23", "summary": "529"} +{"content": "24", "summary": "576"} +{"content": "25", "summary": "625"} +{"content": "26", "summary": "676"} +{"content": "27", "summary": "729"} +{"content": "28", "summary": "784"} +{"content": "29", "summary": "841"} +{"content": "30", "summary": "900"} +{"content": "31", "summary": "961"} +{"content": "32", "summary": "1024"} +{"content": "33", "summary": "1089"} +{"content": "34", "summary": "1156"} +{"content": "35", "summary": "1225"} +{"content": "36", "summary": "1296"} +{"content": "37", "summary": "1369"} +{"content": "38", "summary": "1444"} +{"content": "39", "summary": "1521"} +{"content": "40", "summary": "1600"} +{"content": "41", "summary": "1681"} +{"content": "42", "summary": "1764"} +{"content": "43", "summary": "1849"} +{"content": "44", "summary": "1936"} +{"content": "45", "summary": "2025"} +{"content": "46", "summary": "2116"} +{"content": "47", "summary": "2209"} +{"content": "48", "summary": "2304"} +{"content": "49", "summary": "2401"} +{"content": "50", "summary": "2500"} +{"content": "51", "summary": "2601"} +{"content": "52", "summary": "2704"} +{"content": "53", "summary": "2809"} +{"content": "54", "summary": "2916"} +{"content": "55", "summary": "3025"} +{"content": "56", "summary": "3136"} +{"content": "57", "summary": "3249"} +{"content": "58", "summary": "3364"} +{"content": "59", "summary": "3481"} +{"content": "60", "summary": "3600"} +{"content": "61", "summary": "3721"} +{"content": "62", "summary": "3844"} +{"content": "63", "summary": "3969"} +{"content": "64", "summary": "4096"} +{"content": "65", "summary": "4225"} +{"content": "66", "summary": "4356"} +{"content": "67", "summary": "4489"} +{"content": "68", "summary": "4624"} +{"content": "69", "summary": "4761"} +{"content": "70", "summary": "4900"} +{"content": "71", "summary": "5041"} +{"content": "72", "summary": "5184"} +{"content": "73", "summary": "5329"} +{"content": "74", "summary": "5476"} +{"content": "75", "summary": "5625"} +{"content": "76", "summary": "5776"} +{"content": "77", "summary": "5929"} +{"content": "78", "summary": "6084"} +{"content": "79", "summary": "6241"} +{"content": "80", "summary": "6400"} +{"content": "81", "summary": "6561"} +{"content": "82", "summary": "6724"} +{"content": "83", "summary": "6889"} +{"content": "84", "summary": "7056"} +{"content": "85", "summary": "7225"} +{"content": "86", "summary": "7396"} +{"content": "87", "summary": "7569"} +{"content": "88", "summary": "7744"} +{"content": "89", "summary": "7921"} +{"content": "90", "summary": "8100"} +{"content": "91", "summary": "8281"} +{"content": "92", "summary": "8464"} +{"content": "93", "summary": "8649"} +{"content": "94", "summary": "8836"} +{"content": "95", "summary": "9025"} +{"content": "96", "summary": "9216"} +{"content": "97", "summary": "9409"} +{"content": "98", "summary": "9604"} +{"content": "99", "summary": "9801"} +{"content": "100", "summary": "10000"} + diff --git a/ptuning/deepspeed.json b/ptuning/deepspeed.json new file mode 100644 index 0000000000000000000000000000000000000000..798932966f38b2df8a468c72a4b41d8b47033ccc --- /dev/null +++ b/ptuning/deepspeed.json @@ -0,0 +1,21 @@ +{ + "train_micro_batch_size_per_gpu": "auto", + "zero_allow_untested_optimizer": true, + "fp16": { + "enabled": "auto", + "loss_scale": 0, + "initial_scale_power": 16, + "loss_scale_window": 1000, + "hysteresis": 2, + "min_loss_scale": 1 + }, + "zero_optimization": { + "stage": 2, + "allgather_partitions": true, + "allgather_bucket_size": 5e8, + "overlap_comm": false, + "reduce_scatter": true, + "reduce_bucket_size": 5e8, + "contiguous_gradients" : true + } +} \ No newline at end of file diff --git a/ptuning/ds_train_finetune.sh b/ptuning/ds_train_finetune.sh new file mode 100644 index 0000000000000000000000000000000000000000..531a8004dbed00819aa767c420cdc483e7c0abed --- /dev/null +++ b/ptuning/ds_train_finetune.sh @@ -0,0 +1,28 @@ + +LR=1e-4 + +MASTER_PORT=$(shuf -n 1 -i 10000-65535) + +deepspeed --num_gpus=4 --master_port $MASTER_PORT main.py \ + --deepspeed deepspeed.json \ + --do_train \ + --train_file AdvertiseGen/train.json \ + --test_file AdvertiseGen/dev.json \ + --prompt_column content \ + --response_column summary \ + --overwrite_cache \ + --model_name_or_path THUDM/chatglm-6b \ + --output_dir ./output/adgen-chatglm-6b-ft-$LR \ + --overwrite_output_dir \ + --max_source_length 64 \ + --max_target_length 64 \ + --per_device_train_batch_size 4 \ + --per_device_eval_batch_size 1 \ + --gradient_accumulation_steps 1 \ + --predict_with_generate \ + --max_steps 5000 \ + --logging_steps 10 \ + --save_steps 1000 \ + --learning_rate $LR \ + --fp16 + diff --git a/ptuning/evaluate.sh b/ptuning/evaluate.sh new file mode 100644 index 0000000000000000000000000000000000000000..d98251f638034c658c94a53cd5d47d216e955fd2 --- /dev/null +++ b/ptuning/evaluate.sh @@ -0,0 +1,22 @@ +PRE_SEQ_LEN=128 +CHECKPOINT=adgen-chatglm-6b-pt-128-2e-2 +STEP=1000 + +CUDA_VISIBLE_DEVICES=0 python main.py \ + --do_predict \ + --train_file .\\datasets\\AdvertiseGen\\train.json \ + --validation_file .\\datasets\\AdvertiseGen\\dev.json \ + --test_file .\\datasets\\AdvertiseGen\\dev.json \ + --overwrite_cache \ + --prompt_column content \ + --response_column summary \ + --model_name_or_path ..\\models\\chatglm-6b-int4 \ + --ptuning_checkpoint .\\output\\$CHECKPOINT\\checkpoint-$STEP \ + --output_dir .\\output\\$CHECKPOINT \ + --overwrite_output_dir \ + --max_source_length 64 \ + --max_target_length 64 \ + --per_device_eval_batch_size 1 \ + --predict_with_generate \ + --pre_seq_len $PRE_SEQ_LEN \ + --quantization_bit 4 diff --git a/ptuning/evaluate_finetune.sh b/ptuning/evaluate_finetune.sh new file mode 100644 index 0000000000000000000000000000000000000000..e275c3cbbec9ee65ad5e4a958a0ea52c248964c4 --- /dev/null +++ b/ptuning/evaluate_finetune.sh @@ -0,0 +1,18 @@ +CHECKPOINT=adgen-chatglm-6b-ft-1e-4 +STEP=3000 + +CUDA_VISIBLE_DEVICES=0 python3 main.py \ + --do_predict \ + --validation_file AdvertiseGen/dev.json \ + --test_file AdvertiseGen/dev.json \ + --overwrite_cache \ + --prompt_column content \ + --response_column summary \ + --model_name_or_path ./output/$CHECKPOINT/checkpoint-$STEP \ + --output_dir ./output/$CHECKPOINT \ + --overwrite_output_dir \ + --max_source_length 256 \ + --max_target_length 256 \ + --per_device_eval_batch_size 1 \ + --predict_with_generate \ + --fp16_full_eval diff --git a/ptuning/main.py b/ptuning/main.py new file mode 100644 index 0000000000000000000000000000000000000000..4d11f74e90ed2a501880556b42276cf9ac797912 --- /dev/null +++ b/ptuning/main.py @@ -0,0 +1,445 @@ +#!/usr/bin/env python +# coding=utf-8 +# Copyright 2021 The HuggingFace Team. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +Fine-tuning the library models for sequence to sequence. +""" +# You can also adapt this script on your own sequence to sequence task. Pointers for this are left as comments. + +import logging +import os +import sys +import json + +import numpy as np +from datasets import load_dataset +import jieba +from rouge_chinese import Rouge +from nltk.translate.bleu_score import sentence_bleu, SmoothingFunction +import torch + +import transformers +from transformers import ( + AutoConfig, + AutoModel, + AutoTokenizer, + AutoTokenizer, + DataCollatorForSeq2Seq, + HfArgumentParser, + Seq2SeqTrainingArguments, + set_seed, +) +from trainer_seq2seq import Seq2SeqTrainer + +from arguments import ModelArguments, DataTrainingArguments + +logger = logging.getLogger(__name__) + +def main(): + + parser = HfArgumentParser((ModelArguments, DataTrainingArguments, Seq2SeqTrainingArguments)) + if len(sys.argv) == 2 and sys.argv[1].endswith(".json"): + # If we pass only one argument to the script and it's the path to a json file, + # let's parse it to get our arguments. + model_args, data_args, training_args = parser.parse_json_file(json_file=os.path.abspath(sys.argv[1])) + else: + model_args, data_args, training_args = parser.parse_args_into_dataclasses() + + # Setup logging + logging.basicConfig( + format="%(asctime)s - %(levelname)s - %(name)s - %(message)s", + datefmt="%m/%d/%Y %H:%M:%S", + handlers=[logging.StreamHandler(sys.stdout)], + ) + + if training_args.should_log: + # The default of training_args.log_level is passive, so we set log level at info here to have that default. + transformers.utils.logging.set_verbosity_info() + + log_level = training_args.get_process_log_level() + logger.setLevel(log_level) + # datasets.utils.logging.set_verbosity(log_level) + transformers.utils.logging.set_verbosity(log_level) + transformers.utils.logging.enable_default_handler() + transformers.utils.logging.enable_explicit_format() + + # Log on each process the small summary: + logger.warning( + f"Process rank: {training_args.local_rank}, device: {training_args.device}, n_gpu: {training_args.n_gpu}" + + f"distributed training: {bool(training_args.local_rank != -1)}, 16-bits training: {training_args.fp16}" + ) + logger.info(f"Training/evaluation parameters {training_args}") + + # Set seed before initializing model. + set_seed(training_args.seed) + + # Load dataset + data_files = {} + if data_args.train_file is not None: + data_files["train"] = data_args.train_file + extension = data_args.train_file.split(".")[-1] + if data_args.validation_file is not None: + data_files["validation"] = data_args.validation_file + extension = data_args.validation_file.split(".")[-1] + if data_args.test_file is not None: + data_files["test"] = data_args.test_file + extension = data_args.test_file.split(".")[-1] + + raw_datasets = load_dataset( + extension, + data_files=data_files, + cache_dir=model_args.cache_dir, + use_auth_token=True if model_args.use_auth_token else None, + ) + print('---------------------------------------------------') + + print("raw_datasets:", raw_datasets) + + + # Load pretrained model and tokenizer + config = AutoConfig.from_pretrained(model_args.model_name_or_path, trust_remote_code=True) + config.pre_seq_len = model_args.pre_seq_len + config.prefix_projection = model_args.prefix_projection + + tokenizer = AutoTokenizer.from_pretrained(model_args.model_name_or_path, trust_remote_code=True) + + if model_args.ptuning_checkpoint is not None: + # Evaluation + # Loading extra state dict of prefix encoder + model = AutoModel.from_pretrained(model_args.model_name_or_path, config=config, trust_remote_code=True) + prefix_state_dict = torch.load(os.path.join(model_args.ptuning_checkpoint, "pytorch_model.bin")) + new_prefix_state_dict = {} + for k, v in prefix_state_dict.items(): + if k.startswith("transformer.prefix_encoder."): + new_prefix_state_dict[k[len("transformer.prefix_encoder."):]] = v + model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict) + else: + model = AutoModel.from_pretrained(model_args.model_name_or_path, config=config, trust_remote_code=True) + + if model_args.quantization_bit is not None: + print(f"Quantized to {model_args.quantization_bit} bit") + + try: + # kernel_file = "{}\\quantization_kernels.so".format(model_args.model_name_or_path) + kernel_file = "{}/quantization_kernels.so".format(model_args.model_name_or_path) + model = model.quantize(bits=model_args.quantization_bit, kernel_file=kernel_file) + + except: + model = model.quantize(bits=model_args.quantization_bit) + + + if model_args.pre_seq_len is not None: + # P-tuning v2 + model = model.half() + model.transformer.prefix_encoder.float() + else: + # Finetune + model = model.float() + + prefix = data_args.source_prefix if data_args.source_prefix is not None else "" + + # Preprocessing the datasets. + # We need to tokenize inputs and targets. + if training_args.do_train: + column_names = raw_datasets["train"].column_names + elif training_args.do_eval: + column_names = raw_datasets["validation"].column_names + elif training_args.do_predict: + column_names = raw_datasets["test"].column_names + + else: + logger.info("There is nothing to do. Please pass `do_train`, `do_eval` and/or `do_predict`.") + return + + # Get the column names for input/target. + prompt_column = data_args.prompt_column + response_column = data_args.response_column + history_column = data_args.history_column + + # Temporarily set max_target_length for training. + max_target_length = data_args.max_target_length + + def preprocess_function_eval(examples): + inputs, targets = [], [] + for i in range(len(examples[prompt_column])): + if examples[prompt_column][i] and examples[response_column][i]: + query = examples[prompt_column][i] + if history_column is None or len(examples[history_column][i]) == 0: + prompt = query + else: + prompt = "" + history = examples[history_column][i] + for turn_idx, (old_query, response) in enumerate(history): + prompt += "[Round {}]\n问:{}\n答:{}\n".format(turn_idx, old_query, response) + prompt += "[Round {}]\n问:{}\n答:".format(len(history), query) + inputs.append(prompt) + targets.append(examples[response_column][i]) + + inputs = [prefix + inp for inp in inputs] + model_inputs = tokenizer(inputs, max_length=data_args.max_source_length, truncation=True, padding=True) + labels = tokenizer(text_target=targets, max_length=max_target_length, truncation=True) + + if data_args.ignore_pad_token_for_loss: + labels["input_ids"] = [ + [(l if l != tokenizer.pad_token_id else -100) for l in label] for label in labels["input_ids"] + ] + model_inputs["labels"] = labels["input_ids"] + + return model_inputs + + def preprocess_function_train(examples): + max_seq_length = data_args.max_source_length + data_args.max_target_length + + model_inputs = { + "input_ids": [], + "labels": [], + } + for i in range(len(examples[prompt_column])): + if examples[prompt_column][i] and examples[response_column][i]: + query, answer = examples[prompt_column][i], examples[response_column][i] + + if history_column is None: + prompt = query + else: + prompt = "" + history = examples[history_column][i] + for turn_idx, (old_query, response) in enumerate(history): + prompt += "[Round {}]\n问:{}\n答:{}\n".format(turn_idx, old_query, response) + prompt += "[Round {}]\n问:{}\n答:".format(len(history), query) + + prompt = prefix + prompt + a_ids = tokenizer.encode(text=prompt, add_special_tokens=False) + b_ids = tokenizer.encode(text=answer, add_special_tokens=False) + + if len(a_ids) > data_args.max_source_length - 1: + a_ids = a_ids[: data_args.max_source_length - 1] + + if len(b_ids) > data_args.max_target_length - 2: + b_ids = b_ids[: data_args.max_target_length - 2] + + input_ids = tokenizer.build_inputs_with_special_tokens(a_ids, b_ids) + + context_length = input_ids.index(tokenizer.bos_token_id) + mask_position = context_length - 1 + labels = [-100] * context_length + input_ids[mask_position+1:] + + pad_len = max_seq_length - len(input_ids) + input_ids = input_ids + [tokenizer.pad_token_id] * pad_len + labels = labels + [tokenizer.pad_token_id] * pad_len + if data_args.ignore_pad_token_for_loss: + labels = [(l if l != tokenizer.pad_token_id else -100) for l in labels] + + model_inputs["input_ids"].append(input_ids) + model_inputs["labels"].append(labels) + + return model_inputs + + def print_dataset_example(example): + print("input_ids",example["input_ids"]) + print("inputs", tokenizer.decode(example["input_ids"])) + print("label_ids", example["labels"]) + print("labels", tokenizer.decode(example["labels"])) + + if training_args.do_train: + if "train" not in raw_datasets: + raise ValueError("--do_train requires a train dataset") + train_dataset = raw_datasets["train"] + if data_args.max_train_samples is not None: + max_train_samples = min(len(train_dataset), data_args.max_train_samples) + train_dataset = train_dataset.select(range(max_train_samples)) + with training_args.main_process_first(desc="train dataset map pre-processing"): + train_dataset = train_dataset.map( + preprocess_function_train, + batched=True, + num_proc=data_args.preprocessing_num_workers, + remove_columns=column_names, + load_from_cache_file=not data_args.overwrite_cache, + desc="Running tokenizer on train dataset", + ) + print_dataset_example(train_dataset[0]) + + if training_args.do_eval: + max_target_length = data_args.val_max_target_length + if "validation" not in raw_datasets: + raise ValueError("--do_eval requires a validation dataset") + eval_dataset = raw_datasets["validation"] + if data_args.max_eval_samples is not None: + max_eval_samples = min(len(eval_dataset), data_args.max_eval_samples) + eval_dataset = eval_dataset.select(range(max_eval_samples)) + with training_args.main_process_first(desc="validation dataset map pre-processing"): + eval_dataset = eval_dataset.map( + preprocess_function_eval, + batched=True, + num_proc=data_args.preprocessing_num_workers, + remove_columns=column_names, + load_from_cache_file=not data_args.overwrite_cache, + desc="Running tokenizer on validation dataset", + ) + print_dataset_example(eval_dataset[0]) + + if training_args.do_predict: + max_target_length = data_args.val_max_target_length + if "test" not in raw_datasets: + raise ValueError("--do_predict requires a test dataset") + predict_dataset = raw_datasets["test"] + if data_args.max_predict_samples is not None: + max_predict_samples = min(len(predict_dataset), data_args.max_predict_samples) + predict_dataset = predict_dataset.select(range(max_predict_samples)) + with training_args.main_process_first(desc="prediction dataset map pre-processing"): + predict_dataset = predict_dataset.map( + preprocess_function_eval, + batched=True, + num_proc=data_args.preprocessing_num_workers, + remove_columns=column_names, + load_from_cache_file=not data_args.overwrite_cache, + desc="Running tokenizer on prediction dataset", + ) + print_dataset_example(predict_dataset[0]) + + # Data collator + label_pad_token_id = -100 if data_args.ignore_pad_token_for_loss else tokenizer.pad_token_id + data_collator = DataCollatorForSeq2Seq( + tokenizer, + model=model, + label_pad_token_id=label_pad_token_id, + pad_to_multiple_of=None, + padding=False + ) + + # Metric + def compute_metrics(eval_preds): + preds, labels = eval_preds + if isinstance(preds, tuple): + preds = preds[0] + decoded_preds = tokenizer.batch_decode(preds, skip_special_tokens=True) + if data_args.ignore_pad_token_for_loss: + # Replace -100 in the labels as we can't decode them. + labels = np.where(labels != -100, labels, tokenizer.pad_token_id) + decoded_labels = tokenizer.batch_decode(labels, skip_special_tokens=True) + + score_dict = { + "rouge-1": [], + "rouge-2": [], + "rouge-l": [], + "bleu-4": [] + } + for pred, label in zip(decoded_preds, decoded_labels): + hypothesis = list(jieba.cut(pred)) + reference = list(jieba.cut(label)) + rouge = Rouge() + scores = rouge.get_scores(' '.join(hypothesis) , ' '.join(reference)) + result = scores[0] + + for k, v in result.items(): + score_dict[k].append(round(v["f"] * 100, 4)) + bleu_score = sentence_bleu([list(label)], list(pred), smoothing_function=SmoothingFunction().method3) + score_dict["bleu-4"].append(round(bleu_score * 100, 4)) + + for k, v in score_dict.items(): + score_dict[k] = float(np.mean(v)) + return score_dict + + # Override the decoding parameters of Seq2SeqTrainer + training_args.generation_max_length = ( + training_args.generation_max_length + if training_args.generation_max_length is not None + else data_args.val_max_target_length + ) + training_args.generation_num_beams = ( + data_args.num_beams if data_args.num_beams is not None else training_args.generation_num_beams + ) + # Initialize our Trainer + trainer = Seq2SeqTrainer( + model=model, + args=training_args, + train_dataset=train_dataset if training_args.do_train else None, + eval_dataset=eval_dataset if training_args.do_eval else None, + tokenizer=tokenizer, + data_collator=data_collator, + compute_metrics=compute_metrics if training_args.predict_with_generate else None, + save_prefixencoder=model_args.pre_seq_len is not None + ) + + # Training + if training_args.do_train: + checkpoint = None + if training_args.resume_from_checkpoint is not None: + checkpoint = training_args.resume_from_checkpoint + # elif last_checkpoint is not None: + # checkpoint = last_checkpoint + model.gradient_checkpointing_enable() + model.enable_input_require_grads() + train_result = trainer.train(resume_from_checkpoint=checkpoint) + # trainer.save_model() # Saves the tokenizer too for easy upload + + metrics = train_result.metrics + max_train_samples = ( + data_args.max_train_samples if data_args.max_train_samples is not None else len(train_dataset) + ) + metrics["train_samples"] = min(max_train_samples, len(train_dataset)) + + trainer.log_metrics("train", metrics) + trainer.save_metrics("train", metrics) + trainer.save_state() + + # Evaluation + results = {} + if training_args.do_eval: + logger.info("*** Evaluate ***") + metrics = trainer.evaluate(metric_key_prefix="eval", do_sample=True, top_p=0.7, max_length=512, temperature=0.95) + max_eval_samples = data_args.max_eval_samples if data_args.max_eval_samples is not None else len(eval_dataset) + metrics["eval_samples"] = min(max_eval_samples, len(eval_dataset)) + + trainer.log_metrics("eval", metrics) + trainer.save_metrics("eval", metrics) + + if training_args.do_predict: + logger.info("*** Predict ***") + + predict_results = trainer.predict(predict_dataset, metric_key_prefix="predict", max_length=512, do_sample=True, top_p=0.7, temperature=0.95) + metrics = predict_results.metrics + max_predict_samples = ( + data_args.max_predict_samples if data_args.max_predict_samples is not None else len(predict_dataset) + ) + metrics["predict_samples"] = min(max_predict_samples, len(predict_dataset)) + + trainer.log_metrics("predict", metrics) + trainer.save_metrics("predict", metrics) + + if trainer.is_world_process_zero(): + if training_args.predict_with_generate: + predictions = tokenizer.batch_decode( + predict_results.predictions, skip_special_tokens=True, clean_up_tokenization_spaces=True + ) + predictions = [pred.strip() for pred in predictions] + labels = tokenizer.batch_decode( + predict_results.label_ids, skip_special_tokens=True, clean_up_tokenization_spaces=True + ) + labels = [label.strip() for label in labels] + output_prediction_file = os.path.join(training_args.output_dir, "generated_predictions.txt") + with open(output_prediction_file, "w", encoding="utf-8") as writer: + for p, l in zip(predictions, labels): + res = json.dumps({"labels": l, "predict": p}, ensure_ascii=False) + writer.write(f"{res}\n") + return results + + +def _mp_fn(index): + # For xla_spawn (TPUs) + main() + + +if __name__ == "__main__": + main() diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/all_results.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/all_results.json new file mode 100644 index 0000000000000000000000000000000000000000..fbee11a3326003f6a42e4d6c4994820128f2cacf --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/all_results.json @@ -0,0 +1,8 @@ +{ + "epoch": 114.29, + "train_loss": 0.2596614052057266, + "train_runtime": 12879.4173, + "train_samples": 140, + "train_samples_per_second": 1.242, + "train_steps_per_second": 0.078 +} \ No newline at end of file diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/config.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/config.json new file mode 100644 index 0000000000000000000000000000000000000000..161856ba351dc0967fa253fe6935a2931941cc6d --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/config.json @@ -0,0 +1,32 @@ +{ + "_name_or_path": "..\\models\\chatglm-6b-int4", + "architectures": [ + "ChatGLMForConditionalGeneration" + ], + "auto_map": { + "AutoConfig": "configuration_chatglm.ChatGLMConfig", + "AutoModel": "modeling_chatglm.ChatGLMForConditionalGeneration", + "AutoModelForSeq2SeqLM": "modeling_chatglm.ChatGLMForConditionalGeneration" + }, + "bos_token_id": 130004, + "eos_token_id": 130005, + "gmask_token_id": 130001, + "hidden_size": 4096, + "inner_hidden_size": 16384, + "layernorm_epsilon": 1e-05, + "mask_token_id": 130000, + "max_sequence_length": 2048, + "model_type": "chatglm", + "num_attention_heads": 32, + "num_layers": 28, + "pad_token_id": 3, + "position_encoding_2d": true, + "pre_seq_len": 128, + "prefix_projection": false, + "quantization_bit": 4, + "quantization_embeddings": false, + "torch_dtype": "float16", + "transformers_version": "4.27.1", + "use_cache": true, + "vocab_size": 130528 +} diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/configuration_chatglm.py b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/configuration_chatglm.py new file mode 100644 index 0000000000000000000000000000000000000000..5680c1a5d5412a831b054694cabdf53880af0469 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/configuration_chatglm.py @@ -0,0 +1,105 @@ +""" ChatGLM model configuration """ + +from transformers.configuration_utils import PretrainedConfig +from transformers.utils import logging + +logger = logging.get_logger(__name__) + + +class ChatGLMConfig(PretrainedConfig): + r""" + This is the configuration class to store the configuration of a [`~ChatGLMModel`]. + It is used to instantiate an ChatGLM model according to the specified arguments, defining the model + architecture. Instantiating a configuration with the defaults will yield a similar configuration to that of + the ChatGLM-6B [THUDM/ChatGLM-6B](https://huggingface.co/THUDM/chatglm-6b) architecture. + + Configuration objects inherit from [`PretrainedConfig`] and can be used + to control the model outputs. Read the documentation from [`PretrainedConfig`] + for more information. + + + Args: + vocab_size (`int`, *optional*, defaults to 150528): + Vocabulary size of the ChatGLM-6B model. Defines the number of different tokens that can be represented by the + `inputs_ids` passed when calling [`~ChatGLMModel`] or + [`~TFChatGLMModel`]. + hidden_size (`int`, *optional*, defaults to 4096): + Dimension of the encoder layers and the pooler layer. + num_hidden_layers (`int`, *optional*, defaults to 28): + Number of hidden layers in the Transformer encoder. + num_attention_heads (`int`, *optional*, defaults to 32): + Number of attention heads for each attention layer in the Transformer encoder. + inner_hidden_size (`int`, *optional*, defaults to 16384): + Dimension of the "intermediate" (i.e., feed-forward) layer in the Transformer encoder. + max_sequence_length (`int`, *optional*, defaults to 512): + The maximum sequence length that this model might ever be used with. + Typically set this to something large just in case (e.g., 512 or 1024 or 2048). + layernorm_epsilon (`float`, *optional*, defaults to 1e-5): + The epsilon used by the layer normalization layers. + use_cache (`bool`, *optional*, defaults to `True`): + Whether the model should return the last key/values attentions (not used by all models). + Example: + + ```python + >>> from configuration_chatglm import ChatGLMConfig + >>> from modeling_chatglm import ChatGLMModel + + >>> # Initializing a ChatGLM-6B THUDM/ChatGLM-6B style configuration + >>> configuration = ChatGLMConfig() + + >>> # Initializing a model from the THUDM/ChatGLM-6B style configuration + >>> model = ChatGLMModel(configuration) + + >>> # Accessing the model configuration + >>> configuration = model.config + ``` +""" + model_type = "chatglm" + + def __init__( + self, + vocab_size=150528, + hidden_size=4096, + num_layers=28, + num_attention_heads=32, + layernorm_epsilon=1e-5, + use_cache=False, + bos_token_id=150004, + eos_token_id=150005, + mask_token_id=150000, + gmask_token_id=150001, + pad_token_id=0, + max_sequence_length=2048, + inner_hidden_size=16384, + position_encoding_2d=True, + quantization_bit=0, + quantization_embeddings=False, + pre_seq_len=None, + prefix_projection=False, + **kwargs + ): + self.num_layers = num_layers + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.num_attention_heads = num_attention_heads + self.max_sequence_length = max_sequence_length + self.layernorm_epsilon = layernorm_epsilon + self.inner_hidden_size = inner_hidden_size + self.use_cache = use_cache + self.bos_token_id = bos_token_id + self.eos_token_id = eos_token_id + self.pad_token_id = pad_token_id + self.mask_token_id = mask_token_id + self.gmask_token_id = gmask_token_id + self.position_encoding_2d = position_encoding_2d + self.quantization_bit = quantization_bit + self.quantization_embeddings = quantization_embeddings + self.pre_seq_len = pre_seq_len + self.prefix_projection = prefix_projection + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + **kwargs + ) diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/generation_config.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/generation_config.json new file mode 100644 index 0000000000000000000000000000000000000000..e6191613b8cca2cd0d91cc92e90f2a353388ec3e --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/generation_config.json @@ -0,0 +1,7 @@ +{ + "_from_model_config": true, + "bos_token_id": 130004, + "eos_token_id": 130005, + "pad_token_id": 3, + "transformers_version": "4.27.1" +} diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/ice_text.model b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/ice_text.model new file mode 100644 index 0000000000000000000000000000000000000000..0dcfe31e02ad0767e0c80a469340bf97f58e777a --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/ice_text.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5e974d9a69c242ce014c88c2b26089270f6198f3c0b700a887666cd3e816f17e +size 2706249 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/modeling_chatglm.py b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/modeling_chatglm.py new file mode 100644 index 0000000000000000000000000000000000000000..52ac0515faabb9df4551f15ef29c2802feb693e0 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/modeling_chatglm.py @@ -0,0 +1,1472 @@ +""" PyTorch ChatGLM model. """ + +import math +import copy +import os +import warnings +import re +import sys + +import torch +import torch.utils.checkpoint +import torch.nn.functional as F +from torch import nn +from torch.nn import CrossEntropyLoss, LayerNorm +from torch.nn.utils import skip_init +from typing import Optional, Tuple, Union, List, Callable, Dict, Any + +from transformers.utils import ( + add_code_sample_docstrings, + add_start_docstrings, + add_start_docstrings_to_model_forward, +) +from transformers.modeling_outputs import ( + BaseModelOutputWithPast, + CausalLMOutputWithPast, + BaseModelOutputWithPastAndCrossAttentions, +) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging +from transformers.generation.logits_process import LogitsProcessor +from transformers.generation.utils import LogitsProcessorList, StoppingCriteriaList, GenerationConfig, ModelOutput + +from .configuration_chatglm import ChatGLMConfig + + +# flags required to enable jit fusion kernels + +if sys.platform != 'darwin': + torch._C._jit_set_profiling_mode(False) + torch._C._jit_set_profiling_executor(False) + torch._C._jit_override_can_fuse_on_cpu(True) + torch._C._jit_override_can_fuse_on_gpu(True) + +logger = logging.get_logger(__name__) + +_CHECKPOINT_FOR_DOC = "THUDM/ChatGLM-6B" +_CONFIG_FOR_DOC = "ChatGLM6BConfig" + +CHATGLM_6B_PRETRAINED_MODEL_ARCHIVE_LIST = [ + "THUDM/chatglm-6b", + # See all ChatGLM-6B models at https://huggingface.co/models?filter=chatglm +] + + +class InvalidScoreLogitsProcessor(LogitsProcessor): + def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor) -> torch.FloatTensor: + if torch.isnan(scores).any() or torch.isinf(scores).any(): + scores.zero_() + scores[..., 5] = 5e4 + return scores + + +def load_tf_weights_in_chatglm_6b(model, config, tf_checkpoint_path): + """Load tf checkpoints in a pytorch model.""" + try: + import re + + import numpy as np + import tensorflow as tf + except ImportError: + logger.error( + "Loading a TensorFlow model in PyTorch, requires TensorFlow to be installed. Please see " + "https://www.tensorflow.org/install/ for installation instructions." + ) + raise + tf_path = os.path.abspath(tf_checkpoint_path) + logger.info(f"Converting TensorFlow checkpoint from {tf_path}") + # Load weights from TF model + init_vars = tf.train.list_variables(tf_path) + names = [] + arrays = [] + for name, shape in init_vars: + logger.info(f"Loading TF weight {name} with shape {shape}") + array = tf.train.load_variable(tf_path, name) + names.append(name) + arrays.append(array) + + for name, array in zip(names, arrays): + name = name.split("/") + # adam_v and adam_m are variables used in AdamWeightDecayOptimizer to calculated m and v + # which are not required for using pretrained model + if any( + n in ["adam_v", "adam_m", "AdamWeightDecayOptimizer", "AdamWeightDecayOptimizer_1", "global_step"] + for n in name + ): + logger.info(f"Skipping {'/'.join(name)}") + continue + pointer = model + for m_name in name: + if re.fullmatch(r"[A-Za-z]+_\d+", m_name): + scope_names = re.split(r"_(\d+)", m_name) + else: + scope_names = [m_name] + if scope_names[0] == "kernel" or scope_names[0] == "gamma": + pointer = getattr(pointer, "weight") + elif scope_names[0] == "output_bias" or scope_names[0] == "beta": + pointer = getattr(pointer, "bias") + elif scope_names[0] == "output_weights": + pointer = getattr(pointer, "weight") + elif scope_names[0] == "squad": + pointer = getattr(pointer, "classifier") + else: + try: + pointer = getattr(pointer, scope_names[0]) + except AttributeError: + logger.info(f"Skipping {'/'.join(name)}") + continue + if len(scope_names) >= 2: + num = int(scope_names[1]) + pointer = pointer[num] + if m_name[-11:] == "_embeddings": + pointer = getattr(pointer, "weight") + elif m_name == "kernel": + array = np.transpose(array) + try: + assert ( + pointer.shape == array.shape + ), f"Pointer shape {pointer.shape} and array shape {array.shape} mismatched" + except AssertionError as e: + e.args += (pointer.shape, array.shape) + raise + logger.info(f"Initialize PyTorch weight {name}") + pointer.data = torch.from_numpy(array) + return model + + +class PrefixEncoder(torch.nn.Module): + """ + The torch.nn model to encode the prefix + Input shape: (batch-size, prefix-length) + Output shape: (batch-size, prefix-length, 2*layers*hidden) + """ + + def __init__(self, config): + super().__init__() + self.prefix_projection = config.prefix_projection + if self.prefix_projection: + # Use a two-layer MLP to encode the prefix + self.embedding = torch.nn.Embedding(config.pre_seq_len, config.hidden_size) + self.trans = torch.nn.Sequential( + torch.nn.Linear(config.hidden_size, config.hidden_size), + torch.nn.Tanh(), + torch.nn.Linear(config.hidden_size, config.num_layers * config.hidden_size * 2) + ) + else: + self.embedding = torch.nn.Embedding(config.pre_seq_len, config.num_layers * config.hidden_size * 2) + + def forward(self, prefix: torch.Tensor): + if self.prefix_projection: + prefix_tokens = self.embedding(prefix) + past_key_values = self.trans(prefix_tokens) + else: + past_key_values = self.embedding(prefix) + return past_key_values + + +@torch.jit.script +def gelu_impl(x): + """OpenAI's gelu implementation.""" + return 0.5 * x * (1.0 + torch.tanh(0.7978845608028654 * x * + (1.0 + 0.044715 * x * x))) + + +def gelu(x): + return gelu_impl(x) + + +class RotaryEmbedding(torch.nn.Module): + def __init__(self, dim, base=10000, precision=torch.half, learnable=False): + super().__init__() + inv_freq = 1. / (base ** (torch.arange(0, dim, 2).float() / dim)) + inv_freq = inv_freq.half() + self.learnable = learnable + if learnable: + self.inv_freq = torch.nn.Parameter(inv_freq) + self.max_seq_len_cached = None + else: + self.register_buffer('inv_freq', inv_freq) + self.max_seq_len_cached = None + self.cos_cached = None + self.sin_cached = None + self.precision = precision + + def _load_from_state_dict(self, state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, + error_msgs): + pass + + def forward(self, x, seq_dim=1, seq_len=None): + if seq_len is None: + seq_len = x.shape[seq_dim] + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): + self.max_seq_len_cached = None if self.learnable else seq_len + t = torch.arange(seq_len, device=x.device, dtype=self.inv_freq.dtype) + freqs = torch.einsum('i,j->ij', t, self.inv_freq) + # Different from paper, but it uses a different permutation in order to obtain the same calculation + emb = torch.cat((freqs, freqs), dim=-1).to(x.device) + if self.precision == torch.bfloat16: + emb = emb.float() + + # [sx, 1 (b * np), hn] + cos_cached = emb.cos()[:, None, :] + sin_cached = emb.sin()[:, None, :] + if self.precision == torch.bfloat16: + cos_cached = cos_cached.bfloat16() + sin_cached = sin_cached.bfloat16() + if self.learnable: + return cos_cached, sin_cached + self.cos_cached, self.sin_cached = cos_cached, sin_cached + return self.cos_cached[:seq_len, ...], self.sin_cached[:seq_len, ...] + + def _apply(self, fn): + if self.cos_cached is not None: + self.cos_cached = fn(self.cos_cached) + if self.sin_cached is not None: + self.sin_cached = fn(self.sin_cached) + return super()._apply(fn) + +def rotate_half(x): + x1, x2 = x[..., :x.shape[-1] // 2], x[..., x.shape[-1] // 2:] + return torch.cat((-x2, x1), dim=x1.ndim - 1) # dim=-1 triggers a bug in earlier torch versions + + +@torch.jit.script +def apply_rotary_pos_emb_index(q, k, cos, sin, position_id): + # position_id: [sq, b], q, k: [sq, b, np, hn], cos: [sq, 1, hn] -> [sq, b, 1, hn] + cos, sin = F.embedding(position_id, cos.squeeze(1)).unsqueeze(2), \ + F.embedding(position_id, sin.squeeze(1)).unsqueeze(2) + q, k = (q * cos) + (rotate_half(q) * sin), (k * cos) + (rotate_half(k) * sin) + return q, k + + +def attention_fn( + self, + query_layer, + key_layer, + value_layer, + attention_mask, + hidden_size_per_partition, + layer_id, + layer_past=None, + scaling_attention_score=True, + use_cache=False, +): + if layer_past is not None: + past_key, past_value = layer_past[0], layer_past[1] + key_layer = torch.cat((past_key, key_layer), dim=0) + value_layer = torch.cat((past_value, value_layer), dim=0) + + # seqlen, batch, num_attention_heads, hidden_size_per_attention_head + seq_len, b, nh, hidden_size = key_layer.shape + + if use_cache: + present = (key_layer, value_layer) + else: + present = None + + query_key_layer_scaling_coeff = float(layer_id + 1) + if scaling_attention_score: + query_layer = query_layer / (math.sqrt(hidden_size) * query_key_layer_scaling_coeff) + + # =================================== + # Raw attention scores. [b, np, s, s] + # =================================== + + # [b, np, sq, sk] + output_size = (query_layer.size(1), query_layer.size(2), query_layer.size(0), key_layer.size(0)) + + # [sq, b, np, hn] -> [sq, b * np, hn] + query_layer = query_layer.view(output_size[2], output_size[0] * output_size[1], -1) + # [sk, b, np, hn] -> [sk, b * np, hn] + key_layer = key_layer.view(output_size[3], output_size[0] * output_size[1], -1) + + matmul_result = torch.zeros( + 1, 1, 1, + dtype=query_layer.dtype, + device=query_layer.device, + ) + + matmul_result = torch.baddbmm( + matmul_result, + query_layer.transpose(0, 1), # [b * np, sq, hn] + key_layer.transpose(0, 1).transpose(1, 2), # [b * np, hn, sk] + beta=0.0, + alpha=1.0, + ) + + # change view to [b, np, sq, sk] + attention_scores = matmul_result.view(*output_size) + + if self.scale_mask_softmax: + self.scale_mask_softmax.scale = query_key_layer_scaling_coeff + attention_probs = self.scale_mask_softmax(attention_scores, attention_mask.contiguous()) + else: + if not (attention_mask == 0).all(): + # if auto-regressive, skip + attention_scores.masked_fill_(attention_mask, -10000.0) + dtype = attention_scores.dtype + attention_scores = attention_scores.float() + attention_scores = attention_scores * query_key_layer_scaling_coeff + + attention_probs = F.softmax(attention_scores, dim=-1) + + attention_probs = attention_probs.type(dtype) + + # ========================= + # Context layer. [sq, b, hp] + # ========================= + + # value_layer -> context layer. + # [sk, b, np, hn] --> [b, np, sq, hn] + + # context layer shape: [b, np, sq, hn] + output_size = (value_layer.size(1), value_layer.size(2), query_layer.size(0), value_layer.size(3)) + + # change view [sk, b * np, hn] + value_layer = value_layer.view(value_layer.size(0), output_size[0] * output_size[1], -1) + + # change view [b * np, sq, sk] + attention_probs = attention_probs.view(output_size[0] * output_size[1], output_size[2], -1) + + # matmul: [b * np, sq, hn] + context_layer = torch.bmm(attention_probs, value_layer.transpose(0, 1)) + + # change view [b, np, sq, hn] + context_layer = context_layer.view(*output_size) + + # [b, np, sq, hn] --> [sq, b, np, hn] + context_layer = context_layer.permute(2, 0, 1, 3).contiguous() + + # [sq, b, np, hn] --> [sq, b, hp] + new_context_layer_shape = context_layer.size()[:-2] + (hidden_size_per_partition,) + context_layer = context_layer.view(*new_context_layer_shape) + + outputs = (context_layer, present, attention_probs) + + return outputs + + +def default_init(cls, *args, **kwargs): + return cls(*args, **kwargs) + + +class SelfAttention(torch.nn.Module): + def __init__(self, hidden_size, num_attention_heads, + layer_id, hidden_size_per_attention_head=None, bias=True, + params_dtype=torch.float, position_encoding_2d=True, empty_init=True): + if empty_init: + init_method = skip_init + else: + init_method = default_init + super(SelfAttention, self).__init__() + + self.layer_id = layer_id + self.hidden_size = hidden_size + self.hidden_size_per_partition = hidden_size + self.num_attention_heads = num_attention_heads + self.num_attention_heads_per_partition = num_attention_heads + self.position_encoding_2d = position_encoding_2d + self.rotary_emb = RotaryEmbedding( + self.hidden_size // (self.num_attention_heads * 2) + if position_encoding_2d + else self.hidden_size // self.num_attention_heads, + base=10000, + precision=torch.half, + learnable=False, + ) + + self.scale_mask_softmax = None + + if hidden_size_per_attention_head is None: + self.hidden_size_per_attention_head = hidden_size // num_attention_heads + else: + self.hidden_size_per_attention_head = hidden_size_per_attention_head + + self.inner_hidden_size = num_attention_heads * self.hidden_size_per_attention_head + + # Strided linear layer. + self.query_key_value = init_method( + torch.nn.Linear, + hidden_size, + 3 * self.inner_hidden_size, + bias=bias, + dtype=params_dtype, + ) + + self.dense = init_method( + torch.nn.Linear, + self.inner_hidden_size, + hidden_size, + bias=bias, + dtype=params_dtype, + ) + + @staticmethod + def attention_mask_func(attention_scores, attention_mask): + attention_scores.masked_fill_(attention_mask, -10000.0) + return attention_scores + + def split_tensor_along_last_dim(self, tensor, num_partitions, + contiguous_split_chunks=False): + """Split a tensor along its last dimension. + Arguments: + tensor: input tensor. + num_partitions: number of partitions to split the tensor + contiguous_split_chunks: If True, make each chunk contiguous + in memory. + """ + # Get the size and dimension. + last_dim = tensor.dim() - 1 + last_dim_size = tensor.size()[last_dim] // num_partitions + # Split. + tensor_list = torch.split(tensor, last_dim_size, dim=last_dim) + # Note: torch.split does not create contiguous tensors by default. + if contiguous_split_chunks: + return tuple(chunk.contiguous() for chunk in tensor_list) + + return tensor_list + + def forward( + self, + hidden_states: torch.Tensor, + position_ids, + attention_mask: torch.Tensor, + layer_id, + layer_past: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + use_cache: bool = False, + output_attentions: bool = False, + ): + """ + hidden_states: [seq_len, batch, hidden_size] + attention_mask: [(1, 1), seq_len, seq_len] + """ + + # [seq_len, batch, 3 * hidden_size] + mixed_raw_layer = self.query_key_value(hidden_states) + + # [seq_len, batch, 3 * hidden_size] --> [seq_len, batch, num_attention_heads, 3 * hidden_size_per_attention_head] + new_tensor_shape = mixed_raw_layer.size()[:-1] + ( + self.num_attention_heads_per_partition, + 3 * self.hidden_size_per_attention_head, + ) + mixed_raw_layer = mixed_raw_layer.view(*new_tensor_shape) + + # [seq_len, batch, num_attention_heads, hidden_size_per_attention_head] + (query_layer, key_layer, value_layer) = self.split_tensor_along_last_dim(mixed_raw_layer, 3) + + if self.position_encoding_2d: + q1, q2 = query_layer.chunk(2, dim=(query_layer.ndim - 1)) + k1, k2 = key_layer.chunk(2, dim=(key_layer.ndim - 1)) + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + position_ids, block_position_ids = position_ids[:, 0, :].transpose(0, 1).contiguous(), \ + position_ids[:, 1, :].transpose(0, 1).contiguous() + q1, k1 = apply_rotary_pos_emb_index(q1, k1, cos, sin, position_ids) + q2, k2 = apply_rotary_pos_emb_index(q2, k2, cos, sin, block_position_ids) + query_layer = torch.concat([q1, q2], dim=(q1.ndim - 1)) + key_layer = torch.concat([k1, k2], dim=(k1.ndim - 1)) + else: + position_ids = position_ids.transpose(0, 1) + cos, sin = self.rotary_emb(value_layer, seq_len=position_ids.max() + 1) + # [seq_len, batch, num_attention_heads, hidden_size_per_attention_head] + query_layer, key_layer = apply_rotary_pos_emb_index(query_layer, key_layer, cos, sin, position_ids) + + # [seq_len, batch, hidden_size] + context_layer, present, attention_probs = attention_fn( + self=self, + query_layer=query_layer, + key_layer=key_layer, + value_layer=value_layer, + attention_mask=attention_mask, + hidden_size_per_partition=self.hidden_size_per_partition, + layer_id=layer_id, + layer_past=layer_past, + use_cache=use_cache + ) + + output = self.dense(context_layer) + + outputs = (output, present) + + if output_attentions: + outputs += (attention_probs,) + + return outputs # output, present, attention_probs + + +class GEGLU(torch.nn.Module): + def __init__(self): + super().__init__() + self.activation_fn = F.gelu + + def forward(self, x): + # dim=-1 breaks in jit for pt<1.10 + x1, x2 = x.chunk(2, dim=(x.ndim - 1)) + return x1 * self.activation_fn(x2) + + +class GLU(torch.nn.Module): + def __init__(self, hidden_size, inner_hidden_size=None, + layer_id=None, bias=True, activation_func=gelu, params_dtype=torch.float, empty_init=True): + super(GLU, self).__init__() + if empty_init: + init_method = skip_init + else: + init_method = default_init + self.layer_id = layer_id + self.activation_func = activation_func + + # Project to 4h. + self.hidden_size = hidden_size + if inner_hidden_size is None: + inner_hidden_size = 4 * hidden_size + self.inner_hidden_size = inner_hidden_size + self.dense_h_to_4h = init_method( + torch.nn.Linear, + self.hidden_size, + self.inner_hidden_size, + bias=bias, + dtype=params_dtype, + ) + # Project back to h. + self.dense_4h_to_h = init_method( + torch.nn.Linear, + self.inner_hidden_size, + self.hidden_size, + bias=bias, + dtype=params_dtype, + ) + + def forward(self, hidden_states): + """ + hidden_states: [seq_len, batch, hidden_size] + """ + + # [seq_len, batch, inner_hidden_size] + intermediate_parallel = self.dense_h_to_4h(hidden_states) + + intermediate_parallel = self.activation_func(intermediate_parallel) + + output = self.dense_4h_to_h(intermediate_parallel) + + return output + + +class GLMBlock(torch.nn.Module): + def __init__( + self, + hidden_size, + num_attention_heads, + layernorm_epsilon, + layer_id, + inner_hidden_size=None, + hidden_size_per_attention_head=None, + layernorm=LayerNorm, + use_bias=True, + params_dtype=torch.float, + num_layers=28, + position_encoding_2d=True, + empty_init=True + ): + super(GLMBlock, self).__init__() + # Set output layer initialization if not provided. + + self.layer_id = layer_id + + # Layernorm on the input data. + self.input_layernorm = layernorm(hidden_size, eps=layernorm_epsilon) + + self.position_encoding_2d = position_encoding_2d + + # Self attention. + self.attention = SelfAttention( + hidden_size, + num_attention_heads, + layer_id, + hidden_size_per_attention_head=hidden_size_per_attention_head, + bias=use_bias, + params_dtype=params_dtype, + position_encoding_2d=self.position_encoding_2d, + empty_init=empty_init + ) + + # Layernorm on the input data. + self.post_attention_layernorm = layernorm(hidden_size, eps=layernorm_epsilon) + + self.num_layers = num_layers + + # GLU + self.mlp = GLU( + hidden_size, + inner_hidden_size=inner_hidden_size, + bias=use_bias, + layer_id=layer_id, + params_dtype=params_dtype, + empty_init=empty_init + ) + + def forward( + self, + hidden_states: torch.Tensor, + position_ids, + attention_mask: torch.Tensor, + layer_id, + layer_past: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + use_cache: bool = False, + output_attentions: bool = False, + ): + """ + hidden_states: [seq_len, batch, hidden_size] + attention_mask: [(1, 1), seq_len, seq_len] + """ + + # Layer norm at the begining of the transformer layer. + # [seq_len, batch, hidden_size] + attention_input = self.input_layernorm(hidden_states) + + # Self attention. + attention_outputs = self.attention( + attention_input, + position_ids, + attention_mask=attention_mask, + layer_id=layer_id, + layer_past=layer_past, + use_cache=use_cache, + output_attentions=output_attentions + ) + + attention_output = attention_outputs[0] + + outputs = attention_outputs[1:] + + # Residual connection. + alpha = (2 * self.num_layers) ** 0.5 + hidden_states = attention_input * alpha + attention_output + + mlp_input = self.post_attention_layernorm(hidden_states) + + # MLP. + mlp_output = self.mlp(mlp_input) + + # Second residual connection. + output = mlp_input * alpha + mlp_output + + if use_cache: + outputs = (output,) + outputs + else: + outputs = (output,) + outputs[1:] + + return outputs # hidden_states, present, attentions + + +class ChatGLMPreTrainedModel(PreTrainedModel): + """ + An abstract class to handle weights initialization and + a simple interface for downloading and loading pretrained models. + """ + + is_parallelizable = False + supports_gradient_checkpointing = True + config_class = ChatGLMConfig + base_model_prefix = "transformer" + _no_split_modules = ["GLMBlock"] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights(self, module: nn.Module): + """Initialize the weights.""" + return + + def get_masks(self, input_ids, device): + batch_size, seq_length = input_ids.shape + context_lengths = [seq.tolist().index(self.config.bos_token_id) for seq in input_ids] + attention_mask = torch.ones((batch_size, seq_length, seq_length), device=device) + attention_mask.tril_() + for i, context_length in enumerate(context_lengths): + attention_mask[i, :, :context_length] = 1 + attention_mask.unsqueeze_(1) + attention_mask = (attention_mask < 0.5).bool() + + return attention_mask + + def get_position_ids(self, input_ids, mask_positions, device, use_gmasks=None): + batch_size, seq_length = input_ids.shape + if use_gmasks is None: + use_gmasks = [False] * batch_size + context_lengths = [seq.tolist().index(self.config.bos_token_id) for seq in input_ids] + if self.position_encoding_2d: + position_ids = torch.arange(seq_length, dtype=torch.long, device=device).unsqueeze(0).repeat(batch_size, 1) + for i, context_length in enumerate(context_lengths): + position_ids[i, context_length:] = mask_positions[i] + block_position_ids = [torch.cat(( + torch.zeros(context_length, dtype=torch.long, device=device), + torch.arange(seq_length - context_length, dtype=torch.long, device=device) + 1 + )) for context_length in context_lengths] + block_position_ids = torch.stack(block_position_ids, dim=0) + position_ids = torch.stack((position_ids, block_position_ids), dim=1) + else: + position_ids = torch.arange(seq_length, dtype=torch.long, device=device).unsqueeze(0).repeat(batch_size, 1) + for i, context_length in enumerate(context_lengths): + if not use_gmasks[i]: + position_ids[context_length:] = mask_positions[i] + + return position_ids + + def _set_gradient_checkpointing(self, module, value=False): + if isinstance(module, ChatGLMModel): + module.gradient_checkpointing = value + + +CHATGLM_6B_START_DOCSTRING = r""" + This model is a PyTorch [torch.nn.Module](https://pytorch.org/docs/stable/nn.html#torch.nn.Module) sub-class. + Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general + usage and behavior. + + Parameters: + config ([`~ChatGLM6BConfig`]): Model configuration class with all the parameters of the model. + Initializing with a config file does not load the weights associated with the model, only the configuration. + Check out the [`~PreTrainedModel.from_pretrained`] method to load the model weights. +""" + +CHATGLM_6B_INPUTS_DOCSTRING = r""" + Args: + input_ids (`torch.LongTensor` of shape `({0})`): + Indices of input sequence tokens in the vocabulary. + + Indices can be obtained using [`ChatGLM6BTokenizer`]. + See [`PreTrainedTokenizer.encode`] and + [`PreTrainedTokenizer.__call__`] for details. + + [What are input IDs?](../glossary#input-ids) + attention_mask (`torch.FloatTensor` of shape `({0})`, *optional*): + Mask to avoid performing attention on padding token indices. Mask values selected in `[0, 1]`: + + - 1 for tokens that are **not masked**, + - 0 for tokens that are **masked**. + + [What are attention masks?](../glossary#attention-mask) + token_type_ids (`torch.LongTensor` of shape `({0})`, *optional*): + Segment token indices to indicate first and second portions of the inputs. Indices are selected in `[0, 1]`: + + - 0 corresponds to a *sentence A* token, + - 1 corresponds to a *sentence B* token. + + [What are token type IDs?](../glossary#token-type-ids) + position_ids (`torch.LongTensor` of shape `({0})`, *optional*): + Indices of positions of each input sequence tokens in the position embeddings. + Selected in the range `[0, config.max_position_embeddings - 1]`. + + [What are position IDs?](../glossary#position-ids) + head_mask (`torch.FloatTensor` of shape `(num_heads,)` or `(num_layers, num_heads)`, *optional*): + Mask to nullify selected heads of the self-attention modules. Mask values selected in `[0, 1]`: + + - 1 indicates the head is **not masked**, + - 0 indicates the head is **masked**. + + inputs_embeds (`torch.FloatTensor` of shape `({0}, hidden_size)`, *optional*): + Optionally, instead of passing `input_ids` you can choose to directly pass an embedded representation. + This is useful if you want more control over how to convert *input_ids* indices into associated vectors + than the model's internal embedding lookup matrix. + output_attentions (`bool`, *optional*): + Whether or not to return the attentions tensors of all attention layers. See `attentions` under returned + tensors for more detail. + output_hidden_states (`bool`, *optional*): + Whether or not to return the hidden states of all layers. See `hidden_states` under returned tensors for + more detail. + return_dict (`bool`, *optional*): + Whether or not to return a [`~utils.ModelOutput`] instead of a plain tuple. +""" + + +@add_start_docstrings( + "The bare ChatGLM-6B Model transformer outputting raw hidden-states without any specific head on top.", + CHATGLM_6B_START_DOCSTRING, +) +class ChatGLMModel(ChatGLMPreTrainedModel): + """ + + The model can behave as an encoder (with only self-attention) as well + as a decoder, in which case a layer of cross-attention is added between + the self-attention layers, following the architecture described in [Attention is + all you need](https://arxiv.org/abs/1706.03762) by Ashish Vaswani, + Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin. + + To behave as an decoder the model needs to be initialized with the + `is_decoder` argument of the configuration set to `True`. + To be used in a Seq2Seq model, the model needs to initialized with both `is_decoder` + argument and `add_cross_attention` set to `True`; an + `encoder_hidden_states` is then expected as an input to the forward pass. + """ + + def __init__(self, config: ChatGLMConfig, empty_init=True): + super().__init__(config) + if empty_init: + init_method = skip_init + else: + init_method = default_init + # recording parameters + self.max_sequence_length = config.max_sequence_length + self.hidden_size = config.hidden_size + self.params_dtype = torch.half + self.num_attention_heads = config.num_attention_heads + self.vocab_size = config.vocab_size + self.num_layers = config.num_layers + self.layernorm_epsilon = config.layernorm_epsilon + self.inner_hidden_size = config.inner_hidden_size + self.hidden_size_per_attention_head = self.hidden_size // self.num_attention_heads + self.position_encoding_2d = config.position_encoding_2d + self.pre_seq_len = config.pre_seq_len + self.prefix_projection = config.prefix_projection + + self.word_embeddings = init_method( + torch.nn.Embedding, + num_embeddings=self.vocab_size, embedding_dim=self.hidden_size, + dtype=self.params_dtype + ) + self.gradient_checkpointing = False + + def get_layer(layer_id): + return GLMBlock( + self.hidden_size, + self.num_attention_heads, + self.layernorm_epsilon, + layer_id, + inner_hidden_size=self.inner_hidden_size, + hidden_size_per_attention_head=self.hidden_size_per_attention_head, + layernorm=LayerNorm, + use_bias=True, + params_dtype=self.params_dtype, + position_encoding_2d=self.position_encoding_2d, + empty_init=empty_init + ) + + self.layers = torch.nn.ModuleList( + [get_layer(layer_id) for layer_id in range(self.num_layers)] + ) + + # Final layer norm before output. + self.final_layernorm = LayerNorm(self.hidden_size, eps=self.layernorm_epsilon) + + if self.pre_seq_len is not None: + for param in self.parameters(): + param.requires_grad = False + self.prefix_tokens = torch.arange(self.pre_seq_len).long() + self.prefix_encoder = PrefixEncoder(config) + self.dropout = torch.nn.Dropout(0.1) + + # total_params = sum(p.numel() for p in self.parameters()) + # trainable_params = sum(p.numel() for p in self.parameters() if p.requires_grad) + # print("Using p-tuning v2: # trainable_params = {} / {}".format(trainable_params, total_params)) + + def get_input_embeddings(self): + return self.word_embeddings + + def set_input_embeddings(self, new_embeddings: torch.Tensor): + self.word_embeddings = new_embeddings + + def get_prompt(self, batch_size, device, dtype=torch.half): + prefix_tokens = self.prefix_tokens.unsqueeze(0).expand(batch_size, -1).to(device) + past_key_values = self.prefix_encoder(prefix_tokens).type(dtype) + past_key_values = past_key_values.view( + batch_size, + self.pre_seq_len, + self.num_layers * 2, + self.num_attention_heads, + self.hidden_size // self.num_attention_heads + ) + # seq_len, b, nh, hidden_size + past_key_values = self.dropout(past_key_values) + past_key_values = past_key_values.permute([2, 1, 0, 3, 4]).split(2) + # past_key_values = [(v[0], v[1]) for v in past_key_values] + return past_key_values + + @add_start_docstrings_to_model_forward(CHATGLM_6B_INPUTS_DOCSTRING.format("batch_size, sequence_length")) + @add_code_sample_docstrings( + checkpoint=_CHECKPOINT_FOR_DOC, + output_type=BaseModelOutputWithPastAndCrossAttentions, + config_class=_CONFIG_FOR_DOC, + ) + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + position_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Tuple[Tuple[torch.Tensor, torch.Tensor], ...]] = None, + inputs_embeds: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + ) -> Union[Tuple[torch.Tensor, ...], BaseModelOutputWithPast]: + + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + use_cache = use_cache if use_cache is not None else self.config.use_cache + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + if self.gradient_checkpointing and self.training: + if use_cache: + logger.warning_once( + "`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`..." + ) + use_cache = False + + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + elif input_ids is not None: + batch_size, seq_length = input_ids.shape[:2] + elif inputs_embeds is not None: + batch_size, seq_length, _ = inputs_embeds.shape[:2] + else: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.word_embeddings(input_ids) + + if past_key_values is None: + if self.pre_seq_len is not None: + past_key_values = self.get_prompt(batch_size=input_ids.shape[0], device=input_ids.device, + dtype=inputs_embeds.dtype) + else: + past_key_values = tuple([None] * len(self.layers)) + + if attention_mask is None: + attention_mask = self.get_masks( + input_ids, + device=input_ids.device + ) + + + if position_ids is None: + MASK, gMASK = self.config.mask_token_id, self.config.gmask_token_id + seqs = input_ids.tolist() + + mask_positions, use_gmasks = [], [] + for seq in seqs: + mask_token = gMASK if gMASK in seq else MASK + use_gmask = mask_token == gMASK + mask_positions.append(seq.index(mask_token)) + use_gmasks.append(use_gmask) + + position_ids = self.get_position_ids( + input_ids, + mask_positions=mask_positions, + device=input_ids.device, + use_gmasks=use_gmasks + ) + + if self.pre_seq_len is not None and attention_mask is not None: + prefix_attention_mask = torch.ones(batch_size, 1, input_ids.size(-1), self.pre_seq_len).to( + attention_mask.device) + prefix_attention_mask = (prefix_attention_mask < 0.5).bool() + attention_mask = torch.cat((prefix_attention_mask, attention_mask), dim=3) + + # [seq_len, batch, hidden_size] + hidden_states = inputs_embeds.transpose(0, 1) + + presents = () if use_cache else None + all_self_attentions = () if output_attentions else None + all_hidden_states = () if output_hidden_states else None + + if attention_mask is None: + attention_mask = torch.zeros(1, 1, device=input_ids.device).bool() + + else: + attention_mask = attention_mask.to(input_ids.device) + + for i, layer in enumerate(self.layers): + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + layer_past = past_key_values[i] + + if self.gradient_checkpointing and self.training: + layer_ret = torch.utils.checkpoint.checkpoint( + layer, + hidden_states, + position_ids, + attention_mask, + torch.tensor(i), + layer_past, + use_cache, + output_attentions + ) + else: + layer_ret = layer( + hidden_states, + position_ids=position_ids, + attention_mask=attention_mask, + layer_id=torch.tensor(i), + layer_past=layer_past, + use_cache=use_cache, + output_attentions=output_attentions + ) + + hidden_states = layer_ret[0] + + if use_cache: + presents = presents + (layer_ret[1],) + + if output_attentions: + all_self_attentions = all_self_attentions + (layer_ret[2 if use_cache else 1],) + + # Final layer norm. + hidden_states = self.final_layernorm(hidden_states) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, presents, all_hidden_states, all_self_attentions] if v is not None) + + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=presents, + hidden_states=all_hidden_states, + attentions=all_self_attentions, + ) + + +class ChatGLMForConditionalGeneration(ChatGLMPreTrainedModel): + def __init__(self, config: ChatGLMConfig, empty_init=True): + super().__init__(config) + if empty_init: + init_method = skip_init + else: + init_method = default_init + + # self.hidden_size = config.hidden_size + # self.params_dtype = torch.half + # self.vocab_size = config.vocab_size + self.max_sequence_length = config.max_sequence_length + + self.position_encoding_2d = config.position_encoding_2d + + self.transformer = ChatGLMModel(config, empty_init=empty_init) + + self.lm_head = init_method( + nn.Linear, + config.hidden_size, + config.vocab_size, + bias=False, + dtype=torch.half + ) + + self.config = config + + self.quantized = False + + if self.config.quantization_bit: + self.quantize(self.config.quantization_bit, self.config.quantization_embeddings, use_quantization_cache=True, empty_init=True) + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def _update_model_kwargs_for_generation( + self, + outputs: ModelOutput, + model_kwargs: Dict[str, Any], + is_encoder_decoder: bool = False, + standardize_cache_format: bool = False, + ) -> Dict[str, Any]: + # update past_key_values + model_kwargs["past_key_values"] = self._extract_past_from_model_output( + outputs, standardize_cache_format=standardize_cache_format + ) + + # update attention mask + if "attention_mask" in model_kwargs: + attention_mask = model_kwargs["attention_mask"] + if attention_mask is not None and attention_mask.dtype == torch.bool: + attention_mask = torch.cat( + [attention_mask, attention_mask.new_ones((*attention_mask.shape[:3], 1))], dim=3) + new_attention_mask = attention_mask[:, :, -1:].clone() + new_attention_mask[..., -1] = False + model_kwargs["attention_mask"] = torch.cat( + [attention_mask, new_attention_mask], dim=2 + ) + + # update position ids + if "position_ids" in model_kwargs: + position_ids = model_kwargs["position_ids"] + new_position_id = position_ids[..., -1:].clone() + new_position_id[:, 1, :] += 1 + model_kwargs["position_ids"] = torch.cat( + [position_ids, new_position_id], dim=-1 + ) + + return model_kwargs + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor, + past: Optional[torch.Tensor] = None, + past_key_values: Optional[torch.Tensor] = None, + attention_mask: Optional[torch.Tensor] = None, + position_ids: Optional[torch.Tensor] = None, + **kwargs + ) -> dict: + batch_size, seq_length = input_ids.shape + MASK, gMASK = self.config.mask_token_id, self.config.gmask_token_id + seqs = input_ids.tolist() + mask_positions, use_gmasks = [], [] + for seq in seqs: + mask_token = gMASK if gMASK in seq else MASK + use_gmask = mask_token == gMASK + mask_positions.append(seq.index(mask_token)) + use_gmasks.append(use_gmask) + + # only last token for input_ids if past is not None + if past is not None or past_key_values is not None: + last_token = input_ids[:, -1].unsqueeze(-1) + if attention_mask is not None and attention_mask.dtype == torch.bool: + attention_mask = attention_mask[:, :, -1:] + else: + attention_mask = None + if position_ids is not None: + position_ids = position_ids[..., -1:] + else: + context_lengths = [seq.index(self.config.bos_token_id) for seq in seqs] + if self.position_encoding_2d: + position_ids = torch.tensor( + [[mask_position, seq_length - context_length] for mask_position, context_length in + zip(mask_positions, context_lengths)], dtype=torch.long, device=input_ids.device).unsqueeze(-1) + else: + position_ids = torch.tensor([mask_position for mask_position in mask_positions], dtype=torch.long, + device=input_ids.device).unsqueeze(-1) + + if past is None: + past = past_key_values + return { + "input_ids": last_token, + "past_key_values": past, + "position_ids": position_ids, + "attention_mask": attention_mask + } + else: + if attention_mask is not None and attention_mask.dtype != torch.bool: + logger.warning_once(f"The dtype of attention mask ({attention_mask.dtype}) is not bool") + attention_mask = None + if attention_mask is None: + attention_mask = self.get_masks( + input_ids, + device=input_ids.device + ) + if position_ids is None: + position_ids = self.get_position_ids( + input_ids, + device=input_ids.device, + mask_positions=mask_positions, + use_gmasks=use_gmasks + ) + + return { + "input_ids": input_ids, + "past_key_values": past, + "position_ids": position_ids, + "attention_mask": attention_mask + } + + def forward( + self, + input_ids: Optional[torch.Tensor] = None, + position_ids: Optional[torch.Tensor] = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Tuple[torch.FloatTensor]] = None, + inputs_embeds: Optional[torch.Tensor] = None, + labels: Optional[torch.Tensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + ): + use_cache = use_cache if use_cache is not None else self.config.use_cache + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + transformer_outputs = self.transformer( + input_ids=input_ids, + position_ids=position_ids, + attention_mask=attention_mask, + past_key_values=past_key_values, + inputs_embeds=inputs_embeds, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict, + ) + + hidden_states = transformer_outputs[0] + + lm_logits = self.lm_head(hidden_states).permute(1, 0, 2).contiguous() + + loss = None + if labels is not None: + lm_logits = lm_logits.to(torch.float32) + + # Shift so that tokens < n predict n + shift_logits = lm_logits[..., :-1, :].contiguous() + shift_labels = labels[..., 1:].contiguous() + # Flatten the tokens + loss_fct = CrossEntropyLoss(ignore_index=-100) + loss = loss_fct(shift_logits.view(-1, shift_logits.size(-1)), shift_labels.view(-1)) + + lm_logits = lm_logits.to(hidden_states.dtype) + loss = loss.to(hidden_states.dtype) + + if not return_dict: + output = (lm_logits,) + transformer_outputs[1:] + return ((loss,) + output) if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=lm_logits, + past_key_values=transformer_outputs.past_key_values, + hidden_states=transformer_outputs.hidden_states, + attentions=transformer_outputs.attentions, + ) + + @staticmethod + def _reorder_cache( + past: Tuple[Tuple[torch.Tensor, torch.Tensor], ...], beam_idx: torch.LongTensor + ) -> Tuple[Tuple[torch.Tensor, torch.Tensor], ...]: + """ + This function is used to re-order the `past_key_values` cache if [`~PreTrainedModel.beam_search`] or + [`~PreTrainedModel.beam_sample`] is called. This is required to match `past_key_values` with the correct + beam_idx at every generation step. + + Output shares the same memory storage as `past`. + """ + return tuple( + ( + layer_past[0].index_select(1, beam_idx.to(layer_past[0].device)), + layer_past[1].index_select(1, beam_idx.to(layer_past[1].device)), + ) + for layer_past in past + ) + + def process_response(self, response): + response = response.strip() + response = response.replace("[[训练时间]]", "2023年") + punkts = [ + [",", ","], + ["!", "!"], + [":", ":"], + [";", ";"], + ["\?", "?"], + ] + for item in punkts: + response = re.sub(r"([\u4e00-\u9fff])%s" % item[0], r"\1%s" % item[1], response) + response = re.sub(r"%s([\u4e00-\u9fff])" % item[0], r"%s\1" % item[1], response) + return response + + @torch.no_grad() + def chat(self, tokenizer, query: str, history: List[Tuple[str, str]] = None, max_length: int = 2048, num_beams=1, + do_sample=True, top_p=0.7, temperature=0.95, logits_processor=None, **kwargs): + if history is None: + history = [] + if logits_processor is None: + logits_processor = LogitsProcessorList() + logits_processor.append(InvalidScoreLogitsProcessor()) + gen_kwargs = {"max_length": max_length, "num_beams": num_beams, "do_sample": do_sample, "top_p": top_p, + "temperature": temperature, "logits_processor": logits_processor, **kwargs} + if not history: + prompt = query + else: + prompt = "" + for i, (old_query, response) in enumerate(history): + prompt += "[Round {}]\n问:{}\n答:{}\n".format(i, old_query, response) + prompt += "[Round {}]\n问:{}\n答:".format(len(history), query) + inputs = tokenizer([prompt], return_tensors="pt") + inputs = inputs.to(self.device) + outputs = self.generate(**inputs, **gen_kwargs) + outputs = outputs.tolist()[0][len(inputs["input_ids"][0]):] + response = tokenizer.decode(outputs) + response = self.process_response(response) + history = history + [(query, response)] + return response, history + + @torch.no_grad() + def stream_chat(self, tokenizer, query: str, history: List[Tuple[str, str]] = None, max_length: int = 2048, + do_sample=True, top_p=0.7, temperature=0.95, logits_processor=None, **kwargs): + if history is None: + history = [] + if logits_processor is None: + logits_processor = LogitsProcessorList() + logits_processor.append(InvalidScoreLogitsProcessor()) + gen_kwargs = {"max_length": max_length, "do_sample": do_sample, "top_p": top_p, + "temperature": temperature, "logits_processor": logits_processor, **kwargs} + if not history: + prompt = query + else: + prompt = "" + for i, (old_query, response) in enumerate(history): + prompt += "[Round {}]\n问:{}\n答:{}\n".format(i, old_query, response) + prompt += "[Round {}]\n问:{}\n答:".format(len(history), query) + inputs = tokenizer([prompt], return_tensors="pt") + inputs = inputs.to(self.device) + for outputs in self.stream_generate(**inputs, **gen_kwargs): + outputs = outputs.tolist()[0][len(inputs["input_ids"][0]):] + response = tokenizer.decode(outputs) + response = self.process_response(response) + new_history = history + [(query, response)] + yield response, new_history + + @torch.no_grad() + def stream_generate( + self, + input_ids, + generation_config: Optional[GenerationConfig] = None, + logits_processor: Optional[LogitsProcessorList] = None, + stopping_criteria: Optional[StoppingCriteriaList] = None, + prefix_allowed_tokens_fn: Optional[Callable[[int, torch.Tensor], List[int]]] = None, + **kwargs, + ): + batch_size, input_ids_seq_length = input_ids.shape[0], input_ids.shape[-1] + + if generation_config is None: + generation_config = self.generation_config + generation_config = copy.deepcopy(generation_config) + model_kwargs = generation_config.update(**kwargs) + bos_token_id, eos_token_id = generation_config.bos_token_id, generation_config.eos_token_id + + if isinstance(eos_token_id, int): + eos_token_id = [eos_token_id] + + has_default_max_length = kwargs.get("max_length") is None and generation_config.max_length is not None + if has_default_max_length and generation_config.max_new_tokens is None: + warnings.warn( + f"Using `max_length`'s default ({generation_config.max_length}) to control the generation length. " + "This behaviour is deprecated and will be removed from the config in v5 of Transformers -- we" + " recommend using `max_new_tokens` to control the maximum length of the generation.", + UserWarning, + ) + elif generation_config.max_new_tokens is not None: + generation_config.max_length = generation_config.max_new_tokens + input_ids_seq_length + if not has_default_max_length: + logger.warn( + f"Both `max_new_tokens` (={generation_config.max_new_tokens}) and `max_length`(=" + f"{generation_config.max_length}) seem to have been set. `max_new_tokens` will take precedence. " + "Please refer to the documentation for more information. " + "(https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)", + UserWarning, + ) + + if input_ids_seq_length >= generation_config.max_length: + input_ids_string = "decoder_input_ids" if self.config.is_encoder_decoder else "input_ids" + logger.warning( + f"Input length of {input_ids_string} is {input_ids_seq_length}, but `max_length` is set to" + f" {generation_config.max_length}. This can lead to unexpected behavior. You should consider" + " increasing `max_new_tokens`." + ) + + # 2. Set generation parameters if not already defined + logits_processor = logits_processor if logits_processor is not None else LogitsProcessorList() + stopping_criteria = stopping_criteria if stopping_criteria is not None else StoppingCriteriaList() + + logits_processor = self._get_logits_processor( + generation_config=generation_config, + input_ids_seq_length=input_ids_seq_length, + encoder_input_ids=input_ids, + prefix_allowed_tokens_fn=prefix_allowed_tokens_fn, + logits_processor=logits_processor, + ) + + stopping_criteria = self._get_stopping_criteria( + generation_config=generation_config, stopping_criteria=stopping_criteria + ) + logits_warper = self._get_logits_warper(generation_config) + + unfinished_sequences = input_ids.new(input_ids.shape[0]).fill_(1) + scores = None + while True: + model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) + # forward pass to get next token + outputs = self( + **model_inputs, + return_dict=True, + output_attentions=False, + output_hidden_states=False, + ) + + next_token_logits = outputs.logits[:, -1, :] + + # pre-process distribution + next_token_scores = logits_processor(input_ids, next_token_logits) + next_token_scores = logits_warper(input_ids, next_token_scores) + + # sample + probs = nn.functional.softmax(next_token_scores, dim=-1) + if generation_config.do_sample: + next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1) + else: + next_tokens = torch.argmax(probs, dim=-1) + + # update generated ids, model inputs, and length for next step + input_ids = torch.cat([input_ids, next_tokens[:, None]], dim=-1) + model_kwargs = self._update_model_kwargs_for_generation( + outputs, model_kwargs, is_encoder_decoder=self.config.is_encoder_decoder + ) + unfinished_sequences = unfinished_sequences.mul((sum(next_tokens != i for i in eos_token_id)).long()) + + # stop when each sentence is finished, or if we exceed the maximum length + if unfinished_sequences.max() == 0 or stopping_criteria(input_ids, scores): + break + yield input_ids + + def quantize(self, bits: int, quantize_embeddings=False, use_quantization_cache=False, empty_init=False, **kwargs): + if bits == 0: + return + + from .quantization import quantize, QuantizedEmbedding, QuantizedLinear, load_cpu_kernel + + if self.quantized: + if self.device == torch.device("cpu"): + logger.info("Already quantized, reloading cpu kernel.") + load_cpu_kernel(**kwargs) + else: + logger.info("Already quantized.") + return self + + self.quantized = True + + self.config.quantization_bit = bits + self.config.quantization_embeddings = quantize_embeddings + + self.transformer = quantize(self.transformer, bits, use_quantization_cache=use_quantization_cache, empty_init=empty_init, **kwargs) + + if self.device == torch.device("cpu"): + dtype = torch.float32 + else: + dtype = torch.half + + if quantize_embeddings: + logger.info("Applying quantization to embeddings") + self.transformer.word_embeddings = QuantizedEmbedding( + weight_bit_width=bits, + weight_tensor=self.transformer.word_embeddings.weight.to(self.device), + num_embeddings=self.transformer.word_embeddings.num_embeddings, + embedding_dim=self.transformer.word_embeddings.embedding_dim, + dtype=dtype, + empty_init=empty_init, + device=self.transformer.word_embeddings.weight.device, + ) + self.lm_head = QuantizedLinear( + weight_bit_width=bits, + weight_tensor=self.lm_head.weight.to(self.device), + bias_tensor=None, + in_features=self.lm_head.in_features, + out_features=self.lm_head.out_features, + bias=False, + quantized_weight=self.transformer.word_embeddings.weight, + quantized_weight_scale=self.transformer.word_embeddings.weight_scale, + dtype=dtype, + empty_init=empty_init, + device=self.lm_head.weight.device, + ) + + return self diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/optimizer.pt b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/optimizer.pt new file mode 100644 index 0000000000000000000000000000000000000000..ce078b46cc6415478606eb343972d3bfc7e772bf --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/optimizer.pt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0c1bae103ea1bfc1bd47213378dc044d55554c61d5da8d3771230013f4300f11 +size 234882351 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/pytorch_model.bin b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/pytorch_model.bin new file mode 100644 index 0000000000000000000000000000000000000000..eccacde62b053bfa920c0d8c887a2cd7295fd5d1 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/pytorch_model.bin @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4f00304c7adb89a5779f15a408af0d674fd60b3c341455bb7ca8d1d88b519dbe +size 117441341 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/quantization.py b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/quantization.py new file mode 100644 index 0000000000000000000000000000000000000000..5be8b0b790b4cc3f96a2fd66a62f396d2b9d5dec --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/quantization.py @@ -0,0 +1,515 @@ +from torch.nn import Linear, Embedding +from torch.nn.parameter import Parameter +import torch.nn.functional as F + +import os +import bz2 +import torch +import base64 +import ctypes +from transformers.utils import logging + +from typing import List +from functools import partial + +logger = logging.get_logger(__name__) + +try: + from cpm_kernels.kernels.base import LazyKernelCModule, KernelFunction, round_up + + class Kernel: + def __init__(self, code: bytes, function_names: List[str]): + self.code = code + self._function_names = function_names + self._cmodule = LazyKernelCModule(self.code) + + for name in self._function_names: + setattr(self, name, KernelFunction(self._cmodule, name)) + + quantization_code = "$QlpoOTFBWSZTWU9yuJUAQHN//////////f/n/8/n///n//bt4dTidcVx8X3V9FV/92/v4B7/AD5FBQFAAAChSgKpFCFAFVSigUAAAEKhSgUUqgFBKigqVREQAABQBQIANDTTIGI00BkZBkNGE0A0BkBkGQGRkaNAaAGQNBoGgDIAAYIGTI0DQAQAaGmmQMRpoDIyDIaMJoBoDIDIMgMjI0aA0AMgaDQNAGQAAwQMmRoGgAgA0NNMgYjTQGRkGQ0YTQDQGQGQZAZGRo0BoAZA0GgaAMgABggZMjQNABABoaaZAxGmgMjIMhowmgGgMgMgyAyMjRoDQAyBoNA0AZAADBAyZGgaAAmqU1NEgJqnptU/Sn4jRR6J6epk2pqb1Q/SgAPUGgyNNGjQ2SBpoAZAAGg0NB6mgDIAAAAA2oaApSREBNAARhGiYEaEwU8pvImlP0k2aam1GaGqbFNM1MHpTwmkepmyU9R6nqPKekHqNNPUxNGhp6n6p6QaZ6o9TG1GMqcoV9ly6nRanHlq6zPNbnGZNi6HSug+2nPiZ13XcnFYZW+45W11CumhzYhchOJ2GLLV1OBjBjGf4TptOddTSOcVxhqYZMYwZXZZY00zI1paX5X9J+b+f4e+x43RXSxXPOdquiGpduatGyXneN696M9t4HU2eR5XX/kPhP261NTx3JO1Ow7LyuDmeo9a7d351T1ZxnvnrvYnrXv/hXxPCeuYx2XsNmO003eg9J3Z6U7b23meJ4ri01OdzTk9BNO96brz+qT5nuvvH3ds/G+m/JcG/F2XYuhXlvO+jP7U3XgrzPN/lr8Sf1n6j4j7jZs+s/T0tNaNNYzTs12rxjwztHlnire3Nzc3N1wuBwOBwXBvZfoHpD7rFmR99V5vj3aXza3xdBbXMalubTg/jIv5dfAi54Pdc75j4z412n3Npj3Ld/ENm7a3b/Cod6h/ret1/5vn/C+l+gdslMvgPSLJ8d8q+U66fevYn/tW1chleEtNTGlcHCbLRlq0tHzF5tsbbZZfHjjLgZu42XCuC3NrdjTasZGNzgxPIrGqp7r3p7L2p5XjnpPSmTd5XtzqnB6U87zzg1Ol0zd0zsLszxR6lkxp35u6/teL0L0W922cR7Lu1lpL9CsHirzuM2T+BgsyViT6LHcm0/Vr6U/7LGGyJeqTEjt0PHWhF5mCT7R9mtlDwriYv0Tyr/OxYt6qp5r0mPVT0608TqnqMZaarU2nFwrTzzlrs1ed7z1ux60wyr4ydCaTi3enW8x68x0zU7tXSlcmPSW1mGpWJMg4zmPC2lK96tp0OE80y4MfEvnZj8zGluR6b22ki1Ou9V2nCd9xovcPvcYMZYy0lvN60ScZ45vN6yeCeeXFb1lVjnnCar5fwXwE2bzJ4HI1XVPXfXZMm44GUsMpYsmLB65TuVdm0cl0b+i/wGNN66XjeV7zuPpHcnK/juhhjdfId5jMdE5nN0dGmmm2zZs2cexD5n9p/dY352XsvXHaZNWWsmmS1atjR452nYudzvqv2HMRyvNNnlMcDl3R2+yx2uVrBubTW9icHDVtbNXlZm7jma1rM4VurZZd2y6nUau7ZXZ7bVU+mnoOVxZGMrVmvX60605JwmzGZhhhjTWtaaaMaaGTGmNMZasY0iX8VMUl8eepaIrzGSpemWOQyZORk2bNpjUybMmxqYmknCGCFynutfksaZpjTNMaaatM0xsxcGR0sociNqxNSmhhR1ZJPbsn8qyF0t2qH6iYBclclalbtTTcHTDsPaX6rlnElph2Jyumumtynv2Kk8GI7rsvXbIcJgHJOSaSXnnGaI3m87RtVXJOZ/YtgdTE6Wpha6ZlE8ayXkef1fh602r2WwvfMXtMdLlkfnLFdYYwYso+bWqm7yJqHXZGw2nrS5ZanSYnWlxBxMF1V940K2wdrI7R6OYf7DGGamMmTSbRhlS45xmVOumF1EyPCmHrrN8wwZOOrdNtLeMtzFzDlWnfTBxMk2NaXIZHBYxYLD4w8yju0ao65Vz1OIXoS9dLanwCe1PWrYuWMqf1if1z2k2yYfKJ741PDgno1ZQ8DRqvUny3mNoWTzGO6m1DkrJI8JiR5cSd+vZdGOO8nrMoc5+NDUFsMSXaZJeNlMmGLtJsovOsUp7I9S5VojKxF6bTVEelXqlfJobQr3LozSh2Jk7VcrVMfhXqszGWMzNqGhqZY0OadxkyyMssKugZR0KNFXBHlqwmJgTE/BNVMk6ItJXZMR0H47GpXv/DMOvNkmVuaV1PRfEdxuqc7Hcd+ZV/zTLaRxWk0nl9CdCeM6mn5rstHIBcpiuwmUZXeq81DacHI2rmrZ5SuE5mOZd6LQrZg9mx32TprA8BMo5jKN6yLTCi3WzQaZSuhzTtM1fUTGVpG8Tw+KXI0tjEpiWxtLYynOlktSbVlaI5kxP8TDH8kx50xoxi5KcA4pcja8KWLRlO/Ks6q06ergnvm1ca3Tq8Uw7LTUsmWyctXPWmpitl/uvGcWTGXGuAXDfhqazGmjkxcJW5hMMMMpYsXl2TZYtVOddG3XCarUt6Ptq9CZXSNzyuRzqRZOjsxdBbFVz6OA5HI43r1jityVlVpVkxmOsyaYWE1NTGq1sOVh36mHMcxtSvcy70edG0ZGR3I1Go1GRlV7mWWo1G0ZGRqlvH40l7o4m5xMWLLLYyNjnqc8556mdPqLJ31n/1nWOncxzG1tizrHs/Z+d2vP/B/l8wdJ6rHUn2nbbDq4p6htFtYzMMMTaZis1K5GKzGNmxhmUx2DDlZ/qNnIx41xnaMfCZWYaZWtNLTNW8ND4Fw1MyZOCdM428suKG1ehW8TesOydg7J+YYcD4cYR+8dFK6M4E3HM9ZfRNNL+Sn6rsl4DsrDl2HpPCnfxjGXtbZtYys1ttlyJ4T+BvexjGWRjMszK4Jpc77D3GyuVD7q0+G8m9G+2+rGm7cOR2y7FdtY2XUYx/oNlfRYxhMYyYZkyyg55enna9Kt/FFi6GMMwYwdwxWgxGMLKYmUyGExTKMZkMFhkymKuh0NOBNnBu+23LdwDoZYYzGGMxtORaTU1pjTGWTTGGtMrNWUsyyTTLLG1qy2ZjbK2DBllWqxMtBMaYZQmcE7zvvRcTkclUwdkxTaSdyySt/7fpL+T1v516Ji97fwr5JbLu305zMn5+GMTTZ9F+y7ExwmGVfG44yxn3dLv6l5i+Wth1jCrDq21nW9LqvvDzz3Vf3LLH/O/32TJ/erx3bXftO4eF+G956D952K/An4NfvOpjFjExjevP/UmE0fIoZXx6/w6lX/no3D0bLt+ixjieBM6ksRd0yB4Lt2SwYNE+gd1detlZWUnpiZfGfFaK+4PyCa/v18V8X75pe9fLXzp7l3VjF76vWZmHwGz1IZNWT7b8yddJ4q5kyrVdfru6atWc7bVYztL9Jf4GXvT+Y8m9/YsXP6H018a8D4XVOqvfzqeR+6yZOD8dPv0+U7/q5Pl+2dNb0MjzGVH5p6MNQ7cOWvw62U9aHE8DprDek+McLyvDz+te+9Zhq5+YTruufMcWMabqysTmZVWjKPfnK0wyVcrsuhjZRdLkHNvD72b9abriOSGIxiLixMOoalNPXzy+wT/tf+U6HHONfsz+xe8ufHBdQWWGWLA9if0rsnmrxK5LvRZQeWsTCsrmOYy8VteVfuRfcVTtDLItLIsMYxZLdU/DbtSemxF6Z6Zo5WBXE4tFdCyVMMXMTEMZXVlS6Xec2T4e0tHsRcEuWshcJ2YsNF5rUx1E8ifCq6Z+ZP7qdCeu/aTwFd53l16/o0NOw6O3dLavP4Hbi4RdmuDk6DoYaninC0+o4uZjbJ7Rxeu0/FbuFg+q7DVS6fQe0rZ6NDGUNNU6DEqOaLTicKnYZMnBWruljQxoaS3dZhocDge0bSTyOvdAbG5hxe2xji7E/L55xX13wWNDi6HCekcFxfCPGxY0MXC+s7afWaMdDyjyr+o8Rudm/NabOZvdl274zH4f5XK9z6On1Pe/K5TdPAslg77BjuO6Y3eO7GqvOPG/stknp1leyvLL0Z7bl9I4noMvLkzytLhWYzrOZzLXCORe028rORzOg4N/L0HlMOQ3Pgmnbb6KczlabORpu980q37TBqRu0/p3PO6234Bl03Ynuz+9W7gnsEcmvYaYY3aMYY0wx3pYd+ujsXauWdaY5Xkbtl23fPzFHiDB/QMo0yFjBllYxTQYYyxkrwn7JufwJ/PfgJ+C83X69ni6zvXcnyXabv0ncbLwsceS+RNlyN2mnneJtX0ngYO0+e+0+UnA+Wch3ji8hj5an4h+i6XBySU4n+R0roVcbw5yvHrmr4Yw8Y7x6c+9POPYHI5HI5HI5HI5HGXGww4nE4nrVyOR8XeqPEO7PLOiukYa3Novk5hV4cdtYZLI93e+uxff2jRo0aNGjRo0aNG1bVtW1dy3m83m8+tQ5ZzHw3nObwOu8La9Rc1dtkdS8A3eTk823tnktXWlxN6Oixe06zrN70Isd9jiOgZFq9yfkPqP/SLhN2Myl8jDM43bl1nbcb4cO57jlh8Jow6pzXZdL4dyODTuuhu77FyO27DdwdRxmvO+O+3N2+BdqyTwLHVczDVY4UPE4O66/ZO2cx1LFzVdSXtF7G4HMbrauOHRw6c8FdZ5m9fHZHYZXfTlZquyynSyTTKke6vcffSD9pzPA/G7n7jxPmuhc1DHMynPMrGL6AdewYmwu5ko+UUyTwrMv27rPH1v1nGqd87+p6N6LU8k3NEng53xXyHS97+44OSg/sy/hn+Se6yfYNjW0/uTgP+PvWYzLMmjhcLB/gGpri6H83/84eUXWT6T9Hsv7785z/7z4icpW+zfXypuR7rx/gMdZb1/wC678pcs8/2a3mDitGHxl9mfPlll5MafWWqxk/eYuTDgcNMzDGWLWvsuglNxs53GtN6uWpktlW1tZZYcuinMMWmnNnJydze3b2Y1McBxrBkXw799izLMZZYyy0TkbsGM4p03S2uVu5s/XXUdSdec6smVxZYYGpVmT8A+8ajuEyV5FatkvVru2x6uxGXXbH4A+jvgP4GMYy3iPLXzq/6z65+E005ey+cwMZD3fZcqc6xpjTFjQ0P3U+e++cPYmTIwj0nrK5NPTfl3WvpfLtXDcb2HQMudYOxFXQBor4L4T6vrOauFctYXJQ++NUWmJe5bmx1jDiZS1dTqWxo4GR8jm3fttpmPHppk9PEyv4/y8/sO07XacOmcqc0x2Vi9BvNJvN5oW8x4mOsydpidRxMYJPx06m1bqPzq9KtK8sxXNXFodD/+MYYaJTLwOhc9brCsV18oOR1i4tXChyTkq4lf4y1Ke+9axjDHqs1mfBbMXuP4Hzi+X7t8vzv7bHerrUPgPCxhjre4fXdfLNtNM+Jd+Zdh8xd8wP87uNPoPgv4W7/5P2BuxfsMabNnMnza+54Pdi5U671GPZY8CehX8Voeoo7FHpkeEc6715FwHZrIrUrHaviPUbPZHND+IhczrP6FcYvhOZ0Di/ETt0OI+YwNWR9r7tpf6WDeZKZDB1+z2IthOl1mPyb5FluvEx9h9d0NnM0Y1XPFkWIsk1WotJ0PBMmkvjvQTd0e71tfeV+8r8lQ/tpzpsmxJ+InrI/dj2UajUajVTUajatRqNRtGo1Go1Go4wjeMpZFMVV9CHbofPraLsJ3JpWV2XOoanCuFky4y3PPNxucK2uKC1Lbdb1eo+m5XomN6HfeZsabHLHRX/K+offtNGGmHWctcVcG44MdSqsOLY9VzX+Zxfxn2HPdWTpzWvkrtJ8M5zorrKcquRytJ5N5DZmcaW02l76nWO+BqPXm1A2Ry/0q71dH/mqrqeFjkYxjEXtsX8qubTk67rGycyqsdm4tZx5D6D5hhi0waaWmiaMP81Yjii5qxPlPuU/GfTL1Y5E6Jyfiq63qTa39A4J0sOGDgO9WF9bOXl0XfPRbsY2bPNKPy1YrFYrFYmRhhlTIyMjJWJYZHXuCXI8OoXsvfljGLFicNifpp2XunoPiG1wtx3p1Tah+/DD66OnVtVXP9rKbVxOnL0tR/rHtqB5UDErUVcl11D4qqvjpOcxX7armUNJB3LpW6bxVvD08e8h3odKKvyCFZBdSh2FVcST9xV3n3T8t1j7Kr9qgrqXg+13Pt5U7JCvFXVIV1YG5lRhkVYZJYYDDD4KOIMoHCp26WS8GB7uBh2zIdgq/PKyInjV2STShuoapUdCpX1yTwqq/z1VvET7Kh5nVPkO8YyxjLt2MaaMmWTLQvx3qnzltnXW0p2jxgbEtSny/Osv8Y9pLMXYoHVPAhkVdWVeODhR6q9/Sxe2liwwZWMVvFXfRkeIDxAePUPIrdJ4ey6yquzH+PD/bUOWAu05qVHtFd8rrKHSoeNIOUqrYr3FXyToqfYJgwmJdKpXXOwYYegNNGMzfZPp/t3t/DVs4zjNTN61rRqaWaa4NYbRjTa0tWwy2Y2tGN8ZO8ofNKq4j9SL7I+cSm4/6ovLV5HNXLI0jJidwrtk6ynCaP6Z++GjRlWS3tLeW129Mi9evxU9mtz6s5J3Z7M2ngTgnKvmpomxpaLCzPfmx0JWE+m3NLDDGOX47RctdYYNK5jakdqLkRlI39n590T5zctGSwwZZDJj6kW8XSi6ot2MmWWJ0DUT3nuvebBudScjZ79g8cWJ8av0k+/bE5WKd5MdbFpbDVMxu1DVMmtNZGJvq1mtRbn6M+g/kP0FwDwr7quZs7xosNGpbscyxhhd9TyJyFwbLcxlTasg75vW7TsV5K7ji44XPMMrdoj+Y3rT0Hie62nlYV/pwczzOmdLqLhYkzGMzCZWGMQzGMSsZYY6Di1t4nlJ+Em63mJxrVLxPbYxNEdgc1dU2iOKyoYYWjNrEeHTYybVk0atSa7ehuwsWMWTqn1TrnS6hYsi71d1+s+k+ic70e20fzE/VaTdxT9ZtU4GIXdeNx3X77guYYfpHeTQjaMX6brOu4OY4K7Y2d9mbHarI5ox3p4GpJ2Vd/Tst60f7j999pppjR+Q/Qf8J/VaORs3cji7FfFuN61+ui9s8hix1OCh5KGVV23BPXvZfz3CLyHpix+exi8z/KnCnosY2eunor+cxyPO/xJ0vKey9OvE9VjqaYu0x3Z3jd6o2b1T12D+F8l232lwaaacD5LE8LBxu7WTlbWraWpew8Xexjel3E+wWD4APITdNqR8F3R3T0lunCQ4GaE9R37DxeCYfcHi4xci5ovKfxVs55y2hf+65E/Xdp6jR5nrebTmi5incpkyOjs50JvrZwstbbW6kfuuQw+2mykf/EXNFzxfKTrxew929TR6bWnGL//F3JFOFCQT3K4lQ" + + kernels = Kernel( + bz2.decompress(base64.b64decode(quantization_code)), + [ + "int4WeightCompression", + "int4WeightExtractionFloat", + "int4WeightExtractionHalf", + "int8WeightExtractionFloat", + "int8WeightExtractionHalf", + ], + ) +except Exception as exception: + kernels = None + logger.warning("Failed to load cpm_kernels:", exception) + + +class W8A16Linear(torch.autograd.Function): + @staticmethod + def forward(ctx, inp: torch.Tensor, quant_w: torch.Tensor, scale_w: torch.Tensor, weight_bit_width): + ctx.inp_shape = inp.size() + ctx.weight_bit_width = weight_bit_width + out_features = quant_w.size(0) + inp = inp.contiguous().view(-1, inp.size(-1)) + weight = extract_weight_to_half(quant_w, scale_w, weight_bit_width) + ctx.weight_shape = weight.size() + output = inp.mm(weight.t()) + ctx.save_for_backward(inp, quant_w, scale_w) + return output.view(*(ctx.inp_shape[:-1] + (out_features,))) + + @staticmethod + def backward(ctx, grad_output: torch.Tensor): + inp, quant_w, scale_w = ctx.saved_tensors + weight = extract_weight_to_half(quant_w, scale_w, ctx.weight_bit_width) + grad_output = grad_output.contiguous().view(-1, weight.size(0)) + grad_input = grad_output.mm(weight) + grad_weight = grad_output.t().mm(inp) + return grad_input.view(ctx.inp_shape), grad_weight.view(ctx.weight_shape), None, None + + +class W8A16LinearCPU(torch.autograd.Function): + @staticmethod + def forward(ctx, inp: torch.Tensor, quant_w: torch.Tensor, scale_w: torch.Tensor, weight_bit_width, quantization_cache=None): + ctx.inp_shape = inp.size() + ctx.weight_bit_width = weight_bit_width + out_features = quant_w.size(0) + inp = inp.contiguous().view(-1, inp.size(-1)) + weight = extract_weight_to_float(quant_w, scale_w, weight_bit_width, quantization_cache=quantization_cache) + ctx.weight_shape = weight.size() + output = inp.mm(weight.t()) + ctx.save_for_backward(inp, quant_w, scale_w) + return output.view(*(ctx.inp_shape[:-1] + (out_features,))) + + @staticmethod + def backward(ctx, grad_output: torch.Tensor): + inp, quant_w, scale_w = ctx.saved_tensors + weight = extract_weight_to_float(quant_w, scale_w, ctx.weight_bit_width) + grad_output = grad_output.contiguous().view(-1, weight.size(0)) + grad_input = grad_output.mm(weight) + grad_weight = grad_output.t().mm(inp) + return grad_input.view(ctx.inp_shape), grad_weight.view(ctx.weight_shape), None, None + + +default_cpu_kernel_code_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "quantization_kernels.c") +default_cpu_kernel_code = "QlpoOTFBWSZTWXLbSoQAAgzbgERwQXxmTwAAr/ff3kABt0Q2oRVT0hpo9RtEAAAAyBEiSQ9EGjQGQAAAwANGhowjJoNGmgMEUplMTNSMJ5TQaDJpsoMyRMj8P4mZzFSVVwqSXG8GG7MlVwiToYEQwVD7noBxMhNfkeZYtYFtbgOBUSIGtIQjhNHCEnPJsadhb3yBmRIOD3TeAtNLSaU5GgvKUBWSNuuOIHmVt0YhW6rsmDMDUjeUJGJ64R1Jm5lrh0Aa0tKjhFwPdWcGogxLDSXPWQUWTM8Sd3Qz1HMYNxx3HMeiNqNo4jeRDEfZ3gUSHIcU/heomq0vEzL1Msz5KKGxH8FrNOYw3KaxdqaEmNHYMxJFgQbR0DyRknL2L4kwUSxKRdhjRpEtUqilVfggFL1klaMS3PPRDfNqbBOPWO7m4JTVGhS9QTBDDJaEbLbrUQNB+IpJSKQbG5SZZ5gkwJEhJ3aYKJipZ/i7kinChIOW2lQg" +default_cpu_parallel_kernel_code_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "quantization_kernels_parallel.c") +default_cpu_parallel_kernel_code = "QlpoOTFBWSZTWUzax5EAALXbgERwSX1mTwAAr/ff3kACNyXUbZYwBpoaNGIyAaADQwRSaVP9QoMg0A2oAPU0AEUkU9GaaKMaQB6gA09T1ARRKnpk0niaJkaaNDJ6g0DTIKVKfZ/g6v1Kem5LJLa0WmkukkuCIHUqWbtJGJMsCSQFiPEIYHgBIZDzR8R6REbYxIqD2Cu7lMkFoPu6LmHeOAy0GF83Tc40jgmTs4HnCe60QfJa2bDBZ0Y1lhgbiZjW8SNsAKCk42UOEdjWN3KoiCIYeQUCCKWIyHewhtSoInLKSG22l4jKM2ZDCVKtBm3OTYBl3jsVqMImtj7PQw7xKxLXQzwgJaPPgW1fRhrvPJICl4YFDYfNbkbBh5JDgrazFml50xEQQwQUjxNwE0IDSofLzSg7UNVKn+Rr1KErzBHUxBqdHRlXzqYsIa5K9Y0UuE2ugw3g5KYofm7AaGNTzJSMhcchhxdaU4JZ0F1UNgQ8XcGDguypqYza8yFaEoGgNRcLej+g2t0feGKFE5OY2PFluQ3q4HgycxlfvzHqo0KcM0JI8OKXtzayJFgsqC1NdUQVu8rChnA6FO3MFyGOoC9KO8ITPpYM5pRqTlczFkLES/4u5IpwoSCZtY8i" + +cpu_kernels = None + + +class CPUKernel: + def __init__(self, kernel_file="", source_code=default_cpu_kernel_code_path, compile_parallel_kernel=None, parallel_num=None): + self.load =False + self.int8WeightExtractionFloat = None + self.int4WeightExtractionFloat = None + self.int4WeightCompression = None + self.SetNumThreads = lambda x: x + + try: + if not os.path.exists(default_cpu_kernel_code_path): + with open(default_cpu_kernel_code_path, "w", encoding="utf-8") as file: + code = default_cpu_kernel_code + cpu_quantization_code = bz2.decompress(base64.b64decode(code)).decode() + file.write(cpu_quantization_code) + + if not os.path.exists(default_cpu_parallel_kernel_code_path): + with open(default_cpu_parallel_kernel_code_path, "w", encoding="utf-8") as file: + code = default_cpu_parallel_kernel_code + cpu_quantization_code = bz2.decompress(base64.b64decode(code)).decode() + file.write(cpu_quantization_code) + + except Exception as ex: + print("Error when generating default cpu kernel code(can be ignored when using custom kernels).") + + if compile_parallel_kernel is None: + compile_parallel_kernel = bool(int(os.cpu_count()) >= 4) + + if compile_parallel_kernel and source_code == default_cpu_kernel_code_path: + source_code = default_cpu_parallel_kernel_code_path + + kernels = None + + if (not kernel_file) or (not os.path.exists(kernel_file)): + print("No compiled kernel found.") + try: + if os.path.exists(source_code): + print("Compiling kernels :", source_code) + kernel_file = source_code[:-2] + ".so" + + if compile_parallel_kernel: + compile_command = "gcc -O3 -fPIC -pthread -fopenmp -std=c99 {} -shared -o {}".format(source_code, kernel_file) + print("Compiling", compile_command) + exit_state = os.system(compile_command) + if not exit_state: + try: + kernels = ctypes.cdll.LoadLibrary(kernel_file) + print("Load kernel :", kernel_file) + except: + kernels = None + print("Load parallel cpu kernel failed, using default cpu kernel code:") + import traceback + exception = traceback.format_exc() + print(exception) + else: + print("Compile default cpu kernel failed, using default cpu kernel code.") + + if kernels is None: # adjust config, use default cpu kernel + compile_parallel_kernel = False + source_code = default_cpu_kernel_code_path + kernel_file = source_code[:-2] + ".so" + + if kernels is None: + compile_command = "gcc -O3 -fPIC -std=c99 {} -shared -o {}".format(source_code, kernel_file) + print("Compiling", compile_command) + exit_state = os.system(compile_command) + if not exit_state: + try: + kernels = ctypes.cdll.LoadLibrary(kernel_file) + print("Load kernel :", kernel_file) + except: + kernels = None + print("Load default cpu kernel failed:") + import traceback + exception = traceback.format_exc() + print(exception) + else: + print("Compile default cpu kernel failed.") + else: + print("Kernel source code not found.") + return + except: + print("Failed to build cpu kernel:") + import traceback + exception = traceback.format_exc() + print(exception) + return + else: + try: + kernels = ctypes.cdll.LoadLibrary(kernel_file) + print("Load kernel :", kernel_file) + except: + kernels = None + print("Load custom cpu kernel failed:") + import traceback + exception = traceback.format_exc() + print(exception) + + if kernels is not None: + self.int8WeightExtractionFloat = kernels.extract_int8_weight_to_float + self.int4WeightExtractionFloat = kernels.extract_int4_weight_to_float + self.int4WeightCompression = kernels.compress_int4_weight + if compile_parallel_kernel: + try: + self.SetNumThreads = kernels.set_num_threads + except: + print("No set_num_threads() found in kernel.") + self.load = True + else: + print("Failed to load kernel.") + return + + if compile_parallel_kernel: + if parallel_num is None: + parallel_num = max(os.cpu_count() // 2, 1) + print("Setting CPU quantization kernel threads to", parallel_num) + if parallel_num < 4: + print("Parallel kernel is not recommended when parallel num < 4.") + self.SetNumThreads(parallel_num) + + self.parallel_num = parallel_num + + +def compress_int4_weight(weight: torch.Tensor): # (n, m) + """compress weight on cpu or cuda to int4""" + if weight.device == torch.device("cpu"): + assert isinstance(cpu_kernels, CPUKernel) + n, m = weight.size(0), weight.size(1) + assert m % 2 == 0 + m = m // 2 + out = torch.empty(n, m, dtype=torch.int8, device="cpu") + cpu_kernels.int4WeightCompression( + ctypes.c_void_p(weight.data_ptr()), + ctypes.c_void_p(out.data_ptr()), + ctypes.c_int32(n), + ctypes.c_int32(m) + ) + return out + else: + with torch.cuda.device(weight.device): + n, m = weight.size(0), weight.size(1) + assert m % 2 == 0 + m = m // 2 + out = torch.empty(n, m, dtype=torch.int8, device="cuda") + stream = torch.cuda.current_stream() + + gridDim = (n, 1, 1) + blockDim = (min(round_up(m, 32), 1024), 1, 1) + + kernels.int4WeightCompression( + gridDim, + blockDim, + 0, + stream, + [ctypes.c_void_p(weight.data_ptr()), ctypes.c_void_p(out.data_ptr()), ctypes.c_int32(n), ctypes.c_int32(m)], + ) + return out + + +def extract_weight_to_half(weight: torch.Tensor, scale_list: torch.Tensor, source_bit_width: int): + if source_bit_width == 8: + func = kernels.int8WeightExtractionHalf + elif source_bit_width == 4: + func = kernels.int4WeightExtractionHalf + else: + assert False, "Unsupported bit-width" + + with torch.cuda.device(weight.device): + n, m = weight.size(0), weight.size(1) + out = torch.empty(n, m * (8 // source_bit_width), dtype=torch.half, device="cuda") + stream = torch.cuda.current_stream() + + gridDim = (n, 1, 1) + blockDim = (min(round_up(m, 32), 1024), 1, 1) + + func( + gridDim, + blockDim, + 0, + stream, + [ + ctypes.c_void_p(weight.data_ptr()), + ctypes.c_void_p(scale_list.data_ptr()), + ctypes.c_void_p(out.data_ptr()), + ctypes.c_int32(n), + ctypes.c_int32(m), + ], + ) + return out + + +def extract_weight_to_float(weight: torch.Tensor, scale_list: torch.Tensor, source_bit_width: int, quantization_cache=None): + """extract weight on cpu to float32""" + if source_bit_width == 8: + func = cpu_kernels.int8WeightExtractionFloat + elif source_bit_width == 4: + func = cpu_kernels.int4WeightExtractionFloat + else: + assert False, "Unsupported bit-width" + + n, m = weight.size(0), weight.size(1) + + if quantization_cache is not None: + out = quantization_cache + func( + ctypes.c_void_p(weight.data_ptr()), + ctypes.c_void_p(scale_list.data_ptr()), + ctypes.c_void_p(out.data_ptr()), + ctypes.c_int32(n), + ctypes.c_int32(m) + ) + return out.tensor + else: + out = torch.empty(n, m * (8 // source_bit_width), dtype=torch.float, device="cpu") + func( + ctypes.c_void_p(weight.data_ptr()), + ctypes.c_void_p(scale_list.data_ptr()), + ctypes.c_void_p(out.data_ptr()), + ctypes.c_int32(n), + ctypes.c_int32(m) + ) + return out + + +class CacheTensor(): + def __init__(self, *args, **kwargs): + self.tensor = torch.empty(*args, **kwargs) + + def to(self, *args, **kwargs): + self.tensor = self.tensor.to(*args, **kwargs) + + def data_ptr(self): + return self.tensor.data_ptr() + + +class QuantizedLinear(Linear): + def __init__(self, weight_bit_width: int, weight_tensor=None, bias_tensor=None, quantized_weight=None, quantized_weight_scale=None, quantization_cache=None, empty_init=False, *args, **kwargs): + super(QuantizedLinear, self).__init__(*args, **kwargs) + self.weight_bit_width = weight_bit_width + self.quantization_cache = quantization_cache + + if (quantized_weight is not None) and (quantized_weight_scale is not None): + del self.weight + self.weight = Parameter(quantized_weight.to(kwargs["device"]), requires_grad=False) + self.weight_scale = Parameter(quantized_weight_scale.to(kwargs["device"]), requires_grad=False) + else: + shape = self.weight.shape + del self.weight + + if weight_tensor is None or empty_init: + self.weight = torch.empty( + shape[0], shape[1] * weight_bit_width // 8, dtype=torch.int8, device=kwargs["device"] + ) + self.weight_scale = torch.empty(shape[0], dtype=kwargs["dtype"], device=kwargs["device"]) + else: + self.weight_scale = (weight_tensor.abs().max(dim=-1).values / ((2 ** (weight_bit_width - 1)) - 1)).to(kwargs["dtype"]) + self.weight = torch.round(weight_tensor / self.weight_scale[:, None]).to(torch.int8) + if weight_bit_width == 4: + self.weight = compress_int4_weight(self.weight) + + self.weight = Parameter(self.weight.to(kwargs["device"]), requires_grad=False) + self.weight_scale = Parameter(self.weight_scale.to(kwargs["device"]), requires_grad=False) + + if bias_tensor is not None: + self.bias = Parameter(bias_tensor.to(kwargs["device"]), requires_grad=False) + else: + self.bias = None + + def reset_parameters(self): + """To accelerate initialization""" + pass + + def forward(self, input): + if self.weight.device == torch.device("cpu"): + output = W8A16LinearCPU.apply(input, self.weight, self.weight_scale, self.weight_bit_width, self.quantization_cache) + else: + output = W8A16Linear.apply(input, self.weight, self.weight_scale, self.weight_bit_width) + if self.bias is not None: + output = output + self.bias + return output + + def _apply(self, fn): + self_obj = super()._apply(fn) + if self.quantization_cache is not None: + self.quantization_cache.to(self_obj.weight.device) + self.quantization_cache.to(self_obj.weight_scale.dtype) + return self_obj + + +class QuantizedEmbedding(Embedding): # TODO: backward, check empty_init + def __init__(self, weight_bit_width: int, weight_tensor=None, quantized_weight=None, quantized_weight_scale=None, empty_init=False, *args, **kwargs): + super(QuantizedEmbedding, self).__init__(*args, **kwargs) + self.weight_bit_width = weight_bit_width + + if (quantized_weight is not None) and (quantized_weight_scale is not None): + del self.weight + self.weight = Parameter(quantized_weight.to(kwargs["device"]), requires_grad=False) + self.weight_scale = Parameter(quantized_weight_scale.to(kwargs["device"]), requires_grad=False) + else: + shape = self.weight.shape + del self.weight + + if weight_tensor is None or empty_init: + self.weight = torch.empty( + shape[0], shape[1] * weight_bit_width // 8, dtype=torch.int8, device=kwargs["device"] + ) + self.weight_scale = torch.empty(shape[0], dtype=kwargs["dtype"], device=kwargs["device"]) + else: + self.weight_scale = (weight_tensor.abs().max(dim=-1).values / ((2 ** (weight_bit_width - 1)) - 1)).half() + self.weight = torch.round(weight_tensor / self.weight_scale[:, None]).to(torch.int8) + if weight_bit_width == 4: + self.weight = compress_int4_weight(self.weight) + + self.weight = Parameter(self.weight.to(kwargs["device"]), requires_grad=False) + self.weight_scale = Parameter(self.weight_scale.to(kwargs["device"]), requires_grad=False) + + def forward(self, input): + if self.weight.device == torch.device("cpu"): + original_weight = extract_weight_to_float(weight=self.weight, scale_list=self.weight_scale, source_bit_width=self.weight_bit_width) + else: + original_weight = extract_weight_to_half(weight=self.weight, scale_list=self.weight_scale, source_bit_width=self.weight_bit_width) + output = F.embedding( + input, original_weight, self.padding_idx, self.max_norm, + self.norm_type, self.scale_grad_by_freq, self.sparse + ) + return output + + +def load_cpu_kernel(**kwargs): + global cpu_kernels + cpu_kernels = CPUKernel(**kwargs) + assert cpu_kernels.load + + +def quantize(model, weight_bit_width, use_quantization_cache=False, empty_init=False, **kwargs): + """Replace fp16 linear with quantized linear""" + + query_key_value_quantization_cache = None + dense_quantization_cache = None + dense_h_to_4h_quantization_cache = None + dense_4h_to_h_quantization_cache = None + + try: + load_cpu_kernel(**kwargs) + except: + if kernels is None: # CUDA kernels failed + print("Cannot load cpu or cuda kernel, quantization failed:") + assert kernels is not None + print("Cannot load cpu kernel, don't use quantized model on cpu.") + + current_device = model.device + + if model.device == torch.device("cpu"): + dtype=torch.float32 + else: + dtype = torch.half + + QuantizedLinearWithPara = partial( + QuantizedLinear, + weight_bit_width=weight_bit_width, + bias=True, + dtype=dtype, + empty_init=empty_init + ) + + if use_quantization_cache: + print("Using quantization cache") + layer = model.layers[0] + weight = layer.attention.query_key_value.weight + n, m = weight.size(0), weight.size(1) + query_key_value_quantization_cache = CacheTensor(n, m, dtype=dtype, device=current_device, requires_grad=False) + weight = layer.attention.dense.weight + n, m = weight.size(0), weight.size(1) + dense_quantization_cache = CacheTensor(n, m, dtype=dtype, device=current_device, requires_grad=False) + weight = layer.mlp.dense_h_to_4h.weight + n, m = weight.size(0), weight.size(1) + dense_h_to_4h_quantization_cache = CacheTensor(n, m, dtype=dtype, device=current_device, requires_grad=False) + weight = layer.mlp.dense_4h_to_h.weight + n, m = weight.size(0), weight.size(1) + dense_4h_to_h_quantization_cache = CacheTensor(n, m, dtype=dtype, device=current_device, requires_grad=False) + + print("Applying quantization to glm layers") + + for layer in model.layers: + layer.attention.query_key_value = QuantizedLinearWithPara( + weight_tensor=layer.attention.query_key_value.weight.to(current_device), + bias_tensor=layer.attention.query_key_value.bias, + in_features=layer.attention.query_key_value.in_features, + out_features=layer.attention.query_key_value.out_features, + device=layer.attention.query_key_value.weight.device, + quantization_cache=query_key_value_quantization_cache + ) + layer.attention.dense = QuantizedLinearWithPara( + weight_tensor=layer.attention.dense.weight.to(current_device), + bias_tensor=layer.attention.dense.bias, + in_features=layer.attention.dense.in_features, + out_features=layer.attention.dense.out_features, + device=layer.attention.dense.weight.device, + quantization_cache=dense_quantization_cache + ) + layer.mlp.dense_h_to_4h = QuantizedLinearWithPara( + weight_tensor=layer.mlp.dense_h_to_4h.weight.to(current_device), + bias_tensor=layer.mlp.dense_h_to_4h.bias, + in_features=layer.mlp.dense_h_to_4h.in_features, + out_features=layer.mlp.dense_h_to_4h.out_features, + device=layer.mlp.dense_h_to_4h.weight.device, + quantization_cache=dense_h_to_4h_quantization_cache + ) + layer.mlp.dense_4h_to_h = QuantizedLinearWithPara( + weight_tensor=layer.mlp.dense_4h_to_h.weight.to(current_device), + bias_tensor=layer.mlp.dense_4h_to_h.bias, + in_features=layer.mlp.dense_4h_to_h.in_features, + out_features=layer.mlp.dense_4h_to_h.out_features, + device=layer.mlp.dense_4h_to_h.weight.device, + quantization_cache=dense_4h_to_h_quantization_cache + ) + return model diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/rng_state.pth b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/rng_state.pth new file mode 100644 index 0000000000000000000000000000000000000000..29b5067aee9dbae9e625ba35a85074a02d6f5a58 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/rng_state.pth @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6ced23e0e417f2dfcf6eed7e0bfc1d56c4c6ed8e2c602d433d06066af88ab2e7 +size 14575 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/scheduler.pt b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/scheduler.pt new file mode 100644 index 0000000000000000000000000000000000000000..496a9c7bb642e5aca1569f3626a409095679a933 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/scheduler.pt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ce21225b3a04f92b1556e137b9e4f00a59dda32eda3027243d8cfd9edab1acb7 +size 627 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/special_tokens_map.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/special_tokens_map.json new file mode 100644 index 0000000000000000000000000000000000000000..1f897c919b758e64c56eb1a7b34b39b569040086 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/special_tokens_map.json @@ -0,0 +1,7 @@ +{ + "bos_token": "", + "eos_token": "", + "mask_token": "[MASK]", + "pad_token": "", + "unk_token": "" +} diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenization_chatglm.py b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenization_chatglm.py new file mode 100644 index 0000000000000000000000000000000000000000..1d4f0ba532543b6dbdacdd83d30324b7a6abfad3 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenization_chatglm.py @@ -0,0 +1,430 @@ +"""Tokenization classes for ChatGLM.""" +from typing import List, Optional, Union +import os + +from transformers.tokenization_utils import PreTrainedTokenizer +from transformers.utils import logging, PaddingStrategy +from transformers.tokenization_utils_base import EncodedInput, BatchEncoding +from typing import Dict +import sentencepiece as spm +import numpy as np + +logger = logging.get_logger(__name__) + +PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES = { + "THUDM/chatglm-6b": 2048, +} + + +class TextTokenizer: + def __init__(self, model_path): + self.sp = spm.SentencePieceProcessor() + self.sp.Load(model_path) + self.num_tokens = self.sp.vocab_size() + + def encode(self, text): + return self.sp.EncodeAsIds(text) + + def decode(self, ids: List[int]): + return self.sp.DecodeIds(ids) + + def tokenize(self, text): + return self.sp.EncodeAsPieces(text) + + def convert_tokens_to_ids(self, tokens): + return [self.sp.PieceToId(token) for token in tokens] + + def convert_token_to_id(self, token): + return self.sp.PieceToId(token) + + def convert_id_to_token(self, idx): + return self.sp.IdToPiece(idx) + + def __len__(self): + return self.num_tokens + + +class SPTokenizer: + def __init__( + self, + vocab_file, + num_image_tokens=20000, + max_blank_length=80, + byte_fallback=True, + ): + assert vocab_file is not None + self.vocab_file = vocab_file + self.num_image_tokens = num_image_tokens + self.special_tokens = ["[MASK]", "[gMASK]", "[sMASK]", "", "", "", "", ""] + self.max_blank_length = max_blank_length + self.byte_fallback = byte_fallback + self.text_tokenizer = TextTokenizer(vocab_file) + + def _get_text_tokenizer(self): + return self.text_tokenizer + + @staticmethod + def get_blank_token(length: int): + assert length >= 2 + return f"<|blank_{length}|>" + + @staticmethod + def get_tab_token(): + return f"<|tab|>" + + @property + def num_text_tokens(self): + return self.text_tokenizer.num_tokens + + @property + def num_tokens(self): + return self.num_image_tokens + self.num_text_tokens + + @staticmethod + def _encode_whitespaces(text: str, max_len: int = 80): + text = text.replace("\t", SPTokenizer.get_tab_token()) + for i in range(max_len, 1, -1): + text = text.replace(" " * i, SPTokenizer.get_blank_token(i)) + return text + + def _preprocess(self, text: str, linebreak=True, whitespaces=True): + if linebreak: + text = text.replace("\n", "") + if whitespaces: + text = self._encode_whitespaces(text, max_len=self.max_blank_length) + return text + + def encode( + self, text: str, linebreak=True, whitespaces=True, add_dummy_prefix=True + ) -> List[int]: + """ + @param text: Text to encode. + @param linebreak: Whether to encode newline (\n) in text. + @param whitespaces: Whether to encode multiple whitespaces or tab in text, useful for source code encoding. + @param special_tokens: Whether to encode special token ([MASK], [gMASK], etc.) in text. + @param add_dummy_prefix: Whether to add dummy blank space in the beginning. + """ + text = self._preprocess(text, linebreak, whitespaces) + if not add_dummy_prefix: + text = "" + text + tmp = self._get_text_tokenizer().encode(text) + tokens = [x + self.num_image_tokens for x in tmp] + return tokens if add_dummy_prefix else tokens[2:] + + def decode(self, text_ids: List[int]) -> str: + ids = [int(_id) - self.num_image_tokens for _id in text_ids] + ids = [_id for _id in ids if _id >= 0] + text = self._get_text_tokenizer().decode(ids) + text = text.replace("", "\n") + text = text.replace(SPTokenizer.get_tab_token(), "\t") + for i in range(2, self.max_blank_length + 1): + text = text.replace(self.get_blank_token(i), " " * i) + return text + + def tokenize( + self, text: str, linebreak=True, whitespaces=True, add_dummy_prefix=True + ) -> List[str]: + """ + @param text: Text to encode. + @param linebreak: Whether to encode newline (\n) in text. + @param whitespaces: Whether to encode multiple whitespaces or tab in text, useful for source code encoding. + @param special_tokens: Whether to encode special token ([MASK], [gMASK], etc.) in text. + @param add_dummy_prefix: Whether to add dummy blank space in the beginning. + """ + text = self._preprocess(text, linebreak, whitespaces) + if not add_dummy_prefix: + text = "" + text + tokens = self._get_text_tokenizer().tokenize(text) + return tokens if add_dummy_prefix else tokens[2:] + + def __getitem__(self, x: Union[int, str]): + if isinstance(x, int): + if x < self.num_image_tokens: + return "".format(x) + else: + return self.text_tokenizer.convert_id_to_token(x - self.num_image_tokens) + elif isinstance(x, str): + if x.startswith("") and x[7:-1].isdigit(): + return int(x[7:-1]) + else: + return self.text_tokenizer.convert_token_to_id(x) + self.num_image_tokens + else: + raise ValueError("The key should be str or int.") + + +class ChatGLMTokenizer(PreTrainedTokenizer): + """ + Construct a ChatGLM tokenizer. Based on byte-level Byte-Pair-Encoding. + + Args: + vocab_file (`str`): + Path to the vocabulary file. + """ + + vocab_files_names = {"vocab_file": "ice_text.model"} + max_model_input_sizes = PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES + model_input_names = ["input_ids", "attention_mask", "position_ids"] + + def __init__( + self, + vocab_file, + do_lower_case=False, + remove_space=False, + bos_token='', + eos_token='', + end_token='', + mask_token='[MASK]', + gmask_token='[gMASK]', + padding_side="left", + pad_token="", + unk_token="", + num_image_tokens=20000, + **kwargs + ) -> None: + super().__init__( + do_lower_case=do_lower_case, + remove_space=remove_space, + padding_side=padding_side, + bos_token=bos_token, + eos_token=eos_token, + end_token=end_token, + mask_token=mask_token, + gmask_token=gmask_token, + pad_token=pad_token, + unk_token=unk_token, + num_image_tokens=num_image_tokens, + **kwargs + ) + + self.do_lower_case = do_lower_case + self.remove_space = remove_space + self.vocab_file = vocab_file + + self.bos_token = bos_token + self.eos_token = eos_token + self.end_token = end_token + self.mask_token = mask_token + self.gmask_token = gmask_token + + self.sp_tokenizer = SPTokenizer(vocab_file, num_image_tokens=num_image_tokens) + + """ Initialisation """ + + @property + def gmask_token_id(self) -> Optional[int]: + if self.gmask_token is None: + return None + return self.convert_tokens_to_ids(self.gmask_token) + + @property + def end_token_id(self) -> Optional[int]: + """ + `Optional[int]`: Id of the end of context token in the vocabulary. Returns `None` if the token has not been + set. + """ + if self.end_token is None: + return None + return self.convert_tokens_to_ids(self.end_token) + + @property + def vocab_size(self): + """ Returns vocab size """ + return self.sp_tokenizer.num_tokens + + def get_vocab(self): + """ Returns vocab as a dict """ + vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)} + vocab.update(self.added_tokens_encoder) + return vocab + + def preprocess_text(self, inputs): + if self.remove_space: + outputs = " ".join(inputs.strip().split()) + else: + outputs = inputs + + if self.do_lower_case: + outputs = outputs.lower() + + return outputs + + def _tokenize(self, text, **kwargs): + """ Returns a tokenized string. """ + text = self.preprocess_text(text) + + seq = self.sp_tokenizer.tokenize(text) + + return seq + + def _decode( + self, + token_ids: Union[int, List[int]], + skip_special_tokens: bool = False, + clean_up_tokenization_spaces: bool = True, + **kwargs + ) -> str: + if isinstance(token_ids, int): + token_ids = [token_ids] + if len(token_ids) == 0: + return "" + if self.pad_token_id in token_ids: # remove pad + token_ids = list(filter((self.pad_token_id).__ne__, token_ids)) + return self.sp_tokenizer.decode(token_ids) + + def _convert_token_to_id(self, token): + """ Converts a token (str) in an id using the vocab. """ + return self.sp_tokenizer[token] + + def _convert_id_to_token(self, index): + """Converts an index (integer) in a token (str) using the vocab.""" + return self.sp_tokenizer[index] + + def save_vocabulary(self, save_directory, filename_prefix=None): + """ + Save the vocabulary and special tokens file to a directory. + + Args: + save_directory (`str`): + The directory in which to save the vocabulary. + filename_prefix (`str`, *optional*): + An optional prefix to add to the named of the saved files. + + Returns: + `Tuple(str)`: Paths to the files saved. + """ + if os.path.isdir(save_directory): + vocab_file = os.path.join( + save_directory, self.vocab_files_names["vocab_file"] + ) + else: + vocab_file = save_directory + + with open(self.vocab_file, 'rb') as fin: + proto_str = fin.read() + + with open(vocab_file, "wb") as writer: + writer.write(proto_str) + + return (vocab_file,) + + def build_inputs_with_special_tokens( + self, token_ids_0: List[int], token_ids_1: Optional[List[int]] = None + ) -> List[int]: + """ + Build model inputs from a sequence or a pair of sequence for sequence classification tasks by concatenating and + adding special tokens. A BERT sequence has the following format: + + - single sequence: `[CLS] X [SEP]` + - pair of sequences: `[CLS] A [SEP] B [SEP]` + + Args: + token_ids_0 (`List[int]`): + List of IDs to which the special tokens will be added. + token_ids_1 (`List[int]`, *optional*): + Optional second list of IDs for sequence pairs. + + Returns: + `List[int]`: List of [input IDs](../glossary#input-ids) with the appropriate special tokens. + """ + gmask_id = self.sp_tokenizer[self.gmask_token] + eos_id = self.sp_tokenizer[self.eos_token] + token_ids_0 = token_ids_0 + [gmask_id, self.sp_tokenizer[self.bos_token]] + if token_ids_1 is not None: + token_ids_0 = token_ids_0 + token_ids_1 + [eos_id] + return token_ids_0 + + def _pad( + self, + encoded_inputs: Union[Dict[str, EncodedInput], BatchEncoding], + max_length: Optional[int] = None, + padding_strategy: PaddingStrategy = PaddingStrategy.DO_NOT_PAD, + pad_to_multiple_of: Optional[int] = None, + return_attention_mask: Optional[bool] = None, + ) -> dict: + """ + Pad encoded inputs (on left/right and up to predefined length or max length in the batch) + + Args: + encoded_inputs: + Dictionary of tokenized inputs (`List[int]`) or batch of tokenized inputs (`List[List[int]]`). + max_length: maximum length of the returned list and optionally padding length (see below). + Will truncate by taking into account the special tokens. + padding_strategy: PaddingStrategy to use for padding. + + - PaddingStrategy.LONGEST Pad to the longest sequence in the batch + - PaddingStrategy.MAX_LENGTH: Pad to the max length (default) + - PaddingStrategy.DO_NOT_PAD: Do not pad + The tokenizer padding sides are defined in self.padding_side: + + - 'left': pads on the left of the sequences + - 'right': pads on the right of the sequences + pad_to_multiple_of: (optional) Integer if set will pad the sequence to a multiple of the provided value. + This is especially useful to enable the use of Tensor Core on NVIDIA hardware with compute capability + `>= 7.5` (Volta). + return_attention_mask: + (optional) Set to False to avoid returning attention mask (default: set to model specifics) + """ + # Load from model defaults + bos_token_id = self.sp_tokenizer[self.bos_token] + mask_token_id = self.sp_tokenizer[self.mask_token] + gmask_token_id = self.sp_tokenizer[self.gmask_token] + assert self.padding_side == "left" + + required_input = encoded_inputs[self.model_input_names[0]] + seq_length = len(required_input) + + if padding_strategy == PaddingStrategy.LONGEST: + max_length = len(required_input) + + if max_length is not None and pad_to_multiple_of is not None and (max_length % pad_to_multiple_of != 0): + max_length = ((max_length // pad_to_multiple_of) + 1) * pad_to_multiple_of + + needs_to_be_padded = padding_strategy != PaddingStrategy.DO_NOT_PAD and len(required_input) != max_length + + # Initialize attention mask if not present. + if max_length is not None: + if "attention_mask" not in encoded_inputs: + if bos_token_id in required_input: + context_length = required_input.index(bos_token_id) + else: + context_length = seq_length + attention_mask = np.ones((1, seq_length, seq_length)) + attention_mask = np.tril(attention_mask) + attention_mask[:, :, :context_length] = 1 + attention_mask = np.bool_(attention_mask < 0.5) + encoded_inputs["attention_mask"] = attention_mask + + if "position_ids" not in encoded_inputs: + if bos_token_id in required_input: + context_length = required_input.index(bos_token_id) + else: + context_length = seq_length + position_ids = np.arange(seq_length, dtype=np.int64) + mask_token = mask_token_id if mask_token_id in required_input else gmask_token_id + if mask_token in required_input: + mask_position = required_input.index(mask_token) + position_ids[context_length:] = mask_position + block_position_ids = np.concatenate( + [np.zeros(context_length, dtype=np.int64), + np.arange(1, seq_length - context_length + 1, dtype=np.int64)]) + encoded_inputs["position_ids"] = np.stack([position_ids, block_position_ids], axis=0) + + if needs_to_be_padded: + difference = max_length - len(required_input) + + if "attention_mask" in encoded_inputs: + encoded_inputs["attention_mask"] = np.pad(encoded_inputs["attention_mask"], + pad_width=[(0, 0), (difference, 0), (difference, 0)], + mode='constant', constant_values=True) + if "token_type_ids" in encoded_inputs: + encoded_inputs["token_type_ids"] = [self.pad_token_type_id] * difference + encoded_inputs[ + "token_type_ids" + ] + if "special_tokens_mask" in encoded_inputs: + encoded_inputs["special_tokens_mask"] = [1] * difference + encoded_inputs["special_tokens_mask"] + if "position_ids" in encoded_inputs: + encoded_inputs["position_ids"] = np.pad(encoded_inputs["position_ids"], + pad_width=[(0, 0), (difference, 0)]) + encoded_inputs[self.model_input_names[0]] = [self.pad_token_id] * difference + required_input + + return encoded_inputs diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenizer_config.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenizer_config.json new file mode 100644 index 0000000000000000000000000000000000000000..f3f8e1c935cc40c270ff6ac75c05b4208533688a --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/tokenizer_config.json @@ -0,0 +1,22 @@ +{ + "auto_map": { + "AutoTokenizer": [ + "tokenization_chatglm.ChatGLMTokenizer", + null + ] + }, + "bos_token": "", + "do_lower_case": false, + "end_token": "", + "eos_token": "", + "gmask_token": "[gMASK]", + "mask_token": "[MASK]", + "model_max_length": 1000000000000000019884624838656, + "num_image_tokens": 0, + "pad_token": "", + "padding_side": "left", + "remove_space": false, + "special_tokens_map_file": null, + "tokenizer_class": "ChatGLMTokenizer", + "unk_token": "" +} diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/trainer_state.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/trainer_state.json new file mode 100644 index 0000000000000000000000000000000000000000..7f1fb0dd0f51507edb3764248a22c3a4fb6b8ed7 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/trainer_state.json @@ -0,0 +1,616 @@ +{ + "best_metric": null, + "best_model_checkpoint": null, + "epoch": 114.28571428571429, + "global_step": 1000, + "is_hyper_param_search": false, + "is_local_process_zero": true, + "is_world_process_zero": true, + "log_history": [ + { + "epoch": 1.14, + "learning_rate": 0.0198, + "loss": 4.7949, + "step": 10 + }, + { + "epoch": 2.29, + "learning_rate": 0.0196, + "loss": 3.7519, + "step": 20 + }, + { + "epoch": 3.43, + "learning_rate": 0.0194, + "loss": 3.3049, + "step": 30 + }, + { + "epoch": 4.57, + "learning_rate": 0.0192, + "loss": 2.8868, + "step": 40 + }, + { + "epoch": 5.71, + "learning_rate": 0.019, + "loss": 2.4806, + "step": 50 + }, + { + "epoch": 6.86, + "learning_rate": 0.0188, + "loss": 1.865, + "step": 60 + }, + { + "epoch": 8.0, + "learning_rate": 0.018600000000000002, + "loss": 1.4186, + "step": 70 + }, + { + "epoch": 9.14, + "learning_rate": 0.0184, + "loss": 0.9316, + "step": 80 + }, + { + "epoch": 10.29, + "learning_rate": 0.0182, + "loss": 0.5643, + "step": 90 + }, + { + "epoch": 11.43, + "learning_rate": 0.018000000000000002, + "loss": 0.3509, + "step": 100 + }, + { + "epoch": 12.57, + "learning_rate": 0.0178, + "loss": 0.2172, + "step": 110 + }, + { + "epoch": 13.71, + "learning_rate": 0.0176, + "loss": 0.1486, + "step": 120 + }, + { + "epoch": 14.86, + "learning_rate": 0.0174, + "loss": 0.1196, + "step": 130 + }, + { + "epoch": 16.0, + "learning_rate": 0.0172, + "loss": 0.0802, + "step": 140 + }, + { + "epoch": 17.14, + "learning_rate": 0.017, + "loss": 0.0739, + "step": 150 + }, + { + "epoch": 18.29, + "learning_rate": 0.0168, + "loss": 0.0619, + "step": 160 + }, + { + "epoch": 19.43, + "learning_rate": 0.0166, + "loss": 0.062, + "step": 170 + }, + { + "epoch": 20.57, + "learning_rate": 0.016399999999999998, + "loss": 0.0595, + "step": 180 + }, + { + "epoch": 21.71, + "learning_rate": 0.016200000000000003, + "loss": 0.0569, + "step": 190 + }, + { + "epoch": 22.86, + "learning_rate": 0.016, + "loss": 0.0533, + "step": 200 + }, + { + "epoch": 24.0, + "learning_rate": 0.0158, + "loss": 0.052, + "step": 210 + }, + { + "epoch": 25.14, + "learning_rate": 0.015600000000000001, + "loss": 0.0471, + "step": 220 + }, + { + "epoch": 26.29, + "learning_rate": 0.0154, + "loss": 0.0478, + "step": 230 + }, + { + "epoch": 27.43, + "learning_rate": 0.0152, + "loss": 0.0415, + "step": 240 + }, + { + "epoch": 28.57, + "learning_rate": 0.015, + "loss": 0.0462, + "step": 250 + }, + { + "epoch": 29.71, + "learning_rate": 0.0148, + "loss": 0.0437, + "step": 260 + }, + { + "epoch": 30.86, + "learning_rate": 0.0146, + "loss": 0.0452, + "step": 270 + }, + { + "epoch": 32.0, + "learning_rate": 0.0144, + "loss": 0.041, + "step": 280 + }, + { + "epoch": 33.14, + "learning_rate": 0.014199999999999999, + "loss": 0.0418, + "step": 290 + }, + { + "epoch": 34.29, + "learning_rate": 0.013999999999999999, + "loss": 0.0469, + "step": 300 + }, + { + "epoch": 35.43, + "learning_rate": 0.0138, + "loss": 0.0394, + "step": 310 + }, + { + "epoch": 36.57, + "learning_rate": 0.013600000000000001, + "loss": 0.0444, + "step": 320 + }, + { + "epoch": 37.71, + "learning_rate": 0.0134, + "loss": 0.0393, + "step": 330 + }, + { + "epoch": 38.86, + "learning_rate": 0.013200000000000002, + "loss": 0.045, + "step": 340 + }, + { + "epoch": 40.0, + "learning_rate": 0.013000000000000001, + "loss": 0.0392, + "step": 350 + }, + { + "epoch": 41.14, + "learning_rate": 0.0128, + "loss": 0.0351, + "step": 360 + }, + { + "epoch": 42.29, + "learning_rate": 0.0126, + "loss": 0.0389, + "step": 370 + }, + { + "epoch": 43.43, + "learning_rate": 0.0124, + "loss": 0.0374, + "step": 380 + }, + { + "epoch": 44.57, + "learning_rate": 0.0122, + "loss": 0.035, + "step": 390 + }, + { + "epoch": 45.71, + "learning_rate": 0.012, + "loss": 0.0349, + "step": 400 + }, + { + "epoch": 46.86, + "learning_rate": 0.0118, + "loss": 0.0361, + "step": 410 + }, + { + "epoch": 48.0, + "learning_rate": 0.0116, + "loss": 0.0368, + "step": 420 + }, + { + "epoch": 49.14, + "learning_rate": 0.011399999999999999, + "loss": 0.0352, + "step": 430 + }, + { + "epoch": 50.29, + "learning_rate": 0.011200000000000002, + "loss": 0.0353, + "step": 440 + }, + { + "epoch": 51.43, + "learning_rate": 0.011000000000000001, + "loss": 0.0337, + "step": 450 + }, + { + "epoch": 52.57, + "learning_rate": 0.0108, + "loss": 0.032, + "step": 460 + }, + { + "epoch": 53.71, + "learning_rate": 0.0106, + "loss": 0.0366, + "step": 470 + }, + { + "epoch": 54.86, + "learning_rate": 0.010400000000000001, + "loss": 0.0317, + "step": 480 + }, + { + "epoch": 56.0, + "learning_rate": 0.0102, + "loss": 0.0332, + "step": 490 + }, + { + "epoch": 57.14, + "learning_rate": 0.01, + "loss": 0.0328, + "step": 500 + }, + { + "epoch": 58.29, + "learning_rate": 0.0098, + "loss": 0.0325, + "step": 510 + }, + { + "epoch": 59.43, + "learning_rate": 0.0096, + "loss": 0.0327, + "step": 520 + }, + { + "epoch": 60.57, + "learning_rate": 0.0094, + "loss": 0.0344, + "step": 530 + }, + { + "epoch": 61.71, + "learning_rate": 0.0092, + "loss": 0.0351, + "step": 540 + }, + { + "epoch": 62.86, + "learning_rate": 0.009000000000000001, + "loss": 0.0325, + "step": 550 + }, + { + "epoch": 64.0, + "learning_rate": 0.0088, + "loss": 0.0329, + "step": 560 + }, + { + "epoch": 65.14, + "learning_rate": 0.0086, + "loss": 0.031, + "step": 570 + }, + { + "epoch": 66.29, + "learning_rate": 0.0084, + "loss": 0.0326, + "step": 580 + }, + { + "epoch": 67.43, + "learning_rate": 0.008199999999999999, + "loss": 0.0311, + "step": 590 + }, + { + "epoch": 68.57, + "learning_rate": 0.008, + "loss": 0.033, + "step": 600 + }, + { + "epoch": 69.71, + "learning_rate": 0.0078000000000000005, + "loss": 0.0297, + "step": 610 + }, + { + "epoch": 70.86, + "learning_rate": 0.0076, + "loss": 0.0331, + "step": 620 + }, + { + "epoch": 72.0, + "learning_rate": 0.0074, + "loss": 0.0318, + "step": 630 + }, + { + "epoch": 73.14, + "learning_rate": 0.0072, + "loss": 0.0307, + "step": 640 + }, + { + "epoch": 74.29, + "learning_rate": 0.006999999999999999, + "loss": 0.03, + "step": 650 + }, + { + "epoch": 75.43, + "learning_rate": 0.0068000000000000005, + "loss": 0.0307, + "step": 660 + }, + { + "epoch": 76.57, + "learning_rate": 0.006600000000000001, + "loss": 0.0334, + "step": 670 + }, + { + "epoch": 77.71, + "learning_rate": 0.0064, + "loss": 0.0321, + "step": 680 + }, + { + "epoch": 78.86, + "learning_rate": 0.0062, + "loss": 0.029, + "step": 690 + }, + { + "epoch": 80.0, + "learning_rate": 0.006, + "loss": 0.0315, + "step": 700 + }, + { + "epoch": 81.14, + "learning_rate": 0.0058, + "loss": 0.0294, + "step": 710 + }, + { + "epoch": 82.29, + "learning_rate": 0.005600000000000001, + "loss": 0.0323, + "step": 720 + }, + { + "epoch": 83.43, + "learning_rate": 0.0054, + "loss": 0.0274, + "step": 730 + }, + { + "epoch": 84.57, + "learning_rate": 0.005200000000000001, + "loss": 0.0305, + "step": 740 + }, + { + "epoch": 85.71, + "learning_rate": 0.005, + "loss": 0.0316, + "step": 750 + }, + { + "epoch": 86.86, + "learning_rate": 0.0048, + "loss": 0.0262, + "step": 760 + }, + { + "epoch": 88.0, + "learning_rate": 0.0046, + "loss": 0.0305, + "step": 770 + }, + { + "epoch": 89.14, + "learning_rate": 0.0044, + "loss": 0.0294, + "step": 780 + }, + { + "epoch": 90.29, + "learning_rate": 0.0042, + "loss": 0.0291, + "step": 790 + }, + { + "epoch": 91.43, + "learning_rate": 0.004, + "loss": 0.0274, + "step": 800 + }, + { + "epoch": 92.57, + "learning_rate": 0.0038, + "loss": 0.032, + "step": 810 + }, + { + "epoch": 93.71, + "learning_rate": 0.0036, + "loss": 0.0262, + "step": 820 + }, + { + "epoch": 94.86, + "learning_rate": 0.0034000000000000002, + "loss": 0.0325, + "step": 830 + }, + { + "epoch": 96.0, + "learning_rate": 0.0032, + "loss": 0.0276, + "step": 840 + }, + { + "epoch": 97.14, + "learning_rate": 0.003, + "loss": 0.029, + "step": 850 + }, + { + "epoch": 98.29, + "learning_rate": 0.0028000000000000004, + "loss": 0.0256, + "step": 860 + }, + { + "epoch": 99.43, + "learning_rate": 0.0026000000000000003, + "loss": 0.0305, + "step": 870 + }, + { + "epoch": 100.57, + "learning_rate": 0.0024, + "loss": 0.0271, + "step": 880 + }, + { + "epoch": 101.71, + "learning_rate": 0.0022, + "loss": 0.0302, + "step": 890 + }, + { + "epoch": 102.86, + "learning_rate": 0.002, + "loss": 0.0288, + "step": 900 + }, + { + "epoch": 104.0, + "learning_rate": 0.0018, + "loss": 0.0261, + "step": 910 + }, + { + "epoch": 105.14, + "learning_rate": 0.0016, + "loss": 0.0272, + "step": 920 + }, + { + "epoch": 106.29, + "learning_rate": 0.0014000000000000002, + "loss": 0.0285, + "step": 930 + }, + { + "epoch": 107.43, + "learning_rate": 0.0012, + "loss": 0.0271, + "step": 940 + }, + { + "epoch": 108.57, + "learning_rate": 0.001, + "loss": 0.0289, + "step": 950 + }, + { + "epoch": 109.71, + "learning_rate": 0.0008, + "loss": 0.028, + "step": 960 + }, + { + "epoch": 110.86, + "learning_rate": 0.0006, + "loss": 0.0263, + "step": 970 + }, + { + "epoch": 112.0, + "learning_rate": 0.0004, + "loss": 0.0279, + "step": 980 + }, + { + "epoch": 113.14, + "learning_rate": 0.0002, + "loss": 0.0272, + "step": 990 + }, + { + "epoch": 114.29, + "learning_rate": 0.0, + "loss": 0.0285, + "step": 1000 + } + ], + "max_steps": 1000, + "num_train_epochs": 125, + "total_flos": 3.4665721233408e+16, + "trial_name": null, + "trial_params": null +} diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/training_args.bin b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/training_args.bin new file mode 100644 index 0000000000000000000000000000000000000000..cf6cd481e104478687196098a75ae131659da2ed --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/checkpoint-1000/training_args.bin @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:800d88035249fde5c3f1ce80f96f4ccad529f4cf9101a6a495f31e43927109c4 +size 3771 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/1682019627.5574055/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/1682019627.5574055/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.1 new file mode 100644 index 0000000000000000000000000000000000000000..4f0e65d0c903ee1f731a9b2ec7a588ed13d1ab13 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/1682019627.5574055/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9dbba98e909e8f3a837bb9936dd583c354d8b83d6864e2e8b01328489a808369 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.0 new file mode 100644 index 0000000000000000000000000000000000000000..f33fcd74b294c136c092bea2aaf7b8a557e17777 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-38-54_LAPTOP-U8KCJD82/events.out.tfevents.1682019627.LAPTOP-U8KCJD82.39620.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b561736c3552194be6ec935d7a465bfb336fbd58bb8ab1d68a78ac97eb833cd7 +size 4497 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/1682020719.8067539/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/1682020719.8067539/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.1 new file mode 100644 index 0000000000000000000000000000000000000000..43fe9a7ff199efbc57dc889ba95bd59e18553186 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/1682020719.8067539/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5bc26469b12b39eefbfb19bf9be5eac4ad0c68d6e1ddd33e905d3059a06eee64 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.0 new file mode 100644 index 0000000000000000000000000000000000000000..7678df0d68cfde3fe09b6db51271942af520f9b9 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_03-57-00_LAPTOP-U8KCJD82/events.out.tfevents.1682020719.LAPTOP-U8KCJD82.34144.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:be50f347728891e15e260d563eb2c32822e3ad0a5ab4e4d5c88689d517c633db +size 4999 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/1682021099.0769536/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/1682021099.0769536/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.1 new file mode 100644 index 0000000000000000000000000000000000000000..2d30baa2305c77aba9138171f62077b1e8c7ed48 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/1682021099.0769536/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3c09d6a7d9ba37d37dfae6a9fc383664b7b345ed5b19930da85a2da075a85c8e +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.0 new file mode 100644 index 0000000000000000000000000000000000000000..9a916bc785c3de89af7ae8dc3ac853d02e7145b3 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-03-18_LAPTOP-U8KCJD82/events.out.tfevents.1682021099.LAPTOP-U8KCJD82.4528.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:44de312738e1f73fdf252da896b82969f54b288a16abdec15d9bccf5ead2ec89 +size 4497 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/1682021363.5070107/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/1682021363.5070107/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.1 new file mode 100644 index 0000000000000000000000000000000000000000..e8b5f18d29779133ad9c2ff71af3f812108b84ac --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/1682021363.5070107/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c47bc2a2092ec54c278c7f331de46b05c3da8951fc57b79b59353f46d92040cd +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.0 new file mode 100644 index 0000000000000000000000000000000000000000..8b308698209b2ef99ceafc981fd174e6e255b682 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_04-07-46_LAPTOP-U8KCJD82/events.out.tfevents.1682021363.LAPTOP-U8KCJD82.34384.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:95acacb51277b736cd043af6c3df8b0949d37c1932f8dd39775db731d552b037 +size 20519 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/1682048840.1281629/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/1682048840.1281629/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.1 new file mode 100644 index 0000000000000000000000000000000000000000..2aa5570f26bfd8cdfa36f5fddf2961d8664736f3 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/1682048840.1281629/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f8ac8fe57b89689c202e9c8543749e69915e6670e59659f288602c5fc361779b +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.0 new file mode 100644 index 0000000000000000000000000000000000000000..d9d8da4b0329216f6cc52b326e52a744426c3a5d --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-45-46_LAPTOP-U8KCJD82/events.out.tfevents.1682048840.LAPTOP-U8KCJD82.30268.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d4662640a42ea57ab2152db47ef3f50d721d5509c11c69b51651f99e19411e95 +size 4501 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-48-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049010.LAPTOP-U8KCJD82.40476.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-48-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049010.LAPTOP-U8KCJD82.40476.0 new file mode 100644 index 0000000000000000000000000000000000000000..3a8c41ed6b24e5a3d1195f0ff0676cfd27ad7e5e --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-48-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049010.LAPTOP-U8KCJD82.40476.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:43ba45944f035296405454ac17ff21e2da2fdbc697e34273d7dc45c7eb310ff9 +size 88 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/1682049309.5723379/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/1682049309.5723379/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.1 new file mode 100644 index 0000000000000000000000000000000000000000..ed6418c1ab397897ccbbda4527fec9492ba12d40 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/1682049309.5723379/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c0d37ef4a5778b1a30ca92ff219aa6c92fb662e0df7d88cbf52f0c9d1787d81b +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.0 new file mode 100644 index 0000000000000000000000000000000000000000..9255b38175b47700fd5f54bbf87b4fd79d632575 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-53-35_LAPTOP-U8KCJD82/events.out.tfevents.1682049309.LAPTOP-U8KCJD82.19992.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:fff7d6993d790a4606be9bebec5478bd2bba7ee0b58fe9c3bfc89edb004c28c8 +size 4501 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/1682049642.960882/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/1682049642.960882/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.1 new file mode 100644 index 0000000000000000000000000000000000000000..5b2647c52a45071aee46eed291193e742c3bd703 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/1682049642.960882/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e2f300310a299cfb4131842c1f4a0b0741d07142c5ad82c82086a08ffb1d0716 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.0 new file mode 100644 index 0000000000000000000000000000000000000000..fe2b79787ccb4d7b463574b7f6d394d32b9bfd69 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_11-59-08_LAPTOP-U8KCJD82/events.out.tfevents.1682049642.LAPTOP-U8KCJD82.21296.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:adb237b2fa30205061e1d9ba3f2423f8f065aba5b3f0cdaad6908760cc69ff0c +size 4501 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/1682049952.9553204/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/1682049952.9553204/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.1 new file mode 100644 index 0000000000000000000000000000000000000000..e6a3b89c3687cc0296c364b05f2a75d20afc760c --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/1682049952.9553204/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:854624c3499913de971099060c16894b8ff3cc8fb8ba1f912148ac43a812330f +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.0 new file mode 100644 index 0000000000000000000000000000000000000000..1204723d17b6d96032bff4c376d0f0cd26d11b7f --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_12-04-16_LAPTOP-U8KCJD82/events.out.tfevents.1682049952.LAPTOP-U8KCJD82.3092.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9d9e876c62e1fa37df30e6a2d151671f2370546fd7a5f0803bd120affdd0404c +size 4414 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/1682063879.4572506/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/1682063879.4572506/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.1 new file mode 100644 index 0000000000000000000000000000000000000000..e0816089616c8e9f8d9d003e27ab254d71f3a0d5 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/1682063879.4572506/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:a56b44c1ba6b2d18be7f70271867146e964eba79a175486dd6b55b702a82892b +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.0 new file mode 100644 index 0000000000000000000000000000000000000000..1eba72b51b60d89c521b4e005eec164a12730bc1 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_15-57-52_LAPTOP-U8KCJD82/events.out.tfevents.1682063879.LAPTOP-U8KCJD82.41476.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:17547f8ac5cc000b35a1116cf4b79eed7c52ef17447e1e0e6842c7a6bee5bb34 +size 6262 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/1682067121.1639612/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/1682067121.1639612/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.1 new file mode 100644 index 0000000000000000000000000000000000000000..1a14bf31e88cef1c20ec7fb0256e35585b598bd0 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/1682067121.1639612/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2f9a4229035dff3bd94f3f4bd94f88c1db0bbd318e5d085a98addc2fe12e9084 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.0 new file mode 100644 index 0000000000000000000000000000000000000000..6b9ea30654928234f91dea78e31604bf92c40b61 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_16-51-52_LAPTOP-U8KCJD82/events.out.tfevents.1682067121.LAPTOP-U8KCJD82.14196.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:065d9cfeabdfa6d270d0d499386d63e08dbaf49fc48384ade446b190e3f51833 +size 4414 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/1682088936.096107/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/1682088936.096107/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.1 new file mode 100644 index 0000000000000000000000000000000000000000..7cc242cab828bf27dd2106d9c47eec491be7ee40 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/1682088936.096107/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e3acb882101e7722e9d292bc0640ec972038e4af51655ceb409f37211cdb4ac4 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.0 new file mode 100644 index 0000000000000000000000000000000000000000..81837546c45ffefe7a9ccce0ab4d9e01e6f89d6a --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-55-28_LAPTOP-U8KCJD82/events.out.tfevents.1682088936.LAPTOP-U8KCJD82.33836.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:804221f0bf0f27bc4538de946ec7c3066801fd95d109e850f5a5a15dc4e6b806 +size 4414 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/1682089049.5996344/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/1682089049.5996344/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.1 new file mode 100644 index 0000000000000000000000000000000000000000..5a08c369432c955d033752e146a7079713b811c7 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/1682089049.5996344/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:54e9468bb151a2d777580c0c4a9ad702ddabba0107199b055f245e3cc92f347e +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.0 new file mode 100644 index 0000000000000000000000000000000000000000..5461a1a020eca91278c1574fa5fa2c08644bbe9d --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_22-57-22_LAPTOP-U8KCJD82/events.out.tfevents.1682089049.LAPTOP-U8KCJD82.13408.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2e300e89c12ad068282394b98b8197e86c5fb57b87640d2e76da6ec987cd081e +size 4876 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/1682089622.01523/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/1682089622.01523/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.1 new file mode 100644 index 0000000000000000000000000000000000000000..abeacd1838614c3872426737bb26f7b18a4c3c79 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/1682089622.01523/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1d36ac8d0ff6b01521f791acdfa8edac49e110c5890b885948b6fa33b5f4edc7 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.0 new file mode 100644 index 0000000000000000000000000000000000000000..ae951c118210d7386e597ae7af28445947dca4b5 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr21_23-06-54_LAPTOP-U8KCJD82/events.out.tfevents.1682089622.LAPTOP-U8KCJD82.8532.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6da59371a96a34ed690eb6cf384fbe486f96360029540bc7d07194289b8d940a +size 5029 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/1682149970.7120373/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/1682149970.7120373/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.1 new file mode 100644 index 0000000000000000000000000000000000000000..85d55c1a2e117cba782bf57f1dda68d7d603ec25 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/1682149970.7120373/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6b34f772c091e4af3b58773f884568834034efff05e87d540208d7819bcc9fe0 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.0 new file mode 100644 index 0000000000000000000000000000000000000000..bec1f885176c444aa7dbd56e1759e53c9abd2a3b --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-52-40_LAPTOP-U8KCJD82/events.out.tfevents.1682149970.LAPTOP-U8KCJD82.18056.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4c1346e06b2205d593b884573abd454b1ded29b375b449370355a0ec9acf18ba +size 4413 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/1682150042.7190838/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/1682150042.7190838/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.1 new file mode 100644 index 0000000000000000000000000000000000000000..17830c91be59b3fa0da7c401a8c202679bc16eb4 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/1682150042.7190838/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:dc6853234469e59732b5955f7f1b65f2ba2f4e0582878391d554b95d26beb831 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.0 new file mode 100644 index 0000000000000000000000000000000000000000..17ec52ce45e2847964c81b9e7ef1b150efa5be77 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr22_15-53-54_LAPTOP-U8KCJD82/events.out.tfevents.1682150042.LAPTOP-U8KCJD82.2616.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:78efc511e73606d4b45a21aacb1be5278da7a3616b81674e270b5f20b97b6712 +size 5183 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/1682187898.7504838/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.1 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/1682187898.7504838/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.1 new file mode 100644 index 0000000000000000000000000000000000000000..43d331c7624410fb108fb8c2d02bc65149f6d2a2 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/1682187898.7504838/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b265830fb43a40267c580fbce0d18c6a5d19d4b413a708dd57f0f7ab9ea247c2 +size 6115 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.0 b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.0 new file mode 100644 index 0000000000000000000000000000000000000000..373f0148a8d729838afafbd4d9b334091a271cb2 --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/runs/Apr23_02-24-50_LAPTOP-U8KCJD82/events.out.tfevents.1682187898.LAPTOP-U8KCJD82.34748.0 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8e3bd84f0b98691c3719ece8f042640779cf29884be9b847cd5e33756a33f1c3 +size 20431 diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/train_results.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/train_results.json new file mode 100644 index 0000000000000000000000000000000000000000..fbee11a3326003f6a42e4d6c4994820128f2cacf --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/train_results.json @@ -0,0 +1,8 @@ +{ + "epoch": 114.29, + "train_loss": 0.2596614052057266, + "train_runtime": 12879.4173, + "train_samples": 140, + "train_samples_per_second": 1.242, + "train_steps_per_second": 0.078 +} \ No newline at end of file diff --git a/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/trainer_state.json b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/trainer_state.json new file mode 100644 index 0000000000000000000000000000000000000000..01d01f1000ff854dadccf6ba8750003e3fdcba3a --- /dev/null +++ b/ptuning/output/adgen-chatglm-6b-pt-128-2e-2/trainer_state.json @@ -0,0 +1,625 @@ +{ + "best_metric": null, + "best_model_checkpoint": null, + "epoch": 114.28571428571429, + "global_step": 1000, + "is_hyper_param_search": false, + "is_local_process_zero": true, + "is_world_process_zero": true, + "log_history": [ + { + "epoch": 1.14, + "learning_rate": 0.0198, + "loss": 4.7949, + "step": 10 + }, + { + "epoch": 2.29, + "learning_rate": 0.0196, + "loss": 3.7519, + "step": 20 + }, + { + "epoch": 3.43, + "learning_rate": 0.0194, + "loss": 3.3049, + "step": 30 + }, + { + "epoch": 4.57, + "learning_rate": 0.0192, + "loss": 2.8868, + "step": 40 + }, + { + "epoch": 5.71, + "learning_rate": 0.019, + "loss": 2.4806, + "step": 50 + }, + { + "epoch": 6.86, + "learning_rate": 0.0188, + "loss": 1.865, + "step": 60 + }, + { + "epoch": 8.0, + "learning_rate": 0.018600000000000002, + "loss": 1.4186, + "step": 70 + }, + { + "epoch": 9.14, + "learning_rate": 0.0184, + "loss": 0.9316, + "step": 80 + }, + { + "epoch": 10.29, + "learning_rate": 0.0182, + "loss": 0.5643, + "step": 90 + }, + { + "epoch": 11.43, + "learning_rate": 0.018000000000000002, + "loss": 0.3509, + "step": 100 + }, + { + "epoch": 12.57, + "learning_rate": 0.0178, + "loss": 0.2172, + "step": 110 + }, + { + "epoch": 13.71, + "learning_rate": 0.0176, + "loss": 0.1486, + "step": 120 + }, + { + "epoch": 14.86, + "learning_rate": 0.0174, + "loss": 0.1196, + "step": 130 + }, + { + "epoch": 16.0, + "learning_rate": 0.0172, + "loss": 0.0802, + "step": 140 + }, + { + "epoch": 17.14, + "learning_rate": 0.017, + "loss": 0.0739, + "step": 150 + }, + { + "epoch": 18.29, + "learning_rate": 0.0168, + "loss": 0.0619, + "step": 160 + }, + { + "epoch": 19.43, + "learning_rate": 0.0166, + "loss": 0.062, + "step": 170 + }, + { + "epoch": 20.57, + "learning_rate": 0.016399999999999998, + "loss": 0.0595, + "step": 180 + }, + { + "epoch": 21.71, + "learning_rate": 0.016200000000000003, + "loss": 0.0569, + "step": 190 + }, + { + "epoch": 22.86, + "learning_rate": 0.016, + "loss": 0.0533, + "step": 200 + }, + { + "epoch": 24.0, + "learning_rate": 0.0158, + "loss": 0.052, + "step": 210 + }, + { + "epoch": 25.14, + "learning_rate": 0.015600000000000001, + "loss": 0.0471, + "step": 220 + }, + { + "epoch": 26.29, + "learning_rate": 0.0154, + "loss": 0.0478, + "step": 230 + }, + { + "epoch": 27.43, + "learning_rate": 0.0152, + "loss": 0.0415, + "step": 240 + }, + { + "epoch": 28.57, + "learning_rate": 0.015, + "loss": 0.0462, + "step": 250 + }, + { + "epoch": 29.71, + "learning_rate": 0.0148, + "loss": 0.0437, + "step": 260 + }, + { + "epoch": 30.86, + "learning_rate": 0.0146, + "loss": 0.0452, + "step": 270 + }, + { + "epoch": 32.0, + "learning_rate": 0.0144, + "loss": 0.041, + "step": 280 + }, + { + "epoch": 33.14, + "learning_rate": 0.014199999999999999, + "loss": 0.0418, + "step": 290 + }, + { + "epoch": 34.29, + "learning_rate": 0.013999999999999999, + "loss": 0.0469, + "step": 300 + }, + { + "epoch": 35.43, + "learning_rate": 0.0138, + "loss": 0.0394, + "step": 310 + }, + { + "epoch": 36.57, + "learning_rate": 0.013600000000000001, + "loss": 0.0444, + "step": 320 + }, + { + "epoch": 37.71, + "learning_rate": 0.0134, + "loss": 0.0393, + "step": 330 + }, + { + "epoch": 38.86, + "learning_rate": 0.013200000000000002, + "loss": 0.045, + "step": 340 + }, + { + "epoch": 40.0, + "learning_rate": 0.013000000000000001, + "loss": 0.0392, + "step": 350 + }, + { + "epoch": 41.14, + "learning_rate": 0.0128, + "loss": 0.0351, + "step": 360 + }, + { + "epoch": 42.29, + "learning_rate": 0.0126, + "loss": 0.0389, + "step": 370 + }, + { + "epoch": 43.43, + "learning_rate": 0.0124, + "loss": 0.0374, + "step": 380 + }, + { + "epoch": 44.57, + "learning_rate": 0.0122, + "loss": 0.035, + "step": 390 + }, + { + "epoch": 45.71, + "learning_rate": 0.012, + "loss": 0.0349, + "step": 400 + }, + { + "epoch": 46.86, + "learning_rate": 0.0118, + "loss": 0.0361, + "step": 410 + }, + { + "epoch": 48.0, + "learning_rate": 0.0116, + "loss": 0.0368, + "step": 420 + }, + { + "epoch": 49.14, + "learning_rate": 0.011399999999999999, + "loss": 0.0352, + "step": 430 + }, + { + "epoch": 50.29, + "learning_rate": 0.011200000000000002, + "loss": 0.0353, + "step": 440 + }, + { + "epoch": 51.43, + "learning_rate": 0.011000000000000001, + "loss": 0.0337, + "step": 450 + }, + { + "epoch": 52.57, + "learning_rate": 0.0108, + "loss": 0.032, + "step": 460 + }, + { + "epoch": 53.71, + "learning_rate": 0.0106, + "loss": 0.0366, + "step": 470 + }, + { + "epoch": 54.86, + "learning_rate": 0.010400000000000001, + "loss": 0.0317, + "step": 480 + }, + { + "epoch": 56.0, + "learning_rate": 0.0102, + "loss": 0.0332, + "step": 490 + }, + { + "epoch": 57.14, + "learning_rate": 0.01, + "loss": 0.0328, + "step": 500 + }, + { + "epoch": 58.29, + "learning_rate": 0.0098, + "loss": 0.0325, + "step": 510 + }, + { + "epoch": 59.43, + "learning_rate": 0.0096, + "loss": 0.0327, + "step": 520 + }, + { + "epoch": 60.57, + "learning_rate": 0.0094, + "loss": 0.0344, + "step": 530 + }, + { + "epoch": 61.71, + "learning_rate": 0.0092, + "loss": 0.0351, + "step": 540 + }, + { + "epoch": 62.86, + "learning_rate": 0.009000000000000001, + "loss": 0.0325, + "step": 550 + }, + { + "epoch": 64.0, + "learning_rate": 0.0088, + "loss": 0.0329, + "step": 560 + }, + { + "epoch": 65.14, + "learning_rate": 0.0086, + "loss": 0.031, + "step": 570 + }, + { + "epoch": 66.29, + "learning_rate": 0.0084, + "loss": 0.0326, + "step": 580 + }, + { + "epoch": 67.43, + "learning_rate": 0.008199999999999999, + "loss": 0.0311, + "step": 590 + }, + { + "epoch": 68.57, + "learning_rate": 0.008, + "loss": 0.033, + "step": 600 + }, + { + "epoch": 69.71, + "learning_rate": 0.0078000000000000005, + "loss": 0.0297, + "step": 610 + }, + { + "epoch": 70.86, + "learning_rate": 0.0076, + "loss": 0.0331, + "step": 620 + }, + { + "epoch": 72.0, + "learning_rate": 0.0074, + "loss": 0.0318, + "step": 630 + }, + { + "epoch": 73.14, + "learning_rate": 0.0072, + "loss": 0.0307, + "step": 640 + }, + { + "epoch": 74.29, + "learning_rate": 0.006999999999999999, + "loss": 0.03, + "step": 650 + }, + { + "epoch": 75.43, + "learning_rate": 0.0068000000000000005, + "loss": 0.0307, + "step": 660 + }, + { + "epoch": 76.57, + "learning_rate": 0.006600000000000001, + "loss": 0.0334, + "step": 670 + }, + { + "epoch": 77.71, + "learning_rate": 0.0064, + "loss": 0.0321, + "step": 680 + }, + { + "epoch": 78.86, + "learning_rate": 0.0062, + "loss": 0.029, + "step": 690 + }, + { + "epoch": 80.0, + "learning_rate": 0.006, + "loss": 0.0315, + "step": 700 + }, + { + "epoch": 81.14, + "learning_rate": 0.0058, + "loss": 0.0294, + "step": 710 + }, + { + "epoch": 82.29, + "learning_rate": 0.005600000000000001, + "loss": 0.0323, + "step": 720 + }, + { + "epoch": 83.43, + "learning_rate": 0.0054, + "loss": 0.0274, + "step": 730 + }, + { + "epoch": 84.57, + "learning_rate": 0.005200000000000001, + "loss": 0.0305, + "step": 740 + }, + { + "epoch": 85.71, + "learning_rate": 0.005, + "loss": 0.0316, + "step": 750 + }, + { + "epoch": 86.86, + "learning_rate": 0.0048, + "loss": 0.0262, + "step": 760 + }, + { + "epoch": 88.0, + "learning_rate": 0.0046, + "loss": 0.0305, + "step": 770 + }, + { + "epoch": 89.14, + "learning_rate": 0.0044, + "loss": 0.0294, + "step": 780 + }, + { + "epoch": 90.29, + "learning_rate": 0.0042, + "loss": 0.0291, + "step": 790 + }, + { + "epoch": 91.43, + "learning_rate": 0.004, + "loss": 0.0274, + "step": 800 + }, + { + "epoch": 92.57, + "learning_rate": 0.0038, + "loss": 0.032, + "step": 810 + }, + { + "epoch": 93.71, + "learning_rate": 0.0036, + "loss": 0.0262, + "step": 820 + }, + { + "epoch": 94.86, + "learning_rate": 0.0034000000000000002, + "loss": 0.0325, + "step": 830 + }, + { + "epoch": 96.0, + "learning_rate": 0.0032, + "loss": 0.0276, + "step": 840 + }, + { + "epoch": 97.14, + "learning_rate": 0.003, + "loss": 0.029, + "step": 850 + }, + { + "epoch": 98.29, + "learning_rate": 0.0028000000000000004, + "loss": 0.0256, + "step": 860 + }, + { + "epoch": 99.43, + "learning_rate": 0.0026000000000000003, + "loss": 0.0305, + "step": 870 + }, + { + "epoch": 100.57, + "learning_rate": 0.0024, + "loss": 0.0271, + "step": 880 + }, + { + "epoch": 101.71, + "learning_rate": 0.0022, + "loss": 0.0302, + "step": 890 + }, + { + "epoch": 102.86, + "learning_rate": 0.002, + "loss": 0.0288, + "step": 900 + }, + { + "epoch": 104.0, + "learning_rate": 0.0018, + "loss": 0.0261, + "step": 910 + }, + { + "epoch": 105.14, + "learning_rate": 0.0016, + "loss": 0.0272, + "step": 920 + }, + { + "epoch": 106.29, + "learning_rate": 0.0014000000000000002, + "loss": 0.0285, + "step": 930 + }, + { + "epoch": 107.43, + "learning_rate": 0.0012, + "loss": 0.0271, + "step": 940 + }, + { + "epoch": 108.57, + "learning_rate": 0.001, + "loss": 0.0289, + "step": 950 + }, + { + "epoch": 109.71, + "learning_rate": 0.0008, + "loss": 0.028, + "step": 960 + }, + { + "epoch": 110.86, + "learning_rate": 0.0006, + "loss": 0.0263, + "step": 970 + }, + { + "epoch": 112.0, + "learning_rate": 0.0004, + "loss": 0.0279, + "step": 980 + }, + { + "epoch": 113.14, + "learning_rate": 0.0002, + "loss": 0.0272, + "step": 990 + }, + { + "epoch": 114.29, + "learning_rate": 0.0, + "loss": 0.0285, + "step": 1000 + }, + { + "epoch": 114.29, + "step": 1000, + "total_flos": 3.4665721233408e+16, + "train_loss": 0.2596614052057266, + "train_runtime": 12879.4173, + "train_samples_per_second": 1.242, + "train_steps_per_second": 0.078 + } + ], + "max_steps": 1000, + "num_train_epochs": 125, + "total_flos": 3.4665721233408e+16, + "trial_name": null, + "trial_params": null +} diff --git a/ptuning/predict.py b/ptuning/predict.py new file mode 100644 index 0000000000000000000000000000000000000000..2cb2995d606fa120cf1538a5c42ab8c504a02292 --- /dev/null +++ b/ptuning/predict.py @@ -0,0 +1,31 @@ +import os +import torch +from transformers import AutoConfig, AutoModel, AutoTokenizer + +# 载入Tokenizer +model_path = "..\\models\\chatglm-6b-int4" +CHECKPOINT_PATH = '.\\output\\adgen-chatglm-6b-pt-128-2e-2\\checkpoint-100' + +tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True) + +# 如果需要加载的是新 Checkpoint(只包含 PrefixEncoder 参数): +config = AutoConfig.from_pretrained(model_path, trust_remote_code=True, pre_seq_len=128) +model = AutoModel.from_pretrained(model_path, config=config, trust_remote_code=True) +prefix_state_dict = torch.load(os.path.join(CHECKPOINT_PATH, "pytorch_model.bin")) +new_prefix_state_dict = {} +for k, v in prefix_state_dict.items(): + if k.startswith("transformer.prefix_encoder."): + new_prefix_state_dict[k[len("transformer.prefix_encoder."):]] = v +model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict) + +# 之后根据需求可以进行量化,也可以直接使用: +kernel_file = "{}\\quantization_kernels.so".format(model_path) +model = model.quantize(bits=4,kernel_file=kernel_file) + +model = model.half().cuda() +model.transformer.prefix_encoder.float() + +model = model.eval() + +response, history = model.chat(tokenizer, "你好呀", history=[]) +print("response:", response) \ No newline at end of file diff --git a/ptuning/predict_multi_chat.py b/ptuning/predict_multi_chat.py new file mode 100644 index 0000000000000000000000000000000000000000..b32abe2c8baf2423a6de40330696313a9e7d72d3 --- /dev/null +++ b/ptuning/predict_multi_chat.py @@ -0,0 +1,93 @@ +import os +import torch +from transformers import AutoConfig, AutoModel, AutoTokenizer + +# 载入Tokenizer +model_path = "..\\models\\chatglm-6b-int4" +CHECKPOINT_PATH = '.\\output\\adgen-chatglm-6b-pt-128-2e-2\\checkpoint-1000' + +tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True) + +# 如果需要加载的是新 Checkpoint(只包含 PrefixEncoder 参数): +config = AutoConfig.from_pretrained(model_path, trust_remote_code=True, pre_seq_len=128) +model = AutoModel.from_pretrained(model_path, config=config, trust_remote_code=True) +prefix_state_dict = torch.load(os.path.join(CHECKPOINT_PATH, "pytorch_model.bin")) +new_prefix_state_dict = {} +for k, v in prefix_state_dict.items(): + if k.startswith("transformer.prefix_encoder."): + new_prefix_state_dict[k[len("transformer.prefix_encoder."):]] = v +model.transformer.prefix_encoder.load_state_dict(new_prefix_state_dict) + +# 之后根据需求可以进行量化,也可以直接使用: +kernel_file = "{}\\quantization_kernels.so".format(model_path) +model = model.quantize(bits=4,kernel_file=kernel_file) + +model = model.half().cuda() +model.transformer.prefix_encoder.float() + +model = model.eval() + +# response, history = model.chat(tokenizer, "你好呀", history=[]) +# print("response:", response) + + + +def parse_text(text): + lines = text.split("\n") + lines = [line for line in lines if line != ""] + count = 0 + for i, line in enumerate(lines): + if "```" in line: + count += 1 + items = line.split('`') + if count % 2 == 1: + lines[i] = f'
'
+            else:
+                lines[i] = f'
' + else: + if i > 0: + if count % 2 == 1: + line = line.replace("`", "\`") + line = line.replace("<", "<") + line = line.replace(">", ">") + line = line.replace(" ", " ") + line = line.replace("*", "*") + line = line.replace("_", "_") + line = line.replace("-", "-") + line = line.replace(".", ".") + line = line.replace("!", "!") + line = line.replace("(", "(") + line = line.replace(")", ")") + line = line.replace("$", "$") + lines[i] = "
"+line + text = "".join(lines) + return text + +def predict(input, chatbot, max_length, top_p, temperature, history): + chatbot.append((parse_text(input), "")) + for response, history in model.stream_chat(tokenizer, input, history, max_length=max_length, top_p=top_p, + temperature=temperature): + chatbot[-1] = (parse_text(input), parse_text(response)) + + yield chatbot, history +response_new = '' +history = [] + +for i in range(3000): + length_history = len(history) + if (length_history > 5): # 如果对话长度太长,就把之前的遗忘掉 + del history[0] + del history[0] + # print('\nYou:',end='') + print('\033[1;31m{}\033[0m'.format('\nYou:'),end='') + msg = input() + print('\033[1;34m{}\033[0m'.format('ChatGLM:'),end='') + + for chatbot, history in predict(input=msg, chatbot=[], max_length=10000, top_p=0.5, temperature=0.5, history=history): + response_old = response_new + response_new = chatbot[0][1] + new_single = response_new.replace(response_old, '') + print(new_single,end='') + + + diff --git a/ptuning/train.sh b/ptuning/train.sh new file mode 100644 index 0000000000000000000000000000000000000000..d688307b3d792d0ba9b2645a3b3a1bfd3a2f5bd2 --- /dev/null +++ b/ptuning/train.sh @@ -0,0 +1,26 @@ +PRE_SEQ_LEN=128 +LR=2e-2 + +CUDA_VISIBLE_DEVICES=0 python main.py \ + --do_train \ + --train_file .\\datasets\\Zettels\\train.json \ + --validation_file .\\datasets\\Zettels\\dev.json \ + --prompt_column content \ + --response_column summary \ + --overwrite_cache \ + --model_name_or_path ..\\models\\chatglm-6b-int4 \ + --output_dir output\\adgen-chatglm-6b-pt-$PRE_SEQ_LEN-$LR \ + --overwrite_output_dir \ + --max_source_length 64 \ + --max_target_length 64 \ + --per_device_train_batch_size 1 \ + --per_device_eval_batch_size 1 \ + --gradient_accumulation_steps 16 \ + --predict_with_generate \ + --max_steps 1000 \ + --logging_steps 10 \ + --save_steps 10 \ + --learning_rate $LR \ + --pre_seq_len $PRE_SEQ_LEN \ + --quantization_bit 4 + diff --git a/ptuning/train_chat.sh b/ptuning/train_chat.sh new file mode 100644 index 0000000000000000000000000000000000000000..b0f5cdc241ef94f039c93df483ee76cb8668ce2a --- /dev/null +++ b/ptuning/train_chat.sh @@ -0,0 +1,27 @@ +PRE_SEQ_LEN=8 +LR=1e-2 + +CUDA_VISIBLE_DEVICES=0 python3 main.py \ + --do_train \ + --train_file $CHAT_TRAIN_DATA \ + --validation_file $CHAT_VAL_DATA \ + --prompt_column prompt \ + --response_column response \ + --history_column history \ + --overwrite_cache \ + --model_name_or_path THUDM/chatglm-6b \ + --output_dir $CHECKPOINT_NAME \ + --overwrite_output_dir \ + --max_source_length 256 \ + --max_target_length 256 \ + --per_device_train_batch_size 1 \ + --per_device_eval_batch_size 1 \ + --gradient_accumulation_steps 16 \ + --predict_with_generate \ + --max_steps 3000 \ + --logging_steps 10 \ + --save_steps 1000 \ + --learning_rate $LR \ + --pre_seq_len $PRE_SEQ_LEN \ + --quantization_bit 4 + diff --git a/ptuning/train_linux.sh b/ptuning/train_linux.sh new file mode 100644 index 0000000000000000000000000000000000000000..e851346a15defd87a2b8c29008ae130171edab1a --- /dev/null +++ b/ptuning/train_linux.sh @@ -0,0 +1,26 @@ +PRE_SEQ_LEN=128 +LR=2e-2 + +CUDA_VISIBLE_DEVICES=0 python main.py \ + --do_train \ + --train_file ../AdvertiseGen/train.json \ + --validation_file ../AdvertiseGen/dev.json \ + --prompt_column content \ + --response_column summary \ + --overwrite_cache \ + --model_name_or_path ../models/chatglm-6b-int4/models--THUDM--chatglm-6b-int4/snapshots/e02ba894cf18f3fd9b2526c795f983683c4ec732 \ + --output_dir output/adgen-chatglm-6b-pt-$PRE_SEQ_LEN-$LR \ + --overwrite_output_dir \ + --max_source_length 64 \ + --max_target_length 64 \ + --per_device_train_batch_size 1 \ + --per_device_eval_batch_size 1 \ + --gradient_accumulation_steps 16 \ + --predict_with_generate \ + --max_steps 3000 \ + --logging_steps 10 \ + --save_steps 1000 \ + --learning_rate $LR \ + --pre_seq_len 1 \ + --quantization_bit 4 + diff --git a/ptuning/trainer.py b/ptuning/trainer.py new file mode 100644 index 0000000000000000000000000000000000000000..63101bc9d3dfb65ff5a444c7c151b8d4d241f2c9 --- /dev/null +++ b/ptuning/trainer.py @@ -0,0 +1,3830 @@ +# coding=utf-8 +# Copyright 2020-present the HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" +The Trainer class, to easily train a 🤗 Transformers from scratch or finetune it on a new task. +""" + +import contextlib +import functools +import glob +import inspect +import math +import os +import random +import re +import shutil +import sys +import time +import warnings +from collections.abc import Mapping +from distutils.util import strtobool +from pathlib import Path +from typing import TYPE_CHECKING, Any, Callable, Dict, List, Optional, Tuple, Union + +from tqdm.auto import tqdm + + +# Integrations must be imported before ML frameworks: +# isort: off +from transformers.integrations import ( + default_hp_search_backend, + get_reporting_integration_callbacks, + hp_params, + is_fairscale_available, + is_optuna_available, + is_ray_tune_available, + is_sigopt_available, + is_wandb_available, + run_hp_search_optuna, + run_hp_search_ray, + run_hp_search_sigopt, + run_hp_search_wandb, +) + +# isort: on + +import numpy as np +import torch +import torch.distributed as dist +from huggingface_hub import Repository, create_repo +from packaging import version +from torch import nn +from torch.utils.data import DataLoader, Dataset, RandomSampler, SequentialSampler +from torch.utils.data.distributed import DistributedSampler + +from transformers import __version__ +from transformers.configuration_utils import PretrainedConfig +from transformers.data.data_collator import DataCollator, DataCollatorWithPadding, default_data_collator +from transformers.debug_utils import DebugOption, DebugUnderflowOverflow +from transformers.deepspeed import deepspeed_init, is_deepspeed_zero3_enabled +from transformers.dependency_versions_check import dep_version_check +from transformers.modelcard import TrainingSummary +from transformers.modeling_utils import PreTrainedModel, load_sharded_checkpoint, unwrap_model +from transformers.models.auto.modeling_auto import MODEL_FOR_CAUSAL_LM_MAPPING_NAMES, MODEL_MAPPING_NAMES +from transformers.optimization import Adafactor, get_scheduler +from transformers.pytorch_utils import ALL_LAYERNORM_LAYERS, is_torch_greater_or_equal_than_1_10, is_torch_less_than_1_11 +from transformers.tokenization_utils_base import PreTrainedTokenizerBase +from transformers.trainer_callback import ( + CallbackHandler, + DefaultFlowCallback, + PrinterCallback, + ProgressCallback, + TrainerCallback, + TrainerControl, + TrainerState, +) +from transformers.trainer_pt_utils import ( + DistributedLengthGroupedSampler, + DistributedSamplerWithLoop, + DistributedTensorGatherer, + IterableDatasetShard, + LabelSmoother, + LengthGroupedSampler, + SequentialDistributedSampler, + ShardSampler, + distributed_broadcast_scalars, + distributed_concat, + find_batch_size, + get_module_class_from_name, + get_parameter_names, + nested_concat, + nested_detach, + nested_numpify, + nested_truncate, + nested_xla_mesh_reduce, + reissue_pt_warnings, +) +from transformers.trainer_utils import ( + PREFIX_CHECKPOINT_DIR, + BestRun, + EvalLoopOutput, + EvalPrediction, + FSDPOption, + HPSearchBackend, + HubStrategy, + IntervalStrategy, + PredictionOutput, + RemoveColumnsCollator, + ShardedDDPOption, + TrainerMemoryTracker, + TrainOutput, + default_compute_objective, + default_hp_space, + denumpify_detensorize, + enable_full_determinism, + find_executable_batch_size, + get_last_checkpoint, + has_length, + number_of_arguments, + seed_worker, + set_seed, + speed_metrics, +) +from transformers.training_args import OptimizerNames, ParallelMode, TrainingArguments +from transformers.utils import ( + CONFIG_NAME, + WEIGHTS_INDEX_NAME, + WEIGHTS_NAME, + can_return_loss, + find_labels, + get_full_repo_name, + is_accelerate_available, + is_apex_available, + is_datasets_available, + is_in_notebook, + is_ipex_available, + is_sagemaker_dp_enabled, + is_sagemaker_mp_enabled, + is_torch_compile_available, + is_torch_neuroncore_available, + is_torch_tpu_available, + logging, +) +from transformers.utils.generic import ContextManagers + + +_is_native_cpu_amp_available = is_torch_greater_or_equal_than_1_10 + +DEFAULT_CALLBACKS = [DefaultFlowCallback] +DEFAULT_PROGRESS_CALLBACK = ProgressCallback + +if is_in_notebook(): + from transformers.utils.notebook import NotebookProgressCallback + + DEFAULT_PROGRESS_CALLBACK = NotebookProgressCallback + +if is_apex_available(): + from apex import amp + +if is_datasets_available(): + import datasets + +if is_torch_tpu_available(check_device=False): + import torch_xla.core.xla_model as xm + import torch_xla.debug.metrics as met + import torch_xla.distributed.parallel_loader as pl + +if is_fairscale_available(): + dep_version_check("fairscale") + import fairscale + from fairscale.nn.data_parallel import FullyShardedDataParallel as FullyShardedDDP + from fairscale.nn.data_parallel import ShardedDataParallel as ShardedDDP + from fairscale.nn.wrap import auto_wrap + from fairscale.optim import OSS + from fairscale.optim.grad_scaler import ShardedGradScaler + + +if is_sagemaker_mp_enabled(): + import smdistributed.modelparallel.torch as smp + from smdistributed.modelparallel import __version__ as SMP_VERSION + + IS_SAGEMAKER_MP_POST_1_10 = version.parse(SMP_VERSION) >= version.parse("1.10") + + from transformers.trainer_pt_utils import smp_forward_backward, smp_forward_only, smp_gather, smp_nested_concat +else: + IS_SAGEMAKER_MP_POST_1_10 = False + + +skip_first_batches = None +if is_accelerate_available(): + from accelerate import __version__ as accelerate_version + + if version.parse(accelerate_version) >= version.parse("0.16"): + from accelerate import skip_first_batches + + +if TYPE_CHECKING: + import optuna + +logger = logging.get_logger(__name__) + + +# Name of the files used for checkpointing +TRAINING_ARGS_NAME = "training_args.bin" +TRAINER_STATE_NAME = "trainer_state.json" +OPTIMIZER_NAME = "optimizer.pt" +SCHEDULER_NAME = "scheduler.pt" +SCALER_NAME = "scaler.pt" + + +class Trainer: + """ + Trainer is a simple but feature-complete training and eval loop for PyTorch, optimized for 🤗 Transformers. + + Args: + model ([`PreTrainedModel`] or `torch.nn.Module`, *optional*): + The model to train, evaluate or use for predictions. If not provided, a `model_init` must be passed. + + + + [`Trainer`] is optimized to work with the [`PreTrainedModel`] provided by the library. You can still use + your own models defined as `torch.nn.Module` as long as they work the same way as the 🤗 Transformers + models. + + + + args ([`TrainingArguments`], *optional*): + The arguments to tweak for training. Will default to a basic instance of [`TrainingArguments`] with the + `output_dir` set to a directory named *tmp_trainer* in the current directory if not provided. + data_collator (`DataCollator`, *optional*): + The function to use to form a batch from a list of elements of `train_dataset` or `eval_dataset`. Will + default to [`default_data_collator`] if no `tokenizer` is provided, an instance of + [`DataCollatorWithPadding`] otherwise. + train_dataset (`torch.utils.data.Dataset` or `torch.utils.data.IterableDataset`, *optional*): + The dataset to use for training. If it is a [`~datasets.Dataset`], columns not accepted by the + `model.forward()` method are automatically removed. + + Note that if it's a `torch.utils.data.IterableDataset` with some randomization and you are training in a + distributed fashion, your iterable dataset should either use a internal attribute `generator` that is a + `torch.Generator` for the randomization that must be identical on all processes (and the Trainer will + manually set the seed of this `generator` at each epoch) or have a `set_epoch()` method that internally + sets the seed of the RNGs used. + eval_dataset (Union[`torch.utils.data.Dataset`, Dict[str, `torch.utils.data.Dataset`]), *optional*): + The dataset to use for evaluation. If it is a [`~datasets.Dataset`], columns not accepted by the + `model.forward()` method are automatically removed. If it is a dictionary, it will evaluate on each + dataset prepending the dictionary key to the metric name. + tokenizer ([`PreTrainedTokenizerBase`], *optional*): + The tokenizer used to preprocess the data. If provided, will be used to automatically pad the inputs to the + maximum length when batching inputs, and it will be saved along the model to make it easier to rerun an + interrupted training or reuse the fine-tuned model. + model_init (`Callable[[], PreTrainedModel]`, *optional*): + A function that instantiates the model to be used. If provided, each call to [`~Trainer.train`] will start + from a new instance of the model as given by this function. + + The function may have zero argument, or a single one containing the optuna/Ray Tune/SigOpt trial object, to + be able to choose different architectures according to hyper parameters (such as layer count, sizes of + inner layers, dropout probabilities etc). + compute_metrics (`Callable[[EvalPrediction], Dict]`, *optional*): + The function that will be used to compute metrics at evaluation. Must take a [`EvalPrediction`] and return + a dictionary string to metric values. + callbacks (List of [`TrainerCallback`], *optional*): + A list of callbacks to customize the training loop. Will add those to the list of default callbacks + detailed in [here](callback). + + If you want to remove one of the default callbacks used, use the [`Trainer.remove_callback`] method. + optimizers (`Tuple[torch.optim.Optimizer, torch.optim.lr_scheduler.LambdaLR]`, *optional*): A tuple + containing the optimizer and the scheduler to use. Will default to an instance of [`AdamW`] on your model + and a scheduler given by [`get_linear_schedule_with_warmup`] controlled by `args`. + preprocess_logits_for_metrics (`Callable[[torch.Tensor, torch.Tensor], torch.Tensor]`, *optional*): + A function that preprocess the logits right before caching them at each evaluation step. Must take two + tensors, the logits and the labels, and return the logits once processed as desired. The modifications made + by this function will be reflected in the predictions received by `compute_metrics`. + + Note that the labels (second parameter) will be `None` if the dataset does not have them. + + Important attributes: + + - **model** -- Always points to the core model. If using a transformers model, it will be a [`PreTrainedModel`] + subclass. + - **model_wrapped** -- Always points to the most external model in case one or more other modules wrap the + original model. This is the model that should be used for the forward pass. For example, under `DeepSpeed`, + the inner model is wrapped in `DeepSpeed` and then again in `torch.nn.DistributedDataParallel`. If the inner + model hasn't been wrapped, then `self.model_wrapped` is the same as `self.model`. + - **is_model_parallel** -- Whether or not a model has been switched to a model parallel mode (different from + data parallelism, this means some of the model layers are split on different GPUs). + - **place_model_on_device** -- Whether or not to automatically place the model on the device - it will be set + to `False` if model parallel or deepspeed is used, or if the default + `TrainingArguments.place_model_on_device` is overridden to return `False` . + - **is_in_train** -- Whether or not a model is currently running `train` (e.g. when `evaluate` is called while + in `train`) + + """ + + from transformers.trainer_pt_utils import _get_learning_rate, log_metrics, metrics_format, save_metrics, save_state + + def __init__( + self, + model: Union[PreTrainedModel, nn.Module] = None, + args: TrainingArguments = None, + data_collator: Optional[DataCollator] = None, + train_dataset: Optional[Dataset] = None, + eval_dataset: Optional[Union[Dataset, Dict[str, Dataset]]] = None, + tokenizer: Optional[PreTrainedTokenizerBase] = None, + model_init: Optional[Callable[[], PreTrainedModel]] = None, + compute_metrics: Optional[Callable[[EvalPrediction], Dict]] = None, + callbacks: Optional[List[TrainerCallback]] = None, + optimizers: Tuple[torch.optim.Optimizer, torch.optim.lr_scheduler.LambdaLR] = (None, None), + preprocess_logits_for_metrics: Optional[Callable[[torch.Tensor, torch.Tensor], torch.Tensor]] = None, + save_prefixencoder: bool = False, + ): + self.save_prefixencoder = save_prefixencoder + if args is None: + output_dir = "tmp_trainer" + logger.info(f"No `TrainingArguments` passed, using `output_dir={output_dir}`.") + args = TrainingArguments(output_dir=output_dir) + self.args = args + # Seed must be set before instantiating the model when using model + enable_full_determinism(self.args.seed) if self.args.full_determinism else set_seed(self.args.seed) + self.hp_name = None + self.deepspeed = None + self.is_in_train = False + + # memory metrics - must set up as early as possible + self._memory_tracker = TrainerMemoryTracker(self.args.skip_memory_metrics) + self._memory_tracker.start() + + # set the correct log level depending on the node + log_level = args.get_process_log_level() + logging.set_verbosity(log_level) + + # force device and distributed setup init explicitly + args._setup_devices + + if model is None: + if model_init is not None: + self.model_init = model_init + model = self.call_model_init() + else: + raise RuntimeError("`Trainer` requires either a `model` or `model_init` argument") + else: + if model_init is not None: + warnings.warn( + "`Trainer` requires either a `model` or `model_init` argument, but not both. `model_init` will" + " overwrite your model when calling the `train` method. This will become a fatal error in the next" + " release.", + FutureWarning, + ) + self.model_init = model_init + + if model.__class__.__name__ in MODEL_MAPPING_NAMES: + raise ValueError( + f"The model you have picked ({model.__class__.__name__}) cannot be used as is for training: it only " + "computes hidden states and does not accept any labels. You should choose a model with a head " + "suitable for your task like any of the `AutoModelForXxx` listed at " + "https://huggingface.co/docs/transformers/model_doc/auto." + ) + + if hasattr(model, "is_parallelizable") and model.is_parallelizable and model.model_parallel: + self.is_model_parallel = True + else: + self.is_model_parallel = False + + # At this stage the model is already loaded + if getattr(model, "is_loaded_in_8bit", False): + if getattr(model, "_is_int8_training_enabled", False): + logger.info( + "The model is loaded in 8-bit precision. To train this model you need to add additional modules" + " inside the model such as adapters using `peft` library and freeze the model weights. Please" + " check " + " the examples in https://github.com/huggingface/peft for more details." + ) + else: + raise ValueError( + "The model you want to train is loaded in 8-bit precision. if you want to fine-tune an 8-bit" + " model, please make sure that you have installed `bitsandbytes>=0.37.0`. " + ) + + # Setup Sharded DDP training + self.sharded_ddp = None + if len(args.sharded_ddp) > 0: + if args.deepspeed: + raise ValueError( + "Using --sharded_ddp xxx together with --deepspeed is not possible, deactivate one of those flags." + ) + if len(args.fsdp) > 0: + raise ValueError( + "Using --sharded_ddp xxx together with --fsdp is not possible, deactivate one of those flags." + ) + + if args.local_rank == -1: + raise ValueError("Using sharded DDP only works in distributed training.") + elif not is_fairscale_available(): + raise ImportError("Sharded DDP training requires fairscale: `pip install fairscale`.") + elif ShardedDDPOption.SIMPLE not in args.sharded_ddp and FullyShardedDDP is None: + raise ImportError( + "Sharded DDP in a mode other than simple training requires fairscale version >= 0.3, found " + f"{fairscale.__version__}. Upgrade your fairscale library: `pip install --upgrade fairscale`." + ) + elif ShardedDDPOption.SIMPLE in args.sharded_ddp: + self.sharded_ddp = ShardedDDPOption.SIMPLE + elif ShardedDDPOption.ZERO_DP_2 in args.sharded_ddp: + self.sharded_ddp = ShardedDDPOption.ZERO_DP_2 + elif ShardedDDPOption.ZERO_DP_3 in args.sharded_ddp: + self.sharded_ddp = ShardedDDPOption.ZERO_DP_3 + + self.fsdp = None + if len(args.fsdp) > 0: + if args.deepspeed: + raise ValueError( + "Using --fsdp xxx together with --deepspeed is not possible, deactivate one of those flags." + ) + if not args.fsdp_config["xla"] and args.local_rank == -1: + raise ValueError("Using fsdp only works in distributed training.") + + # dep_version_check("torch>=1.12.0") + # Would have to update setup.py with torch>=1.12.0 + # which isn't ideally given that it will force people not using FSDP to also use torch>=1.12.0 + # below is the current alternative. + if version.parse(version.parse(torch.__version__).base_version) < version.parse("1.12.0"): + raise ValueError("FSDP requires PyTorch >= 1.12.0") + + from torch.distributed.fsdp.fully_sharded_data_parallel import BackwardPrefetch, ShardingStrategy + + if FSDPOption.FULL_SHARD in args.fsdp: + self.fsdp = ShardingStrategy.FULL_SHARD + elif FSDPOption.SHARD_GRAD_OP in args.fsdp: + self.fsdp = ShardingStrategy.SHARD_GRAD_OP + elif FSDPOption.NO_SHARD in args.fsdp: + self.fsdp = ShardingStrategy.NO_SHARD + + self.backward_prefetch = BackwardPrefetch.BACKWARD_PRE + if "backward_prefetch" in self.args.fsdp_config and "backward_pos" not in self.backward_prefetch: + self.backward_prefetch = BackwardPrefetch.BACKWARD_POST + + self.forword_prefetch = False + if self.args.fsdp_config.get("forword_prefect", False): + self.forword_prefetch = True + + self.limit_all_gathers = False + if self.args.fsdp_config.get("limit_all_gathers", False): + self.limit_all_gathers = True + + # one place to sort out whether to place the model on device or not + # postpone switching model to cuda when: + # 1. MP - since we are trying to fit a much bigger than 1 gpu model + # 2. fp16-enabled DeepSpeed loads the model in half the size and it doesn't need .to() anyway, + # and we only use deepspeed for training at the moment + # 3. full bf16 or fp16 eval - since the model needs to be cast to the right dtype first + # 4. Sharded DDP - same as MP + # 5. FSDP - same as MP + self.place_model_on_device = args.place_model_on_device + if ( + self.is_model_parallel + or args.deepspeed + or ((args.fp16_full_eval or args.bf16_full_eval) and not args.do_train) + or (self.sharded_ddp in [ShardedDDPOption.ZERO_DP_2, ShardedDDPOption.ZERO_DP_3]) + or (self.fsdp is not None) + ): + self.place_model_on_device = False + + default_collator = default_data_collator if tokenizer is None else DataCollatorWithPadding(tokenizer) + self.data_collator = data_collator if data_collator is not None else default_collator + self.train_dataset = train_dataset + self.eval_dataset = eval_dataset + self.tokenizer = tokenizer + + if self.place_model_on_device and not getattr(model, "is_loaded_in_8bit", False): + self._move_model_to_device(model, args.device) + + # Force n_gpu to 1 to avoid DataParallel as MP will manage the GPUs + if self.is_model_parallel: + self.args._n_gpu = 1 + + # later use `self.model is self.model_wrapped` to check if it's wrapped or not + self.model_wrapped = model + self.model = model + + self.compute_metrics = compute_metrics + self.preprocess_logits_for_metrics = preprocess_logits_for_metrics + self.optimizer, self.lr_scheduler = optimizers + if model_init is not None and (self.optimizer is not None or self.lr_scheduler is not None): + raise RuntimeError( + "Passing a `model_init` is incompatible with providing the `optimizers` argument. " + "You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method." + ) + if is_torch_tpu_available() and self.optimizer is not None: + for param in self.model.parameters(): + model_device = param.device + break + for param_group in self.optimizer.param_groups: + if len(param_group["params"]) > 0: + optimizer_device = param_group["params"][0].device + break + if model_device != optimizer_device: + raise ValueError( + "The model and the optimizer parameters are not on the same device, which probably means you" + " created an optimizer around your model **before** putting on the device and passing it to the" + " `Trainer`. Make sure the lines `import torch_xla.core.xla_model as xm` and" + " `model.to(xm.xla_device())` is performed before the optimizer creation in your script." + ) + if ((self.sharded_ddp is not None) or args.deepspeed or (self.fsdp is not None)) and ( + self.optimizer is not None or self.lr_scheduler is not None + ): + raise RuntimeError( + "Passing `optimizers` is not allowed if Fairscale, Deepspeed or PyTorch FSDP is enabled." + "You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method." + ) + default_callbacks = DEFAULT_CALLBACKS + get_reporting_integration_callbacks(self.args.report_to) + callbacks = default_callbacks if callbacks is None else default_callbacks + callbacks + self.callback_handler = CallbackHandler( + callbacks, self.model, self.tokenizer, self.optimizer, self.lr_scheduler + ) + self.add_callback(PrinterCallback if self.args.disable_tqdm else DEFAULT_PROGRESS_CALLBACK) + + # Will be set to True by `self._setup_loggers()` on first call to `self.log()`. + self._loggers_initialized = False + + # Create clone of distant repo and output directory if needed + if self.args.push_to_hub: + self.init_git_repo(at_init=True) + # In case of pull, we need to make sure every process has the latest. + if is_torch_tpu_available(): + xm.rendezvous("init git repo") + elif args.local_rank != -1: + dist.barrier() + + if self.args.should_save: + os.makedirs(self.args.output_dir, exist_ok=True) + + if not callable(self.data_collator) and callable(getattr(self.data_collator, "collate_batch", None)): + raise ValueError("The `data_collator` should be a simple callable (function, class with `__call__`).") + + if args.max_steps > 0: + logger.info("max_steps is given, it will override any value given in num_train_epochs") + + if train_dataset is not None and not has_length(train_dataset) and args.max_steps <= 0: + raise ValueError("train_dataset does not implement __len__, max_steps has to be specified") + + if ( + train_dataset is not None + and isinstance(train_dataset, torch.utils.data.IterableDataset) + and args.group_by_length + ): + raise ValueError("the `--group_by_length` option is only available for `Dataset`, not `IterableDataset") + + self._signature_columns = None + + # Mixed precision setup + self.use_apex = False + self.use_cuda_amp = False + self.use_cpu_amp = False + + # Mixed precision setup for SageMaker Model Parallel + if is_sagemaker_mp_enabled(): + # BF16 + model parallelism in SageMaker: currently not supported, raise an error + if args.bf16: + raise ValueError("SageMaker Model Parallelism does not support BF16 yet. Please use FP16 instead ") + + if IS_SAGEMAKER_MP_POST_1_10: + # When there's mismatch between SMP config and trainer argument, use SMP config as truth + if args.fp16 != smp.state.cfg.fp16: + logger.warning( + f"FP16 provided in SM_HP_MP_PARAMETERS is {smp.state.cfg.fp16}," + f"but FP16 provided in trainer argument is {args.fp16}," + f"setting to {smp.state.cfg.fp16}" + ) + args.fp16 = smp.state.cfg.fp16 + else: + # smp < 1.10 does not support fp16 in trainer. + if hasattr(smp.state.cfg, "fp16"): + logger.warning( + f"FP16 provided in SM_HP_MP_PARAMETERS is {smp.state.cfg.fp16}, " + "but SageMaker Model Parallelism < 1.10 does not support FP16 in trainer." + ) + + if args.fp16 or args.bf16: + if args.half_precision_backend == "auto": + if args.device == torch.device("cpu"): + if args.fp16: + raise ValueError("Tried to use `fp16` but it is not supported on cpu") + elif _is_native_cpu_amp_available: + args.half_precision_backend = "cpu_amp" + else: + raise ValueError("Tried to use cpu amp but native cpu amp is not available") + else: + args.half_precision_backend = "cuda_amp" + + logger.info(f"Using {args.half_precision_backend} half precision backend") + + self.do_grad_scaling = False + if (args.fp16 or args.bf16) and not (args.deepspeed or is_sagemaker_mp_enabled() or is_torch_tpu_available()): + # deepspeed and SageMaker Model Parallel manage their own half precision + if args.half_precision_backend == "cuda_amp": + self.use_cuda_amp = True + self.amp_dtype = torch.float16 if args.fp16 else torch.bfloat16 + # bf16 does not need grad scaling + self.do_grad_scaling = self.amp_dtype == torch.float16 + if self.do_grad_scaling: + if self.sharded_ddp is not None: + self.scaler = ShardedGradScaler() + elif self.fsdp is not None: + from torch.distributed.fsdp.sharded_grad_scaler import ( + ShardedGradScaler as FSDPShardedGradScaler, + ) + + self.scaler = FSDPShardedGradScaler() + elif is_torch_tpu_available(): + from torch_xla.amp import GradScaler + + self.scaler = GradScaler() + else: + self.scaler = torch.cuda.amp.GradScaler() + elif args.half_precision_backend == "cpu_amp": + self.use_cpu_amp = True + self.amp_dtype = torch.bfloat16 + else: + if not is_apex_available(): + raise ImportError( + "Using FP16 with APEX but APEX is not installed, please refer to" + " https://www.github.com/nvidia/apex." + ) + self.use_apex = True + + # FP16 + model parallelism in SageMaker: gradient clipping does not work for now so we raise a helpful error. + if ( + is_sagemaker_mp_enabled() + and self.use_cuda_amp + and args.max_grad_norm is not None + and args.max_grad_norm > 0 + ): + raise ValueError( + "SageMaker Model Parallelism in mixed precision mode does not support gradient clipping yet. Pass " + "along 'max_grad_norm': 0 in your hyperparameters." + ) + + # Label smoothing + if self.args.label_smoothing_factor != 0: + self.label_smoother = LabelSmoother(epsilon=self.args.label_smoothing_factor) + else: + self.label_smoother = None + + self.state = TrainerState( + is_local_process_zero=self.is_local_process_zero(), + is_world_process_zero=self.is_world_process_zero(), + ) + + self.control = TrainerControl() + # Internal variable to count flos in each process, will be accumulated in `self.state.total_flos` then + # returned to 0 every time flos need to be logged + self.current_flos = 0 + self.hp_search_backend = None + self.use_tune_checkpoints = False + default_label_names = find_labels(self.model.__class__) + self.label_names = default_label_names if self.args.label_names is None else self.args.label_names + self.can_return_loss = can_return_loss(self.model.__class__) + self.control = self.callback_handler.on_init_end(self.args, self.state, self.control) + + # Internal variables to keep track of the original batch size + self._train_batch_size = args.train_batch_size + + # very last + self._memory_tracker.stop_and_update_metrics() + + # torch.compile + if args.torch_compile and not is_torch_compile_available(): + raise RuntimeError("Using torch.compile requires PyTorch 2.0 or higher.") + + def add_callback(self, callback): + """ + Add a callback to the current list of [`~transformer.TrainerCallback`]. + + Args: + callback (`type` or [`~transformer.TrainerCallback`]): + A [`~transformer.TrainerCallback`] class or an instance of a [`~transformer.TrainerCallback`]. In the + first case, will instantiate a member of that class. + """ + self.callback_handler.add_callback(callback) + + def pop_callback(self, callback): + """ + Remove a callback from the current list of [`~transformer.TrainerCallback`] and returns it. + + If the callback is not found, returns `None` (and no error is raised). + + Args: + callback (`type` or [`~transformer.TrainerCallback`]): + A [`~transformer.TrainerCallback`] class or an instance of a [`~transformer.TrainerCallback`]. In the + first case, will pop the first member of that class found in the list of callbacks. + + Returns: + [`~transformer.TrainerCallback`]: The callback removed, if found. + """ + return self.callback_handler.pop_callback(callback) + + def remove_callback(self, callback): + """ + Remove a callback from the current list of [`~transformer.TrainerCallback`]. + + Args: + callback (`type` or [`~transformer.TrainerCallback`]): + A [`~transformer.TrainerCallback`] class or an instance of a [`~transformer.TrainerCallback`]. In the + first case, will remove the first member of that class found in the list of callbacks. + """ + self.callback_handler.remove_callback(callback) + + def _move_model_to_device(self, model, device): + model = model.to(device) + # Moving a model to an XLA device disconnects the tied weights, so we have to retie them. + if self.args.parallel_mode == ParallelMode.TPU and hasattr(model, "tie_weights"): + model.tie_weights() + + def _set_signature_columns_if_needed(self): + if self._signature_columns is None: + # Inspect model forward signature to keep only the arguments it accepts. + signature = inspect.signature(self.model.forward) + self._signature_columns = list(signature.parameters.keys()) + # Labels may be named label or label_ids, the default data collator handles that. + self._signature_columns += list(set(["label", "label_ids"] + self.label_names)) + + def _remove_unused_columns(self, dataset: "datasets.Dataset", description: Optional[str] = None): + if not self.args.remove_unused_columns: + return dataset + self._set_signature_columns_if_needed() + signature_columns = self._signature_columns + + ignored_columns = list(set(dataset.column_names) - set(signature_columns)) + if len(ignored_columns) > 0: + dset_description = "" if description is None else f"in the {description} set" + logger.info( + f"The following columns {dset_description} don't have a corresponding argument in " + f"`{self.model.__class__.__name__}.forward` and have been ignored: {', '.join(ignored_columns)}." + f" If {', '.join(ignored_columns)} are not expected by `{self.model.__class__.__name__}.forward`, " + " you can safely ignore this message." + ) + + columns = [k for k in signature_columns if k in dataset.column_names] + + if version.parse(datasets.__version__) < version.parse("1.4.0"): + dataset.set_format( + type=dataset.format["type"], columns=columns, format_kwargs=dataset.format["format_kwargs"] + ) + return dataset + else: + return dataset.remove_columns(ignored_columns) + + def _get_collator_with_removed_columns( + self, data_collator: Callable, description: Optional[str] = None + ) -> Callable: + """Wrap the data collator in a callable removing unused columns.""" + if not self.args.remove_unused_columns: + return data_collator + self._set_signature_columns_if_needed() + signature_columns = self._signature_columns + + remove_columns_collator = RemoveColumnsCollator( + data_collator=data_collator, + signature_columns=signature_columns, + logger=logger, + description=description, + model_name=self.model.__class__.__name__, + ) + return remove_columns_collator + + def _get_train_sampler(self) -> Optional[torch.utils.data.Sampler]: + if self.train_dataset is None or not has_length(self.train_dataset): + return None + + generator = None + if self.args.world_size <= 1: + generator = torch.Generator() + # for backwards compatibility, we generate a seed here (which is sampled from a generator seeded with + # `args.seed`) if data_seed isn't provided. + # Further on in this method, we default to `args.seed` instead. + if self.args.data_seed is None: + seed = int(torch.empty((), dtype=torch.int64).random_().item()) + else: + seed = self.args.data_seed + generator.manual_seed(seed) + + seed = self.args.data_seed if self.args.data_seed is not None else self.args.seed + + # Build the sampler. + if self.args.group_by_length: + if is_datasets_available() and isinstance(self.train_dataset, datasets.Dataset): + lengths = ( + self.train_dataset[self.args.length_column_name] + if self.args.length_column_name in self.train_dataset.column_names + else None + ) + else: + lengths = None + model_input_name = self.tokenizer.model_input_names[0] if self.tokenizer is not None else None + if self.args.world_size <= 1: + return LengthGroupedSampler( + self.args.train_batch_size * self.args.gradient_accumulation_steps, + dataset=self.train_dataset, + lengths=lengths, + model_input_name=model_input_name, + generator=generator, + ) + else: + return DistributedLengthGroupedSampler( + self.args.train_batch_size * self.args.gradient_accumulation_steps, + dataset=self.train_dataset, + num_replicas=self.args.world_size, + rank=self.args.process_index, + lengths=lengths, + model_input_name=model_input_name, + seed=seed, + ) + + else: + if self.args.world_size <= 1: + return RandomSampler(self.train_dataset, generator=generator) + elif ( + self.args.parallel_mode in [ParallelMode.TPU, ParallelMode.SAGEMAKER_MODEL_PARALLEL] + and not self.args.dataloader_drop_last + ): + # Use a loop for TPUs when drop_last is False to have all batches have the same size. + return DistributedSamplerWithLoop( + self.train_dataset, + batch_size=self.args.per_device_train_batch_size, + num_replicas=self.args.world_size, + rank=self.args.process_index, + seed=seed, + ) + else: + return DistributedSampler( + self.train_dataset, + num_replicas=self.args.world_size, + rank=self.args.process_index, + seed=seed, + ) + + def get_train_dataloader(self) -> DataLoader: + """ + Returns the training [`~torch.utils.data.DataLoader`]. + + Will use no sampler if `train_dataset` does not implement `__len__`, a random sampler (adapted to distributed + training if necessary) otherwise. + + Subclass and override this method if you want to inject some custom behavior. + """ + if self.train_dataset is None: + raise ValueError("Trainer: training requires a train_dataset.") + + train_dataset = self.train_dataset + data_collator = self.data_collator + if is_datasets_available() and isinstance(train_dataset, datasets.Dataset): + train_dataset = self._remove_unused_columns(train_dataset, description="training") + else: + data_collator = self._get_collator_with_removed_columns(data_collator, description="training") + + if isinstance(train_dataset, torch.utils.data.IterableDataset): + if self.args.world_size > 1: + train_dataset = IterableDatasetShard( + train_dataset, + batch_size=self._train_batch_size, + drop_last=self.args.dataloader_drop_last, + num_processes=self.args.world_size, + process_index=self.args.process_index, + ) + + return DataLoader( + train_dataset, + batch_size=self._train_batch_size, + collate_fn=data_collator, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + ) + + train_sampler = self._get_train_sampler() + + return DataLoader( + train_dataset, + batch_size=self._train_batch_size, + sampler=train_sampler, + collate_fn=data_collator, + drop_last=self.args.dataloader_drop_last, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + worker_init_fn=seed_worker, + ) + + def _get_eval_sampler(self, eval_dataset: Dataset) -> Optional[torch.utils.data.Sampler]: + # Deprecated code + if self.args.use_legacy_prediction_loop: + if is_torch_tpu_available(): + return SequentialDistributedSampler( + eval_dataset, num_replicas=xm.xrt_world_size(), rank=xm.get_ordinal() + ) + elif is_sagemaker_mp_enabled(): + return SequentialDistributedSampler( + eval_dataset, + num_replicas=smp.dp_size(), + rank=smp.dp_rank(), + batch_size=self.args.per_device_eval_batch_size, + ) + elif self.args.local_rank != -1: + return SequentialDistributedSampler(eval_dataset) + else: + return SequentialSampler(eval_dataset) + + if self.args.world_size <= 1: + return SequentialSampler(eval_dataset) + else: + return ShardSampler( + eval_dataset, + batch_size=self.args.per_device_eval_batch_size, + num_processes=self.args.world_size, + process_index=self.args.process_index, + ) + + def get_eval_dataloader(self, eval_dataset: Optional[Dataset] = None) -> DataLoader: + """ + Returns the evaluation [`~torch.utils.data.DataLoader`]. + + Subclass and override this method if you want to inject some custom behavior. + + Args: + eval_dataset (`torch.utils.data.Dataset`, *optional*): + If provided, will override `self.eval_dataset`. If it is a [`~datasets.Dataset`], columns not accepted + by the `model.forward()` method are automatically removed. It must implement `__len__`. + """ + if eval_dataset is None and self.eval_dataset is None: + raise ValueError("Trainer: evaluation requires an eval_dataset.") + eval_dataset = eval_dataset if eval_dataset is not None else self.eval_dataset + data_collator = self.data_collator + + if is_datasets_available() and isinstance(eval_dataset, datasets.Dataset): + eval_dataset = self._remove_unused_columns(eval_dataset, description="evaluation") + else: + data_collator = self._get_collator_with_removed_columns(data_collator, description="evaluation") + + if isinstance(eval_dataset, torch.utils.data.IterableDataset): + if self.args.world_size > 1: + eval_dataset = IterableDatasetShard( + eval_dataset, + batch_size=self.args.per_device_eval_batch_size, + drop_last=self.args.dataloader_drop_last, + num_processes=self.args.world_size, + process_index=self.args.process_index, + ) + return DataLoader( + eval_dataset, + batch_size=self.args.eval_batch_size, + collate_fn=data_collator, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + ) + + eval_sampler = self._get_eval_sampler(eval_dataset) + + return DataLoader( + eval_dataset, + sampler=eval_sampler, + batch_size=self.args.eval_batch_size, + collate_fn=data_collator, + drop_last=self.args.dataloader_drop_last, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + ) + + def get_test_dataloader(self, test_dataset: Dataset) -> DataLoader: + """ + Returns the test [`~torch.utils.data.DataLoader`]. + + Subclass and override this method if you want to inject some custom behavior. + + Args: + test_dataset (`torch.utils.data.Dataset`, *optional*): + The test dataset to use. If it is a [`~datasets.Dataset`], columns not accepted by the + `model.forward()` method are automatically removed. It must implement `__len__`. + """ + data_collator = self.data_collator + + if is_datasets_available() and isinstance(test_dataset, datasets.Dataset): + test_dataset = self._remove_unused_columns(test_dataset, description="test") + else: + data_collator = self._get_collator_with_removed_columns(data_collator, description="test") + + if isinstance(test_dataset, torch.utils.data.IterableDataset): + if self.args.world_size > 1: + test_dataset = IterableDatasetShard( + test_dataset, + batch_size=self.args.eval_batch_size, + drop_last=self.args.dataloader_drop_last, + num_processes=self.args.world_size, + process_index=self.args.process_index, + ) + return DataLoader( + test_dataset, + batch_size=self.args.eval_batch_size, + collate_fn=data_collator, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + ) + + test_sampler = self._get_eval_sampler(test_dataset) + + # We use the same batch_size as for eval. + return DataLoader( + test_dataset, + sampler=test_sampler, + batch_size=self.args.eval_batch_size, + collate_fn=data_collator, + drop_last=self.args.dataloader_drop_last, + num_workers=self.args.dataloader_num_workers, + pin_memory=self.args.dataloader_pin_memory, + ) + + def create_optimizer_and_scheduler(self, num_training_steps: int): + """ + Setup the optimizer and the learning rate scheduler. + + We provide a reasonable default that works well. If you want to use something else, you can pass a tuple in the + Trainer's init through `optimizers`, or subclass and override this method (or `create_optimizer` and/or + `create_scheduler`) in a subclass. + """ + self.create_optimizer() + if IS_SAGEMAKER_MP_POST_1_10 and smp.state.cfg.fp16: + # If smp >= 1.10 and fp16 is enabled, we unwrap the optimizer + optimizer = self.optimizer.optimizer + else: + optimizer = self.optimizer + self.create_scheduler(num_training_steps=num_training_steps, optimizer=optimizer) + + def create_optimizer(self): + """ + Setup the optimizer. + + We provide a reasonable default that works well. If you want to use something else, you can pass a tuple in the + Trainer's init through `optimizers`, or subclass and override this method in a subclass. + """ + opt_model = self.model_wrapped if is_sagemaker_mp_enabled() else self.model + + if self.optimizer is None: + decay_parameters = get_parameter_names(opt_model, ALL_LAYERNORM_LAYERS) + decay_parameters = [name for name in decay_parameters if "bias" not in name] + optimizer_grouped_parameters = [ + { + "params": [ + p for n, p in opt_model.named_parameters() if (n in decay_parameters and p.requires_grad) + ], + "weight_decay": self.args.weight_decay, + }, + { + "params": [ + p for n, p in opt_model.named_parameters() if (n not in decay_parameters and p.requires_grad) + ], + "weight_decay": 0.0, + }, + ] + + optimizer_cls, optimizer_kwargs = Trainer.get_optimizer_cls_and_kwargs(self.args) + + if self.sharded_ddp == ShardedDDPOption.SIMPLE: + self.optimizer = OSS( + params=optimizer_grouped_parameters, + optim=optimizer_cls, + **optimizer_kwargs, + ) + else: + self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs) + if optimizer_cls.__name__ == "Adam8bit": + import bitsandbytes + + manager = bitsandbytes.optim.GlobalOptimManager.get_instance() + + skipped = 0 + for module in opt_model.modules(): + if isinstance(module, nn.Embedding): + skipped += sum({p.data_ptr(): p.numel() for p in module.parameters()}.values()) + print(f"skipped {module}: {skipped/2**20}M params") + manager.register_module_override(module, "weight", {"optim_bits": 32}) + logger.debug(f"bitsandbytes: will optimize {module} in fp32") + print(f"skipped: {skipped/2**20}M params") + + if is_sagemaker_mp_enabled(): + self.optimizer = smp.DistributedOptimizer(self.optimizer) + + return self.optimizer + + @staticmethod + def get_optimizer_cls_and_kwargs(args: TrainingArguments) -> Tuple[Any, Any]: + """ + Returns the optimizer class and optimizer parameters based on the training arguments. + + Args: + args (`transformers.training_args.TrainingArguments`): + The training arguments for the training session. + + """ + + # parse args.optim_args + optim_args = {} + if args.optim_args: + for mapping in args.optim_args.replace(" ", "").split(","): + key, value = mapping.split("=") + optim_args[key] = value + + optimizer_kwargs = {"lr": args.learning_rate} + + adam_kwargs = { + "betas": (args.adam_beta1, args.adam_beta2), + "eps": args.adam_epsilon, + } + if args.optim == OptimizerNames.ADAFACTOR: + optimizer_cls = Adafactor + optimizer_kwargs.update({"scale_parameter": False, "relative_step": False}) + elif args.optim == OptimizerNames.ADAMW_HF: + from transformers.optimization import AdamW + + optimizer_cls = AdamW + optimizer_kwargs.update(adam_kwargs) + elif args.optim in [OptimizerNames.ADAMW_TORCH, OptimizerNames.ADAMW_TORCH_FUSED]: + from torch.optim import AdamW + + optimizer_cls = AdamW + optimizer_kwargs.update(adam_kwargs) + if args.optim == OptimizerNames.ADAMW_TORCH_FUSED: + optimizer_kwargs.update({"fused": True}) + elif args.optim == OptimizerNames.ADAMW_TORCH_XLA: + try: + from torch_xla.amp.syncfree import AdamW + + optimizer_cls = AdamW + optimizer_kwargs.update(adam_kwargs) + except ImportError: + raise ValueError("Trainer failed to import syncfree AdamW from torch_xla.") + elif args.optim == OptimizerNames.ADAMW_APEX_FUSED: + try: + from apex.optimizers import FusedAdam + + optimizer_cls = FusedAdam + optimizer_kwargs.update(adam_kwargs) + except ImportError: + raise ValueError("Trainer tried to instantiate apex FusedAdam but apex is not installed!") + elif args.optim == OptimizerNames.ADAMW_BNB: + try: + from bitsandbytes.optim import Adam8bit + + optimizer_cls = Adam8bit + optimizer_kwargs.update(adam_kwargs) + except ImportError: + raise ValueError("Trainer tried to instantiate bnb Adam8bit but bnb is not installed!") + elif args.optim == OptimizerNames.ADAMW_ANYPRECISION: + try: + from torchdistx.optimizers import AnyPrecisionAdamW + + optimizer_cls = AnyPrecisionAdamW + optimizer_kwargs.update(adam_kwargs) + + # TODO Change dtypes back to M=FP32, Var = BF16, Kahan = False once they can be cast together in torchdistx. + optimizer_kwargs.update( + { + "use_kahan_summation": strtobool(optim_args.get("use_kahan_summation", "False")), + "momentum_dtype": getattr(torch, optim_args.get("momentum_dtype", "float32")), + "variance_dtype": getattr(torch, optim_args.get("variance_dtype", "float32")), + "compensation_buffer_dtype": getattr( + torch, optim_args.get("compensation_buffer_dtype", "bfloat16") + ), + } + ) + except ImportError: + raise ValueError("Please install https://github.com/pytorch/torchdistx") + elif args.optim == OptimizerNames.SGD: + optimizer_cls = torch.optim.SGD + elif args.optim == OptimizerNames.ADAGRAD: + optimizer_cls = torch.optim.Adagrad + else: + raise ValueError(f"Trainer cannot instantiate unsupported optimizer: {args.optim}") + return optimizer_cls, optimizer_kwargs + + def create_scheduler(self, num_training_steps: int, optimizer: torch.optim.Optimizer = None): + """ + Setup the scheduler. The optimizer of the trainer must have been set up either before this method is called or + passed as an argument. + + Args: + num_training_steps (int): The number of training steps to do. + """ + if self.lr_scheduler is None: + self.lr_scheduler = get_scheduler( + self.args.lr_scheduler_type, + optimizer=self.optimizer if optimizer is None else optimizer, + num_warmup_steps=self.args.get_warmup_steps(num_training_steps), + num_training_steps=num_training_steps, + ) + return self.lr_scheduler + + def num_examples(self, dataloader: DataLoader) -> int: + """ + Helper to get number of samples in a [`~torch.utils.data.DataLoader`] by accessing its dataset. When + dataloader.dataset does not exist or has no length, estimates as best it can + """ + try: + dataset = dataloader.dataset + # Special case for IterableDatasetShard, we need to dig deeper + if isinstance(dataset, IterableDatasetShard): + return len(dataloader.dataset.dataset) + return len(dataloader.dataset) + except (NameError, AttributeError, TypeError): # no dataset or length, estimate by length of dataloader + return len(dataloader) * self.args.per_device_train_batch_size + + def _hp_search_setup(self, trial: Union["optuna.Trial", Dict[str, Any]]): + """HP search setup code""" + self._trial = trial + + if self.hp_search_backend is None or trial is None: + return + if self.hp_search_backend == HPSearchBackend.OPTUNA: + params = self.hp_space(trial) + elif self.hp_search_backend == HPSearchBackend.RAY: + params = trial + params.pop("wandb", None) + elif self.hp_search_backend == HPSearchBackend.SIGOPT: + params = {k: int(v) if isinstance(v, str) else v for k, v in trial.assignments.items()} + elif self.hp_search_backend == HPSearchBackend.WANDB: + params = trial + + for key, value in params.items(): + if not hasattr(self.args, key): + logger.warning( + f"Trying to set {key} in the hyperparameter search but there is no corresponding field in" + " `TrainingArguments`." + ) + continue + old_attr = getattr(self.args, key, None) + # Casting value to the proper type + if old_attr is not None: + value = type(old_attr)(value) + setattr(self.args, key, value) + if self.hp_search_backend == HPSearchBackend.OPTUNA: + logger.info(f"Trial: {trial.params}") + if self.hp_search_backend == HPSearchBackend.SIGOPT: + logger.info(f"SigOpt Assignments: {trial.assignments}") + if self.hp_search_backend == HPSearchBackend.WANDB: + logger.info(f"W&B Sweep parameters: {trial}") + if self.args.deepspeed: + # Rebuild the deepspeed config to reflect the updated training parameters + from transformers.deepspeed import HfTrainerDeepSpeedConfig + + self.args.hf_deepspeed_config = HfTrainerDeepSpeedConfig(self.args.deepspeed) + self.args.hf_deepspeed_config.trainer_config_process(self.args) + + def _report_to_hp_search(self, trial: Union["optuna.Trial", Dict[str, Any]], step: int, metrics: Dict[str, float]): + if self.hp_search_backend is None or trial is None: + return + self.objective = self.compute_objective(metrics.copy()) + if self.hp_search_backend == HPSearchBackend.OPTUNA: + import optuna + + trial.report(self.objective, step) + if trial.should_prune(): + self.callback_handler.on_train_end(self.args, self.state, self.control) + raise optuna.TrialPruned() + elif self.hp_search_backend == HPSearchBackend.RAY: + from ray import tune + + if self.control.should_save: + self._tune_save_checkpoint() + tune.report(objective=self.objective, **metrics) + + def _tune_save_checkpoint(self): + from ray import tune + + if not self.use_tune_checkpoints: + return + with tune.checkpoint_dir(step=self.state.global_step) as checkpoint_dir: + output_dir = os.path.join(checkpoint_dir, f"{PREFIX_CHECKPOINT_DIR}-{self.state.global_step}") + self.save_model(output_dir, _internal_call=True) + if self.args.should_save: + self.state.save_to_json(os.path.join(output_dir, TRAINER_STATE_NAME)) + torch.save(self.optimizer.state_dict(), os.path.join(output_dir, OPTIMIZER_NAME)) + torch.save(self.lr_scheduler.state_dict(), os.path.join(output_dir, SCHEDULER_NAME)) + + def call_model_init(self, trial=None): + model_init_argcount = number_of_arguments(self.model_init) + if model_init_argcount == 0: + model = self.model_init() + elif model_init_argcount == 1: + model = self.model_init(trial) + else: + raise RuntimeError("model_init should have 0 or 1 argument.") + + if model is None: + raise RuntimeError("model_init should not return None.") + + return model + + def torch_jit_model_eval(self, model, dataloader, training=False): + if not training: + if dataloader is None: + logger.warning("failed to use PyTorch jit mode due to current dataloader is none.") + return model + example_batch = next(iter(dataloader)) + example_batch = self._prepare_inputs(example_batch) + try: + jit_model = model.eval() + with ContextManagers([self.autocast_smart_context_manager(cache_enabled=False), torch.no_grad()]): + if version.parse(version.parse(torch.__version__).base_version) >= version.parse("1.14.0"): + if isinstance(example_batch, dict): + jit_model = torch.jit.trace(jit_model, example_kwarg_inputs=example_batch, strict=False) + else: + jit_model = torch.jit.trace( + jit_model, + example_kwarg_inputs={key: example_batch[key] for key in example_batch}, + strict=False, + ) + else: + jit_inputs = [] + for key in example_batch: + example_tensor = torch.ones_like(example_batch[key]) + jit_inputs.append(example_tensor) + jit_inputs = tuple(jit_inputs) + jit_model = torch.jit.trace(jit_model, jit_inputs, strict=False) + jit_model = torch.jit.freeze(jit_model) + with torch.no_grad(): + jit_model(**example_batch) + jit_model(**example_batch) + model = jit_model + self.use_cpu_amp = False + self.use_cuda_amp = False + except (RuntimeError, TypeError, ValueError, NameError, IndexError) as e: + logger.warning(f"failed to use PyTorch jit mode due to: {e}.") + + return model + + def ipex_optimize_model(self, model, training=False, dtype=torch.float32): + if not is_ipex_available(): + raise ImportError( + "Using IPEX but IPEX is not installed or IPEX's version does not match current PyTorch, please refer" + " to https://github.com/intel/intel-extension-for-pytorch." + ) + + import intel_extension_for_pytorch as ipex + + if not training: + model.eval() + dtype = torch.bfloat16 if not self.is_in_train and self.args.bf16_full_eval else dtype + # conv_bn_folding is disabled as it fails in symbolic tracing, resulting in ipex warnings + model = ipex.optimize(model, dtype=dtype, level="O1", conv_bn_folding=False, inplace=not self.is_in_train) + else: + if not model.training: + model.train() + model, self.optimizer = ipex.optimize( + model, dtype=dtype, optimizer=self.optimizer, inplace=True, level="O1" + ) + + return model + + def _wrap_model(self, model, training=True, dataloader=None): + if self.args.torch_compile: + model = torch.compile(model, backend=self.args.torch_compile_backend, mode=self.args.torch_compile_mode) + + if self.args.use_ipex: + dtype = torch.bfloat16 if self.use_cpu_amp else torch.float32 + model = self.ipex_optimize_model(model, training, dtype=dtype) + + if is_sagemaker_mp_enabled(): + # Wrapping the base model twice in a DistributedModel will raise an error. + if isinstance(self.model_wrapped, smp.model.DistributedModel): + return self.model_wrapped + return smp.DistributedModel(model, backward_passes_per_step=self.args.gradient_accumulation_steps) + + # already initialized its own DDP and AMP + if self.deepspeed: + return self.deepspeed + + # train/eval could be run multiple-times - if already wrapped, don't re-wrap it again + if unwrap_model(model) is not model: + return model + + # Mixed precision training with apex (torch < 1.6) + if self.use_apex and training: + model, self.optimizer = amp.initialize(model, self.optimizer, opt_level=self.args.fp16_opt_level) + + # Multi-gpu training (should be after apex fp16 initialization) + if self.args.n_gpu > 1: + model = nn.DataParallel(model) + + if self.args.jit_mode_eval: + start_time = time.time() + model = self.torch_jit_model_eval(model, dataloader, training) + self.jit_compilation_time = round(time.time() - start_time, 4) + + # Note: in torch.distributed mode, there's no point in wrapping the model + # inside a DistributedDataParallel as we'll be under `no_grad` anyways. + if not training: + return model + + # Distributed training (should be after apex fp16 initialization) + if self.sharded_ddp is not None: + # Sharded DDP! + if self.sharded_ddp == ShardedDDPOption.SIMPLE: + model = ShardedDDP(model, self.optimizer) + else: + mixed_precision = self.args.fp16 or self.args.bf16 + cpu_offload = ShardedDDPOption.OFFLOAD in self.args.sharded_ddp + zero_3 = self.sharded_ddp == ShardedDDPOption.ZERO_DP_3 + # XXX: Breaking the self.model convention but I see no way around it for now. + if ShardedDDPOption.AUTO_WRAP in self.args.sharded_ddp: + model = auto_wrap(model) + self.model = model = FullyShardedDDP( + model, + mixed_precision=mixed_precision, + reshard_after_forward=zero_3, + cpu_offload=cpu_offload, + ).to(self.args.device) + # Distributed training using PyTorch FSDP + elif self.fsdp is not None: + if not self.args.fsdp_config["xla"]: + # PyTorch FSDP! + from torch.distributed.fsdp.fully_sharded_data_parallel import CPUOffload, MixedPrecision + from torch.distributed.fsdp.fully_sharded_data_parallel import FullyShardedDataParallel as FSDP + from torch.distributed.fsdp.wrap import size_based_auto_wrap_policy, transformer_auto_wrap_policy + + if FSDPOption.OFFLOAD in self.args.fsdp: + cpu_offload = CPUOffload(offload_params=True) + else: + cpu_offload = CPUOffload(offload_params=False) + + auto_wrap_policy = None + + if FSDPOption.AUTO_WRAP in self.args.fsdp: + if self.args.fsdp_config["fsdp_min_num_params"] > 0: + auto_wrap_policy = functools.partial( + size_based_auto_wrap_policy, min_num_params=self.args.fsdp_config["fsdp_min_num_params"] + ) + elif self.args.fsdp_config.get("fsdp_transformer_layer_cls_to_wrap", None) is not None: + transformer_cls_to_wrap = set() + for layer_class in self.args.fsdp_config["fsdp_transformer_layer_cls_to_wrap"]: + transformer_cls = get_module_class_from_name(model, layer_class) + if transformer_cls is None: + raise Exception("Could not find the transformer layer class to wrap in the model.") + else: + transformer_cls_to_wrap.add(transformer_cls) + auto_wrap_policy = functools.partial( + transformer_auto_wrap_policy, + # Transformer layer class to wrap + transformer_layer_cls=transformer_cls_to_wrap, + ) + mixed_precision_policy = None + dtype = None + if self.args.fp16: + dtype = torch.float16 + elif self.args.bf16: + dtype = torch.bfloat16 + if dtype is not None: + mixed_precision_policy = MixedPrecision(param_dtype=dtype, reduce_dtype=dtype, buffer_dtype=dtype) + if type(model) != FSDP: + # XXX: Breaking the self.model convention but I see no way around it for now. + self.model = model = FSDP( + model, + sharding_strategy=self.fsdp, + cpu_offload=cpu_offload, + auto_wrap_policy=auto_wrap_policy, + mixed_precision=mixed_precision_policy, + device_id=self.args.device, + backward_prefetch=self.backward_prefetch, + forward_prefetch=self.forword_prefetch, + limit_all_gathers=self.limit_all_gathers, + ) + else: + try: + from torch_xla.distributed.fsdp import XlaFullyShardedDataParallel as FSDP + from torch_xla.distributed.fsdp import checkpoint_module + from torch_xla.distributed.fsdp.wrap import ( + size_based_auto_wrap_policy, + transformer_auto_wrap_policy, + ) + except ImportError: + raise ImportError("Missing XLA FSDP related module; please make sure to use torch-xla >= 2.0.") + auto_wrap_policy = None + auto_wrapper_callable = None + if self.args.fsdp_config["fsdp_min_num_params"] > 0: + auto_wrap_policy = functools.partial( + size_based_auto_wrap_policy, min_num_params=self.args.fsdp_config["fsdp_min_num_params"] + ) + elif self.args.fsdp_config.get("fsdp_transformer_layer_cls_to_wrap", None) is not None: + transformer_cls_to_wrap = set() + for layer_class in self.args.fsdp_config["fsdp_transformer_layer_cls_to_wrap"]: + transformer_cls = get_module_class_from_name(model, layer_class) + if transformer_cls is None: + raise Exception("Could not find the transformer layer class to wrap in the model.") + else: + transformer_cls_to_wrap.add(transformer_cls) + auto_wrap_policy = functools.partial( + transformer_auto_wrap_policy, + # Transformer layer class to wrap + transformer_layer_cls=transformer_cls_to_wrap, + ) + fsdp_kwargs = self.args.xla_fsdp_config + if self.args.fsdp_config["xla_fsdp_grad_ckpt"]: + # Apply gradient checkpointing to auto-wrapped sub-modules if specified + def auto_wrapper_callable(m, *args, **kwargs): + return FSDP(checkpoint_module(m), *args, **kwargs) + + # Wrap the base model with an outer FSDP wrapper + self.model = model = FSDP( + model, + auto_wrap_policy=auto_wrap_policy, + auto_wrapper_callable=auto_wrapper_callable, + **fsdp_kwargs, + ) + + # Patch `xm.optimizer_step` should not reduce gradients in this case, + # as FSDP does not need gradient reduction over sharded parameters. + def patched_optimizer_step(optimizer, barrier=False, optimizer_args={}): + loss = optimizer.step(**optimizer_args) + if barrier: + xm.mark_step() + return loss + + xm.optimizer_step = patched_optimizer_step + elif is_sagemaker_dp_enabled(): + model = nn.parallel.DistributedDataParallel( + model, device_ids=[int(os.getenv("SMDATAPARALLEL_LOCAL_RANK"))] + ) + elif self.args.local_rank != -1: + kwargs = {} + if self.args.ddp_find_unused_parameters is not None: + kwargs["find_unused_parameters"] = self.args.ddp_find_unused_parameters + elif isinstance(model, PreTrainedModel): + # find_unused_parameters breaks checkpointing as per + # https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 + kwargs["find_unused_parameters"] = not model.is_gradient_checkpointing + else: + kwargs["find_unused_parameters"] = True + + if self.args.ddp_bucket_cap_mb is not None: + kwargs["bucket_cap_mb"] = self.args.ddp_bucket_cap_mb + if is_torch_neuroncore_available(): + return model + model = nn.parallel.DistributedDataParallel( + model, + device_ids=[self.args.local_rank] if self.args._n_gpu != 0 else None, + output_device=self.args.local_rank if self.args._n_gpu != 0 else None, + **kwargs, + ) + + return model + + def train( + self, + resume_from_checkpoint: Optional[Union[str, bool]] = None, + trial: Union["optuna.Trial", Dict[str, Any]] = None, + ignore_keys_for_eval: Optional[List[str]] = None, + **kwargs, + ): + """ + Main training entry point. + + Args: + resume_from_checkpoint (`str` or `bool`, *optional*): + If a `str`, local path to a saved checkpoint as saved by a previous instance of [`Trainer`]. If a + `bool` and equals `True`, load the last checkpoint in *args.output_dir* as saved by a previous instance + of [`Trainer`]. If present, training will resume from the model/optimizer/scheduler states loaded here. + trial (`optuna.Trial` or `Dict[str, Any]`, *optional*): + The trial run or the hyperparameter dictionary for hyperparameter search. + ignore_keys_for_eval (`List[str]`, *optional*) + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions for evaluation during the training. + kwargs: + Additional keyword arguments used to hide deprecated arguments + """ + if resume_from_checkpoint is False: + resume_from_checkpoint = None + + # memory metrics - must set up as early as possible + self._memory_tracker.start() + + args = self.args + + self.is_in_train = True + + # do_train is not a reliable argument, as it might not be set and .train() still called, so + # the following is a workaround: + if (args.fp16_full_eval or args.bf16_full_eval) and not args.do_train: + self._move_model_to_device(self.model, args.device) + + if "model_path" in kwargs: + resume_from_checkpoint = kwargs.pop("model_path") + warnings.warn( + "`model_path` is deprecated and will be removed in a future version. Use `resume_from_checkpoint` " + "instead.", + FutureWarning, + ) + if len(kwargs) > 0: + raise TypeError(f"train() received got unexpected keyword arguments: {', '.join(list(kwargs.keys()))}.") + # This might change the seed so needs to run first. + self._hp_search_setup(trial) + self._train_batch_size = self.args.train_batch_size + + # Model re-init + model_reloaded = False + if self.model_init is not None: + # Seed must be set before instantiating the model when using model_init. + enable_full_determinism(self.args.seed) if self.args.full_determinism else set_seed(self.args.seed) + self.model = self.call_model_init(trial) + model_reloaded = True + # Reinitializes optimizer and scheduler + self.optimizer, self.lr_scheduler = None, None + + # Load potential model checkpoint + if isinstance(resume_from_checkpoint, bool) and resume_from_checkpoint: + resume_from_checkpoint = get_last_checkpoint(args.output_dir) + if resume_from_checkpoint is None: + raise ValueError(f"No valid checkpoint found in output directory ({args.output_dir})") + + if resume_from_checkpoint is not None and not is_sagemaker_mp_enabled() and args.deepspeed is None: + self._load_from_checkpoint(resume_from_checkpoint) + + # If model was re-initialized, put it on the right device and update self.model_wrapped + if model_reloaded: + if self.place_model_on_device: + self._move_model_to_device(self.model, args.device) + self.model_wrapped = self.model + + inner_training_loop = find_executable_batch_size( + self._inner_training_loop, self._train_batch_size, args.auto_find_batch_size + ) + return inner_training_loop( + args=args, + resume_from_checkpoint=resume_from_checkpoint, + trial=trial, + ignore_keys_for_eval=ignore_keys_for_eval, + ) + + def _inner_training_loop( + self, batch_size=None, args=None, resume_from_checkpoint=None, trial=None, ignore_keys_for_eval=None + ): + self._train_batch_size = batch_size + # Data loader and number of training steps + train_dataloader = self.get_train_dataloader() + + # Setting up training control variables: + # number of training epochs: num_train_epochs + # number of training steps per epoch: num_update_steps_per_epoch + # total number of training steps to execute: max_steps + total_train_batch_size = args.train_batch_size * args.gradient_accumulation_steps * args.world_size + + len_dataloader = None + if has_length(train_dataloader): + len_dataloader = len(train_dataloader) + num_update_steps_per_epoch = len_dataloader // args.gradient_accumulation_steps + num_update_steps_per_epoch = max(num_update_steps_per_epoch, 1) + num_examples = self.num_examples(train_dataloader) + if args.max_steps > 0: + max_steps = args.max_steps + num_train_epochs = args.max_steps // num_update_steps_per_epoch + int( + args.max_steps % num_update_steps_per_epoch > 0 + ) + # May be slightly incorrect if the last batch in the training dataloader has a smaller size but it's + # the best we can do. + num_train_samples = args.max_steps * total_train_batch_size + else: + max_steps = math.ceil(args.num_train_epochs * num_update_steps_per_epoch) + num_train_epochs = math.ceil(args.num_train_epochs) + num_train_samples = self.num_examples(train_dataloader) * args.num_train_epochs + elif args.max_steps > 0: # Rely on max_steps when dataloader does not have a working size + max_steps = args.max_steps + # Setting a very large number of epochs so we go as many times as necessary over the iterator. + num_train_epochs = sys.maxsize + num_update_steps_per_epoch = max_steps + num_examples = total_train_batch_size * args.max_steps + num_train_samples = args.max_steps * total_train_batch_size + else: + raise ValueError( + "args.max_steps must be set to a positive value if dataloader does not have a length, was" + f" {args.max_steps}" + ) + + if DebugOption.UNDERFLOW_OVERFLOW in self.args.debug: + if self.args.n_gpu > 1: + # nn.DataParallel(model) replicates the model, creating new variables and module + # references registered here no longer work on other gpus, breaking the module + raise ValueError( + "Currently --debug underflow_overflow is not supported under DP. Please use DDP" + " (torch.distributed.launch)." + ) + else: + debug_overflow = DebugUnderflowOverflow(self.model) # noqa + + delay_optimizer_creation = ( + self.sharded_ddp is not None + and self.sharded_ddp != ShardedDDPOption.SIMPLE + or is_sagemaker_mp_enabled() + or self.fsdp is not None + ) + if args.deepspeed: + deepspeed_engine, optimizer, lr_scheduler = deepspeed_init( + self, num_training_steps=max_steps, resume_from_checkpoint=resume_from_checkpoint + ) + self.model = deepspeed_engine.module + self.model_wrapped = deepspeed_engine + self.deepspeed = deepspeed_engine + self.optimizer = optimizer + self.lr_scheduler = lr_scheduler + elif not delay_optimizer_creation: + self.create_optimizer_and_scheduler(num_training_steps=max_steps) + + self.state = TrainerState() + self.state.is_hyper_param_search = trial is not None + + # Activate gradient checkpointing if needed + if args.gradient_checkpointing: + self.model.gradient_checkpointing_enable() + + model = self._wrap_model(self.model_wrapped) + + if is_sagemaker_mp_enabled() and resume_from_checkpoint is not None: + self._load_from_checkpoint(resume_from_checkpoint, model) + + # for the rest of this function `model` is the outside model, whether it was wrapped or not + if model is not self.model: + self.model_wrapped = model + + if delay_optimizer_creation: + self.create_optimizer_and_scheduler(num_training_steps=max_steps) + + # Check if saved optimizer or scheduler states exist + self._load_optimizer_and_scheduler(resume_from_checkpoint) + + # important: at this point: + # self.model is the Transformers Model + # self.model_wrapped is DDP(Transformers Model), Deepspeed(Transformers Model), etc. + + # Train! + logger.info("***** Running training *****") + logger.info(f" Num examples = {num_examples}") + logger.info(f" Num Epochs = {num_train_epochs}") + logger.info(f" Instantaneous batch size per device = {args.per_device_train_batch_size}") + logger.info(f" Total train batch size (w. parallel, distributed & accumulation) = {total_train_batch_size}") + logger.info(f" Gradient Accumulation steps = {args.gradient_accumulation_steps}") + logger.info(f" Total optimization steps = {max_steps}") + logger.info( + f" Number of trainable parameters = {sum(p.numel() for p in model.parameters() if p.requires_grad)}" + ) + + self.state.epoch = 0 + start_time = time.time() + epochs_trained = 0 + steps_trained_in_current_epoch = 0 + steps_trained_progress_bar = None + + # Check if continuing training from a checkpoint + if resume_from_checkpoint is not None and os.path.isfile( + os.path.join(resume_from_checkpoint, TRAINER_STATE_NAME) + ): + self.state = TrainerState.load_from_json(os.path.join(resume_from_checkpoint, TRAINER_STATE_NAME)) + epochs_trained = self.state.global_step // num_update_steps_per_epoch + if not args.ignore_data_skip: + steps_trained_in_current_epoch = self.state.global_step % (num_update_steps_per_epoch) + steps_trained_in_current_epoch *= args.gradient_accumulation_steps + else: + steps_trained_in_current_epoch = 0 + + logger.info(" Continuing training from checkpoint, will skip to saved global_step") + logger.info(f" Continuing training from epoch {epochs_trained}") + logger.info(f" Continuing training from global step {self.state.global_step}") + if not args.ignore_data_skip: + if skip_first_batches is None: + logger.info( + f" Will skip the first {epochs_trained} epochs then the first" + f" {steps_trained_in_current_epoch} batches in the first epoch. If this takes a lot of time," + " you can install the latest version of Accelerate with `pip install -U accelerate`.You can" + " also add the `--ignore_data_skip` flag to your launch command, but you will resume the" + " training on data already seen by your model." + ) + else: + logger.info( + f" Will skip the first {epochs_trained} epochs then the first" + f" {steps_trained_in_current_epoch} batches in the first epoch." + ) + if self.is_local_process_zero() and not args.disable_tqdm and skip_first_batches is None: + steps_trained_progress_bar = tqdm(total=steps_trained_in_current_epoch) + steps_trained_progress_bar.set_description("Skipping the first batches") + + # Update the references + self.callback_handler.model = self.model + self.callback_handler.optimizer = self.optimizer + self.callback_handler.lr_scheduler = self.lr_scheduler + self.callback_handler.train_dataloader = train_dataloader + if self.hp_name is not None and self._trial is not None: + # use self._trial because the SigOpt/Optuna hpo only call `_hp_search_setup(trial)` instead of passing trial + # parameter to Train when using DDP. + self.state.trial_name = self.hp_name(self._trial) + if trial is not None: + assignments = trial.assignments if self.hp_search_backend == HPSearchBackend.SIGOPT else trial + self.state.trial_params = hp_params(assignments) + else: + self.state.trial_params = None + # This should be the same if the state has been saved but in case the training arguments changed, it's safer + # to set this after the load. + self.state.max_steps = max_steps + self.state.num_train_epochs = num_train_epochs + self.state.is_local_process_zero = self.is_local_process_zero() + self.state.is_world_process_zero = self.is_world_process_zero() + + # tr_loss is a tensor to avoid synchronization of TPUs through .item() + tr_loss = torch.tensor(0.0).to(args.device) + # _total_loss_scalar is updated everytime .item() has to be called on tr_loss and stores the sum of all losses + self._total_loss_scalar = 0.0 + self._globalstep_last_logged = self.state.global_step + model.zero_grad() + + self.control = self.callback_handler.on_train_begin(args, self.state, self.control) + + # Skip the first epochs_trained epochs to get the random state of the dataloader at the right point. + if not args.ignore_data_skip: + for epoch in range(epochs_trained): + is_random_sampler = hasattr(train_dataloader, "sampler") and isinstance( + train_dataloader.sampler, RandomSampler + ) + if is_torch_less_than_1_11 or not is_random_sampler: + # We just need to begin an iteration to create the randomization of the sampler. + # That was before PyTorch 1.11 however... + for _ in train_dataloader: + break + else: + # Otherwise we need to call the whooooole sampler cause there is some random operation added + # AT THE VERY END! + _ = list(train_dataloader.sampler) + + total_batched_samples = 0 + for epoch in range(epochs_trained, num_train_epochs): + if isinstance(train_dataloader, DataLoader) and isinstance(train_dataloader.sampler, DistributedSampler): + train_dataloader.sampler.set_epoch(epoch) + elif hasattr(train_dataloader, "dataset") and isinstance(train_dataloader.dataset, IterableDatasetShard): + train_dataloader.dataset.set_epoch(epoch) + + if is_torch_tpu_available(): + parallel_loader = pl.ParallelLoader(train_dataloader, [args.device]).per_device_loader(args.device) + epoch_iterator = parallel_loader + else: + epoch_iterator = train_dataloader + + # Reset the past mems state at the beginning of each epoch if necessary. + if args.past_index >= 0: + self._past = None + + steps_in_epoch = ( + len(epoch_iterator) + if len_dataloader is not None + else args.max_steps * args.gradient_accumulation_steps + ) + self.control = self.callback_handler.on_epoch_begin(args, self.state, self.control) + + if epoch == epochs_trained and resume_from_checkpoint is not None and steps_trained_in_current_epoch == 0: + self._load_rng_state(resume_from_checkpoint) + + rng_to_sync = False + steps_skipped = 0 + if skip_first_batches is not None and steps_trained_in_current_epoch > 0: + epoch_iterator = skip_first_batches(epoch_iterator, steps_trained_in_current_epoch) + steps_skipped = steps_trained_in_current_epoch + steps_trained_in_current_epoch = 0 + rng_to_sync = True + + step = -1 + for step, inputs in enumerate(epoch_iterator): + total_batched_samples += 1 + if rng_to_sync: + self._load_rng_state(resume_from_checkpoint) + rng_to_sync = False + + # Skip past any already trained steps if resuming training + if steps_trained_in_current_epoch > 0: + steps_trained_in_current_epoch -= 1 + if steps_trained_progress_bar is not None: + steps_trained_progress_bar.update(1) + if steps_trained_in_current_epoch == 0: + self._load_rng_state(resume_from_checkpoint) + continue + elif steps_trained_progress_bar is not None: + steps_trained_progress_bar.close() + steps_trained_progress_bar = None + + if step % args.gradient_accumulation_steps == 0: + self.control = self.callback_handler.on_step_begin(args, self.state, self.control) + + if ( + (total_batched_samples % args.gradient_accumulation_steps != 0) + and args.local_rank != -1 + and args._no_sync_in_gradient_accumulation + ): + # Avoid unnecessary DDP synchronization since there will be no backward pass on this example. + with model.no_sync(): + tr_loss_step = self.training_step(model, inputs) + else: + tr_loss_step = self.training_step(model, inputs) + + if ( + args.logging_nan_inf_filter + and not is_torch_tpu_available() + and (torch.isnan(tr_loss_step) or torch.isinf(tr_loss_step)) + ): + # if loss is nan or inf simply add the average of previous logged losses + tr_loss += tr_loss / (1 + self.state.global_step - self._globalstep_last_logged) + else: + tr_loss += tr_loss_step + + self.current_flos += float(self.floating_point_ops(inputs)) + + # Optimizer step for deepspeed must be called on every step regardless of the value of gradient_accumulation_steps + if self.deepspeed: + self.deepspeed.step() + + if total_batched_samples % args.gradient_accumulation_steps == 0 or ( + # last step in epoch but step is always smaller than gradient_accumulation_steps + steps_in_epoch <= args.gradient_accumulation_steps + and (step + 1) == steps_in_epoch + ): + # Gradient clipping + if args.max_grad_norm is not None and args.max_grad_norm > 0 and not self.deepspeed: + # deepspeed does its own clipping + + if self.do_grad_scaling: + # Reduce gradients first for XLA + if is_torch_tpu_available(): + gradients = xm._fetch_gradients(self.optimizer) + xm.all_reduce("sum", gradients, scale=1.0 / xm.xrt_world_size()) + # AMP: gradients need unscaling + self.scaler.unscale_(self.optimizer) + + if is_sagemaker_mp_enabled() and args.fp16: + self.optimizer.clip_master_grads(args.max_grad_norm) + elif hasattr(self.optimizer, "clip_grad_norm"): + # Some optimizers (like the sharded optimizer) have a specific way to do gradient clipping + self.optimizer.clip_grad_norm(args.max_grad_norm) + elif hasattr(model, "clip_grad_norm_"): + # Some models (like FullyShardedDDP) have a specific way to do gradient clipping + model.clip_grad_norm_(args.max_grad_norm) + else: + # Revert to normal clipping otherwise, handling Apex or full precision + nn.utils.clip_grad_norm_( + amp.master_params(self.optimizer) if self.use_apex else model.parameters(), + args.max_grad_norm, + ) + + # Optimizer step + optimizer_was_run = True + if self.deepspeed: + pass # called outside the loop + elif is_torch_tpu_available(): + if self.do_grad_scaling: + self.scaler.step(self.optimizer) + self.scaler.update() + else: + xm.optimizer_step(self.optimizer) + elif self.do_grad_scaling: + scale_before = self.scaler.get_scale() + self.scaler.step(self.optimizer) + self.scaler.update() + scale_after = self.scaler.get_scale() + optimizer_was_run = scale_before <= scale_after + else: + self.optimizer.step() + + if optimizer_was_run and not self.deepspeed: + self.lr_scheduler.step() + + model.zero_grad() + self.state.global_step += 1 + self.state.epoch = epoch + (step + 1 + steps_skipped) / steps_in_epoch + self.control = self.callback_handler.on_step_end(args, self.state, self.control) + + self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval) + else: + self.control = self.callback_handler.on_substep_end(args, self.state, self.control) + + if self.control.should_epoch_stop or self.control.should_training_stop: + break + if step < 0: + logger.warning( + "There seems to be not a single sample in your epoch_iterator, stopping training at step" + f" {self.state.global_step}! This is expected if you're using an IterableDataset and set" + f" num_steps ({max_steps}) higher than the number of available samples." + ) + self.control.should_training_stop = True + + self.control = self.callback_handler.on_epoch_end(args, self.state, self.control) + self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval) + + if DebugOption.TPU_METRICS_DEBUG in self.args.debug: + if is_torch_tpu_available(): + # tpu-comment: Logging debug metrics for PyTorch/XLA (compile, execute times, ops, etc.) + xm.master_print(met.metrics_report()) + else: + logger.warning( + "You enabled PyTorch/XLA debug metrics but you don't have a TPU " + "configured. Check your training configuration if this is unexpected." + ) + if self.control.should_training_stop: + break + + if args.past_index and hasattr(self, "_past"): + # Clean the state at the end of training + delattr(self, "_past") + + logger.info("\n\nTraining completed. Do not forget to share your model on huggingface.co/models =)\n\n") + if args.load_best_model_at_end and self.state.best_model_checkpoint is not None: + # Wait for everyone to get here so we are sur the model has been saved by process 0. + if is_torch_tpu_available(): + xm.rendezvous("load_best_model_at_end") + elif args.local_rank != -1: + dist.barrier() + elif is_sagemaker_mp_enabled(): + smp.barrier() + + self._load_best_model() + + # add remaining tr_loss + self._total_loss_scalar += tr_loss.item() + train_loss = self._total_loss_scalar / self.state.global_step + + metrics = speed_metrics("train", start_time, num_samples=num_train_samples, num_steps=self.state.max_steps) + self.store_flos() + metrics["total_flos"] = self.state.total_flos + metrics["train_loss"] = train_loss + + self.is_in_train = False + + self._memory_tracker.stop_and_update_metrics(metrics) + + self.log(metrics) + + run_dir = self._get_output_dir(trial) + checkpoints_sorted = self._sorted_checkpoints(use_mtime=False, output_dir=run_dir) + + # Delete the last checkpoint when save_total_limit=1 if it's different from the best checkpoint and process allowed to save. + if self.args.should_save and self.state.best_model_checkpoint is not None and self.args.save_total_limit == 1: + for checkpoint in checkpoints_sorted: + if checkpoint != self.state.best_model_checkpoint: + logger.info(f"Deleting older checkpoint [{checkpoint}] due to args.save_total_limit") + shutil.rmtree(checkpoint) + + self.control = self.callback_handler.on_train_end(args, self.state, self.control) + + return TrainOutput(self.state.global_step, train_loss, metrics) + + def _get_output_dir(self, trial): + if self.hp_search_backend is not None and trial is not None: + if self.hp_search_backend == HPSearchBackend.OPTUNA: + run_id = trial.number + elif self.hp_search_backend == HPSearchBackend.RAY: + from ray import tune + + run_id = tune.get_trial_id() + elif self.hp_search_backend == HPSearchBackend.SIGOPT: + run_id = trial.id + elif self.hp_search_backend == HPSearchBackend.WANDB: + import wandb + + run_id = wandb.run.id + run_name = self.hp_name(trial) if self.hp_name is not None else f"run-{run_id}" + run_dir = os.path.join(self.args.output_dir, run_name) + else: + run_dir = self.args.output_dir + return run_dir + + def _load_from_checkpoint(self, resume_from_checkpoint, model=None): + if model is None: + model = self.model + + if not os.path.isfile(os.path.join(resume_from_checkpoint, WEIGHTS_NAME)) and not os.path.isfile( + os.path.join(resume_from_checkpoint, WEIGHTS_INDEX_NAME) + ): + raise ValueError(f"Can't find a valid checkpoint at {resume_from_checkpoint}") + + logger.info(f"Loading model from {resume_from_checkpoint}.") + + if os.path.isfile(os.path.join(resume_from_checkpoint, CONFIG_NAME)): + config = PretrainedConfig.from_json_file(os.path.join(resume_from_checkpoint, CONFIG_NAME)) + checkpoint_version = config.transformers_version + if checkpoint_version is not None and checkpoint_version != __version__: + logger.warning( + f"You are resuming training from a checkpoint trained with {checkpoint_version} of " + f"Transformers but your current version is {__version__}. This is not recommended and could " + "yield to errors or unwanted behaviors." + ) + + if os.path.isfile(os.path.join(resume_from_checkpoint, WEIGHTS_NAME)): + # If the model is on the GPU, it still works! + if is_sagemaker_mp_enabled(): + if os.path.isfile(os.path.join(resume_from_checkpoint, "user_content.pt")): + # If the 'user_content.pt' file exists, load with the new smp api. + # Checkpoint must have been saved with the new smp api. + smp.resume_from_checkpoint( + path=resume_from_checkpoint, tag=WEIGHTS_NAME, partial=False, load_optimizer=False + ) + else: + # If the 'user_content.pt' file does NOT exist, load with the old smp api. + # Checkpoint must have been saved with the old smp api. + if hasattr(self.args, "fp16") and self.args.fp16 is True: + logger.warning( + "Enabling FP16 and loading from smp < 1.10 checkpoint together is not suppported." + ) + state_dict = torch.load(os.path.join(resume_from_checkpoint, WEIGHTS_NAME), map_location="cpu") + # Required for smp to not auto-translate state_dict from hf to smp (is already smp). + state_dict["_smp_is_partial"] = False + load_result = model.load_state_dict(state_dict, strict=True) + # release memory + del state_dict + else: + # We load the model state dict on the CPU to avoid an OOM error. + state_dict = torch.load(os.path.join(resume_from_checkpoint, WEIGHTS_NAME), map_location="cpu") + # workaround for FSDP bug https://github.com/pytorch/pytorch/issues/82963 + # which takes *args instead of **kwargs + load_result = model.load_state_dict(state_dict, False) + # release memory + del state_dict + self._issue_warnings_after_load(load_result) + else: + # We load the sharded checkpoint + load_result = load_sharded_checkpoint(model, resume_from_checkpoint, strict=is_sagemaker_mp_enabled()) + if not is_sagemaker_mp_enabled(): + self._issue_warnings_after_load(load_result) + + def _load_best_model(self): + logger.info(f"Loading best model from {self.state.best_model_checkpoint} (score: {self.state.best_metric}).") + best_model_path = os.path.join(self.state.best_model_checkpoint, WEIGHTS_NAME) + model = self.model_wrapped if is_sagemaker_mp_enabled() else self.model + if os.path.exists(best_model_path): + if self.deepspeed: + if self.model_wrapped is not None: + # this removes the pre-hooks from the previous engine + self.model_wrapped.destroy() + self.model_wrapped = None + + # temp hack until Deepspeed fixes the problem with resume from an existing engine that did some stepping + deepspeed_engine, optimizer, lr_scheduler = deepspeed_init( + self, + num_training_steps=self.args.max_steps, + resume_from_checkpoint=self.state.best_model_checkpoint, + ) + self.model = deepspeed_engine.module + self.model_wrapped = deepspeed_engine + self.deepspeed = deepspeed_engine + self.optimizer = optimizer + self.lr_scheduler = lr_scheduler + else: + if is_sagemaker_mp_enabled(): + if os.path.isfile(os.path.join(self.state.best_model_checkpoint, "user_content.pt")): + # If the 'user_content.pt' file exists, load with the new smp api. + # Checkpoint must have been saved with the new smp api. + smp.resume_from_checkpoint( + path=self.state.best_model_checkpoint, + tag=WEIGHTS_NAME, + partial=False, + load_optimizer=False, + ) + else: + # If the 'user_content.pt' file does NOT exist, load with the old smp api. + # Checkpoint must have been saved with the old smp api. + state_dict = torch.load(best_model_path, map_location="cpu") + state_dict["_smp_is_partial"] = False + load_result = model.load_state_dict(state_dict, strict=True) + else: + # We load the model state dict on the CPU to avoid an OOM error. + state_dict = torch.load(best_model_path, map_location="cpu") + # If the model is on the GPU, it still works! + # workaround for FSDP bug https://github.com/pytorch/pytorch/issues/82963 + # which takes *args instead of **kwargs + load_result = model.load_state_dict(state_dict, False) + if not is_sagemaker_mp_enabled(): + self._issue_warnings_after_load(load_result) + elif os.path.exists(os.path.join(self.state.best_model_checkpoint, WEIGHTS_INDEX_NAME)): + load_result = load_sharded_checkpoint( + model, self.state.best_model_checkpoint, strict=is_sagemaker_mp_enabled() + ) + if not is_sagemaker_mp_enabled(): + self._issue_warnings_after_load(load_result) + else: + logger.warning( + f"Could not locate the best model at {best_model_path}, if you are running a distributed training " + "on multiple nodes, you should activate `--save_on_each_node`." + ) + + def _issue_warnings_after_load(self, load_result): + if len(load_result.missing_keys) != 0: + if self.model._keys_to_ignore_on_save is not None and set(load_result.missing_keys) == set( + self.model._keys_to_ignore_on_save + ): + self.model.tie_weights() + else: + logger.warning(f"There were missing keys in the checkpoint model loaded: {load_result.missing_keys}.") + if len(load_result.unexpected_keys) != 0: + logger.warning( + f"There were unexpected keys in the checkpoint model loaded: {load_result.unexpected_keys}." + ) + + def _maybe_log_save_evaluate(self, tr_loss, model, trial, epoch, ignore_keys_for_eval): + if self.control.should_log: + if is_torch_tpu_available(): + xm.mark_step() + + logs: Dict[str, float] = {} + + # all_gather + mean() to get average loss over all processes + tr_loss_scalar = self._nested_gather(tr_loss).mean().item() + + # reset tr_loss to zero + tr_loss -= tr_loss + + logs["loss"] = round(tr_loss_scalar / (self.state.global_step - self._globalstep_last_logged), 4) + logs["learning_rate"] = self._get_learning_rate() + + self._total_loss_scalar += tr_loss_scalar + self._globalstep_last_logged = self.state.global_step + self.store_flos() + + self.log(logs) + + metrics = None + if self.control.should_evaluate: + if isinstance(self.eval_dataset, dict): + for eval_dataset_name, eval_dataset in self.eval_dataset.items(): + metrics = self.evaluate( + eval_dataset=eval_dataset, + ignore_keys=ignore_keys_for_eval, + metric_key_prefix=f"eval_{eval_dataset_name}", + ) + else: + metrics = self.evaluate(ignore_keys=ignore_keys_for_eval) + self._report_to_hp_search(trial, self.state.global_step, metrics) + + if self.control.should_save: + self._save_checkpoint(model, trial, metrics=metrics) + self.control = self.callback_handler.on_save(self.args, self.state, self.control) + + def _load_rng_state(self, checkpoint): + # Load RNG states from `checkpoint` + if checkpoint is None: + return + + if self.args.world_size > 1: + process_index = self.args.process_index + rng_file = os.path.join(checkpoint, f"rng_state_{process_index}.pth") + if not os.path.isfile(rng_file): + logger.info( + f"Didn't find an RNG file for process {process_index}, if you are resuming a training that " + "wasn't launched in a distributed fashion, reproducibility is not guaranteed." + ) + return + else: + rng_file = os.path.join(checkpoint, "rng_state.pth") + if not os.path.isfile(rng_file): + logger.info( + "Didn't find an RNG file, if you are resuming a training that was launched in a distributed " + "fashion, reproducibility is not guaranteed." + ) + return + + checkpoint_rng_state = torch.load(rng_file) + random.setstate(checkpoint_rng_state["python"]) + np.random.set_state(checkpoint_rng_state["numpy"]) + torch.random.set_rng_state(checkpoint_rng_state["cpu"]) + if torch.cuda.is_available(): + if self.args.local_rank != -1: + torch.cuda.random.set_rng_state(checkpoint_rng_state["cuda"]) + else: + try: + torch.cuda.random.set_rng_state_all(checkpoint_rng_state["cuda"]) + except Exception as e: + logger.info( + f"Didn't manage to set back the RNG states of the GPU because of the following error:\n {e}" + "\nThis won't yield the same results as if the training had not been interrupted." + ) + if is_torch_tpu_available(): + xm.set_rng_state(checkpoint_rng_state["xla"]) + + def _save_checkpoint(self, model, trial, metrics=None): + # In all cases, including ddp/dp/deepspeed, self.model is always a reference to the model we + # want to save except FullyShardedDDP. + # assert unwrap_model(model) is self.model, "internal model should be a reference to self.model" + + # Save model checkpoint + checkpoint_folder = f"{PREFIX_CHECKPOINT_DIR}-{self.state.global_step}" + + if self.hp_search_backend is None and trial is None: + self.store_flos() + + run_dir = self._get_output_dir(trial=trial) + output_dir = os.path.join(run_dir, checkpoint_folder) + self.save_model(output_dir, _internal_call=True) + if self.deepspeed: + # under zero3 model file itself doesn't get saved since it's bogus! Unless deepspeed + # config `stage3_gather_16bit_weights_on_model_save` is True + self.deepspeed.save_checkpoint(output_dir) + + # Save optimizer and scheduler + if self.sharded_ddp == ShardedDDPOption.SIMPLE: + self.optimizer.consolidate_state_dict() + + if is_torch_tpu_available(): + xm.rendezvous("saving_optimizer_states") + xm.save(self.optimizer.state_dict(), os.path.join(output_dir, OPTIMIZER_NAME)) + with warnings.catch_warnings(record=True) as caught_warnings: + xm.save(self.lr_scheduler.state_dict(), os.path.join(output_dir, SCHEDULER_NAME)) + reissue_pt_warnings(caught_warnings) + elif is_sagemaker_mp_enabled(): + opt_state_dict = self.optimizer.local_state_dict(gather_if_shard=False) + smp.barrier() + if smp.rdp_rank() == 0 or smp.state.cfg.shard_optimizer_state: + smp.save( + opt_state_dict, + os.path.join(output_dir, OPTIMIZER_NAME), + partial=True, + v3=smp.state.cfg.shard_optimizer_state, + ) + if self.args.should_save: + with warnings.catch_warnings(record=True) as caught_warnings: + torch.save(self.lr_scheduler.state_dict(), os.path.join(output_dir, SCHEDULER_NAME)) + reissue_pt_warnings(caught_warnings) + if self.do_grad_scaling: + torch.save(self.scaler.state_dict(), os.path.join(output_dir, SCALER_NAME)) + elif self.args.should_save and not self.deepspeed: + # deepspeed.save_checkpoint above saves model/optim/sched + torch.save(self.optimizer.state_dict(), os.path.join(output_dir, OPTIMIZER_NAME)) + with warnings.catch_warnings(record=True) as caught_warnings: + torch.save(self.lr_scheduler.state_dict(), os.path.join(output_dir, SCHEDULER_NAME)) + reissue_pt_warnings(caught_warnings) + if self.do_grad_scaling: + torch.save(self.scaler.state_dict(), os.path.join(output_dir, SCALER_NAME)) + + # Determine the new best metric / best model checkpoint + if metrics is not None and self.args.metric_for_best_model is not None: + metric_to_check = self.args.metric_for_best_model + if not metric_to_check.startswith("eval_"): + metric_to_check = f"eval_{metric_to_check}" + metric_value = metrics[metric_to_check] + + operator = np.greater if self.args.greater_is_better else np.less + if ( + self.state.best_metric is None + or self.state.best_model_checkpoint is None + or operator(metric_value, self.state.best_metric) + ): + self.state.best_metric = metric_value + self.state.best_model_checkpoint = output_dir + + # Save the Trainer state + if self.args.should_save: + self.state.save_to_json(os.path.join(output_dir, TRAINER_STATE_NAME)) + + # Save RNG state in non-distributed training + rng_states = { + "python": random.getstate(), + "numpy": np.random.get_state(), + "cpu": torch.random.get_rng_state(), + } + if torch.cuda.is_available(): + if self.args.local_rank == -1: + # In non distributed, we save the global CUDA RNG state (will take care of DataParallel) + rng_states["cuda"] = torch.cuda.random.get_rng_state_all() + else: + rng_states["cuda"] = torch.cuda.random.get_rng_state() + + if is_torch_tpu_available(): + rng_states["xla"] = xm.get_rng_state() + + # A process can arrive here before the process 0 has a chance to save the model, in which case output_dir may + # not yet exist. + os.makedirs(output_dir, exist_ok=True) + + if self.args.world_size <= 1: + torch.save(rng_states, os.path.join(output_dir, "rng_state.pth")) + else: + torch.save(rng_states, os.path.join(output_dir, f"rng_state_{self.args.process_index}.pth")) + + if self.args.push_to_hub: + self._push_from_checkpoint(output_dir) + + # Maybe delete some older checkpoints. + if self.args.should_save: + self._rotate_checkpoints(use_mtime=True, output_dir=run_dir) + + def _load_optimizer_and_scheduler(self, checkpoint): + """If optimizer and scheduler states exist, load them.""" + if checkpoint is None: + return + + if self.deepspeed: + # deepspeed loads optimizer/lr_scheduler together with the model in deepspeed_init + return + + checkpoint_file_exists = ( + glob.glob(os.path.join(checkpoint, OPTIMIZER_NAME) + "_*") + if is_sagemaker_mp_enabled() + else os.path.isfile(os.path.join(checkpoint, OPTIMIZER_NAME)) + ) + if checkpoint_file_exists and os.path.isfile(os.path.join(checkpoint, SCHEDULER_NAME)): + # Load in optimizer and scheduler states + if is_torch_tpu_available(): + # On TPU we have to take some extra precautions to properly load the states on the right device. + optimizer_state = torch.load(os.path.join(checkpoint, OPTIMIZER_NAME), map_location="cpu") + with warnings.catch_warnings(record=True) as caught_warnings: + lr_scheduler_state = torch.load(os.path.join(checkpoint, SCHEDULER_NAME), map_location="cpu") + reissue_pt_warnings(caught_warnings) + + xm.send_cpu_data_to_device(optimizer_state, self.args.device) + xm.send_cpu_data_to_device(lr_scheduler_state, self.args.device) + + self.optimizer.load_state_dict(optimizer_state) + self.lr_scheduler.load_state_dict(lr_scheduler_state) + else: + map_location = "cpu" if is_sagemaker_mp_enabled() else self.args.device + if is_sagemaker_mp_enabled(): + if os.path.isfile(os.path.join(checkpoint, "user_content.pt")): + # Optimizer checkpoint was saved with smp >= 1.10 + def opt_load_hook(mod, opt): + opt.load_state_dict(smp.load(os.path.join(checkpoint, OPTIMIZER_NAME), partial=True)) + + else: + # Optimizer checkpoint was saved with smp < 1.10 + def opt_load_hook(mod, opt): + if IS_SAGEMAKER_MP_POST_1_10: + opt.load_state_dict( + smp.load(os.path.join(checkpoint, OPTIMIZER_NAME), partial=True, back_compat=True) + ) + else: + opt.load_state_dict(smp.load(os.path.join(checkpoint, OPTIMIZER_NAME), partial=True)) + + self.model_wrapped.register_post_step_hook(opt_load_hook) + else: + self.optimizer.load_state_dict( + torch.load(os.path.join(checkpoint, OPTIMIZER_NAME), map_location=map_location) + ) + with warnings.catch_warnings(record=True) as caught_warnings: + self.lr_scheduler.load_state_dict(torch.load(os.path.join(checkpoint, SCHEDULER_NAME))) + reissue_pt_warnings(caught_warnings) + if self.do_grad_scaling and os.path.isfile(os.path.join(checkpoint, SCALER_NAME)): + self.scaler.load_state_dict(torch.load(os.path.join(checkpoint, SCALER_NAME))) + + def hyperparameter_search( + self, + hp_space: Optional[Callable[["optuna.Trial"], Dict[str, float]]] = None, + compute_objective: Optional[Callable[[Dict[str, float]], float]] = None, + n_trials: int = 20, + direction: str = "minimize", + backend: Optional[Union["str", HPSearchBackend]] = None, + hp_name: Optional[Callable[["optuna.Trial"], str]] = None, + **kwargs, + ) -> BestRun: + """ + Launch an hyperparameter search using `optuna` or `Ray Tune` or `SigOpt`. The optimized quantity is determined + by `compute_objective`, which defaults to a function returning the evaluation loss when no metric is provided, + the sum of all metrics otherwise. + + + + To use this method, you need to have provided a `model_init` when initializing your [`Trainer`]: we need to + reinitialize the model at each new run. This is incompatible with the `optimizers` argument, so you need to + subclass [`Trainer`] and override the method [`~Trainer.create_optimizer_and_scheduler`] for custom + optimizer/scheduler. + + + + Args: + hp_space (`Callable[["optuna.Trial"], Dict[str, float]]`, *optional*): + A function that defines the hyperparameter search space. Will default to + [`~trainer_utils.default_hp_space_optuna`] or [`~trainer_utils.default_hp_space_ray`] or + [`~trainer_utils.default_hp_space_sigopt`] depending on your backend. + compute_objective (`Callable[[Dict[str, float]], float]`, *optional*): + A function computing the objective to minimize or maximize from the metrics returned by the `evaluate` + method. Will default to [`~trainer_utils.default_compute_objective`]. + n_trials (`int`, *optional*, defaults to 100): + The number of trial runs to test. + direction (`str`, *optional*, defaults to `"minimize"`): + Whether to optimize greater or lower objects. Can be `"minimize"` or `"maximize"`, you should pick + `"minimize"` when optimizing the validation loss, `"maximize"` when optimizing one or several metrics. + backend (`str` or [`~training_utils.HPSearchBackend`], *optional*): + The backend to use for hyperparameter search. Will default to optuna or Ray Tune or SigOpt, depending + on which one is installed. If all are installed, will default to optuna. + hp_name (`Callable[["optuna.Trial"], str]]`, *optional*): + A function that defines the trial/run name. Will default to None. + kwargs (`Dict[str, Any]`, *optional*): + Additional keyword arguments passed along to `optuna.create_study` or `ray.tune.run`. For more + information see: + + - the documentation of + [optuna.create_study](https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html) + - the documentation of [tune.run](https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run) + - the documentation of [sigopt](https://app.sigopt.com/docs/endpoints/experiments/create) + + Returns: + [`trainer_utils.BestRun`]: All the information about the best run. Experiment summary can be found in + `run_summary` attribute for Ray backend. + """ + if backend is None: + backend = default_hp_search_backend() + if backend is None: + raise RuntimeError( + "At least one of optuna or ray should be installed. " + "To install optuna run `pip install optuna`. " + "To install ray run `pip install ray[tune]`. " + "To install sigopt run `pip install sigopt`." + ) + backend = HPSearchBackend(backend) + if backend == HPSearchBackend.OPTUNA and not is_optuna_available(): + raise RuntimeError("You picked the optuna backend, but it is not installed. Use `pip install optuna`.") + if backend == HPSearchBackend.RAY and not is_ray_tune_available(): + raise RuntimeError( + "You picked the Ray Tune backend, but it is not installed. Use `pip install 'ray[tune]'`." + ) + if backend == HPSearchBackend.SIGOPT and not is_sigopt_available(): + raise RuntimeError("You picked the sigopt backend, but it is not installed. Use `pip install sigopt`.") + if backend == HPSearchBackend.WANDB and not is_wandb_available(): + raise RuntimeError("You picked the wandb backend, but it is not installed. Use `pip install wandb`.") + self.hp_search_backend = backend + if self.model_init is None: + raise RuntimeError( + "To use hyperparameter search, you need to pass your model through a model_init function." + ) + + self.hp_space = default_hp_space[backend] if hp_space is None else hp_space + self.hp_name = hp_name + self.compute_objective = default_compute_objective if compute_objective is None else compute_objective + + backend_dict = { + HPSearchBackend.OPTUNA: run_hp_search_optuna, + HPSearchBackend.RAY: run_hp_search_ray, + HPSearchBackend.SIGOPT: run_hp_search_sigopt, + HPSearchBackend.WANDB: run_hp_search_wandb, + } + best_run = backend_dict[backend](self, n_trials, direction, **kwargs) + + self.hp_search_backend = None + return best_run + + def log(self, logs: Dict[str, float]) -> None: + """ + Log `logs` on the various objects watching training. + + Subclass and override this method to inject custom behavior. + + Args: + logs (`Dict[str, float]`): + The values to log. + """ + if self.state.epoch is not None: + logs["epoch"] = round(self.state.epoch, 2) + + output = {**logs, **{"step": self.state.global_step}} + self.state.log_history.append(output) + self.control = self.callback_handler.on_log(self.args, self.state, self.control, logs) + + def _prepare_input(self, data: Union[torch.Tensor, Any]) -> Union[torch.Tensor, Any]: + """ + Prepares one `data` before feeding it to the model, be it a tensor or a nested list/dictionary of tensors. + """ + if isinstance(data, Mapping): + return type(data)({k: self._prepare_input(v) for k, v in data.items()}) + elif isinstance(data, (tuple, list)): + return type(data)(self._prepare_input(v) for v in data) + elif isinstance(data, torch.Tensor): + kwargs = {"device": self.args.device} + if self.deepspeed and (torch.is_floating_point(data) or torch.is_complex(data)): + # NLP models inputs are int/uint and those get adjusted to the right dtype of the + # embedding. Other models such as wav2vec2's inputs are already float and thus + # may need special handling to match the dtypes of the model + kwargs.update({"dtype": self.args.hf_deepspeed_config.dtype()}) + return data.to(**kwargs) + return data + + def _prepare_inputs(self, inputs: Dict[str, Union[torch.Tensor, Any]]) -> Dict[str, Union[torch.Tensor, Any]]: + """ + Prepare `inputs` before feeding them to the model, converting them to tensors if they are not already and + handling potential state. + """ + inputs = self._prepare_input(inputs) + if len(inputs) == 0: + raise ValueError( + "The batch received was empty, your model won't be able to train on it. Double-check that your " + f"training dataset contains keys expected by the model: {','.join(self._signature_columns)}." + ) + if self.args.past_index >= 0 and self._past is not None: + inputs["mems"] = self._past + + return inputs + + def compute_loss_context_manager(self): + """ + A helper wrapper to group together context managers. + """ + return self.autocast_smart_context_manager() + + def autocast_smart_context_manager(self, cache_enabled: Optional[bool] = True): + """ + A helper wrapper that creates an appropriate context manager for `autocast` while feeding it the desired + arguments, depending on the situation. + """ + if self.use_cuda_amp or self.use_cpu_amp: + if is_torch_greater_or_equal_than_1_10: + ctx_manager = ( + torch.cpu.amp.autocast(cache_enabled=cache_enabled, dtype=self.amp_dtype) + if self.use_cpu_amp + else torch.cuda.amp.autocast(cache_enabled=cache_enabled, dtype=self.amp_dtype) + ) + else: + ctx_manager = torch.cuda.amp.autocast() + else: + ctx_manager = contextlib.nullcontext() if sys.version_info >= (3, 7) else contextlib.suppress() + + return ctx_manager + + def training_step(self, model: nn.Module, inputs: Dict[str, Union[torch.Tensor, Any]]) -> torch.Tensor: + """ + Perform a training step on a batch of inputs. + + Subclass and override to inject custom behavior. + + Args: + model (`nn.Module`): + The model to train. + inputs (`Dict[str, Union[torch.Tensor, Any]]`): + The inputs and targets of the model. + + The dictionary will be unpacked before being fed to the model. Most models expect the targets under the + argument `labels`. Check your model's documentation for all accepted arguments. + + Return: + `torch.Tensor`: The tensor with training loss on this batch. + """ + model.train() + inputs = self._prepare_inputs(inputs) + + if is_sagemaker_mp_enabled(): + loss_mb = smp_forward_backward(model, inputs, self.args.gradient_accumulation_steps) + return loss_mb.reduce_mean().detach().to(self.args.device) + + with self.compute_loss_context_manager(): + loss = self.compute_loss(model, inputs) + + if self.args.n_gpu > 1: + loss = loss.mean() # mean() to average on multi-gpu parallel training + + if self.args.gradient_accumulation_steps > 1 and not self.deepspeed: + # deepspeed handles loss scaling by gradient_accumulation_steps in its `backward` + loss = loss / self.args.gradient_accumulation_steps + + if self.do_grad_scaling: + self.scaler.scale(loss).backward() + elif self.use_apex: + with amp.scale_loss(loss, self.optimizer) as scaled_loss: + scaled_loss.backward() + elif self.deepspeed: + # loss gets scaled under gradient_accumulation_steps in deepspeed + loss = self.deepspeed.backward(loss) + else: + loss.backward() + + return loss.detach() + + def compute_loss(self, model, inputs, return_outputs=False): + """ + How the loss is computed by Trainer. By default, all models return the loss in the first element. + + Subclass and override for custom behavior. + """ + if self.label_smoother is not None and "labels" in inputs: + labels = inputs.pop("labels") + else: + labels = None + outputs = model(**inputs) + # Save past state if it exists + # TODO: this needs to be fixed and made cleaner later. + if self.args.past_index >= 0: + self._past = outputs[self.args.past_index] + + if labels is not None: + if unwrap_model(model)._get_name() in MODEL_FOR_CAUSAL_LM_MAPPING_NAMES.values(): + loss = self.label_smoother(outputs, labels, shift_labels=True) + else: + loss = self.label_smoother(outputs, labels) + else: + if isinstance(outputs, dict) and "loss" not in outputs: + raise ValueError( + "The model did not return a loss from the inputs, only the following keys: " + f"{','.join(outputs.keys())}. For reference, the inputs it received are {','.join(inputs.keys())}." + ) + # We don't use .loss here since the model may return tuples instead of ModelOutput. + loss = outputs["loss"] if isinstance(outputs, dict) else outputs[0] + + return (loss, outputs) if return_outputs else loss + + def is_local_process_zero(self) -> bool: + """ + Whether or not this process is the local (e.g., on one machine if training in a distributed fashion on several + machines) main process. + """ + return self.args.local_process_index == 0 + + def is_world_process_zero(self) -> bool: + """ + Whether or not this process is the global main process (when training in a distributed fashion on several + machines, this is only going to be `True` for one process). + """ + # Special case for SageMaker ModelParallel since there process_index is dp_process_index, not the global + # process index. + if is_sagemaker_mp_enabled(): + return smp.rank() == 0 + else: + return self.args.process_index == 0 + + def save_model(self, output_dir: Optional[str] = None, _internal_call: bool = False): + """ + Will save the model, so you can reload it using `from_pretrained()`. + + Will only save from the main process. + """ + + if output_dir is None: + output_dir = self.args.output_dir + + if is_torch_tpu_available(): + self._save_tpu(output_dir) + elif is_sagemaker_mp_enabled(): + # Calling the state_dict needs to be done on the wrapped model and on all processes. + os.makedirs(output_dir, exist_ok=True) + state_dict = self.model_wrapped.state_dict() + if self.args.should_save: + self._save(output_dir, state_dict=state_dict) + if IS_SAGEMAKER_MP_POST_1_10: + # 'user_content.pt' indicates model state_dict saved with smp >= 1.10 + Path(os.path.join(output_dir, "user_content.pt")).touch() + elif ( + ShardedDDPOption.ZERO_DP_2 in self.args.sharded_ddp + or ShardedDDPOption.ZERO_DP_3 in self.args.sharded_ddp + or self.fsdp is not None + ): + state_dict = self.model.state_dict() + + if self.args.should_save: + self._save(output_dir, state_dict=state_dict) + elif self.deepspeed: + # this takes care of everything as long as we aren't under zero3 + if self.args.should_save: + self._save(output_dir) + + if is_deepspeed_zero3_enabled(): + # It's too complicated to try to override different places where the weights dump gets + # saved, so since under zero3 the file is bogus, simply delete it. The user should + # either user deepspeed checkpoint to resume or to recover full weights use + # zero_to_fp32.py stored in the checkpoint. + if self.args.should_save: + file = os.path.join(output_dir, WEIGHTS_NAME) + if os.path.isfile(file): + # logger.info(f"deepspeed zero3: removing {file}, see zero_to_fp32.py to recover weights") + os.remove(file) + + # now save the real model if stage3_gather_16bit_weights_on_model_save=True + # if false it will not be saved. + # This must be called on all ranks + if not self.deepspeed.save_16bit_model(output_dir, WEIGHTS_NAME): + logger.warning( + "deepspeed.save_16bit_model didn't save the model, since" + " stage3_gather_16bit_weights_on_model_save=false. Saving the full checkpoint instead, use" + " zero_to_fp32.py to recover weights" + ) + self.deepspeed.save_checkpoint(output_dir) + + elif self.args.should_save: + self._save(output_dir) + + # Push to the Hub when `save_model` is called by the user. + if self.args.push_to_hub and not _internal_call: + self.push_to_hub(commit_message="Model save") + + def _save_tpu(self, output_dir: Optional[str] = None): + output_dir = output_dir if output_dir is not None else self.args.output_dir + logger.info(f"Saving model checkpoint to {output_dir}") + + if xm.is_master_ordinal(): + os.makedirs(output_dir, exist_ok=True) + torch.save(self.args, os.path.join(output_dir, TRAINING_ARGS_NAME)) + + # Save a trained model and configuration using `save_pretrained()`. + # They can then be reloaded using `from_pretrained()` + xm.rendezvous("saving_checkpoint") + if not isinstance(self.model, PreTrainedModel): + if isinstance(unwrap_model(self.model), PreTrainedModel): + unwrap_model(self.model).save_pretrained( + output_dir, + is_main_process=self.args.should_save, + state_dict=self.model.state_dict(), + save_function=xm.save, + ) + else: + logger.info("Trainer.model is not a `PreTrainedModel`, only saving its state dict.") + state_dict = self.model.state_dict() + xm.save(state_dict, os.path.join(output_dir, WEIGHTS_NAME)) + else: + self.model.save_pretrained(output_dir, is_main_process=self.args.should_save, save_function=xm.save) + if self.tokenizer is not None and self.args.should_save: + self.tokenizer.save_pretrained(output_dir) + + def _save(self, output_dir: Optional[str] = None, state_dict=None): + # If we are executing this function, we are the process zero, so we don't check for that. + output_dir = output_dir if output_dir is not None else self.args.output_dir + os.makedirs(output_dir, exist_ok=True) + logger.info(f"Saving model checkpoint to {output_dir}") + # Save a trained model and configuration using `save_pretrained()`. + # They can then be reloaded using `from_pretrained()` + if not isinstance(self.model, PreTrainedModel): + if isinstance(unwrap_model(self.model), PreTrainedModel): + if state_dict is None: + state_dict = self.model.state_dict() + unwrap_model(self.model).save_pretrained(output_dir, state_dict=filtered_state_dict) + else: + logger.info("Trainer.model is not a `PreTrainedModel`, only saving its state dict.") + if state_dict is None: + state_dict = self.model.state_dict() + torch.save(state_dict, os.path.join(output_dir, WEIGHTS_NAME)) + else: + if self.save_prefixencoder: + print("Saving PrefixEncoder") + state_dict = self.model.state_dict() + filtered_state_dict = {} + for k, v in self.model.named_parameters(): + if v.requires_grad: + filtered_state_dict[k] = state_dict[k] + self.model.save_pretrained(output_dir, state_dict=filtered_state_dict) + else: + print("Saving the whole model") + self.model.save_pretrained(output_dir, state_dict=state_dict) + if self.tokenizer is not None: + self.tokenizer.save_pretrained(output_dir) + + # Good practice: save your training arguments together with the trained model + torch.save(self.args, os.path.join(output_dir, TRAINING_ARGS_NAME)) + + def store_flos(self): + # Storing the number of floating-point operations that went into the model + if self.args.local_rank != -1: + self.state.total_flos += ( + distributed_broadcast_scalars([self.current_flos], device=self.args.device).sum().item() + ) + self.current_flos = 0 + else: + self.state.total_flos += self.current_flos + self.current_flos = 0 + + def _sorted_checkpoints( + self, output_dir=None, checkpoint_prefix=PREFIX_CHECKPOINT_DIR, use_mtime=False + ) -> List[str]: + ordering_and_checkpoint_path = [] + + glob_checkpoints = [str(x) for x in Path(output_dir).glob(f"{checkpoint_prefix}-*") if os.path.isdir(x)] + + for path in glob_checkpoints: + if use_mtime: + ordering_and_checkpoint_path.append((os.path.getmtime(path), path)) + else: + regex_match = re.match(f".*{checkpoint_prefix}-([0-9]+)", path) + if regex_match is not None and regex_match.groups() is not None: + ordering_and_checkpoint_path.append((int(regex_match.groups()[0]), path)) + + checkpoints_sorted = sorted(ordering_and_checkpoint_path) + checkpoints_sorted = [checkpoint[1] for checkpoint in checkpoints_sorted] + # Make sure we don't delete the best model. + if self.state.best_model_checkpoint is not None: + best_model_index = checkpoints_sorted.index(str(Path(self.state.best_model_checkpoint))) + for i in range(best_model_index, len(checkpoints_sorted) - 2): + checkpoints_sorted[i], checkpoints_sorted[i + 1] = checkpoints_sorted[i + 1], checkpoints_sorted[i] + return checkpoints_sorted + + def _rotate_checkpoints(self, use_mtime=False, output_dir=None) -> None: + if self.args.save_total_limit is None or self.args.save_total_limit <= 0: + return + + # Check if we should delete older checkpoint(s) + checkpoints_sorted = self._sorted_checkpoints(use_mtime=use_mtime, output_dir=output_dir) + if len(checkpoints_sorted) <= self.args.save_total_limit: + return + + # If save_total_limit=1 with load_best_model_at_end=True, we could end up deleting the last checkpoint, which + # we don't do to allow resuming. + save_total_limit = self.args.save_total_limit + if ( + self.state.best_model_checkpoint is not None + and self.args.save_total_limit == 1 + and checkpoints_sorted[-1] != self.state.best_model_checkpoint + ): + save_total_limit = 2 + + number_of_checkpoints_to_delete = max(0, len(checkpoints_sorted) - save_total_limit) + checkpoints_to_be_deleted = checkpoints_sorted[:number_of_checkpoints_to_delete] + for checkpoint in checkpoints_to_be_deleted: + logger.info(f"Deleting older checkpoint [{checkpoint}] due to args.save_total_limit") + shutil.rmtree(checkpoint, ignore_errors=True) + + def evaluate( + self, + eval_dataset: Optional[Dataset] = None, + ignore_keys: Optional[List[str]] = None, + metric_key_prefix: str = "eval", + ) -> Dict[str, float]: + """ + Run evaluation and returns metrics. + + The calling script will be responsible for providing a method to compute metrics, as they are task-dependent + (pass it to the init `compute_metrics` argument). + + You can also subclass and override this method to inject custom behavior. + + Args: + eval_dataset (`Dataset`, *optional*): + Pass a dataset if you wish to override `self.eval_dataset`. If it is a [`~datasets.Dataset`], columns + not accepted by the `model.forward()` method are automatically removed. It must implement the `__len__` + method. + ignore_keys (`Lst[str]`, *optional*): + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions. + metric_key_prefix (`str`, *optional*, defaults to `"eval"`): + An optional prefix to be used as the metrics key prefix. For example the metrics "bleu" will be named + "eval_bleu" if the prefix is "eval" (default) + + Returns: + A dictionary containing the evaluation loss and the potential metrics computed from the predictions. The + dictionary also contains the epoch number which comes from the training state. + """ + # memory metrics - must set up as early as possible + self._memory_tracker.start() + + eval_dataloader = self.get_eval_dataloader(eval_dataset) + start_time = time.time() + + eval_loop = self.prediction_loop if self.args.use_legacy_prediction_loop else self.evaluation_loop + output = eval_loop( + eval_dataloader, + description="Evaluation", + # No point gathering the predictions if there are no metrics, otherwise we defer to + # self.args.prediction_loss_only + prediction_loss_only=True if self.compute_metrics is None else None, + ignore_keys=ignore_keys, + metric_key_prefix=metric_key_prefix, + ) + + total_batch_size = self.args.eval_batch_size * self.args.world_size + if f"{metric_key_prefix}_jit_compilation_time" in output.metrics: + start_time += output.metrics[f"{metric_key_prefix}_jit_compilation_time"] + output.metrics.update( + speed_metrics( + metric_key_prefix, + start_time, + num_samples=output.num_samples, + num_steps=math.ceil(output.num_samples / total_batch_size), + ) + ) + + self.log(output.metrics) + + if DebugOption.TPU_METRICS_DEBUG in self.args.debug: + # tpu-comment: Logging debug metrics for PyTorch/XLA (compile, execute times, ops, etc.) + xm.master_print(met.metrics_report()) + + self.control = self.callback_handler.on_evaluate(self.args, self.state, self.control, output.metrics) + + self._memory_tracker.stop_and_update_metrics(output.metrics) + + return output.metrics + + def predict( + self, test_dataset: Dataset, ignore_keys: Optional[List[str]] = None, metric_key_prefix: str = "test" + ) -> PredictionOutput: + """ + Run prediction and returns predictions and potential metrics. + + Depending on the dataset and your use case, your test dataset may contain labels. In that case, this method + will also return metrics, like in `evaluate()`. + + Args: + test_dataset (`Dataset`): + Dataset to run the predictions on. If it is an `datasets.Dataset`, columns not accepted by the + `model.forward()` method are automatically removed. Has to implement the method `__len__` + ignore_keys (`Lst[str]`, *optional*): + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions. + metric_key_prefix (`str`, *optional*, defaults to `"test"`): + An optional prefix to be used as the metrics key prefix. For example the metrics "bleu" will be named + "test_bleu" if the prefix is "test" (default) + + + + If your predictions or labels have different sequence length (for instance because you're doing dynamic padding + in a token classification task) the predictions will be padded (on the right) to allow for concatenation into + one array. The padding index is -100. + + + + Returns: *NamedTuple* A namedtuple with the following keys: + + - predictions (`np.ndarray`): The predictions on `test_dataset`. + - label_ids (`np.ndarray`, *optional*): The labels (if the dataset contained some). + - metrics (`Dict[str, float]`, *optional*): The potential dictionary of metrics (if the dataset contained + labels). + """ + # memory metrics - must set up as early as possible + self._memory_tracker.start() + + test_dataloader = self.get_test_dataloader(test_dataset) + start_time = time.time() + + eval_loop = self.prediction_loop if self.args.use_legacy_prediction_loop else self.evaluation_loop + output = eval_loop( + test_dataloader, description="Prediction", ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix + ) + total_batch_size = self.args.eval_batch_size * self.args.world_size + if f"{metric_key_prefix}_jit_compilation_time" in output.metrics: + start_time += output.metrics[f"{metric_key_prefix}_jit_compilation_time"] + output.metrics.update( + speed_metrics( + metric_key_prefix, + start_time, + num_samples=output.num_samples, + num_steps=math.ceil(output.num_samples / total_batch_size), + ) + ) + + self.control = self.callback_handler.on_predict(self.args, self.state, self.control, output.metrics) + self._memory_tracker.stop_and_update_metrics(output.metrics) + + return PredictionOutput(predictions=output.predictions, label_ids=output.label_ids, metrics=output.metrics) + + def evaluation_loop( + self, + dataloader: DataLoader, + description: str, + prediction_loss_only: Optional[bool] = None, + ignore_keys: Optional[List[str]] = None, + metric_key_prefix: str = "eval", + ) -> EvalLoopOutput: + """ + Prediction/evaluation loop, shared by `Trainer.evaluate()` and `Trainer.predict()`. + + Works both with or without labels. + """ + args = self.args + + prediction_loss_only = prediction_loss_only if prediction_loss_only is not None else args.prediction_loss_only + + # if eval is called w/o train init deepspeed here + if args.deepspeed and not self.deepspeed: + # XXX: eval doesn't have `resume_from_checkpoint` arg but we should be able to do eval + # from the checkpoint eventually + deepspeed_engine, _, _ = deepspeed_init( + self, num_training_steps=0, resume_from_checkpoint=None, inference=True + ) + self.model = deepspeed_engine.module + self.model_wrapped = deepspeed_engine + self.deepspeed = deepspeed_engine + + model = self._wrap_model(self.model, training=False, dataloader=dataloader) + + # if full fp16 or bf16 eval is wanted and this ``evaluation`` or ``predict`` isn't called + # while ``train`` is running, cast it to the right dtype first and then put on device + if not self.is_in_train: + if args.fp16_full_eval: + model = model.to(dtype=torch.float16, device=args.device) + elif args.bf16_full_eval: + model = model.to(dtype=torch.bfloat16, device=args.device) + + batch_size = self.args.eval_batch_size + + logger.info(f"***** Running {description} *****") + if has_length(dataloader): + logger.info(f" Num examples = {self.num_examples(dataloader)}") + else: + logger.info(" Num examples: Unknown") + logger.info(f" Batch size = {batch_size}") + + model.eval() + + self.callback_handler.eval_dataloader = dataloader + # Do this before wrapping. + eval_dataset = getattr(dataloader, "dataset", None) + + if is_torch_tpu_available(): + dataloader = pl.ParallelLoader(dataloader, [args.device]).per_device_loader(args.device) + + if args.past_index >= 0: + self._past = None + + # Initialize containers + # losses/preds/labels on GPU/TPU (accumulated for eval_accumulation_steps) + losses_host = None + preds_host = None + labels_host = None + inputs_host = None + + # losses/preds/labels on CPU (final containers) + all_losses = None + all_preds = None + all_labels = None + all_inputs = None + # Will be useful when we have an iterable dataset so don't know its length. + + observed_num_examples = 0 + # Main evaluation loop + for step, inputs in enumerate(dataloader): + # Update the observed num examples + observed_batch_size = find_batch_size(inputs) + if observed_batch_size is not None: + observed_num_examples += observed_batch_size + # For batch samplers, batch_size is not known by the dataloader in advance. + if batch_size is None: + batch_size = observed_batch_size + + # Prediction step + loss, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys) + inputs_decode = self._prepare_input(inputs["input_ids"]) if args.include_inputs_for_metrics else None + + if is_torch_tpu_available(): + xm.mark_step() + + # Update containers on host + if loss is not None: + losses = self._nested_gather(loss.repeat(batch_size)) + losses_host = losses if losses_host is None else torch.cat((losses_host, losses), dim=0) + if labels is not None: + labels = self._pad_across_processes(labels) + labels = self._nested_gather(labels) + labels_host = labels if labels_host is None else nested_concat(labels_host, labels, padding_index=-100) + if inputs_decode is not None: + inputs_decode = self._pad_across_processes(inputs_decode) + inputs_decode = self._nested_gather(inputs_decode) + inputs_host = ( + inputs_decode + if inputs_host is None + else nested_concat(inputs_host, inputs_decode, padding_index=-100) + ) + if logits is not None: + logits = self._pad_across_processes(logits) + logits = self._nested_gather(logits) + if self.preprocess_logits_for_metrics is not None: + logits = self.preprocess_logits_for_metrics(logits, labels) + preds_host = logits if preds_host is None else nested_concat(preds_host, logits, padding_index=-100) + self.control = self.callback_handler.on_prediction_step(args, self.state, self.control) + + # Gather all tensors and put them back on the CPU if we have done enough accumulation steps. + if args.eval_accumulation_steps is not None and (step + 1) % args.eval_accumulation_steps == 0: + if losses_host is not None: + losses = nested_numpify(losses_host) + all_losses = losses if all_losses is None else np.concatenate((all_losses, losses), axis=0) + if preds_host is not None: + logits = nested_numpify(preds_host) + all_preds = logits if all_preds is None else nested_concat(all_preds, logits, padding_index=-100) + if inputs_host is not None: + inputs_decode = nested_numpify(inputs_host) + all_inputs = ( + inputs_decode + if all_inputs is None + else nested_concat(all_inputs, inputs_decode, padding_index=-100) + ) + if labels_host is not None: + labels = nested_numpify(labels_host) + all_labels = ( + labels if all_labels is None else nested_concat(all_labels, labels, padding_index=-100) + ) + + # Set back to None to begin a new accumulation + losses_host, preds_host, inputs_host, labels_host = None, None, None, None + + if args.past_index and hasattr(self, "_past"): + # Clean the state at the end of the evaluation loop + delattr(self, "_past") + + # Gather all remaining tensors and put them back on the CPU + if losses_host is not None: + losses = nested_numpify(losses_host) + all_losses = losses if all_losses is None else np.concatenate((all_losses, losses), axis=0) + if preds_host is not None: + logits = nested_numpify(preds_host) + all_preds = logits if all_preds is None else nested_concat(all_preds, logits, padding_index=-100) + if inputs_host is not None: + inputs_decode = nested_numpify(inputs_host) + all_inputs = ( + inputs_decode if all_inputs is None else nested_concat(all_inputs, inputs_decode, padding_index=-100) + ) + if labels_host is not None: + labels = nested_numpify(labels_host) + all_labels = labels if all_labels is None else nested_concat(all_labels, labels, padding_index=-100) + + # Number of samples + if has_length(eval_dataset): + num_samples = len(eval_dataset) + # The instance check is weird and does not actually check for the type, but whether the dataset has the right + # methods. Therefore we need to make sure it also has the attribute. + elif isinstance(eval_dataset, IterableDatasetShard) and getattr(eval_dataset, "num_examples", 0) > 0: + num_samples = eval_dataset.num_examples + else: + if has_length(dataloader): + num_samples = self.num_examples(dataloader) + else: # both len(dataloader.dataset) and len(dataloader) fail + num_samples = observed_num_examples + if num_samples == 0 and observed_num_examples > 0: + num_samples = observed_num_examples + + # Number of losses has been rounded to a multiple of batch_size and in a distributed training, the number of + # samplers has been rounded to a multiple of batch_size, so we truncate. + if all_losses is not None: + all_losses = all_losses[:num_samples] + if all_preds is not None: + all_preds = nested_truncate(all_preds, num_samples) + if all_labels is not None: + all_labels = nested_truncate(all_labels, num_samples) + if all_inputs is not None: + all_inputs = nested_truncate(all_inputs, num_samples) + + # Metrics! + if self.compute_metrics is not None and all_preds is not None and all_labels is not None: + if args.include_inputs_for_metrics: + metrics = self.compute_metrics( + EvalPrediction(predictions=all_preds, label_ids=all_labels, inputs=all_inputs) + ) + else: + metrics = self.compute_metrics(EvalPrediction(predictions=all_preds, label_ids=all_labels)) + else: + metrics = {} + + # To be JSON-serializable, we need to remove numpy types or zero-d tensors + metrics = denumpify_detensorize(metrics) + + if all_losses is not None: + metrics[f"{metric_key_prefix}_loss"] = all_losses.mean().item() + if hasattr(self, "jit_compilation_time"): + metrics[f"{metric_key_prefix}_jit_compilation_time"] = self.jit_compilation_time + + # Prefix all keys with metric_key_prefix + '_' + for key in list(metrics.keys()): + if not key.startswith(f"{metric_key_prefix}_"): + metrics[f"{metric_key_prefix}_{key}"] = metrics.pop(key) + + return EvalLoopOutput(predictions=all_preds, label_ids=all_labels, metrics=metrics, num_samples=num_samples) + + def _nested_gather(self, tensors, name=None): + """ + Gather value of `tensors` (tensor or list/tuple of nested tensors) and convert them to numpy before + concatenating them to `gathered` + """ + if tensors is None: + return + if is_torch_tpu_available(): + if name is None: + name = "nested_gather" + tensors = nested_xla_mesh_reduce(tensors, name) + elif is_sagemaker_mp_enabled(): + tensors = smp_gather(tensors) + elif self.args.local_rank != -1: + tensors = distributed_concat(tensors) + return tensors + + # Copied from Accelerate. + def _pad_across_processes(self, tensor, pad_index=-100): + """ + Recursively pad the tensors in a nested list/tuple/dictionary of tensors from all devices to the same size so + they can safely be gathered. + """ + if isinstance(tensor, (list, tuple)): + return type(tensor)(self._pad_across_processes(t, pad_index=pad_index) for t in tensor) + elif isinstance(tensor, dict): + return type(tensor)({k: self._pad_across_processes(v, pad_index=pad_index) for k, v in tensor.items()}) + elif not isinstance(tensor, torch.Tensor): + raise TypeError( + f"Can't pad the values of type {type(tensor)}, only of nested list/tuple/dicts of tensors." + ) + + if len(tensor.shape) < 2: + return tensor + # Gather all sizes + size = torch.tensor(tensor.shape, device=tensor.device)[None] + sizes = self._nested_gather(size).cpu() + + max_size = max(s[1] for s in sizes) + # When extracting XLA graphs for compilation, max_size is 0, + # so use inequality to avoid errors. + if tensor.shape[1] >= max_size: + return tensor + + # Then pad to the maximum size + old_size = tensor.shape + new_size = list(old_size) + new_size[1] = max_size + new_tensor = tensor.new_zeros(tuple(new_size)) + pad_index + new_tensor[:, : old_size[1]] = tensor + return new_tensor + + def prediction_step( + self, + model: nn.Module, + inputs: Dict[str, Union[torch.Tensor, Any]], + prediction_loss_only: bool, + ignore_keys: Optional[List[str]] = None, + ) -> Tuple[Optional[torch.Tensor], Optional[torch.Tensor], Optional[torch.Tensor]]: + """ + Perform an evaluation step on `model` using `inputs`. + + Subclass and override to inject custom behavior. + + Args: + model (`nn.Module`): + The model to evaluate. + inputs (`Dict[str, Union[torch.Tensor, Any]]`): + The inputs and targets of the model. + + The dictionary will be unpacked before being fed to the model. Most models expect the targets under the + argument `labels`. Check your model's documentation for all accepted arguments. + prediction_loss_only (`bool`): + Whether or not to return the loss only. + ignore_keys (`Lst[str]`, *optional*): + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions. + + Return: + Tuple[Optional[torch.Tensor], Optional[torch.Tensor], Optional[torch.Tensor]]: A tuple with the loss, + logits and labels (each being optional). + """ + has_labels = False if len(self.label_names) == 0 else all(inputs.get(k) is not None for k in self.label_names) + # For CLIP-like models capable of returning loss values. + # If `return_loss` is not specified or being `None` in `inputs`, we check if the default value of `return_loss` + # is `True` in `model.forward`. + return_loss = inputs.get("return_loss", None) + if return_loss is None: + return_loss = self.can_return_loss + loss_without_labels = True if len(self.label_names) == 0 and return_loss else False + + inputs = self._prepare_inputs(inputs) + if ignore_keys is None: + if hasattr(self.model, "config"): + ignore_keys = getattr(self.model.config, "keys_to_ignore_at_inference", []) + else: + ignore_keys = [] + + # labels may be popped when computing the loss (label smoothing for instance) so we grab them first. + if has_labels or loss_without_labels: + labels = nested_detach(tuple(inputs.get(name) for name in self.label_names)) + if len(labels) == 1: + labels = labels[0] + else: + labels = None + + with torch.no_grad(): + if is_sagemaker_mp_enabled(): + raw_outputs = smp_forward_only(model, inputs) + if has_labels or loss_without_labels: + if isinstance(raw_outputs, dict): + loss_mb = raw_outputs["loss"] + logits_mb = tuple(v for k, v in raw_outputs.items() if k not in ignore_keys + ["loss"]) + else: + loss_mb = raw_outputs[0] + logits_mb = raw_outputs[1:] + + loss = loss_mb.reduce_mean().detach().cpu() + logits = smp_nested_concat(logits_mb) + else: + loss = None + if isinstance(raw_outputs, dict): + logits_mb = tuple(v for k, v in raw_outputs.items() if k not in ignore_keys) + else: + logits_mb = raw_outputs + logits = smp_nested_concat(logits_mb) + else: + if has_labels or loss_without_labels: + with self.compute_loss_context_manager(): + loss, outputs = self.compute_loss(model, inputs, return_outputs=True) + loss = loss.mean().detach() + + if isinstance(outputs, dict): + logits = tuple(v for k, v in outputs.items() if k not in ignore_keys + ["loss"]) + else: + logits = outputs[1:] + else: + loss = None + with self.compute_loss_context_manager(): + outputs = model(**inputs) + if isinstance(outputs, dict): + logits = tuple(v for k, v in outputs.items() if k not in ignore_keys) + else: + logits = outputs + # TODO: this needs to be fixed and made cleaner later. + if self.args.past_index >= 0: + self._past = outputs[self.args.past_index - 1] + + if prediction_loss_only: + return (loss, None, None) + + logits = nested_detach(logits) + if len(logits) == 1: + logits = logits[0] + + return (loss, logits, labels) + + def floating_point_ops(self, inputs: Dict[str, Union[torch.Tensor, Any]]): + """ + For models that inherit from [`PreTrainedModel`], uses that method to compute the number of floating point + operations for every backward + forward pass. If using another model, either implement such a method in the + model or subclass and override this method. + + Args: + inputs (`Dict[str, Union[torch.Tensor, Any]]`): + The inputs and targets of the model. + + Returns: + `int`: The number of floating-point operations. + """ + if hasattr(self.model, "floating_point_ops"): + return self.model.floating_point_ops(inputs) + else: + return 0 + + def init_git_repo(self, at_init: bool = False): + """ + Initializes a git repo in `self.args.hub_model_id`. + + Args: + at_init (`bool`, *optional*, defaults to `False`): + Whether this function is called before any training or not. If `self.args.overwrite_output_dir` is + `True` and `at_init` is `True`, the path to the repo (which is `self.args.output_dir`) might be wiped + out. + """ + if not self.is_world_process_zero(): + return + if self.args.hub_model_id is None: + repo_name = Path(self.args.output_dir).absolute().name + else: + repo_name = self.args.hub_model_id + if "/" not in repo_name: + repo_name = get_full_repo_name(repo_name, token=self.args.hub_token) + + # Make sure the repo exists. + create_repo(repo_name, token=self.args.hub_token, private=self.args.hub_private_repo, exist_ok=True) + try: + self.repo = Repository(self.args.output_dir, clone_from=repo_name, token=self.args.hub_token) + except EnvironmentError: + if self.args.overwrite_output_dir and at_init: + # Try again after wiping output_dir + shutil.rmtree(self.args.output_dir) + self.repo = Repository(self.args.output_dir, clone_from=repo_name, token=self.args.hub_token) + else: + raise + + self.repo.git_pull() + + # By default, ignore the checkpoint folders + if ( + not os.path.exists(os.path.join(self.args.output_dir, ".gitignore")) + and self.args.hub_strategy != HubStrategy.ALL_CHECKPOINTS + ): + with open(os.path.join(self.args.output_dir, ".gitignore"), "w", encoding="utf-8") as writer: + writer.writelines(["checkpoint-*/"]) + + # Add "*.sagemaker" to .gitignore if using SageMaker + if os.environ.get("SM_TRAINING_ENV"): + self._add_sm_patterns_to_gitignore() + + self.push_in_progress = None + + def create_model_card( + self, + language: Optional[str] = None, + license: Optional[str] = None, + tags: Union[str, List[str], None] = None, + model_name: Optional[str] = None, + finetuned_from: Optional[str] = None, + tasks: Union[str, List[str], None] = None, + dataset_tags: Union[str, List[str], None] = None, + dataset: Union[str, List[str], None] = None, + dataset_args: Union[str, List[str], None] = None, + ): + """ + Creates a draft of a model card using the information available to the `Trainer`. + + Args: + language (`str`, *optional*): + The language of the model (if applicable) + license (`str`, *optional*): + The license of the model. Will default to the license of the pretrained model used, if the original + model given to the `Trainer` comes from a repo on the Hub. + tags (`str` or `List[str]`, *optional*): + Some tags to be included in the metadata of the model card. + model_name (`str`, *optional*): + The name of the model. + finetuned_from (`str`, *optional*): + The name of the model used to fine-tune this one (if applicable). Will default to the name of the repo + of the original model given to the `Trainer` (if it comes from the Hub). + tasks (`str` or `List[str]`, *optional*): + One or several task identifiers, to be included in the metadata of the model card. + dataset_tags (`str` or `List[str]`, *optional*): + One or several dataset tags, to be included in the metadata of the model card. + dataset (`str` or `List[str]`, *optional*): + One or several dataset identifiers, to be included in the metadata of the model card. + dataset_args (`str` or `List[str]`, *optional*): + One or several dataset arguments, to be included in the metadata of the model card. + """ + if not self.is_world_process_zero(): + return + + training_summary = TrainingSummary.from_trainer( + self, + language=language, + license=license, + tags=tags, + model_name=model_name, + finetuned_from=finetuned_from, + tasks=tasks, + dataset_tags=dataset_tags, + dataset=dataset, + dataset_args=dataset_args, + ) + model_card = training_summary.to_model_card() + with open(os.path.join(self.args.output_dir, "README.md"), "w") as f: + f.write(model_card) + + def _push_from_checkpoint(self, checkpoint_folder): + # Only push from one node. + if not self.is_world_process_zero() or self.args.hub_strategy == HubStrategy.END: + return + # If we haven't finished the last push, we don't do this one. + if self.push_in_progress is not None and not self.push_in_progress.is_done: + return + + output_dir = self.args.output_dir + # To avoid a new synchronization of all model weights, we just copy the file from the checkpoint folder + modeling_files = [CONFIG_NAME, WEIGHTS_NAME] + for modeling_file in modeling_files: + if os.path.isfile(os.path.join(checkpoint_folder, modeling_file)): + shutil.copy(os.path.join(checkpoint_folder, modeling_file), os.path.join(output_dir, modeling_file)) + # Saving the tokenizer is fast and we don't know how many files it may have spawned, so we resave it to be sure. + if self.tokenizer is not None: + self.tokenizer.save_pretrained(output_dir) + # Same for the training arguments + torch.save(self.args, os.path.join(output_dir, TRAINING_ARGS_NAME)) + + try: + if self.args.hub_strategy == HubStrategy.CHECKPOINT: + # Temporarily move the checkpoint just saved for the push + tmp_checkpoint = os.path.join(output_dir, "last-checkpoint") + # We have to remove the "last-checkpoint" dir if it exists, otherwise the checkpoint is moved as a + # subfolder. + if os.path.isdir(tmp_checkpoint): + shutil.rmtree(tmp_checkpoint) + shutil.move(checkpoint_folder, tmp_checkpoint) + + if self.args.save_strategy == IntervalStrategy.STEPS: + commit_message = f"Training in progress, step {self.state.global_step}" + else: + commit_message = f"Training in progress, epoch {int(self.state.epoch)}" + _, self.push_in_progress = self.repo.push_to_hub( + commit_message=commit_message, blocking=False, auto_lfs_prune=True + ) + finally: + if self.args.hub_strategy == HubStrategy.CHECKPOINT: + # Move back the checkpoint to its place + shutil.move(tmp_checkpoint, checkpoint_folder) + + def push_to_hub(self, commit_message: Optional[str] = "End of training", blocking: bool = True, **kwargs) -> str: + """ + Upload *self.model* and *self.tokenizer* to the 🤗 model hub on the repo *self.args.hub_model_id*. + + Parameters: + commit_message (`str`, *optional*, defaults to `"End of training"`): + Message to commit while pushing. + blocking (`bool`, *optional*, defaults to `True`): + Whether the function should return only when the `git push` has finished. + kwargs: + Additional keyword arguments passed along to [`~Trainer.create_model_card`]. + + Returns: + The url of the commit of your model in the given repository if `blocking=False`, a tuple with the url of + the commit and an object to track the progress of the commit if `blocking=True` + """ + # If a user calls manually `push_to_hub` with `self.args.push_to_hub = False`, we try to create the repo but + # it might fail. + if not hasattr(self, "repo"): + self.init_git_repo() + + model_name = kwargs.pop("model_name", None) + if model_name is None and self.args.should_save: + if self.args.hub_model_id is None: + model_name = Path(self.args.output_dir).name + else: + model_name = self.args.hub_model_id.split("/")[-1] + + # Needs to be executed on all processes for TPU training, but will only save on the processed determined by + # self.args.should_save. + self.save_model(_internal_call=True) + + # Only push from one node. + if not self.is_world_process_zero(): + return + + # Cancel any async push in progress if blocking=True. The commits will all be pushed together. + if blocking and self.push_in_progress is not None and not self.push_in_progress.is_done: + self.push_in_progress._process.kill() + self.push_in_progress = None + + git_head_commit_url = self.repo.push_to_hub( + commit_message=commit_message, blocking=blocking, auto_lfs_prune=True + ) + # push separately the model card to be independant from the rest of the model + if self.args.should_save: + self.create_model_card(model_name=model_name, **kwargs) + try: + self.repo.push_to_hub( + commit_message="update model card README.md", blocking=blocking, auto_lfs_prune=True + ) + except EnvironmentError as exc: + logger.error(f"Error pushing update to the model card. Please read logs and retry.\n${exc}") + + return git_head_commit_url + + # + # Deprecated code + # + + def prediction_loop( + self, + dataloader: DataLoader, + description: str, + prediction_loss_only: Optional[bool] = None, + ignore_keys: Optional[List[str]] = None, + metric_key_prefix: str = "eval", + ) -> EvalLoopOutput: + """ + Prediction/evaluation loop, shared by `Trainer.evaluate()` and `Trainer.predict()`. + + Works both with or without labels. + """ + args = self.args + + if not has_length(dataloader): + raise ValueError("dataloader must implement a working __len__") + + prediction_loss_only = prediction_loss_only if prediction_loss_only is not None else args.prediction_loss_only + + # if eval is called w/o train init deepspeed here + if args.deepspeed and not self.deepspeed: + # XXX: eval doesn't have `resume_from_checkpoint` arg but we should be able to do eval + # from the checkpoint eventually + deepspeed_engine, _, _ = deepspeed_init(self, num_training_steps=0, resume_from_checkpoint=None) + self.model = deepspeed_engine.module + self.model_wrapped = deepspeed_engine + self.deepspeed = deepspeed_engine + # XXX: we don't need optim/sched for inference, but this needs to be sorted out, since + # for example the Z3-optimizer is a must for zero3 to work even for inference - what we + # don't need is the deepspeed basic optimizer which is self.optimizer.optimizer + deepspeed_engine.optimizer.optimizer = None + deepspeed_engine.lr_scheduler = None + + model = self._wrap_model(self.model, training=False, dataloader=dataloader) + + # if full fp16 or bf16 eval is wanted and this ``evaluation`` or ``predict`` isn't called + # while ``train`` is running, cast it to the right dtype first and then put on device + if not self.is_in_train: + if args.fp16_full_eval: + model = model.to(dtype=torch.float16, device=args.device) + elif args.bf16_full_eval: + model = model.to(dtype=torch.bfloat16, device=args.device) + + batch_size = dataloader.batch_size + num_examples = self.num_examples(dataloader) + logger.info(f"***** Running {description} *****") + logger.info(f" Num examples = {num_examples}") + logger.info(f" Batch size = {batch_size}") + losses_host: torch.Tensor = None + preds_host: Union[torch.Tensor, List[torch.Tensor]] = None + labels_host: Union[torch.Tensor, List[torch.Tensor]] = None + inputs_host: Union[torch.Tensor, List[torch.Tensor]] = None + + world_size = max(1, args.world_size) + + eval_losses_gatherer = DistributedTensorGatherer(world_size, num_examples, make_multiple_of=batch_size) + if not prediction_loss_only: + # The actual number of eval_sample can be greater than num_examples in distributed settings (when we pass + # a batch size to the sampler) + make_multiple_of = None + if hasattr(dataloader, "sampler") and isinstance(dataloader.sampler, SequentialDistributedSampler): + make_multiple_of = dataloader.sampler.batch_size + preds_gatherer = DistributedTensorGatherer(world_size, num_examples, make_multiple_of=make_multiple_of) + labels_gatherer = DistributedTensorGatherer(world_size, num_examples, make_multiple_of=make_multiple_of) + inputs_gatherer = DistributedTensorGatherer(world_size, num_examples, make_multiple_of=make_multiple_of) + + model.eval() + + if is_torch_tpu_available(): + dataloader = pl.ParallelLoader(dataloader, [args.device]).per_device_loader(args.device) + + if args.past_index >= 0: + self._past = None + + self.callback_handler.eval_dataloader = dataloader + + for step, inputs in enumerate(dataloader): + loss, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys) + inputs_decode = self._prepare_input(inputs["input_ids"]) if args.include_inputs_for_metrics else None + + if loss is not None: + losses = loss.repeat(batch_size) + losses_host = losses if losses_host is None else torch.cat((losses_host, losses), dim=0) + if logits is not None: + preds_host = logits if preds_host is None else nested_concat(preds_host, logits, padding_index=-100) + if labels is not None: + labels_host = labels if labels_host is None else nested_concat(labels_host, labels, padding_index=-100) + if inputs_decode is not None: + inputs_host = ( + inputs_decode + if inputs_host is None + else nested_concat(inputs_host, inputs_decode, padding_index=-100) + ) + self.control = self.callback_handler.on_prediction_step(args, self.state, self.control) + + # Gather all tensors and put them back on the CPU if we have done enough accumulation steps. + if args.eval_accumulation_steps is not None and (step + 1) % args.eval_accumulation_steps == 0: + eval_losses_gatherer.add_arrays(self._gather_and_numpify(losses_host, "eval_losses")) + if not prediction_loss_only: + preds_gatherer.add_arrays(self._gather_and_numpify(preds_host, "eval_preds")) + labels_gatherer.add_arrays(self._gather_and_numpify(labels_host, "eval_label_ids")) + inputs_gatherer.add_arrays(self._gather_and_numpify(inputs_host, "eval_inputs_ids")) + + # Set back to None to begin a new accumulation + losses_host, preds_host, labels_host, inputs_host = None, None, None, None + + if args.past_index and hasattr(self, "_past"): + # Clean the state at the end of the evaluation loop + delattr(self, "_past") + + # Gather all remaining tensors and put them back on the CPU + eval_losses_gatherer.add_arrays(self._gather_and_numpify(losses_host, "eval_losses")) + if not prediction_loss_only: + preds_gatherer.add_arrays(self._gather_and_numpify(preds_host, "eval_preds")) + labels_gatherer.add_arrays(self._gather_and_numpify(labels_host, "eval_label_ids")) + inputs_gatherer.add_arrays(self._gather_and_numpify(inputs_host, "eval_inputs_ids")) + + eval_loss = eval_losses_gatherer.finalize() + preds = preds_gatherer.finalize() if not prediction_loss_only else None + label_ids = labels_gatherer.finalize() if not prediction_loss_only else None + inputs_ids = inputs_gatherer.finalize() if not prediction_loss_only else None + + if self.compute_metrics is not None and preds is not None and label_ids is not None: + if args.include_inputs_for_metrics: + metrics = self.compute_metrics( + EvalPrediction(predictions=preds, label_ids=label_ids, inputs=inputs_ids) + ) + else: + metrics = self.compute_metrics(EvalPrediction(predictions=preds, label_ids=label_ids)) + else: + metrics = {} + + # To be JSON-serializable, we need to remove numpy types or zero-d tensors + metrics = denumpify_detensorize(metrics) + + if eval_loss is not None: + metrics[f"{metric_key_prefix}_loss"] = eval_loss.mean().item() + + # Prefix all keys with metric_key_prefix + '_' + for key in list(metrics.keys()): + if not key.startswith(f"{metric_key_prefix}_"): + metrics[f"{metric_key_prefix}_{key}"] = metrics.pop(key) + + return EvalLoopOutput(predictions=preds, label_ids=label_ids, metrics=metrics, num_samples=num_examples) + + def _gather_and_numpify(self, tensors, name): + """ + Gather value of `tensors` (tensor or list/tuple of nested tensors) and convert them to numpy before + concatenating them to `gathered` + """ + if tensors is None: + return + if is_torch_tpu_available(): + tensors = nested_xla_mesh_reduce(tensors, name) + elif is_sagemaker_mp_enabled(): + tensors = smp_gather(tensors) + elif self.args.local_rank != -1: + tensors = distributed_concat(tensors) + + return nested_numpify(tensors) + + def _add_sm_patterns_to_gitignore(self) -> None: + """Add SageMaker Checkpointing patterns to .gitignore file.""" + # Make sure we only do this on the main process + if not self.is_world_process_zero(): + return + + patterns = ["*.sagemaker-uploading", "*.sagemaker-uploaded"] + + # Get current .gitignore content + if os.path.exists(os.path.join(self.repo.local_dir, ".gitignore")): + with open(os.path.join(self.repo.local_dir, ".gitignore"), "r") as f: + current_content = f.read() + else: + current_content = "" + + # Add the patterns to .gitignore + content = current_content + for pattern in patterns: + if pattern not in content: + if content.endswith("\n"): + content += pattern + else: + content += f"\n{pattern}" + + # Write the .gitignore file if it has changed + if content != current_content: + with open(os.path.join(self.repo.local_dir, ".gitignore"), "w") as f: + logger.debug(f"Writing .gitignore file. Content: {content}") + f.write(content) + + self.repo.git_add(".gitignore") + + # avoid race condition with git status + time.sleep(0.5) + + if not self.repo.is_repo_clean(): + self.repo.git_commit("Add *.sagemaker patterns to .gitignore.") + self.repo.git_push() diff --git a/ptuning/trainer_seq2seq.py b/ptuning/trainer_seq2seq.py new file mode 100644 index 0000000000000000000000000000000000000000..19d5cf12a274944a3ea3ce689414eab72636e0bd --- /dev/null +++ b/ptuning/trainer_seq2seq.py @@ -0,0 +1,247 @@ +# Copyright 2020 The HuggingFace Team. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from typing import Any, Dict, List, Optional, Tuple, Union + +import torch +from torch import nn +from torch.utils.data import Dataset + +from transformers.deepspeed import is_deepspeed_zero3_enabled +from trainer import Trainer +from transformers.trainer_utils import PredictionOutput +from transformers.utils import logging + + +logger = logging.get_logger(__name__) + + +class Seq2SeqTrainer(Trainer): + def evaluate( + self, + eval_dataset: Optional[Dataset] = None, + ignore_keys: Optional[List[str]] = None, + metric_key_prefix: str = "eval", + **gen_kwargs + ) -> Dict[str, float]: + """ + Run evaluation and returns metrics. + + The calling script will be responsible for providing a method to compute metrics, as they are task-dependent + (pass it to the init `compute_metrics` argument). + + You can also subclass and override this method to inject custom behavior. + + Args: + eval_dataset (`Dataset`, *optional*): + Pass a dataset if you wish to override `self.eval_dataset`. If it is an [`~datasets.Dataset`], columns + not accepted by the `model.forward()` method are automatically removed. It must implement the `__len__` + method. + ignore_keys (`List[str]`, *optional*): + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions. + metric_key_prefix (`str`, *optional*, defaults to `"eval"`): + An optional prefix to be used as the metrics key prefix. For example the metrics "bleu" will be named + "eval_bleu" if the prefix is `"eval"` (default) + max_length (`int`, *optional*): + The maximum target length to use when predicting with the generate method. + num_beams (`int`, *optional*): + Number of beams for beam search that will be used when predicting with the generate method. 1 means no + beam search. + gen_kwargs: + Additional `generate` specific kwargs. + + Returns: + A dictionary containing the evaluation loss and the potential metrics computed from the predictions. The + dictionary also contains the epoch number which comes from the training state. + """ + + gen_kwargs = gen_kwargs.copy() + if gen_kwargs.get("max_length") is None and gen_kwargs.get("max_new_tokens") is None: + gen_kwargs["max_length"] = self.args.generation_max_length + gen_kwargs["num_beams"] = ( + gen_kwargs["num_beams"] if gen_kwargs.get("num_beams") is not None else self.args.generation_num_beams + ) + self._gen_kwargs = gen_kwargs + + return super().evaluate(eval_dataset, ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix) + + def predict( + self, + test_dataset: Dataset, + ignore_keys: Optional[List[str]] = None, + metric_key_prefix: str = "test", + **gen_kwargs + ) -> PredictionOutput: + """ + Run prediction and returns predictions and potential metrics. + + Depending on the dataset and your use case, your test dataset may contain labels. In that case, this method + will also return metrics, like in `evaluate()`. + + Args: + test_dataset (`Dataset`): + Dataset to run the predictions on. If it is a [`~datasets.Dataset`], columns not accepted by the + `model.forward()` method are automatically removed. Has to implement the method `__len__` + ignore_keys (`List[str]`, *optional*): + A list of keys in the output of your model (if it is a dictionary) that should be ignored when + gathering predictions. + metric_key_prefix (`str`, *optional*, defaults to `"eval"`): + An optional prefix to be used as the metrics key prefix. For example the metrics "bleu" will be named + "eval_bleu" if the prefix is `"eval"` (default) + max_length (`int`, *optional*): + The maximum target length to use when predicting with the generate method. + num_beams (`int`, *optional*): + Number of beams for beam search that will be used when predicting with the generate method. 1 means no + beam search. + gen_kwargs: + Additional `generate` specific kwargs. + + + + If your predictions or labels have different sequence lengths (for instance because you're doing dynamic + padding in a token classification task) the predictions will be padded (on the right) to allow for + concatenation into one array. The padding index is -100. + + + + Returns: *NamedTuple* A namedtuple with the following keys: + + - predictions (`np.ndarray`): The predictions on `test_dataset`. + - label_ids (`np.ndarray`, *optional*): The labels (if the dataset contained some). + - metrics (`Dict[str, float]`, *optional*): The potential dictionary of metrics (if the dataset contained + labels). + """ + + gen_kwargs = gen_kwargs.copy() + if gen_kwargs.get("max_length") is None and gen_kwargs.get("max_new_tokens") is None: + gen_kwargs["max_length"] = self.args.generation_max_length + gen_kwargs["num_beams"] = ( + gen_kwargs["num_beams"] if gen_kwargs.get("num_beams") is not None else self.args.generation_num_beams + ) + self._gen_kwargs = gen_kwargs + + + return super().predict(test_dataset, ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix) + + def prediction_step( + self, + model: nn.Module, + inputs: Dict[str, Union[torch.Tensor, Any]], + prediction_loss_only: bool, + ignore_keys: Optional[List[str]] = None, + ) -> Tuple[Optional[float], Optional[torch.Tensor], Optional[torch.Tensor]]: + """ + Perform an evaluation step on `model` using `inputs`. + + Subclass and override to inject custom behavior. + + Args: + model (`nn.Module`): + The model to evaluate. + inputs (`Dict[str, Union[torch.Tensor, Any]]`): + The inputs and targets of the model. + + The dictionary will be unpacked before being fed to the model. Most models expect the targets under the + argument `labels`. Check your model's documentation for all accepted arguments. + prediction_loss_only (`bool`): + Whether or not to return the loss only. + + Return: + Tuple[Optional[float], Optional[torch.Tensor], Optional[torch.Tensor]]: A tuple with the loss, logits and + labels (each being optional). + """ + + if not self.args.predict_with_generate or prediction_loss_only: + return super().prediction_step( + model, inputs, prediction_loss_only=prediction_loss_only, ignore_keys=ignore_keys + ) + + has_labels = "labels" in inputs + inputs = self._prepare_inputs(inputs) + + # XXX: adapt synced_gpus for fairscale as well + gen_kwargs = self._gen_kwargs.copy() + if gen_kwargs.get("max_length") is None and gen_kwargs.get("max_new_tokens") is None: + gen_kwargs["max_length"] = self.model.config.max_length + gen_kwargs["num_beams"] = ( + gen_kwargs["num_beams"] if gen_kwargs.get("num_beams") is not None else self.model.config.num_beams + ) + default_synced_gpus = True if is_deepspeed_zero3_enabled() else False + gen_kwargs["synced_gpus"] = ( + gen_kwargs["synced_gpus"] if gen_kwargs.get("synced_gpus") is not None else default_synced_gpus + ) + + if "attention_mask" in inputs: + gen_kwargs["attention_mask"] = inputs.get("attention_mask", None) + if "position_ids" in inputs: + gen_kwargs["position_ids"] = inputs.get("position_ids", None) + if "global_attention_mask" in inputs: + gen_kwargs["global_attention_mask"] = inputs.get("global_attention_mask", None) + + # prepare generation inputs + # some encoder-decoder models can have varying encoder's and thus + # varying model input names + if hasattr(self.model, "encoder") and self.model.encoder.main_input_name != self.model.main_input_name: + generation_inputs = inputs[self.model.encoder.main_input_name] + else: + generation_inputs = inputs[self.model.main_input_name] + + gen_kwargs["input_ids"] = generation_inputs + generated_tokens = self.model.generate(**gen_kwargs) + generated_tokens = generated_tokens[:, generation_inputs.size()[-1]:] + + # in case the batch is shorter than max length, the output should be padded + if gen_kwargs.get("max_length") is not None and generated_tokens.shape[-1] < gen_kwargs["max_length"]: + generated_tokens = self._pad_tensors_to_max_len(generated_tokens, gen_kwargs["max_length"]) + elif gen_kwargs.get("max_new_tokens") is not None and generated_tokens.shape[-1] < ( + gen_kwargs["max_new_tokens"] + 1 + ): + generated_tokens = self._pad_tensors_to_max_len(generated_tokens, gen_kwargs["max_new_tokens"] + 1) + + loss = None + + if self.args.prediction_loss_only: + return (loss, None, None) + + if has_labels: + labels = inputs["labels"] + if gen_kwargs.get("max_length") is not None and labels.shape[-1] < gen_kwargs["max_length"]: + labels = self._pad_tensors_to_max_len(labels, gen_kwargs["max_length"]) + elif gen_kwargs.get("max_new_tokens") is not None and labels.shape[-1] < ( + gen_kwargs["max_new_tokens"] + 1 + ): + labels = self._pad_tensors_to_max_len(labels, (gen_kwargs["max_new_tokens"] + 1)) + else: + labels = None + + return (loss, generated_tokens, labels) + + def _pad_tensors_to_max_len(self, tensor, max_length): + if self.tokenizer is not None and hasattr(self.tokenizer, "pad_token_id"): + # If PAD token is not defined at least EOS token has to be defined + pad_token_id = ( + self.tokenizer.pad_token_id if self.tokenizer.pad_token_id is not None else self.tokenizer.eos_token_id + ) + else: + if self.model.config.pad_token_id is not None: + pad_token_id = self.model.config.pad_token_id + else: + raise ValueError("Pad_token_id must be set in the configuration of the model, in order to pad tensors") + + padded_tensor = pad_token_id * torch.ones( + (tensor.shape[0], max_length), dtype=tensor.dtype, device=tensor.device + ) + padded_tensor[:, : tensor.shape[-1]] = tensor + return padded_tensor diff --git a/ptuning/wandb/debug-cli.Lenovo.log b/ptuning/wandb/debug-cli.Lenovo.log new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/config.yaml b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..e0abd32b3be1153abc4f8da9e0e5826d84f36474 --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682019632.075257 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 10 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_03-38-54_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 5 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/output.log b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..0a10b651b6d39ecf214cb3e765ae6b24777338d5 --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/output.log @@ -0,0 +1,92 @@ + + 0%| | 0/10 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\config.json +[INFO|configuration_utils.py:362] 2023-04-21 03:44:38,847 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\generation_config.json +Saving PrefixEncoder +[INFO|modeling_utils.py:1762] 2023-04-21 03:44:39,537 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 03:44:39,541 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 03:44:39,543 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\special_tokens_map.json +Traceback (most recent call last): + File "main.py", line 436, in + main() + File "main.py", line 375, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 672, in _get_syntax + lines: Union[List[Text], Lines] = text.split("\n", allow_blank=ends_on_nl) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 1047, in split + line for line in self.divide(flatten_spans()) if line.plain != separator + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 1078, in divide + new_lines = Lines( + File "D:\Program\Python38\lib\site-packages\rich\containers.py", line 70, in __init__ + self._lines: List["Text"] = list(lines) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 1079, in + _Text( + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 131, in __init__ + def __init__( +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 436, in + main() + File "main.py", line 375, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/requirements.txt b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-metadata.json b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..a9c5afabfba6893fe675701ba6161cb34bf1424f --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-20T19:40:33.862237", + "startedAt": "2023-04-20T19:40:32.053386", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "10", + "--logging_steps", + "10", + "--save_steps", + "5", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 483.1361198425293 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-summary.json b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..d1143b54f94803f69651cd0203c46474a4c610ca --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 283}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug-internal.log b/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..65e01f7a02f917e83662abedc94a288ab67550e9 --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug-internal.log @@ -0,0 +1,422 @@ +2023-04-21 03:40:32,073 INFO StreamThr :39652 [internal.py:wandb_internal():86] W&B internal server running at pid: 39652, started at: 2023-04-21 03:40:32.073817 +2023-04-21 03:40:32,075 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status +2023-04-21 03:40:32,077 INFO WriterThread:39652 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\run-iwfl2ym1.wandb +2023-04-21 03:40:32,079 DEBUG SenderThread:39652 [sender.py:send():375] send: header +2023-04-21 03:40:32,142 DEBUG SenderThread:39652 [sender.py:send():375] send: run +2023-04-21 03:40:33,092 INFO SenderThread:39652 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files +2023-04-21 03:40:33,092 INFO SenderThread:39652 [sender.py:_start_run_threads():1124] run started: iwfl2ym1 with start time 1682019632.075257 +2023-04-21 03:40:33,092 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: summary_record +2023-04-21 03:40:33,092 INFO SenderThread:39652 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 03:40:33,094 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 03:40:33,094 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: check_version +2023-04-21 03:40:33,710 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 03:40:33,772 DEBUG HandlerThread:39652 [system_info.py:__init__():31] System info init +2023-04-21 03:40:33,773 DEBUG HandlerThread:39652 [system_info.py:__init__():46] System info init done +2023-04-21 03:40:33,773 INFO HandlerThread:39652 [system_monitor.py:start():181] Starting system monitor +2023-04-21 03:40:33,773 INFO SystemMonitor:39652 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 03:40:33,773 INFO HandlerThread:39652 [system_monitor.py:probe():201] Collecting system info +2023-04-21 03:40:33,780 INFO SystemMonitor:39652 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 03:40:33,781 INFO SystemMonitor:39652 [interfaces.py:start():190] Started disk monitoring +2023-04-21 03:40:33,781 INFO SystemMonitor:39652 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 03:40:33,797 INFO SystemMonitor:39652 [interfaces.py:start():190] Started memory monitoring +2023-04-21 03:40:33,824 INFO SystemMonitor:39652 [interfaces.py:start():190] Started network monitoring +2023-04-21 03:40:33,852 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:33,862 DEBUG HandlerThread:39652 [system_info.py:probe():195] Probing system +2023-04-21 03:40:33,864 DEBUG HandlerThread:39652 [system_info.py:_probe_git():180] Probing git +2023-04-21 03:40:33,983 DEBUG HandlerThread:39652 [system_info.py:_probe_git():188] Probing git done +2023-04-21 03:40:33,983 DEBUG HandlerThread:39652 [system_info.py:probe():240] Probing system done +2023-04-21 03:40:33,983 DEBUG HandlerThread:39652 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-20T19:40:33.862237', 'startedAt': '2023-04-20T19:40:32.053386', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '10', '--logging_steps', '10', '--save_steps', '5', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 483.1361198425293}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 03:40:33,984 INFO HandlerThread:39652 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 03:40:33,984 INFO HandlerThread:39652 [system_monitor.py:probe():214] Publishing system info +2023-04-21 03:40:33,984 DEBUG HandlerThread:39652 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 03:40:33,985 DEBUG HandlerThread:39652 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 03:40:33,986 INFO HandlerThread:39652 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 03:40:33,998 DEBUG SenderThread:39652 [sender.py:send():375] send: files +2023-04-21 03:40:33,998 INFO SenderThread:39652 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 03:40:34,010 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:40:34,010 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:40:34,102 INFO Thread-16 :39652 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-summary.json +2023-04-21 03:40:34,102 INFO Thread-16 :39652 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\requirements.txt +2023-04-21 03:40:34,102 INFO Thread-16 :39652 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-metadata.json +2023-04-21 03:40:34,420 DEBUG SenderThread:39652 [sender.py:send():375] send: telemetry +2023-04-21 03:40:34,421 DEBUG SenderThread:39652 [sender.py:send():375] send: config +2023-04-21 03:40:34,422 DEBUG SenderThread:39652 [sender.py:send():375] send: metric +2023-04-21 03:40:34,422 DEBUG SenderThread:39652 [sender.py:send():375] send: telemetry +2023-04-21 03:40:34,422 DEBUG SenderThread:39652 [sender.py:send():375] send: metric +2023-04-21 03:40:34,423 WARNING SenderThread:39652 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 03:40:34,987 INFO wandb-upload_0:39652 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmp3tl2wwo9wandb\2r1k3dbf-wandb-metadata.json +2023-04-21 03:40:35,103 INFO Thread-16 :39652 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:40:35,937 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:37,115 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:40:37,712 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:40:37,997 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:38,117 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:40:40,037 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:42,084 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:42,746 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:40:44,156 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:46,204 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:47,803 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:40:48,257 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:49,028 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:40:49,029 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:40:50,311 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:52,358 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:53,347 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:40:54,414 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:56,460 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:40:58,385 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:40:58,499 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:00,540 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:02,594 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:03,437 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:04,049 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:41:04,139 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:41:04,691 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\config.yaml +2023-04-21 03:41:04,745 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:06,763 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:08,773 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:09,466 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:10,833 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:12,890 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:14,504 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:14,926 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:16,973 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:19,019 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:19,067 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:41:19,068 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:41:20,310 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:21,087 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:23,131 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:25,164 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:41:25,177 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:25,507 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:27,222 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:29,260 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:30,547 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:31,309 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:33,352 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:33,837 DEBUG SystemMonitor:39652 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 03:41:33,838 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:41:34,067 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:41:34,068 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:41:35,472 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:36,326 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:37,491 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:39,522 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:41,370 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:41,566 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:43,611 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:45,651 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:46,407 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:47,737 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:49,076 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:41:49,076 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:41:49,788 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:51,846 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:52,355 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:53,897 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:55,959 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:41:57,404 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:41:58,006 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:00,084 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:02,124 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:02,445 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:03,840 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:42:04,084 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:42:04,084 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:42:04,194 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:05,185 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:42:06,305 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:08,323 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:08,358 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:10,338 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:12,379 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:13,424 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:14,458 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:16,510 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:18,538 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:18,563 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:19,086 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:42:19,087 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:42:20,628 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:22,676 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:24,371 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:24,739 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:26,800 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:28,864 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:29,409 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:30,924 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:32,978 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:33,841 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:42:34,100 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:42:34,101 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:42:35,032 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:35,364 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:37,145 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:39,164 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:40,405 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:41,195 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:43,238 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:45,276 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:42:45,289 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:45,497 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:47,356 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:49,110 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:42:49,110 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:42:49,411 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:51,480 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:51,493 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:53,544 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:55,592 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:56,596 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:42:57,636 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:42:59,685 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:01,719 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:01,728 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:03,778 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:03,852 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:43:04,119 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:43:04,119 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:43:05,843 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:07,380 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:07,971 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:09,984 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:12,032 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:12,432 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:14,071 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:16,138 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:17,487 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:18,238 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:19,134 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:43:19,134 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:43:20,293 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:22,346 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:23,415 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:24,397 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:26,441 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:28,504 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:28,518 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:30,566 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:32,624 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:33,512 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:33,868 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:43:34,149 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:43:34,150 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:43:34,673 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:36,714 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:38,827 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:39,435 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:40,858 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:42,876 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:43,874 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:43:44,944 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:45,195 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:47,010 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:49,059 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:49,160 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:43:49,160 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:43:50,424 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:51,114 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:53,179 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:55,243 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:55,469 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:43:57,288 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:43:59,366 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:00,504 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:01,431 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:03,488 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:03,879 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:44:04,173 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:44:04,173 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:44:05,563 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:06,457 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:07,624 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:09,776 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:11,506 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:11,797 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:13,833 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:15,898 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:16,554 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:17,981 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:19,189 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:44:19,190 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:44:20,094 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:22,174 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:22,455 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:24,241 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:26,297 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:27,516 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:28,361 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:30,419 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:32,469 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:32,583 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:33,887 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:44:34,193 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:44:34,194 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:44:34,534 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:36,585 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:38,468 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:38,647 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:39,646 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:44:40,757 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:41,716 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:44:42,775 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:43,579 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:44,803 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:46,852 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:48,623 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:48,920 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:49,211 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:44:49,211 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:44:50,970 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:53,051 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:54,488 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:44:55,109 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:57,184 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:59,274 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:44:59,537 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:45:01,322 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:03,389 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:03,899 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:45:04,222 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:45:04,222 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:45:05,438 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:05,472 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:45:07,507 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:09,544 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:10,544 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:45:11,684 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:13,703 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:15,634 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:45:15,738 ERROR gpu :39652 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:45:17,529 DEBUG SenderThread:39652 [sender.py:send():375] send: exit +2023-04-21 03:45:17,529 INFO SenderThread:39652 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 03:45:17,530 INFO SenderThread:39652 [sender.py:send_exit():600] handling runtime: 283 +2023-04-21 03:45:17,531 INFO SenderThread:39652 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 03:45:17,532 INFO SenderThread:39652 [sender.py:send_exit():606] send defer +2023-04-21 03:45:17,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,533 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 03:45:17,533 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,533 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 03:45:17,533 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 1 +2023-04-21 03:45:17,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,533 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 03:45:17,533 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,533 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 03:45:17,533 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 2 +2023-04-21 03:45:17,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,534 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 03:45:17,534 INFO HandlerThread:39652 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 03:45:17,534 DEBUG SystemMonitor:39652 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 03:45:17,534 INFO HandlerThread:39652 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 03:45:17,546 DEBUG SystemMonitor:39652 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 03:45:17,547 INFO HandlerThread:39652 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 03:45:17,596 INFO HandlerThread:39652 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 03:45:17,596 INFO HandlerThread:39652 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 03:45:17,597 INFO HandlerThread:39652 [interfaces.py:finish():202] Joined network monitor +2023-04-21 03:45:17,598 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,598 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 03:45:17,598 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 3 +2023-04-21 03:45:17,598 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,598 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 03:45:17,625 DEBUG SenderThread:39652 [sender.py:send():375] send: stats +2023-04-21 03:45:17,626 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,626 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 03:45:17,626 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 4 +2023-04-21 03:45:17,627 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,627 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 03:45:17,627 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,627 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 03:45:17,627 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 5 +2023-04-21 03:45:17,627 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,627 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 03:45:17,627 DEBUG SenderThread:39652 [sender.py:send():375] send: summary +2023-04-21 03:45:17,628 INFO SenderThread:39652 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 03:45:17,628 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,628 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 03:45:17,629 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 6 +2023-04-21 03:45:17,629 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,629 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 03:45:17,629 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,629 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 03:45:17,629 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 7 +2023-04-21 03:45:17,629 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:45:17,630 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:17,630 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 03:45:17,630 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:17,631 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 03:45:17,745 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-summary.json +2023-04-21 03:45:18,605 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 03:45:18,757 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:45:19,779 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:45:21,033 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 8 +2023-04-21 03:45:21,034 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 03:45:21,034 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:21,034 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 03:45:21,034 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:21,034 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 03:45:21,051 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 9 +2023-04-21 03:45:21,051 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:21,051 DEBUG SenderThread:39652 [sender.py:send():375] send: artifact +2023-04-21 03:45:21,052 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 03:45:21,795 INFO Thread-16 :39652 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:45:22,388 INFO SenderThread:39652 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'digest': '53476254d59d98151858729a5d45f45c', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v2'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v2'} +2023-04-21 03:45:22,388 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:22,388 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 03:45:22,388 INFO SenderThread:39652 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 03:45:22,811 INFO SenderThread:39652 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files +2023-04-21 03:45:22,812 INFO SenderThread:39652 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\config.yaml config.yaml +2023-04-21 03:45:22,813 INFO SenderThread:39652 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log output.log +2023-04-21 03:45:22,816 INFO SenderThread:39652 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\requirements.txt requirements.txt +2023-04-21 03:45:22,820 INFO SenderThread:39652 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-metadata.json wandb-metadata.json +2023-04-21 03:45:22,820 INFO SenderThread:39652 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-summary.json wandb-summary.json +2023-04-21 03:45:22,826 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 10 +2023-04-21 03:45:22,827 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:22,827 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 03:45:22,831 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:22,831 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 03:45:22,832 INFO SenderThread:39652 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 03:45:23,476 INFO wandb-upload_0:39652 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\config.yaml +2023-04-21 03:45:23,645 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 03:45:23,804 INFO wandb-upload_1:39652 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\output.log +2023-04-21 03:45:24,034 INFO wandb-upload_3:39652 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\wandb-summary.json +2023-04-21 03:45:24,096 INFO wandb-upload_2:39652 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\files\requirements.txt +2023-04-21 03:45:24,299 INFO Thread-15 :39652 [sender.py:transition_state():626] send defer: 11 +2023-04-21 03:45:24,299 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:24,299 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 03:45:24,300 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:24,300 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 03:45:24,300 INFO SenderThread:39652 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 03:45:24,300 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 12 +2023-04-21 03:45:24,300 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:24,300 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 03:45:24,300 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:24,300 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 03:45:25,531 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 13 +2023-04-21 03:45:25,531 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:25,531 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 03:45:25,531 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:25,531 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 03:45:25,531 INFO SenderThread:39652 [sender.py:transition_state():626] send defer: 14 +2023-04-21 03:45:25,532 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: defer +2023-04-21 03:45:25,532 DEBUG SenderThread:39652 [sender.py:send():375] send: final +2023-04-21 03:45:25,532 INFO HandlerThread:39652 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 03:45:25,532 DEBUG SenderThread:39652 [sender.py:send():375] send: footer +2023-04-21 03:45:25,532 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: defer +2023-04-21 03:45:25,532 INFO SenderThread:39652 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 03:45:25,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 03:45:25,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 03:45:25,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 03:45:25,533 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 03:45:25,533 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 03:45:25,533 DEBUG SenderThread:39652 [sender.py:send_request():402] send_request: server_info +2023-04-21 03:45:25,774 INFO MainThread:39652 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 03:45:25,774 INFO MainThread:39652 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 03:45:25,774 INFO MainThread:39652 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 03:45:25,776 DEBUG HandlerThread:39652 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 03:45:25,776 INFO HandlerThread:39652 [handler.py:finish():845] shutting down handler +2023-04-21 03:45:26,548 INFO WriterThread:39652 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\run-iwfl2ym1.wandb +2023-04-21 03:45:26,786 INFO SenderThread:39652 [sender.py:finish():1550] shutting down sender +2023-04-21 03:45:26,786 INFO SenderThread:39652 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 03:45:26,786 INFO SenderThread:39652 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug.log b/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..d77dcf11569e1ea05615cd22bc9e9dcc687ab0df --- /dev/null +++ b/ptuning/wandb/run-20230421_034032-iwfl2ym1/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 03:40:32,057 INFO MainThread:39620 [wandb_setup.py:_flush():76] Configure stats pid to 39620 +2023-04-21 03:40:32,058 INFO MainThread:39620 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 03:40:32,058 INFO MainThread:39620 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 03:40:32,058 INFO MainThread:39620 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 03:40:32,058 INFO MainThread:39620 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 03:40:32,058 INFO MainThread:39620 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 03:40:32,066 INFO MainThread:39620 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\logs\debug.log +2023-04-21 03:40:32,067 INFO MainThread:39620 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_034032-iwfl2ym1\logs\debug-internal.log +2023-04-21 03:40:32,067 INFO MainThread:39620 [wandb_init.py:init():547] calling init triggers +2023-04-21 03:40:32,067 INFO MainThread:39620 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 03:40:32,067 INFO MainThread:39620 [wandb_init.py:init():595] starting backend +2023-04-21 03:40:32,067 INFO MainThread:39620 [wandb_init.py:init():599] setting up manager +2023-04-21 03:40:32,069 INFO MainThread:39620 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 03:40:32,075 INFO MainThread:39620 [wandb_init.py:init():605] backend started and connected +2023-04-21 03:40:32,076 INFO MainThread:39620 [wandb_init.py:init():695] updated telemetry +2023-04-21 03:40:32,141 INFO MainThread:39620 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 03:40:33,093 INFO MainThread:39620 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 03:40:33,703 INFO MainThread:39620 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 03:40:33,703 INFO MainThread:39620 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 03:40:34,010 INFO MainThread:39620 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 03:40:34,010 INFO MainThread:39620 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 03:40:34,010 INFO MainThread:39620 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 03:40:34,010 INFO MainThread:39620 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 03:40:34,011 INFO MainThread:39620 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 03:40:34,014 INFO MainThread:39620 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 10, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_03-38-54_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 5, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 03:45:27,600 WARNING MsgRouterThr:39620 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_034032-iwfl2ym1/run-iwfl2ym1.wandb b/ptuning/wandb/run-20230421_034032-iwfl2ym1/run-iwfl2ym1.wandb new file mode 100644 index 0000000000000000000000000000000000000000..16b65117eb10905ec2a0f9c367714733eb224055 Binary files /dev/null and b/ptuning/wandb/run-20230421_034032-iwfl2ym1/run-iwfl2ym1.wandb differ diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/config.yaml b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..00998d2c56216d4d945ed1bc20ea4be56ff92aec --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/config.yaml @@ -0,0 +1,636 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682020723.954036 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 + - 1: train/train_runtime + 5: 1 + 6: + - 1 + - 1: train/train_samples_per_second + 5: 1 + 6: + - 1 + - 1: train/train_steps_per_second + 5: 1 + 6: + - 1 + - 1: train/total_flos + 5: 1 + 6: + - 1 + - 1: train/train_loss + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 10 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_03-57-00_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 5 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/output.log b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..aa23e6e7642df0d9a54bd2b62d051105e1b76d26 --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/output.log @@ -0,0 +1,31 @@ + + 0%| | 0/10 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\config.json +[INFO|configuration_utils.py:362] 2023-04-21 04:01:37,214 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 04:01:37,438 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:01:37,442 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:01:37,443 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\special_tokens_map.json + + + + 90%|████████████████████████████████████████████████████████████████▊ | 9/10 [03:51<00:16, 16.71s/it] +{'loss': 5.144, 'learning_rate': 0.0, 'epoch': 0.0} +Saving PrefixEncoder +{'train_runtime': 251.9148, 'train_samples_per_second': 0.635, 'train_steps_per_second': 0.04, 'train_loss': 5.143994140625, 'epoch': 0.0} +***** train metrics ***** + epoch = 0.0 + train_loss = 5.144 + train_runtime = 0:04:11.91 + train_samples = 114599 + train_samples_per_second = 0.635 +100%|███████████████████████████████████████████████████████████████████████| 10/10 [04:05<00:00, 16.01s/it][INFO|configuration_utils.py:457] 2023-04-21 04:02:51,174 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\config.json +[INFO|configuration_utils.py:362] 2023-04-21 04:02:51,176 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 04:02:51,355 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:02:51,360 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:02:51,361 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\special_tokens_map.json +100%|███████████████████████████████████████████████████████████████████████| 10/10 [04:06<00:00, 24.61s/it] \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/requirements.txt b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-metadata.json b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..cf6f3919b03ae56c4c9b1b36b8088748f617de04 --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-20T19:58:45.445809", + "startedAt": "2023-04-20T19:58:43.942038", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "10", + "--logging_steps", + "10", + "--save_steps", + "5", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 483.4673271179199 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-summary.json b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..54276f9e08416ac524b8c675eb9fd988089ebd5a --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 5.144, "train/learning_rate": 0.0, "train/epoch": 0.0, "train/global_step": 10, "_timestamp": 1682020971.713746, "_runtime": 247.75971007347107, "_step": 1, "train/train_runtime": 251.9148, "train/train_samples_per_second": 0.635, "train/train_steps_per_second": 0.04, "train/total_flos": 346657212334080.0, "train/train_loss": 5.143994140625, "_wandb": {"runtime": 246}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug-internal.log b/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..4b6b117e05218e44bb4fcedd5a1fffa305b2089f --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug-internal.log @@ -0,0 +1,413 @@ +2023-04-21 03:58:43,953 INFO StreamThr :38724 [internal.py:wandb_internal():86] W&B internal server running at pid: 38724, started at: 2023-04-21 03:58:43.953038 +2023-04-21 03:58:43,954 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status +2023-04-21 03:58:43,956 INFO WriterThread:38724 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\run-kyzl8vjo.wandb +2023-04-21 03:58:43,960 DEBUG SenderThread:38724 [sender.py:send():375] send: header +2023-04-21 03:58:44,024 DEBUG SenderThread:38724 [sender.py:send():375] send: run +2023-04-21 03:58:44,776 INFO SenderThread:38724 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files +2023-04-21 03:58:44,776 INFO SenderThread:38724 [sender.py:_start_run_threads():1124] run started: kyzl8vjo with start time 1682020723.954036 +2023-04-21 03:58:44,776 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: summary_record +2023-04-21 03:58:44,777 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 03:58:44,778 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 03:58:44,779 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: check_version +2023-04-21 03:58:45,315 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 03:58:45,368 DEBUG HandlerThread:38724 [system_info.py:__init__():31] System info init +2023-04-21 03:58:45,368 DEBUG HandlerThread:38724 [system_info.py:__init__():46] System info init done +2023-04-21 03:58:45,368 INFO HandlerThread:38724 [system_monitor.py:start():181] Starting system monitor +2023-04-21 03:58:45,368 INFO SystemMonitor:38724 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 03:58:45,368 INFO HandlerThread:38724 [system_monitor.py:probe():201] Collecting system info +2023-04-21 03:58:45,376 INFO SystemMonitor:38724 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 03:58:45,377 INFO SystemMonitor:38724 [interfaces.py:start():190] Started disk monitoring +2023-04-21 03:58:45,377 INFO SystemMonitor:38724 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 03:58:45,387 INFO SystemMonitor:38724 [interfaces.py:start():190] Started memory monitoring +2023-04-21 03:58:45,418 INFO SystemMonitor:38724 [interfaces.py:start():190] Started network monitoring +2023-04-21 03:58:45,445 DEBUG HandlerThread:38724 [system_info.py:probe():195] Probing system +2023-04-21 03:58:45,448 DEBUG HandlerThread:38724 [system_info.py:_probe_git():180] Probing git +2023-04-21 03:58:45,449 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:45,555 DEBUG HandlerThread:38724 [system_info.py:_probe_git():188] Probing git done +2023-04-21 03:58:45,555 DEBUG HandlerThread:38724 [system_info.py:probe():240] Probing system done +2023-04-21 03:58:45,555 DEBUG HandlerThread:38724 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-20T19:58:45.445809', 'startedAt': '2023-04-20T19:58:43.942038', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '10', '--logging_steps', '10', '--save_steps', '5', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 483.4673271179199}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 03:58:45,556 INFO HandlerThread:38724 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 03:58:45,556 INFO HandlerThread:38724 [system_monitor.py:probe():214] Publishing system info +2023-04-21 03:58:45,556 DEBUG HandlerThread:38724 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 03:58:45,557 DEBUG HandlerThread:38724 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 03:58:45,558 INFO HandlerThread:38724 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 03:58:45,571 DEBUG SenderThread:38724 [sender.py:send():375] send: files +2023-04-21 03:58:45,572 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 03:58:45,584 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:58:45,584 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:58:45,788 INFO Thread-16 :38724 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\requirements.txt +2023-04-21 03:58:45,788 INFO Thread-16 :38724 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-summary.json +2023-04-21 03:58:45,789 INFO Thread-16 :38724 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-metadata.json +2023-04-21 03:58:46,002 DEBUG SenderThread:38724 [sender.py:send():375] send: telemetry +2023-04-21 03:58:46,003 DEBUG SenderThread:38724 [sender.py:send():375] send: config +2023-04-21 03:58:46,004 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 03:58:46,004 DEBUG SenderThread:38724 [sender.py:send():375] send: telemetry +2023-04-21 03:58:46,004 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 03:58:46,004 WARNING SenderThread:38724 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 03:58:46,617 INFO wandb-upload_0:38724 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpeu4l8b7twandb\d2ycakdr-wandb-metadata.json +2023-04-21 03:58:46,790 INFO Thread-16 :38724 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 03:58:47,503 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:48,818 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 03:58:49,142 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:58:49,546 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:49,829 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 03:58:51,601 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:53,647 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:54,181 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:58:55,689 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:57,732 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:58:59,234 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:58:59,773 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:00,613 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:59:00,614 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:59:01,821 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:03,882 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:04,912 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:05,942 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:07,975 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:09,999 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:10,029 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:12,081 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:14,142 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:15,049 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:15,621 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:59:15,622 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:59:16,222 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\config.yaml +2023-04-21 03:59:16,259 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:18,284 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:20,314 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:20,975 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:22,358 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:24,414 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:26,020 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:26,454 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:28,502 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:30,553 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:30,638 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:59:30,639 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:59:31,887 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:32,600 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:34,663 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:36,708 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:36,918 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:38,769 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:40,827 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:41,961 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:42,900 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:45,001 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:45,433 DEBUG SystemMonitor:38724 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 03:59:45,435 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 03:59:45,648 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 03:59:45,648 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 03:59:47,109 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:47,929 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:49,122 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:51,146 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:52,971 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:53,193 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:55,245 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:57,294 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 03:59:58,023 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 03:59:59,335 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:00,660 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:00:00,660 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:00:01,393 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:03,437 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:03,965 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:05,490 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:07,538 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:09,004 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:09,586 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:11,647 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:13,695 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:14,063 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:15,443 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:00:15,658 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:00:15,659 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:00:15,741 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:17,898 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:19,916 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:19,937 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:21,968 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:24,008 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:25,205 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:26,076 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:27,067 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:00:28,115 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:30,169 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:30,239 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:30,663 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:00:30,664 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:00:32,238 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:34,282 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:35,961 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:36,323 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:38,382 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:40,434 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:41,018 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:42,479 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:44,530 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:45,451 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:00:45,670 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:00:45,671 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:00:46,577 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:46,949 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:48,705 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:50,720 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:51,993 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:52,755 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:54,790 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:00:54,797 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:56,849 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:00:57,625 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:00:58,902 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:00,680 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:01:00,681 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:01:00,943 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:02,989 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:02,997 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:05,037 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:07,105 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:08,826 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:09,148 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:01:09,160 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:11,225 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:13,283 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:13,867 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:15,365 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:15,463 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:01:15,693 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:01:15,694 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:01:17,413 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:18,961 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:19,535 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:21,557 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:01:21,574 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:23,628 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:24,359 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:25,715 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:27,806 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:29,433 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:29,872 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:30,700 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:01:30,700 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:01:31,961 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:34,012 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:35,005 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:36,083 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:38,130 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:01:38,139 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:39,136 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:01:40,204 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:40,469 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:42,267 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:44,324 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:45,464 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:01:45,714 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:01:45,714 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:45,715 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:01:46,376 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:48,446 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:50,552 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:51,005 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:52,570 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:53,539 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:01:54,584 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:56,633 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:01:56,701 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:01:58,686 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:00,727 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:02:00,728 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:02:00,733 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:01,991 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:02,776 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:04,829 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:06,866 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:07,034 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:08,929 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:09,923 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:11,007 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:12,654 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:13,041 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:15,129 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:15,467 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:02:15,728 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:02:15,729 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:02:17,177 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:18,030 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:19,221 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:21,336 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:23,179 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:23,330 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:23,359 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:25,388 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:27,441 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:28,226 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:29,496 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:30,753 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:02:30,753 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:02:31,555 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:33,589 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:34,032 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:35,639 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:37,680 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:39,720 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:39,721 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:39,730 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:41,781 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:43,830 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:44,830 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:45,468 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:02:45,770 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:02:45,771 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:02:45,866 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:47,956 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:49,987 ERROR gpu :38724 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:02:50,048 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:51,159 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:02:51,160 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,161 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,161 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,161 DEBUG SenderThread:38724 [sender.py:send():375] send: history +2023-04-21 04:02:51,161 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:02:51,162 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:02:51,713 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:02:51,716 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,716 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,717 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,717 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,718 DEBUG SenderThread:38724 [sender.py:send():375] send: metric +2023-04-21 04:02:51,718 DEBUG SenderThread:38724 [sender.py:send():375] send: history +2023-04-21 04:02:51,720 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:02:51,721 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:02:51,743 DEBUG SenderThread:38724 [sender.py:send():375] send: exit +2023-04-21 04:02:51,743 INFO SenderThread:38724 [sender.py:send_exit():598] handling exit code: 0 +2023-04-21 04:02:51,743 INFO SenderThread:38724 [sender.py:send_exit():600] handling runtime: 246 +2023-04-21 04:02:51,744 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:02:51,744 INFO SenderThread:38724 [sender.py:send_exit():606] send defer +2023-04-21 04:02:51,745 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,745 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 04:02:51,745 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,745 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 04:02:51,745 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 1 +2023-04-21 04:02:51,745 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,746 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 04:02:51,746 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,746 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 04:02:51,746 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 2 +2023-04-21 04:02:51,746 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,746 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 04:02:51,747 INFO HandlerThread:38724 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 04:02:51,747 DEBUG SystemMonitor:38724 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 04:02:51,747 INFO HandlerThread:38724 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 04:02:51,757 DEBUG SystemMonitor:38724 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 04:02:51,758 INFO HandlerThread:38724 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 04:02:51,806 INFO HandlerThread:38724 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 04:02:51,806 INFO HandlerThread:38724 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 04:02:51,806 INFO HandlerThread:38724 [interfaces.py:finish():202] Joined network monitor +2023-04-21 04:02:51,806 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,807 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 04:02:51,807 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 3 +2023-04-21 04:02:51,807 DEBUG SenderThread:38724 [sender.py:send():375] send: stats +2023-04-21 04:02:51,807 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,807 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 04:02:51,808 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,808 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 04:02:51,808 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 4 +2023-04-21 04:02:51,809 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,809 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 04:02:51,809 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,809 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 04:02:51,809 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 5 +2023-04-21 04:02:51,810 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,810 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 04:02:51,810 DEBUG SenderThread:38724 [sender.py:send():375] send: summary +2023-04-21 04:02:51,810 INFO SenderThread:38724 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:02:51,811 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,811 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 04:02:51,811 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 6 +2023-04-21 04:02:51,811 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:51,811 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 04:02:51,812 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:51,812 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 04:02:51,816 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:02:51,985 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-summary.json +2023-04-21 04:02:52,432 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 7 +2023-04-21 04:02:52,433 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:52,433 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 04:02:52,433 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:52,433 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 04:02:52,812 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 04:02:52,998 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\config.yaml +2023-04-21 04:02:52,999 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:54,014 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:54,311 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 8 +2023-04-21 04:02:54,311 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 04:02:54,311 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:54,311 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 04:02:54,311 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:54,311 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 04:02:54,338 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 9 +2023-04-21 04:02:54,338 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:54,338 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 04:02:54,338 DEBUG SenderThread:38724 [sender.py:send():375] send: artifact +2023-04-21 04:02:55,024 INFO Thread-16 :38724 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:55,676 INFO SenderThread:38724 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MTMzMDE1', 'digest': '651bcc3a27fe0b9435b558b5f4a1dbed', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v1'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v1'} +2023-04-21 04:02:55,677 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:55,677 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 04:02:55,677 INFO SenderThread:38724 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 04:02:56,025 INFO SenderThread:38724 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files +2023-04-21 04:02:56,026 INFO SenderThread:38724 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\config.yaml config.yaml +2023-04-21 04:02:56,026 INFO SenderThread:38724 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log output.log +2023-04-21 04:02:56,029 INFO SenderThread:38724 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\requirements.txt requirements.txt +2023-04-21 04:02:56,033 INFO SenderThread:38724 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-metadata.json wandb-metadata.json +2023-04-21 04:02:56,033 INFO SenderThread:38724 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-summary.json wandb-summary.json +2023-04-21 04:02:56,036 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 10 +2023-04-21 04:02:56,036 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:56,037 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 04:02:56,037 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:56,039 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 04:02:56,039 INFO SenderThread:38724 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 04:02:56,691 INFO wandb-upload_1:38724 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\output.log +2023-04-21 04:02:57,069 INFO wandb-upload_0:38724 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\config.yaml +2023-04-21 04:02:57,249 INFO wandb-upload_2:38724 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\requirements.txt +2023-04-21 04:02:57,296 INFO wandb-upload_3:38724 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\files\wandb-summary.json +2023-04-21 04:02:57,503 INFO Thread-15 :38724 [sender.py:transition_state():626] send defer: 11 +2023-04-21 04:02:57,503 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:57,503 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 04:02:57,504 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:57,504 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 04:02:57,504 INFO SenderThread:38724 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 04:02:57,504 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 12 +2023-04-21 04:02:57,504 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:57,504 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 04:02:57,504 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:57,504 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 04:02:57,864 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 04:02:58,146 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 13 +2023-04-21 04:02:58,146 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:58,146 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 04:02:58,146 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:58,146 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 04:02:58,146 INFO SenderThread:38724 [sender.py:transition_state():626] send defer: 14 +2023-04-21 04:02:58,147 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:02:58,147 DEBUG SenderThread:38724 [sender.py:send():375] send: final +2023-04-21 04:02:58,147 INFO HandlerThread:38724 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 04:02:58,147 DEBUG SenderThread:38724 [sender.py:send():375] send: footer +2023-04-21 04:02:58,147 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: defer +2023-04-21 04:02:58,147 INFO SenderThread:38724 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 04:02:58,148 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 04:02:58,148 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 04:02:58,148 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 04:02:58,148 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 04:02:58,148 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 04:02:58,148 DEBUG SenderThread:38724 [sender.py:send_request():402] send_request: server_info +2023-04-21 04:02:58,399 INFO MainThread:38724 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 04:02:58,400 INFO MainThread:38724 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 04:02:58,410 INFO MainThread:38724 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 04:02:58,412 DEBUG HandlerThread:38724 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 04:02:58,412 INFO HandlerThread:38724 [handler.py:finish():845] shutting down handler +2023-04-21 04:02:59,160 INFO WriterThread:38724 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\run-kyzl8vjo.wandb +2023-04-21 04:02:59,411 INFO SenderThread:38724 [sender.py:finish():1550] shutting down sender +2023-04-21 04:02:59,411 INFO SenderThread:38724 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 04:02:59,411 INFO SenderThread:38724 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug.log b/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..fc7ea607913c022eabf57cbe8b8f9ee11b423659 --- /dev/null +++ b/ptuning/wandb/run-20230421_035843-kyzl8vjo/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Configure stats pid to 34144 +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\logs\debug.log +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_035843-kyzl8vjo\logs\debug-internal.log +2023-04-21 03:58:43,947 INFO MainThread:34144 [wandb_init.py:init():547] calling init triggers +2023-04-21 03:58:43,948 INFO MainThread:34144 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 03:58:43,948 INFO MainThread:34144 [wandb_init.py:init():595] starting backend +2023-04-21 03:58:43,948 INFO MainThread:34144 [wandb_init.py:init():599] setting up manager +2023-04-21 03:58:43,950 INFO MainThread:34144 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 03:58:43,954 INFO MainThread:34144 [wandb_init.py:init():605] backend started and connected +2023-04-21 03:58:43,955 INFO MainThread:34144 [wandb_init.py:init():695] updated telemetry +2023-04-21 03:58:44,023 INFO MainThread:34144 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 03:58:44,778 INFO MainThread:34144 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 03:58:45,306 INFO MainThread:34144 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 03:58:45,306 INFO MainThread:34144 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 03:58:45,584 INFO MainThread:34144 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 03:58:45,584 INFO MainThread:34144 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 03:58:45,585 INFO MainThread:34144 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 03:58:45,585 INFO MainThread:34144 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 03:58:45,587 INFO MainThread:34144 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 03:58:45,591 INFO MainThread:34144 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 10, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_03-57-00_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 5, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 04:03:00,289 WARNING MsgRouterThr:34144 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_035843-kyzl8vjo/run-kyzl8vjo.wandb b/ptuning/wandb/run-20230421_035843-kyzl8vjo/run-kyzl8vjo.wandb new file mode 100644 index 0000000000000000000000000000000000000000..a54f1d67a69649a4c7841cc7d17ed5228dc5fa1f Binary files /dev/null and b/ptuning/wandb/run-20230421_035843-kyzl8vjo/run-kyzl8vjo.wandb differ diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/files/config.yaml b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..d996b3b9866565a9eebfa908c1792fcc952b65d2 --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682021103.269339 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 10 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_04-03-18_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 5 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/files/output.log b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..41f31edf7e03b9f9438efefca4593e3f7c3d96aa --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/output.log @@ -0,0 +1,143 @@ + + 0%| | 0/10 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\config.json +[INFO|configuration_utils.py:362] 2023-04-21 04:06:46,591 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 04:06:46,881 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:06:46,885 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:06:46,886 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-5\special_tokens_map.json + + 70%|██████████████████████████████████████████████████▍ | 7/10 [02:11<00:47, 15.79s/it]Traceback (most recent call last): + File "main.py", line 437, in + main() + File "main.py", line 376, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 437, in + main() + File "main.py", line 376, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\e02ba894cf18f3fd9b2526c795f983683c4ec732\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/files/requirements.txt b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-metadata.json b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..18b47b1abca235052c860a68062b25d1f1a7158f --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-20T20:05:04.864612", + "startedAt": "2023-04-20T20:05:03.256341", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "10", + "--logging_steps", + "10", + "--save_steps", + "5", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 495.093204498291 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-summary.json b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..3bac1d1593d94a4573a818e30c1835561c3756ac --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 135}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug-internal.log b/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..8f9afe077a95899dc6b75c24b9a8b1e44503a16e --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug-internal.log @@ -0,0 +1,301 @@ +2023-04-21 04:05:03,268 INFO StreamThr :44364 [internal.py:wandb_internal():86] W&B internal server running at pid: 44364, started at: 2023-04-21 04:05:03.268342 +2023-04-21 04:05:03,270 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status +2023-04-21 04:05:03,271 INFO WriterThread:44364 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\run-ofh2ksim.wandb +2023-04-21 04:05:03,275 DEBUG SenderThread:44364 [sender.py:send():375] send: header +2023-04-21 04:05:03,345 DEBUG SenderThread:44364 [sender.py:send():375] send: run +2023-04-21 04:05:04,150 INFO SenderThread:44364 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files +2023-04-21 04:05:04,150 INFO SenderThread:44364 [sender.py:_start_run_threads():1124] run started: ofh2ksim with start time 1682021103.269339 +2023-04-21 04:05:04,150 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:05:04,151 INFO SenderThread:44364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:05:04,151 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 04:05:04,152 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: check_version +2023-04-21 04:05:04,728 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 04:05:04,781 DEBUG HandlerThread:44364 [system_info.py:__init__():31] System info init +2023-04-21 04:05:04,781 DEBUG HandlerThread:44364 [system_info.py:__init__():46] System info init done +2023-04-21 04:05:04,781 INFO HandlerThread:44364 [system_monitor.py:start():181] Starting system monitor +2023-04-21 04:05:04,782 INFO SystemMonitor:44364 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 04:05:04,782 INFO HandlerThread:44364 [system_monitor.py:probe():201] Collecting system info +2023-04-21 04:05:04,790 INFO SystemMonitor:44364 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 04:05:04,790 INFO SystemMonitor:44364 [interfaces.py:start():190] Started disk monitoring +2023-04-21 04:05:04,791 INFO SystemMonitor:44364 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 04:05:04,808 INFO SystemMonitor:44364 [interfaces.py:start():190] Started memory monitoring +2023-04-21 04:05:04,828 INFO SystemMonitor:44364 [interfaces.py:start():190] Started network monitoring +2023-04-21 04:05:04,854 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:04,863 DEBUG HandlerThread:44364 [system_info.py:probe():195] Probing system +2023-04-21 04:05:04,866 DEBUG HandlerThread:44364 [system_info.py:_probe_git():180] Probing git +2023-04-21 04:05:04,971 DEBUG HandlerThread:44364 [system_info.py:_probe_git():188] Probing git done +2023-04-21 04:05:04,972 DEBUG HandlerThread:44364 [system_info.py:probe():240] Probing system done +2023-04-21 04:05:04,972 DEBUG HandlerThread:44364 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-20T20:05:04.864612', 'startedAt': '2023-04-20T20:05:03.256341', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '10', '--logging_steps', '10', '--save_steps', '5', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 495.093204498291}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 04:05:04,972 INFO HandlerThread:44364 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 04:05:04,972 INFO HandlerThread:44364 [system_monitor.py:probe():214] Publishing system info +2023-04-21 04:05:04,972 DEBUG HandlerThread:44364 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 04:05:04,973 DEBUG HandlerThread:44364 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 04:05:04,974 INFO HandlerThread:44364 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 04:05:04,986 DEBUG SenderThread:44364 [sender.py:send():375] send: files +2023-04-21 04:05:04,986 INFO SenderThread:44364 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 04:05:04,998 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:05:04,999 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:05:05,154 INFO Thread-16 :44364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\requirements.txt +2023-04-21 04:05:05,156 INFO Thread-16 :44364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-metadata.json +2023-04-21 04:05:05,156 INFO Thread-16 :44364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-summary.json +2023-04-21 04:05:05,397 DEBUG SenderThread:44364 [sender.py:send():375] send: telemetry +2023-04-21 04:05:05,397 DEBUG SenderThread:44364 [sender.py:send():375] send: config +2023-04-21 04:05:05,398 DEBUG SenderThread:44364 [sender.py:send():375] send: metric +2023-04-21 04:05:05,398 DEBUG SenderThread:44364 [sender.py:send():375] send: telemetry +2023-04-21 04:05:05,399 DEBUG SenderThread:44364 [sender.py:send():375] send: metric +2023-04-21 04:05:05,399 WARNING SenderThread:44364 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 04:05:05,941 INFO wandb-upload_0:44364 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpttogkq_swandb\3u4h3e8i-wandb-metadata.json +2023-04-21 04:05:06,166 INFO Thread-16 :44364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:05:06,908 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:08,176 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:05:08,689 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:08,965 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:09,177 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:05:11,007 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:13,046 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:13,751 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:15,087 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:17,132 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:18,785 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:19,178 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:19,994 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:05:19,995 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:05:21,226 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:23,274 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:24,280 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:25,322 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:27,404 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:29,337 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:29,452 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:31,505 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:33,543 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:34,384 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:35,001 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:05:35,001 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:05:35,609 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\config.yaml +2023-04-21 04:05:35,654 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:37,684 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:39,734 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:40,296 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:41,771 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:43,813 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:45,321 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:45,862 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:47,944 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:49,991 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:50,006 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:05:50,007 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:05:51,257 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:52,035 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:05:52,042 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:54,087 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:56,126 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:05:56,310 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:05:58,165 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:00,232 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:01,351 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:02,296 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:04,353 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:04,839 DEBUG SystemMonitor:44364 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 04:06:04,840 DEBUG SenderThread:44364 [sender.py:send():375] send: stats +2023-04-21 04:06:05,014 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:06:05,014 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:06:06,429 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:06:06,467 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:07,292 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:08,479 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:10,502 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:12,327 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:12,555 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:14,602 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:16,651 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:17,366 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:18,708 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:20,019 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:06:20,019 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:06:20,751 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:06:20,764 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:22,818 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:23,289 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:24,854 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:26,905 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:28,330 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:28,949 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:31,011 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:33,058 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:33,528 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:34,052 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:06:34,843 DEBUG SenderThread:44364 [sender.py:send():375] send: stats +2023-04-21 04:06:35,028 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:06:35,029 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:06:35,108 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:37,225 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:39,247 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:39,316 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:41,277 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:43,328 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:44,369 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:45,378 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:47,431 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:06:47,445 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:48,439 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:06:49,501 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:49,912 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:50,037 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:06:50,037 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:06:51,566 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:53,615 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:55,327 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:06:55,677 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:57,738 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:06:59,789 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:00,363 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:07:01,838 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:03,881 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:04,857 DEBUG SenderThread:44364 [sender.py:send():375] send: stats +2023-04-21 04:07:04,872 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:07:05,039 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:07:05,040 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:07:05,918 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:06,301 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:07:08,065 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:10,076 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:11,325 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:07:12,118 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:14,171 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:16,214 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:16,387 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:07:18,275 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:07:18,284 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:20,046 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:07:20,047 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:07:20,339 ERROR gpu :44364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:07:20,714 DEBUG SenderThread:44364 [sender.py:send():375] send: exit +2023-04-21 04:07:20,714 INFO SenderThread:44364 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 04:07:20,714 INFO SenderThread:44364 [sender.py:send_exit():600] handling runtime: 135 +2023-04-21 04:07:20,715 INFO SenderThread:44364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:07:20,715 INFO SenderThread:44364 [sender.py:send_exit():606] send defer +2023-04-21 04:07:20,715 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,716 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 04:07:20,716 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,716 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 04:07:20,716 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 1 +2023-04-21 04:07:20,716 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,716 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 04:07:20,716 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,716 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 04:07:20,717 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 2 +2023-04-21 04:07:20,717 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,717 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 04:07:20,717 INFO HandlerThread:44364 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 04:07:20,717 DEBUG SystemMonitor:44364 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 04:07:20,717 DEBUG SystemMonitor:44364 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 04:07:20,717 INFO HandlerThread:44364 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 04:07:20,730 INFO HandlerThread:44364 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 04:07:20,787 INFO HandlerThread:44364 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 04:07:20,787 INFO HandlerThread:44364 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 04:07:20,787 INFO HandlerThread:44364 [interfaces.py:finish():202] Joined network monitor +2023-04-21 04:07:20,788 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,788 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 04:07:20,788 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 3 +2023-04-21 04:07:20,789 DEBUG SenderThread:44364 [sender.py:send():375] send: stats +2023-04-21 04:07:20,789 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,789 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 04:07:20,790 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,790 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 04:07:20,790 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 4 +2023-04-21 04:07:20,790 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,791 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 04:07:20,792 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,792 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 04:07:20,792 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 5 +2023-04-21 04:07:20,792 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,792 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 04:07:20,793 DEBUG SenderThread:44364 [sender.py:send():375] send: summary +2023-04-21 04:07:20,793 INFO SenderThread:44364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:07:20,794 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,794 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 04:07:20,794 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 6 +2023-04-21 04:07:20,794 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,794 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 04:07:20,795 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,795 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 04:07:20,795 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 7 +2023-04-21 04:07:20,795 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:07:20,795 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:20,795 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 04:07:20,795 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:20,795 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 04:07:21,345 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-summary.json +2023-04-21 04:07:21,803 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 04:07:22,356 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:07:23,275 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 8 +2023-04-21 04:07:23,275 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 04:07:23,275 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:23,276 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 04:07:23,276 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:23,276 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 04:07:23,294 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 9 +2023-04-21 04:07:23,295 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:23,295 DEBUG SenderThread:44364 [sender.py:send():375] send: artifact +2023-04-21 04:07:23,295 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 04:07:23,370 INFO Thread-16 :44364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:07:24,678 INFO SenderThread:44364 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'digest': '53476254d59d98151858729a5d45f45c', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v2'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v2'} +2023-04-21 04:07:24,679 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:24,679 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 04:07:24,679 INFO SenderThread:44364 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 04:07:25,388 INFO SenderThread:44364 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files +2023-04-21 04:07:25,388 INFO SenderThread:44364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\config.yaml config.yaml +2023-04-21 04:07:25,389 INFO SenderThread:44364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log output.log +2023-04-21 04:07:25,392 INFO SenderThread:44364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\requirements.txt requirements.txt +2023-04-21 04:07:25,395 INFO SenderThread:44364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-metadata.json wandb-metadata.json +2023-04-21 04:07:25,395 INFO SenderThread:44364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-summary.json wandb-summary.json +2023-04-21 04:07:25,397 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 10 +2023-04-21 04:07:25,398 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:25,398 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 04:07:25,400 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:25,401 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 04:07:25,401 INFO SenderThread:44364 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 04:07:26,054 INFO wandb-upload_0:44364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\config.yaml +2023-04-21 04:07:26,501 INFO wandb-upload_1:44364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\output.log +2023-04-21 04:07:26,601 INFO wandb-upload_3:44364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\wandb-summary.json +2023-04-21 04:07:26,682 INFO wandb-upload_2:44364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\files\requirements.txt +2023-04-21 04:07:26,840 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 04:07:26,886 INFO Thread-15 :44364 [sender.py:transition_state():626] send defer: 11 +2023-04-21 04:07:26,886 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:26,886 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 04:07:26,886 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:26,887 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 04:07:26,887 INFO SenderThread:44364 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 04:07:26,887 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 12 +2023-04-21 04:07:26,887 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:26,887 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 04:07:26,887 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:26,887 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 04:07:27,497 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 13 +2023-04-21 04:07:27,497 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:27,497 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 04:07:27,497 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:27,497 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 04:07:27,497 INFO SenderThread:44364 [sender.py:transition_state():626] send defer: 14 +2023-04-21 04:07:27,498 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 04:07:27,498 DEBUG SenderThread:44364 [sender.py:send():375] send: final +2023-04-21 04:07:27,498 INFO HandlerThread:44364 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 04:07:27,498 DEBUG SenderThread:44364 [sender.py:send():375] send: footer +2023-04-21 04:07:27,498 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: defer +2023-04-21 04:07:27,498 INFO SenderThread:44364 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 04:07:27,499 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 04:07:27,499 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 04:07:27,499 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 04:07:27,499 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 04:07:27,499 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 04:07:27,499 DEBUG SenderThread:44364 [sender.py:send_request():402] send_request: server_info +2023-04-21 04:07:27,738 INFO MainThread:44364 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 04:07:27,738 INFO MainThread:44364 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 04:07:27,738 INFO MainThread:44364 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 04:07:27,739 DEBUG HandlerThread:44364 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 04:07:27,739 INFO HandlerThread:44364 [handler.py:finish():845] shutting down handler +2023-04-21 04:07:28,504 INFO WriterThread:44364 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\run-ofh2ksim.wandb +2023-04-21 04:07:28,739 INFO SenderThread:44364 [sender.py:finish():1550] shutting down sender +2023-04-21 04:07:28,739 INFO SenderThread:44364 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 04:07:28,739 INFO SenderThread:44364 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug.log b/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..e6e1d793fe81f34c0423c7b0e5082b87754c41e1 --- /dev/null +++ b/ptuning/wandb/run-20230421_040503-ofh2ksim/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 04:05:03,261 INFO MainThread:4528 [wandb_setup.py:_flush():76] Configure stats pid to 4528 +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\logs\debug.log +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040503-ofh2ksim\logs\debug-internal.log +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:init():547] calling init triggers +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:init():595] starting backend +2023-04-21 04:05:03,262 INFO MainThread:4528 [wandb_init.py:init():599] setting up manager +2023-04-21 04:05:03,265 INFO MainThread:4528 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 04:05:03,269 INFO MainThread:4528 [wandb_init.py:init():605] backend started and connected +2023-04-21 04:05:03,270 INFO MainThread:4528 [wandb_init.py:init():695] updated telemetry +2023-04-21 04:05:03,344 INFO MainThread:4528 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 04:05:04,151 INFO MainThread:4528 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 04:05:04,718 INFO MainThread:4528 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 04:05:04,718 INFO MainThread:4528 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 04:05:04,999 INFO MainThread:4528 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 04:05:04,999 INFO MainThread:4528 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 04:05:05,000 INFO MainThread:4528 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 04:05:05,000 INFO MainThread:4528 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 04:05:05,001 INFO MainThread:4528 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 04:05:05,003 INFO MainThread:4528 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 10, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_04-03-18_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 5, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 04:07:29,463 WARNING MsgRouterThr:4528 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_040503-ofh2ksim/run-ofh2ksim.wandb b/ptuning/wandb/run-20230421_040503-ofh2ksim/run-ofh2ksim.wandb new file mode 100644 index 0000000000000000000000000000000000000000..61f170576b4ca9d681267da96a2a666a354bd12c Binary files /dev/null and b/ptuning/wandb/run-20230421_040503-ofh2ksim/run-ofh2ksim.wandb differ diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/files/config.yaml b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..a9778cb21af7daa22b944ec53230ac94e931d487 --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/config.yaml @@ -0,0 +1,636 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682021367.554957 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 + - 1: train/train_runtime + 5: 1 + 6: + - 1 + - 1: train/train_samples_per_second + 5: 1 + 6: + - 1 + - 1: train/train_steps_per_second + 5: 1 + 6: + - 1 + - 1: train/total_flos + 5: 1 + 6: + - 1 + - 1: train/train_loss + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_04-07-46_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/files/output.log b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..20e3e6ea04d21ddc570e590e61e3cbe378d08e8f --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/output.log @@ -0,0 +1,1152 @@ + + 0%| | 0/1000 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\config.json +[INFO|configuration_utils.py:362] 2023-04-21 04:36:10,009 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 04:36:10,239 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:36:10,244 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:36:10,245 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\special_tokens_map.json +{'loss': 4.3098, 'learning_rate': 0.018000000000000002, 'epoch': 0.01} +Saving PrefixEncoder + + + + + + + + + + 11%|███████▎ | 110/1000 [28:51<3:12:44, 12.99s/it] + + + + + + + + + + 12%|███████▊ | 119/1000 [30:48<3:10:13, 12.96s/it] + + + + + + + + + + + + 13%|████████▌ | 130/1000 [33:10<3:06:59, 12.90s/it] + + + + + + + + + + + 14%|█████████▏ | 140/1000 [35:19<3:05:39, 12.95s/it] + + + + + + + + + + + 15%|█████████▉ | 150/1000 [37:29<3:03:12, 12.93s/it] + + + + + + + + + + 16%|██████████▍ | 159/1000 [39:26<3:01:37, 12.96s/it] + + + + + + + + + + + + 17%|███████████▏ | 170/1000 [41:49<2:59:19, 12.96s/it] + + + + + + + + + + + 18%|███████████▉ | 180/1000 [43:58<2:57:00, 12.95s/it] + + + + + + + + + + + 19%|████████████▌ | 190/1000 [46:07<2:54:19, 12.91s/it] + + + + + + + + + + 20%|█████████████▏ | 200/1000 [48:17<2:52:03, 12.90s/it][INFO|configuration_utils.py:457] 2023-04-21 04:57:46,255 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\config.json +[INFO|configuration_utils.py:362] 2023-04-21 04:57:46,258 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 04:57:46,459 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 04:57:46,463 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 04:57:46,463 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\special_tokens_map.json +{'loss': 4.1744, 'learning_rate': 0.016, 'epoch': 0.03} +Saving PrefixEncoder + + + + + + + + + + 21%|█████████████▊ | 210/1000 [50:27<2:51:01, 12.99s/it] + + + + + + + + + + + 22%|██████████████▌ | 220/1000 [52:37<2:48:24, 12.95s/it] + + + + + + + + + + + 23%|███████████████▏ | 230/1000 [54:46<2:46:03, 12.94s/it] + + + + + + + + + + 24%|███████████████▊ | 239/1000 [56:42<2:44:00, 12.93s/it] + + + + + + + + + + + + 25%|████████████████▌ | 250/1000 [59:12<2:52:37, 13.81s/it] + + + + + + + + + + + 26%|████████████████▋ | 260/1000 [1:01:31<2:51:35, 13.91s/it] + + + + + + + + + + 27%|█████████████████▏ | 269/1000 [1:03:35<2:48:19, 13.82s/it] + + + + + + + + + + + + 28%|█████████████████▉ | 280/1000 [1:06:07<2:45:52, 13.82s/it] + + + + + + + + + + + 29%|██████████████████▌ | 290/1000 [1:08:26<2:43:29, 13.82s/it] + + + + + + + + + + 30%|███████████████████▏ | 300/1000 [1:10:44<2:40:33, 13.76s/it][INFO|configuration_utils.py:457] 2023-04-21 05:20:13,519 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json +[INFO|configuration_utils.py:362] 2023-04-21 05:20:13,522 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 05:20:13,764 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 05:20:13,768 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 05:20:13,769 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\special_tokens_map.json +{'loss': 4.0542, 'learning_rate': 0.013999999999999999, 'epoch': 0.04} +Saving PrefixEncoder + + + + + + + + + + 31%|███████████████████▊ | 310/1000 [1:13:03<2:38:48, 13.81s/it] + + + + + + + + + + + 32%|████████████████████▍ | 320/1000 [1:15:13<2:27:04, 12.98s/it] + + + + + + + + + + + 33%|█████████████████████ | 330/1000 [1:17:23<2:24:59, 12.98s/it] + + + + + + + + + + + 34%|█████████████████████▊ | 340/1000 [1:19:32<2:22:30, 12.96s/it] + + + + + + + + + + + 35%|██████████████████████▍ | 350/1000 [1:21:42<2:20:41, 12.99s/it] + + + + + + + + + + 36%|██████████████████████▉ | 359/1000 [1:23:38<2:17:59, 12.92s/it] + + + + + + + + + + + + 37%|███████████████████████▋ | 370/1000 [1:26:01<2:16:05, 12.96s/it] + + + + + + + + + + + 38%|████████████████████████▎ | 380/1000 [1:28:11<2:14:03, 12.97s/it] + + + + + + + + + + 39%|████████████████████████▉ | 389/1000 [1:30:07<2:11:45, 12.94s/it] + + + + + + + + + + + 40%|█████████████████████████▌ | 399/1000 [1:32:17<2:09:54, 12.97s/it] +{'loss': 4.0051, 'learning_rate': 0.012, 'epoch': 0.06} + 40%|█████████████████████████▌ | 400/1000 [1:32:30<2:09:43, 12.97s/it][INFO|configuration_utils.py:457] 2023-04-21 05:41:59,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json +[INFO|configuration_utils.py:362] 2023-04-21 05:41:59,902 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 05:42:00,099 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 05:42:00,104 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 05:42:00,106 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\special_tokens_map.json + + + + + + + + + + 41%|██████████████████████████▏ | 410/1000 [1:34:40<2:07:16, 12.94s/it] + + + + + + + + + + 42%|██████████████████████████▊ | 419/1000 [1:36:37<2:05:23, 12.95s/it] + + + + + + + + + + + 43%|███████████████████████████▍ | 429/1000 [1:38:47<2:03:42, 13.00s/it] + + + + + + + + + + + 44%|████████████████████████████ | 439/1000 [1:40:57<2:00:48, 12.92s/it] + + + + + + + + + + + + 45%|████████████████████████████▊ | 450/1000 [1:43:19<1:58:51, 12.97s/it] + + + + + + + + + + + 46%|█████████████████████████████▍ | 460/1000 [1:45:29<1:56:20, 12.93s/it] + + + + + + + + + + 47%|██████████████████████████████ | 469/1000 [1:47:25<1:54:32, 12.94s/it] + + + + + + + + + + + 48%|██████████████████████████████▋ | 479/1000 [1:49:34<1:52:15, 12.93s/it] + + + + + + + + + + + 49%|███████████████████████████████▎ | 489/1000 [1:51:44<1:50:11, 12.94s/it] + + + + + + + + + + + 50%|████████████████████████████████ | 500/1000 [1:54:06<1:48:00, 12.96s/it][INFO|configuration_utils.py:457] 2023-04-21 06:03:35,995 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json +[INFO|configuration_utils.py:362] 2023-04-21 06:03:35,998 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 06:03:36,203 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 06:03:36,210 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 06:03:36,213 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\special_tokens_map.json +{'loss': 4.0684, 'learning_rate': 0.01, 'epoch': 0.07} +Saving PrefixEncoder + + + + + + + + + + 51%|████████████████████████████████▋ | 510/1000 [1:56:16<1:45:26, 12.91s/it] + + + + + + + + + + 52%|█████████████████████████████████▏ | 519/1000 [1:58:12<1:43:36, 12.92s/it] + + + + + + + + + + + 53%|█████████████████████████████████▊ | 529/1000 [2:00:21<1:41:17, 12.90s/it] + + + + + + + + + + + 54%|██████████████████████████████████▍ | 539/1000 [2:02:31<1:39:23, 12.94s/it] + + + + + + + + + + + 55%|███████████████████████████████████▏ | 549/1000 [2:04:40<1:37:17, 12.94s/it] + + + + + + + + + + + + 56%|███████████████████████████████████▊ | 560/1000 [2:07:03<1:35:04, 12.96s/it] + + + + + + + + + + + 57%|████████████████████████████████████▍ | 570/1000 [2:09:12<1:32:46, 12.94s/it] + + + + + + + + + + 58%|█████████████████████████████████████ | 579/1000 [2:11:08<1:30:43, 12.93s/it] + + + + + + + + + + + 59%|█████████████████████████████████████▋ | 589/1000 [2:13:18<1:28:25, 12.91s/it] + + + + + + + + + + + 60%|██████████████████████████████████████▎ | 599/1000 [2:15:27<1:26:20, 12.92s/it] +{'loss': 4.0509, 'learning_rate': 0.008, 'epoch': 0.08} + 60%|██████████████████████████████████████▍ | 600/1000 [2:15:40<1:26:21, 12.95s/it][INFO|configuration_utils.py:457] 2023-04-21 06:25:09,708 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json +[INFO|configuration_utils.py:362] 2023-04-21 06:25:09,712 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 06:25:09,944 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 06:25:09,953 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 06:25:09,958 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\special_tokens_map.json + + + + + + + + + 61%|██████████████████████████████████████▉ | 609/1000 [2:17:37<1:23:59, 12.89s/it] + + + + + + + + + + + 62%|███████████████████████████████████████▌ | 619/1000 [2:19:46<1:21:57, 12.91s/it] + + + + + + + + + + + 63%|████████████████████████████████████████▎ | 629/1000 [2:21:55<1:19:55, 12.92s/it] + + + + + + + + + + + 64%|████████████████████████████████████████▉ | 639/1000 [2:24:05<1:17:45, 12.92s/it] + + + + + + + + + + + 65%|█████████████████████████████████████████▌ | 649/1000 [2:26:14<1:15:36, 12.93s/it] + + + + + + + + + + + + 66%|██████████████████████████████████████████▏ | 660/1000 [2:28:36<1:13:06, 12.90s/it] + + + + + + + + + + 67%|██████████████████████████████████████████▊ | 669/1000 [2:30:32<1:11:11, 12.91s/it] + + + + + + + + + + + 68%|███████████████████████████████████████████▍ | 679/1000 [2:32:41<1:09:06, 12.92s/it] + + + + + + + + + + + + 69%|████████████████████████████████████████████▏ | 690/1000 [2:35:03<1:06:53, 12.95s/it] + + + + + + + + + + 70%|████████████████████████████████████████████▋ | 699/1000 [2:37:02<1:05:34, 13.07s/it] +{'loss': 4.0222, 'learning_rate': 0.006, 'epoch': 0.1} + 70%|████████████████████████████████████████████▊ | 700/1000 [2:37:15<1:05:08, 13.03s/it][INFO|configuration_utils.py:457] 2023-04-21 06:46:44,309 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json +[INFO|configuration_utils.py:362] 2023-04-21 06:46:44,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 06:46:44,527 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 06:46:44,532 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 06:46:44,535 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\special_tokens_map.json + + + + + + + + + 71%|█████████████████████████████████████████████▍ | 709/1000 [2:39:12<1:02:39, 12.92s/it] + + + + + + + + + + + 72%|██████████████████████████████████████████████ | 719/1000 [2:41:21<1:00:29, 12.92s/it] + + + + + + + + + + + 73%|████████████████████████████████████████████████ | 729/1000 [2:43:30<58:23, 12.93s/it] + + + + + + + + + + + + 74%|████████████████████████████████████████████████▊ | 740/1000 [2:45:52<55:52, 12.90s/it] + + + + + + + + + + 75%|█████████████████████████████████████████████████▍ | 749/1000 [2:47:49<54:02, 12.92s/it] + + + + + + + + + + + + 76%|██████████████████████████████████████████████████▏ | 760/1000 [2:50:11<51:49, 12.96s/it] + + + + + + + + + + 77%|██████████████████████████████████████████████████▊ | 769/1000 [2:52:08<49:57, 12.98s/it] + + + + + + + + + + + + 78%|███████████████████████████████████████████████████▍ | 780/1000 [2:54:31<47:30, 12.95s/it] + + + + + + + + + + 79%|████████████████████████████████████████████████████ | 789/1000 [2:56:27<45:27, 12.92s/it] + + + + + + + + + + + 80%|████████████████████████████████████████████████████▊ | 800/1000 [2:58:49<43:13, 12.97s/it][INFO|configuration_utils.py:457] 2023-04-21 07:08:19,154 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json +[INFO|configuration_utils.py:362] 2023-04-21 07:08:19,157 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 07:08:19,367 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 07:08:19,372 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 07:08:19,373 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\special_tokens_map.json +{'loss': 3.976, 'learning_rate': 0.004, 'epoch': 0.11} +Saving PrefixEncoder + + + + + + + + + 81%|█████████████████████████████████████████████████████▍ | 809/1000 [3:00:47<41:11, 12.94s/it] + + + + + + + + + + + 82%|██████████████████████████████████████████████████████ | 819/1000 [3:02:56<39:03, 12.95s/it] + + + + + + + + + + + 83%|██████████████████████████████████████████████████████▋ | 829/1000 [3:05:06<36:53, 12.94s/it] + + + + + + + + + + + + 84%|███████████████████████████████████████████████████████▍ | 840/1000 [3:07:28<34:31, 12.94s/it] + + + + + + + + + + 85%|████████████████████████████████████████████████████████ | 849/1000 [3:09:24<32:36, 12.96s/it] + + + + + + + + + + + + 86%|████████████████████████████████████████████████████████▊ | 860/1000 [3:11:47<30:14, 12.96s/it] + + + + + + + + + + 87%|█████████████████████████████████████████████████████████▎ | 869/1000 [3:13:44<28:16, 12.95s/it] + + + + + + + + + + + + 88%|██████████████████████████████████████████████████████████ | 880/1000 [3:16:06<25:53, 12.94s/it] + + + + + + + + + + 89%|██████████████████████████████████████████████████████████▋ | 889/1000 [3:18:02<23:53, 12.91s/it] + + + + + + + + + + + 90%|███████████████████████████████████████████████████████████▎ | 899/1000 [3:20:11<21:45, 12.93s/it] +{'loss': 3.9967, 'learning_rate': 0.002, 'epoch': 0.13} + 90%|███████████████████████████████████████████████████████████▍ | 900/1000 [3:20:24<21:34, 12.95s/it][INFO|configuration_utils.py:457] 2023-04-21 07:29:54,125 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json +[INFO|configuration_utils.py:362] 2023-04-21 07:29:54,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 07:29:54,337 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 07:29:54,342 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 07:29:54,344 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\special_tokens_map.json + + + + + + + + + 91%|███████████████████████████████████████████████████████████▉ | 909/1000 [3:22:21<19:36, 12.93s/it] + + + + + + + + + + + 92%|████████████████████████████████████████████████████████████▋ | 919/1000 [3:24:30<17:25, 12.91s/it] + + + + + + + + + + + + 93%|█████████████████████████████████████████████████████████████▍ | 930/1000 [3:26:53<15:04, 12.93s/it] + + + + + + + + + + 94%|█████████████████████████████████████████████████████████████▉ | 939/1000 [3:28:49<13:07, 12.91s/it] + + + + + + + + + + + 95%|██████████████████████████████████████████████████████████████▋ | 949/1000 [3:30:58<10:57, 12.89s/it] + + + + + + + + + + + 96%|███████████████████████████████████████████████████████████████▎ | 959/1000 [3:33:08<08:51, 12.95s/it] + + + + + + + + + + + 97%|███████████████████████████████████████████████████████████████▉ | 969/1000 [3:35:17<06:39, 12.90s/it] + + + + + + + + + + + 98%|████████████████████████████████████████████████████████████████▌ | 979/1000 [3:37:26<04:31, 12.93s/it] + + + + + + + + + + + + 99%|█████████████████████████████████████████████████████████████████▎| 990/1000 [3:39:48<02:09, 12.91s/it] + + + + + + + + + +100%|█████████████████████████████████████████████████████████████████▉| 999/1000 [3:41:45<00:12, 12.96s/it] +{'loss': 3.9854, 'learning_rate': 0.0, 'epoch': 0.14} +100%|█████████████████████████████████████████████████████████████████| 1000/1000 [3:41:58<00:00, 12.96s/it][INFO|configuration_utils.py:457] 2023-04-21 07:51:27,233 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\config.json +[INFO|configuration_utils.py:362] 2023-04-21 07:51:27,236 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 07:51:27,420 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 07:51:27,423 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 07:51:27,424 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\special_tokens_map.json +100%|█████████████████████████████████████████████████████████████████| 1000/1000 [3:41:58<00:00, 13.32s/it] +{'train_runtime': 13324.3969, 'train_samples_per_second': 1.201, 'train_steps_per_second': 0.075, 'train_loss': 4.110057983398438, 'epoch': 0.14} +***** train metrics ***** + epoch = 0.14 + train_loss = 4.1101 + train_runtime = 3:42:04.39 + train_samples = 114599 + train_samples_per_second = 1.201 + train_steps_per_second = 0.075 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/files/requirements.txt b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-metadata.json b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..48390f0e2569ac110879b5ae00a752f40fe9aed2 --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-20T20:09:29.061841", + "startedAt": "2023-04-20T20:09:27.544065", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 495.0933532714844 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-summary.json b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..6cdac286aa5e67d237dd1522d4f1d4ef9dcd0daa --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 3.9854, "train/learning_rate": 0.0, "train/epoch": 0.14, "train/global_step": 1000, "_timestamp": 1682034687.8949578, "_runtime": 13320.340000867844, "_step": 100, "train/train_runtime": 13324.3969, "train/train_samples_per_second": 1.201, "train/train_steps_per_second": 0.075, "train/total_flos": 3.4665721233408e+16, "train/train_loss": 4.110057983398438, "_wandb": {"runtime": 13319}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug-internal.log b/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..e30688333ed98799bbb05601d92c1d6425a1ee89 --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug-internal.log @@ -0,0 +1,12919 @@ +2023-04-21 04:09:27,554 INFO StreamThr :37392 [internal.py:wandb_internal():86] W&B internal server running at pid: 37392, started at: 2023-04-21 04:09:27.553643 +2023-04-21 04:09:27,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status +2023-04-21 04:09:27,556 INFO WriterThread:37392 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\run-o7wbvva1.wandb +2023-04-21 04:09:27,560 DEBUG SenderThread:37392 [sender.py:send():375] send: header +2023-04-21 04:09:27,623 DEBUG SenderThread:37392 [sender.py:send():375] send: run +2023-04-21 04:09:28,341 INFO SenderThread:37392 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files +2023-04-21 04:09:28,342 INFO SenderThread:37392 [sender.py:_start_run_threads():1124] run started: o7wbvva1 with start time 1682021367.554957 +2023-04-21 04:09:28,342 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:09:28,343 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:09:28,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 04:09:28,344 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: check_version +2023-04-21 04:09:28,936 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 04:09:28,989 DEBUG HandlerThread:37392 [system_info.py:__init__():31] System info init +2023-04-21 04:09:28,989 DEBUG HandlerThread:37392 [system_info.py:__init__():46] System info init done +2023-04-21 04:09:28,989 INFO HandlerThread:37392 [system_monitor.py:start():181] Starting system monitor +2023-04-21 04:09:28,989 INFO SystemMonitor:37392 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 04:09:28,989 INFO HandlerThread:37392 [system_monitor.py:probe():201] Collecting system info +2023-04-21 04:09:28,998 INFO SystemMonitor:37392 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 04:09:28,998 INFO SystemMonitor:37392 [interfaces.py:start():190] Started disk monitoring +2023-04-21 04:09:28,998 INFO SystemMonitor:37392 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 04:09:29,011 INFO SystemMonitor:37392 [interfaces.py:start():190] Started memory monitoring +2023-04-21 04:09:29,053 INFO SystemMonitor:37392 [interfaces.py:start():190] Started network monitoring +2023-04-21 04:09:29,060 DEBUG HandlerThread:37392 [system_info.py:probe():195] Probing system +2023-04-21 04:09:29,063 DEBUG HandlerThread:37392 [system_info.py:_probe_git():180] Probing git +2023-04-21 04:09:29,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:29,161 DEBUG HandlerThread:37392 [system_info.py:_probe_git():188] Probing git done +2023-04-21 04:09:29,162 DEBUG HandlerThread:37392 [system_info.py:probe():240] Probing system done +2023-04-21 04:09:29,162 DEBUG HandlerThread:37392 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-20T20:09:29.061841', 'startedAt': '2023-04-20T20:09:27.544065', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 495.0933532714844}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 04:09:29,162 INFO HandlerThread:37392 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 04:09:29,162 INFO HandlerThread:37392 [system_monitor.py:probe():214] Publishing system info +2023-04-21 04:09:29,162 DEBUG HandlerThread:37392 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 04:09:29,163 DEBUG HandlerThread:37392 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 04:09:29,164 INFO HandlerThread:37392 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 04:09:29,175 DEBUG SenderThread:37392 [sender.py:send():375] send: files +2023-04-21 04:09:29,175 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 04:09:29,188 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:09:29,188 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:09:29,345 INFO Thread-16 :37392 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:09:29,345 INFO Thread-16 :37392 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\requirements.txt +2023-04-21 04:09:29,346 INFO Thread-16 :37392 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-metadata.json +2023-04-21 04:09:29,570 DEBUG SenderThread:37392 [sender.py:send():375] send: telemetry +2023-04-21 04:09:29,571 DEBUG SenderThread:37392 [sender.py:send():375] send: config +2023-04-21 04:09:29,573 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 04:09:29,573 DEBUG SenderThread:37392 [sender.py:send():375] send: telemetry +2023-04-21 04:09:29,573 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 04:09:29,574 WARNING SenderThread:37392 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 04:09:30,223 INFO wandb-upload_0:37392 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpd3er_52ywandb\dcltf6ie-wandb-metadata.json +2023-04-21 04:09:30,349 INFO Thread-16 :37392 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:09:31,108 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:32,366 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:09:32,666 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:33,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:33,369 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:09:35,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:37,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:37,714 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:39,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:41,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:42,755 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:43,411 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:44,183 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:09:44,183 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:09:45,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:47,527 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:48,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:49,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:51,625 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:53,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:53,670 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:55,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:57,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:09:58,603 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:09:59,198 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:09:59,198 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:09:59,835 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\config.yaml +2023-04-21 04:09:59,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:01,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:03,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:04,488 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:05,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:08,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:09,530 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:10,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:12,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:14,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:14,213 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:10:14,213 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:10:15,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:16,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:18,312 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:20,363 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:20,504 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:22,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:24,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:25,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:26,546 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:28,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:29,063 DEBUG SystemMonitor:37392 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 04:10:29,064 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:10:29,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:10:29,221 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:10:30,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:31,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:32,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:34,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:36,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:36,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:38,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:40,915 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:41,611 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:42,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:44,234 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:10:44,235 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:10:45,039 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:47,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:47,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:49,141 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:51,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:52,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:53,244 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:55,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:57,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:10:57,613 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:10:59,065 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:10:59,239 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:10:59,239 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:10:59,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:00,406 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:11:01,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:03,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:03,564 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:05,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:07,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:08,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:09,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:11,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:13,720 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:13,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:14,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:11:14,255 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:11:15,869 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:17,958 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:19,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:20,013 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:22,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:24,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:24,594 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:26,165 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:28,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:29,071 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:11:29,257 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:11:29,258 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:11:30,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:30,510 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:32,457 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:34,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:35,568 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:36,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:38,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:40,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:40,614 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:42,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:44,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:11:44,263 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:11:44,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:46,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:46,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:48,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:50,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:51,576 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:52,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:54,937 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:56,633 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:11:57,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:59,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:11:59,085 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:11:59,272 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:11:59,272 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:12:01,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:01,945 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:03,185 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:12:03,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:05,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:06,991 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:07,266 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:09,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:11,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:12,020 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:13,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:14,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:12:14,285 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:12:15,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:17,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:17,549 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:19,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:21,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:22,612 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:23,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:25,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:27,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:27,789 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:29,092 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:12:29,293 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:12:29,293 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:12:29,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:31,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:33,568 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:34,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:36,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:38,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:38,611 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:40,108 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:42,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:43,661 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:44,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:44,296 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:12:44,297 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:12:46,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:48,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:49,598 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:50,398 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:52,437 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:54,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:54,635 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:12:56,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:58,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:12:59,096 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:12:59,312 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:12:59,313 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:13:00,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:00,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:02,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:04,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:05,607 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:06,824 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:08,866 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:10,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:11,540 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:12,969 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:13:12,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:14,304 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:13:14,304 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:13:15,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:16,565 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:17,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:19,126 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:21,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:21,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:23,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:25,257 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:13:25,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:27,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:27,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:29,104 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:13:29,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:13:29,374 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:13:29,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:31,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:32,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:33,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:35,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:37,593 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:37,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:39,619 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:13:39,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:41,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:42,732 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:43,731 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:44,338 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:13:44,339 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:13:45,774 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:47,857 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:48,605 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:49,928 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:51,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:52,968 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:13:54,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:54,097 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:56,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:58,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:13:59,110 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:13:59,111 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:13:59,358 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:13:59,359 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:14:00,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:02,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:04,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:04,650 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:06,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:08,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:09,394 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:14:10,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:10,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:12,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:14,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:14:14,381 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:14:14,564 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:15,641 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:16,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:18,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:20,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:20,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:22,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:24,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:25,788 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:26,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:28,964 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:29,117 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:14:29,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:14:29,393 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:14:31,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:31,663 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:33,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:35,116 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:14:35,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:36,729 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:37,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:39,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:41,284 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:41,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:43,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:44,406 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:14:44,406 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:14:45,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:47,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:47,673 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:49,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:51,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:52,722 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:53,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:55,675 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:14:55,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:57,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:14:57,770 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:14:59,125 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:14:59,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:14:59,402 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:14:59,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:01,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:03,682 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:03,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:05,985 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:08,113 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:08,742 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:10,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:12,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:13,046 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:15:13,048 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 04:15:13,049 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 04:15:13,049 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 04:15:13,049 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:15:13,049 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:15:13,050 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:15:13,159 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:15:14,064 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:14,214 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:15:14,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:14,415 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:15:14,416 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:15:15,224 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:15:16,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:18,342 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:19,709 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:20,395 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\config.yaml +2023-04-21 04:15:20,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:22,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:24,507 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:25,239 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:26,619 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:28,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:29,130 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:15:29,423 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:15:29,423 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:15:30,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:30,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:32,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:34,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:35,716 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:35,810 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:15:36,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:39,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:40,780 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:41,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:43,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:44,431 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:15:44,431 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:15:45,128 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:46,687 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:47,179 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:49,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:51,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:51,742 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:53,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:55,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:56,391 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:15:57,443 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:15:57,714 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:15:59,133 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:15:59,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:15:59,498 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:15:59,507 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:01,560 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:02,785 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:03,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:05,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:07,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:07,813 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:09,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:11,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:12,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:13,918 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:16:13,927 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:14,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:16:14,465 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:16:15,981 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:18,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:18,743 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:20,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:22,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:23,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:24,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:26,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:28,305 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:28,848 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:29,147 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:16:29,475 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:16:29,475 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:16:30,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:32,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:34,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:34,776 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:36,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:38,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:39,664 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:16:40,347 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:40,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:42,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:44,482 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:16:44,482 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:16:44,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:45,728 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:46,931 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:48,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:50,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:51,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:53,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:55,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:55,817 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:16:57,253 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:59,160 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:16:59,312 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:16:59,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:16:59,493 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:17:00,308 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:17:01,351 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:01,757 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:03,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:05,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:06,810 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:07,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:09,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:11,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:11,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:13,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:14,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:17:14,508 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:17:14,665 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:17:15,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:17,774 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:17,796 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:19,841 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:21,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:22,909 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:23,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:25,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:28,055 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:17:28,067 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:28,082 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:29,165 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:17:29,514 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:17:29,515 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:17:30,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:32,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:33,784 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:34,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:36,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:38,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:38,825 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:40,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:42,461 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:17:42,495 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:44,505 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:44,512 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:44,527 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:17:44,528 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:17:46,553 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:48,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:49,803 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:50,653 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:52,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:54,753 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:55,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:17:56,796 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:17:56,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:58,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:17:59,169 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:17:59,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:17:59,546 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:18:00,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:00,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:02,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:05,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:05,864 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:07,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:08,244 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:18:08,250 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:18:08,250 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:18:08,251 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:18:09,110 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:18:09,110 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:18:09,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:10,113 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:18:11,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:11,280 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:13,287 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:14,561 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:18:14,562 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:18:15,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:16,824 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:17,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:19,421 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:21,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:21,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:23,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:24,516 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:18:25,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:26,950 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:27,634 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:29,184 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:18:29,556 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:18:29,557 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:18:29,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:31,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:32,820 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:33,787 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:35,835 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:36,835 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:18:37,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:38,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:39,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:41,973 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:43,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:44,102 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:44,564 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:18:44,564 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:18:46,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:48,146 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:48,830 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:50,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:51,175 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:18:52,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:54,291 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:54,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:56,361 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:58,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:18:59,192 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:18:59,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:18:59,569 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:18:59,569 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:19:00,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:02,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:04,560 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:19:04,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:04,866 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:06,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:08,666 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:09,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:10,719 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:12,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:14,582 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:19:14,582 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:19:14,883 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:15,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:16,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:18,903 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:19:18,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:20,986 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:21,499 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:23,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:25,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:26,541 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:27,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:29,158 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:29,204 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:19:29,591 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:19:29,592 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:19:31,191 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:19:31,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:32,151 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:33,245 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:35,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:37,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:37,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:39,388 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:41,446 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:42,225 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:43,496 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:44,492 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:19:44,602 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:19:44,602 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:19:45,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:47,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:47,856 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:49,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:51,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:52,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:53,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:55,803 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:57,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:19:58,497 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:19:58,844 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:19:59,205 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:19:59,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:19:59,627 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:19:59,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:01,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:03,970 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:03,981 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:06,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:08,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:08,964 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:10,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:12,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:13,170 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:20:14,204 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:14,213 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:14,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:20:14,638 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:20:16,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:18,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:19,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:20,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:22,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:24,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:24,781 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:20:24,782 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:20:24,782 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:20:24,783 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:20:25,470 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:20:25,797 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:26,509 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:20:26,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:27,511 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:20:28,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:29,209 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:20:29,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:20:29,637 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:20:30,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:30,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:32,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:34,768 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:35,925 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:36,803 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:38,851 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:20:38,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:40,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:41,612 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:42,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:44,658 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:20:44,658 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:20:45,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:46,901 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:47,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:49,132 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:51,151 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:51,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:53,188 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:20:53,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:55,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:57,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:57,312 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:20:59,212 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:20:59,335 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:20:59,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:20:59,651 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:21:01,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:02,927 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:03,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:05,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:07,528 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:21:07,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:07,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:09,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:11,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:13,000 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:13,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:14,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:21:14,653 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:21:15,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:17,896 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:18,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:19,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:21,897 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:21:21,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:23,999 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:24,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:26,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:28,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:29,224 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:21:29,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:21:29,665 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:21:29,915 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:30,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:32,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:34,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:35,233 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:21:35,452 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:36,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:38,322 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:40,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:40,494 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:42,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:44,455 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:44,678 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:21:44,678 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:21:45,935 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:46,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:48,619 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:49,579 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:21:50,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:51,155 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:52,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:54,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:56,200 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:21:56,784 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:58,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:21:59,226 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:21:59,697 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:21:59,698 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:22:00,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:01,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:01,889 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:02,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:04,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:06,826 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:07,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:09,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:11,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:11,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:13,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:14,707 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:22:14,707 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:22:15,217 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:15,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:16,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:17,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:19,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:21,421 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:22,027 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:23,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:25,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:27,064 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:27,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:29,238 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:22:29,575 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:29,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:29,704 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:22:29,704 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:22:31,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:32,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:33,677 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:35,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:37,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:38,005 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:39,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:41,873 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:41,910 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:22:41,911 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:22:41,911 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:22:41,912 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:22:42,876 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:42,876 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:22:43,929 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:43,935 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:43,942 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:44,704 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:22:44,704 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:22:45,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:48,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:49,048 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:50,155 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:52,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:54,088 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:54,210 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:56,258 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:58,287 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:22:58,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:22:59,247 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:22:59,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:22:59,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:22:59,717 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:23:00,359 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:02,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:04,451 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:05,027 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:06,500 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:08,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:10,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:10,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:11,627 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:23:12,667 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:14,720 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:14,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:23:14,731 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:23:15,977 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:16,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:18,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:20,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:20,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:22,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:24,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:25,961 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:23:26,135 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:27,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:29,055 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:29,247 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:23:29,738 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:23:29,739 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:23:31,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:32,006 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:33,155 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:35,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:37,257 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:37,855 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:38,249 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:23:39,303 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:41,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:42,894 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:43,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:44,752 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:23:44,752 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:23:45,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:47,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:48,041 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:49,550 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:51,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:52,628 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:23:53,659 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:53,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:55,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:57,787 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:23:58,712 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:23:59,261 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:23:59,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:23:59,765 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:23:59,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:01,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:03,989 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:04,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:06,050 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:24:06,060 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:08,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:09,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:10,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:12,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:14,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:14,387 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:14,766 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:24:14,766 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:24:16,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:18,413 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:19,992 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:20,451 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:24:20,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:22,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:24,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:25,030 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:26,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:28,654 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:29,262 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:24:29,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:24:29,769 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:24:30,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:31,024 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:32,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:34,807 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:24:34,814 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:36,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:36,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:38,913 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:40,964 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:41,759 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:43,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:44,776 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:24:44,777 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:24:45,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:46,064 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:24:47,113 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:47,354 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:49,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:51,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:52,371 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:53,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:55,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:57,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:57,400 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:24:58,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:24:58,976 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:24:58,976 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:24:58,978 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:24:59,268 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:24:59,436 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:24:59,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:24:59,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:24:59,789 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:25:00,450 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:01,477 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:01,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:03,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:03,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:05,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:07,637 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:08,092 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:09,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:11,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:13,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:13,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:14,789 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:14,804 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:25:14,804 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:25:15,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:17,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:19,091 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:19,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:22,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:24,147 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:24,158 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:26,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:28,182 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:28,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:29,275 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:25:29,276 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:29,829 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:25:29,829 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:25:30,244 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:32,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:34,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:35,113 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:36,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:38,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:40,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:41,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:42,549 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:42,564 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:44,610 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:44,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:25:44,844 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:25:46,116 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:46,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:48,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:50,775 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:51,145 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:52,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:54,879 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:25:54,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:56,814 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:25:56,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:58,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:25:59,283 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:25:59,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:25:59,844 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:26:01,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:02,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:03,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:05,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:07,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:07,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:09,235 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:26:09,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:11,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:12,425 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:13,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:14,862 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:26:14,863 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:26:15,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:17,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:18,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:19,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:21,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:22,539 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:26:23,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:24,051 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:25,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:27,720 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:29,089 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:29,294 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:26:29,764 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:29,876 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:26:29,876 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:26:31,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:33,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:34,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:35,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:36,924 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:26:37,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:39,781 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:40,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:42,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:44,126 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:44,825 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:44,889 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:26:44,890 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:26:46,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:48,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:50,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:50,472 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:51,273 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:26:52,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:54,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:55,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:26:56,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:58,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:26:59,310 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:26:59,904 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:26:59,904 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:27:00,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:01,179 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:02,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:04,625 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:04,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:06,300 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:06,691 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:08,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:10,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:11,335 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:12,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:14,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:14,910 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:27:14,910 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:27:15,977 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:27:15,979 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:27:15,979 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:27:15,981 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:27:16,917 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:16,917 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:27:16,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:16,988 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:17,927 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:18,980 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:21,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:22,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:23,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:25,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:27,103 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:27,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:29,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:29,325 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:27:29,922 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:27:29,923 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:27:31,272 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:31,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:32,171 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:33,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:35,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:37,211 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:37,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:39,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:41,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:42,252 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:43,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:44,915 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:27:44,916 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:27:45,614 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:45,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:47,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:48,175 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:49,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:51,803 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:53,226 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:53,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:55,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:58,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:27:58,303 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:27:58,976 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:27:59,333 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:27:59,929 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:27:59,929 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:28:00,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:02,060 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:04,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:04,222 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:06,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:08,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:09,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:10,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:12,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:13,326 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:28:14,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:14,931 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:14,945 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:28:14,946 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:28:16,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:18,475 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:20,256 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:20,542 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:22,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:24,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:25,918 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:26,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:27,696 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:28:28,814 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:29,347 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:28:29,956 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:28:29,956 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:28:30,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:31,218 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:32,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:34,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:36,251 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:36,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:39,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:41,076 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:28:41,085 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:41,616 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:43,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:44,959 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:28:44,959 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:28:45,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:47,246 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:47,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:49,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:51,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:52,256 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:53,393 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:28:53,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:55,457 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:57,316 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:28:57,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:59,351 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:28:59,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:28:59,977 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:28:59,978 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:29:01,625 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:03,242 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:03,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:05,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:07,756 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:29:07,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:08,916 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:09,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:11,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:13,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:13,956 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:14,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:29:14,979 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:29:15,989 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:18,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:19,269 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:20,092 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:22,133 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:29:22,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:24,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:24,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:26,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:28,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:29,363 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:29:29,962 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:29,993 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:29:29,993 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:29:30,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:32,419 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:33,352 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:29:33,353 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:29:33,353 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:29:33,354 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:29:33,384 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:29:34,425 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:29:34,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:35,369 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:35,432 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:29:36,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:38,527 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:40,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:40,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:42,637 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:44,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:45,011 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:29:45,012 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:29:46,272 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:46,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:48,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:49,812 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:29:50,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:52,152 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:52,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:54,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:56,993 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:57,191 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:29:59,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:29:59,376 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:30:00,014 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:30:00,014 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:30:01,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:02,103 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:30:02,799 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:03,158 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:05,206 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:07,257 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:07,838 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:09,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:11,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:12,879 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:13,400 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:15,025 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:30:15,025 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:30:15,438 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:30:15,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:17,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:18,290 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:19,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:21,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:23,331 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:23,645 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:25,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:27,735 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:29,103 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:29,385 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:30:29,770 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:30:29,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:30,043 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:30:30,043 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:30:31,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:33,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:34,318 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:35,958 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:38,009 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:39,338 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:40,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:42,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:44,158 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:30:44,165 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:44,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:45,056 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:30:45,057 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:30:46,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:48,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:50,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:50,425 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:52,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:54,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:56,462 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:30:56,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:58,516 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:30:58,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:30:59,392 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:31:00,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:31:00,076 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:31:00,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:02,342 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:02,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:04,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:06,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:07,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:08,803 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:09,806 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:31:10,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:12,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:13,080 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:14,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:15,080 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:31:15,080 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:31:17,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:18,335 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:19,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:21,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:23,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:23,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:24,167 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:31:25,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:27,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:28,819 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:29,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:29,402 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:31:30,091 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:31:30,092 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:31:31,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:33,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:34,417 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:35,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:37,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:38,534 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:31:39,564 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:39,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:41,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:43,708 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:44,613 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:45,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:31:45,102 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:31:45,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:47,830 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:49,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:50,240 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:50,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:31:50,255 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:31:50,255 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:31:50,256 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:31:50,876 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:31:50,876 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:31:51,912 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:31:51,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:53,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:55,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:31:56,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:58,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:31:59,403 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:32:00,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:00,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:32:00,129 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:32:00,369 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:02,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:04,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:05,857 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:06,259 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:32:06,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:08,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:10,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:10,908 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:12,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:14,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:15,131 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:32:15,132 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:32:16,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:16,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:18,566 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:32:18,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:20,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:21,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:22,666 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:24,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:26,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:26,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:28,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:29,405 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:32:30,148 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:32:30,148 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:32:30,871 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:32,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:32,901 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:32:32,915 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:35,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:37,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:37,445 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:39,054 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:41,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:42,476 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:43,149 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:44,152 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:32:45,180 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:32:45,180 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:32:45,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:47,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:48,438 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:49,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:51,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:53,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:53,481 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:55,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:56,428 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:32:57,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:32:58,626 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:32:59,415 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:32:59,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:00,181 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:33:00,182 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:33:01,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:03,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:04,471 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:05,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:07,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:09,564 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:09,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:10,813 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:33:11,877 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:13,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:14,610 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:15,192 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:33:15,193 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:33:15,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:18,039 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:20,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:20,485 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:22,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:23,128 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:33:24,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:25,494 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:26,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:28,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:29,419 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:33:30,189 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:33:30,190 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:33:30,335 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:31,450 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:32,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:34,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:36,479 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:36,515 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:33:36,558 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:38,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:40,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:41,518 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:42,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:44,696 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:45,197 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:33:45,197 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:33:46,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:47,271 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:48,779 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:33:48,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:50,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:52,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:52,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:54,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:56,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:57,358 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:33:59,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:33:59,435 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:34:00,218 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:34:00,219 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:34:00,219 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:34:00,220 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:34:00,221 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:34:00,230 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:34:01,074 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:01,074 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:34:01,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:02,079 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:02,502 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:03,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:05,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:07,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:07,530 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:09,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:11,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:12,573 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:13,359 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:15,230 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:34:15,230 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:34:15,411 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:15,421 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:17,475 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:18,497 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:19,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:21,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:23,543 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:23,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:25,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:26,677 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:27,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:29,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:29,443 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:34:29,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:30,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:34:30,248 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:34:31,841 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:33,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:34,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:35,927 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:38,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:40,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:40,154 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:41,036 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:42,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:44,131 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:45,259 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:34:45,259 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:34:45,510 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:46,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:48,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:50,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:50,556 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:52,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:53,335 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:34:54,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:56,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:34:56,432 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:58,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:34:59,446 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:35:00,264 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:35:00,265 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:35:00,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:01,551 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:02,593 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:04,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:06,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:07,116 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:07,697 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:35:08,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:10,818 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:12,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:12,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:14,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:15,266 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:35:15,266 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:35:16,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:17,536 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:18,957 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:35:18,967 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:21,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:23,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:23,122 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:25,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:27,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:28,174 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:29,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:29,460 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:35:30,277 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:35:30,277 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:35:31,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:33,296 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:35:33,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:34,046 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:35,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:37,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:39,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:39,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:41,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:43,573 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:44,968 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:45,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:35:45,285 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:35:45,614 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:35:45,625 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:47,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:49,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:50,581 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:51,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:53,846 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:55,631 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:35:55,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:57,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:35:59,476 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:36:00,016 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:00,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:00,292 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:36:00,293 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:36:01,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:02,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:04,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:06,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:06,599 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:08,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:09,988 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:36:09,989 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:36:09,990 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:36:09,991 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:36:10,353 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:36:10,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:11,353 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:12,377 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:12,378 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:12,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:14,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:15,292 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:36:15,292 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:36:16,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:17,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:18,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:20,610 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:22,641 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:22,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:24,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:25,703 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:26,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:28,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:28,793 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:29,483 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:36:30,319 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:36:30,319 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:36:30,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:32,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:34,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:34,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:36,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:37,977 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:39,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:40,491 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:41,151 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:43,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:45,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:45,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:36:45,315 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:36:45,563 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:47,258 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:49,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:50,568 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:51,361 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:36:51,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:53,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:55,491 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:55,602 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:36:57,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:36:59,490 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:36:59,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:00,320 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:37:00,321 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:37:01,562 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:01,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:03,657 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:37:03,666 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:05,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:06,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:07,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:09,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:11,695 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:11,928 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:13,953 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:15,342 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:37:15,342 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:37:15,986 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:17,620 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:18,043 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:37:18,053 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:20,097 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:22,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:22,649 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:24,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:26,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:27,678 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:28,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:29,489 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:37:30,334 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:37:30,353 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:30,359 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:37:30,360 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:37:32,395 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:33,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:34,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:36,511 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:38,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:38,659 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:40,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:42,698 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:43,667 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:37:44,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:44,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:45,375 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:37:45,376 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:37:46,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:48,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:49,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:50,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:52,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:54,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:55,573 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:37:55,920 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:37:56,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:59,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:37:59,494 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:38:00,382 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:38:00,382 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:38:00,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:01,072 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:03,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:05,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:05,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:07,237 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:09,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:10,279 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:38:11,330 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:11,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:13,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:15,423 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:38:15,423 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:38:15,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:16,678 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:17,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:19,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:20,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:38:20,549 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:38:20,550 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:38:20,551 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:38:21,578 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:38:21,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:22,579 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:22,580 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:38:23,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:25,714 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:27,621 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:27,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:29,499 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:38:29,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:30,393 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:38:30,394 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:38:31,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:32,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:33,924 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:38:33,931 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:35,980 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:38,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:38,540 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:40,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:42,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:43,594 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:44,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:45,399 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:38:45,399 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:38:46,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:48,258 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:38:48,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:49,482 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:50,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:52,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:54,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:54,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:38:56,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:58,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:38:59,507 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:39:00,274 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:00,399 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:39:00,400 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:39:00,564 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:39:00,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:02,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:04,679 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:05,686 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:06,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:08,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:10,813 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:10,822 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:12,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:14,927 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:39:14,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:15,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:39:15,403 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:39:16,649 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:16,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:19,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:21,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:21,691 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:23,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:25,141 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:26,145 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:39:27,189 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:27,458 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:29,237 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:29,509 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:39:30,437 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:39:30,437 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:39:31,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:32,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:33,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:35,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:37,450 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:37,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:39,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:40,493 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:39:41,539 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:43,482 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:43,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:45,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:39:45,442 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:39:45,708 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:47,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:48,723 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:49,771 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:51,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:52,813 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:39:53,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:54,412 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:55,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:57,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:39:59,465 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:39:59,513 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:39:59,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:00,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:40:00,457 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:40:02,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:04,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:05,332 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:06,132 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:07,128 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:08,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:10,237 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:10,387 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:12,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:14,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:15,448 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:40:15,449 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:40:15,690 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:16,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:18,452 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:18,491 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:20,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:21,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:22,558 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:24,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:26,343 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:26,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:28,714 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:29,517 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:40:30,227 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:40:30,228 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:40:30,229 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:40:30,229 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:40:30,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:40:30,464 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:40:30,742 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:40:30,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:31,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:31,755 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:32,785 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:32,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:34,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:36,880 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:36,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:38,937 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:40,986 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:41,989 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:43,039 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:45,085 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:45,095 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:45,486 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:40:45,486 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:40:47,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:47,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:49,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:51,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:52,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:53,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:55,385 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:57,444 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:40:57,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:58,142 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:40:59,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:40:59,534 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:41:00,495 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:41:00,495 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:41:01,572 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:03,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:03,765 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:05,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:07,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:08,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:09,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:10,783 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:41:11,835 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:13,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:14,101 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:15,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:41:15,505 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:41:15,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:18,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:19,774 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:20,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:22,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:23,109 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:41:24,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:25,080 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:26,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:28,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:29,543 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:41:30,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:30,351 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:30,508 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:41:30,508 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:41:32,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:34,445 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:35,949 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:36,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:37,489 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:41:38,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:40,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:40,996 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:42,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:44,691 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:45,517 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:41:45,517 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:41:46,750 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:46,792 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:48,825 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:41:48,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:50,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:51,894 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:52,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:54,940 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:56,973 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:41:56,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:59,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:41:59,547 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:42:00,524 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:42:00,525 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:42:01,092 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:02,795 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:03,119 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:03,128 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:05,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:07,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:07,852 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:09,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:11,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:12,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:13,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:15,423 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:15,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:15,539 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:42:15,539 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:42:17,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:18,839 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:19,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:21,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:23,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:23,887 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:25,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:27,720 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:29,555 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:42:29,556 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:29,754 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:29,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:30,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:42:30,567 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:42:31,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:33,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:34,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:35,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:37,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:39,443 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:42:39,444 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:42:39,444 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:42:39,446 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:42:40,007 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:42:40,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:40,450 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:41,023 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:42,052 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:42,066 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:44,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:45,574 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:42:45,575 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:42:45,828 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:46,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:48,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:50,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:50,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:52,351 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:53,323 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:42:54,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:56,416 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:42:56,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:58,475 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:42:59,565 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:43:00,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:00,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:43:00,583 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:43:01,842 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:02,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:04,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:06,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:07,315 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:07,693 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:43:08,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:10,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:12,361 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:12,839 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:14,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:15,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:43:15,596 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:43:16,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:17,853 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:19,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:20,023 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:43:21,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:23,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:23,221 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:25,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:27,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:28,280 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:29,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:29,578 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:43:30,603 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:43:30,604 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:43:31,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:33,364 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:43:33,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:34,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:35,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:37,474 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:39,167 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:39,539 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:41,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:43,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:45,021 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:45,617 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:43:45,618 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:43:45,677 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:43:45,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:47,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:49,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:50,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:51,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:53,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:55,946 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:43:55,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:58,009 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:43:58,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:43:59,592 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:44:00,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:00,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:44:00,632 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:44:01,876 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:02,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:04,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:06,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:06,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:08,258 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:10,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:11,305 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:44:12,362 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:12,836 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:14,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:15,669 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:44:15,669 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:44:16,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:17,928 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:18,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:20,559 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:22,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:23,641 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:44:23,751 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:24,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:26,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:28,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:28,803 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:29,594 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:44:30,661 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:44:30,662 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:44:30,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:32,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:33,932 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:34,931 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:36,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:37,985 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:44:39,033 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:39,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:41,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:43,135 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:44,812 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:45,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:45,660 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:44:45,660 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:44:47,214 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:48,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:44:48,738 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:44:48,738 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:44:48,739 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:44:49,268 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:44:49,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:50,283 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:44:50,749 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:51,322 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:53,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:55,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:55,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:44:57,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:59,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:44:59,600 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:45:00,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:45:00,647 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:45:00,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:01,586 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:03,621 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:45:03,632 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:05,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:06,695 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:07,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:09,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:11,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:11,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:13,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:15,656 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:45:15,657 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:45:15,935 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:45:15,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:16,946 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:18,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:20,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:22,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:22,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:24,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:26,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:27,081 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:28,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:29,612 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:45:30,397 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:45:30,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:30,667 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:45:30,667 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:45:32,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:32,942 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:34,511 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:36,589 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:38,003 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:38,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:40,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:41,714 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:45:42,761 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:43,038 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:44,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:45,686 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:45:45,687 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:45:46,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:48,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:48,972 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:50,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:53,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:54,030 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:55,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:56,104 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:45:57,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:59,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:45:59,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:45:59,622 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:46:00,691 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:46:00,692 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:46:01,249 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:03,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:04,988 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:05,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:07,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:08,408 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:46:09,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:10,372 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:11,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:13,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:15,424 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:15,649 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:15,694 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:46:15,695 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:46:17,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:19,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:21,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:21,814 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:22,817 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:46:23,866 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:25,968 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:26,440 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:27,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:29,631 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:46:30,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:30,719 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:46:30,719 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:46:31,982 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:32,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:34,123 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:46:34,132 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:36,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:37,269 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:38,254 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:40,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:42,337 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:42,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:44,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:45,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:46:45,727 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:46:46,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:48,140 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:48,464 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:46:48,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:50,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:52,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:53,177 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:54,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:56,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:58,228 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:46:58,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:46:58,943 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:46:58,944 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:46:58,944 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:46:58,945 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:46:59,630 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:46:59,726 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:47:00,752 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:47:00,776 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:47:00,779 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:47:00,794 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:02,830 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:04,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:04,889 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:06,937 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:08,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:09,115 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:11,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:13,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:14,063 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:47:14,912 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:15,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:15,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:47:15,754 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:47:17,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:19,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:20,028 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:21,267 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:23,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:25,363 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:25,835 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:26,361 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:47:27,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:29,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:29,644 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:47:30,756 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:47:30,756 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:47:30,997 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:31,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:33,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:35,634 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:36,048 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:37,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:39,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:40,732 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:47:41,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:41,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:43,815 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:45,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:47:45,769 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:47:45,877 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:47,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:47,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:49,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:52,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:52,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:53,038 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:47:54,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:56,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:57,677 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:47:58,285 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:47:59,654 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:48:00,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:00,777 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:48:00,777 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:48:02,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:03,048 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:04,349 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:48:04,357 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:06,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:08,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:08,551 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:10,496 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:12,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:13,581 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:14,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:15,795 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:48:15,795 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:48:16,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:18,692 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:48:18,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:19,550 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:20,744 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:22,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:24,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:24,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:26,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:28,998 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:29,664 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:48:29,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:30,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:48:30,806 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:48:31,000 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:48:31,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:33,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:35,106 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:35,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:37,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:39,210 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:40,153 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:41,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:43,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:45,348 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:48:45,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:45,558 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:45,825 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:48:45,826 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:48:47,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:49,456 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:51,108 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:51,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:53,563 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:55,606 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:56,450 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:48:56,603 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:48:57,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:48:59,701 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:48:59,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:00,851 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:49:00,851 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:49:01,768 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:02,117 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:03,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:05,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:07,166 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:07,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:08,306 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:49:08,308 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:49:08,308 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:49:08,309 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:49:08,927 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:49:09,965 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:49:09,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:10,981 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:49:12,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:12,352 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:14,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:15,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:49:15,840 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:49:16,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:18,189 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:18,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:20,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:22,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:23,319 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:49:23,445 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:24,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:26,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:28,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:28,501 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:29,714 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:49:30,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:30,853 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:49:30,854 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:49:32,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:34,136 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:34,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:36,702 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:49:36,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:38,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:39,709 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:40,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:42,876 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:44,748 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:44,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:45,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:49:45,868 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:49:46,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:49,034 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:49:49,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:50,633 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:51,092 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:53,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:55,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:55,663 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:49:57,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:59,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:49:59,715 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:50:00,880 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:50:00,880 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:50:01,130 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:01,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:03,401 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:50:03,452 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:05,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:06,179 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:07,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:09,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:11,226 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:11,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:13,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:15,717 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:50:15,724 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:15,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:50:15,885 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:50:17,142 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:17,793 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:19,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:21,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:22,187 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:23,933 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:25,972 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:27,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:28,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:29,038 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:50:29,717 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:50:30,074 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:30,899 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:50:30,899 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:50:32,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:33,164 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:34,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:36,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:38,268 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:38,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:40,332 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:41,334 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:50:42,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:43,425 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:44,427 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:45,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:50:45,925 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:50:46,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:48,520 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:49,201 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:50,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:52,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:53,630 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:50:54,358 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:54,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:56,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:58,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:50:59,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:50:59,719 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:51:00,837 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:00,926 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:51:00,926 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:51:02,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:04,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:05,200 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:07,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:08,012 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:51:09,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:10,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:11,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:13,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:15,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:15,307 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:15,918 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:51:15,918 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:51:17,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:18,273 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:51:18,274 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:51:18,275 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:51:18,275 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:51:19,294 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:51:19,295 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:51:19,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:20,297 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:51:21,308 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:21,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:23,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:25,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:26,359 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:27,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:29,560 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:29,727 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:51:30,927 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:51:30,928 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:51:31,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:32,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:33,685 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:51:33,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:35,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:37,222 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:37,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:39,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:41,846 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:42,273 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:43,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:45,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:51:45,939 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:51:45,941 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:51:45,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:48,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:48,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:50,074 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:52,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:53,243 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:54,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:56,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:58,267 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:51:59,196 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:51:59,734 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:52:00,310 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:52:00,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:00,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:52:00,948 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:52:02,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:04,229 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:04,491 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:06,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:08,525 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:09,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:10,573 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:11,578 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:52:12,624 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:14,695 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:15,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:15,957 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:52:15,957 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:52:16,753 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:18,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:20,226 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:20,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:22,899 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:23,906 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:52:24,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:26,041 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:26,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:29,052 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:29,743 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:52:30,959 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:52:30,960 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:52:31,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:31,210 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:33,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:35,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:36,952 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:37,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:38,265 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:52:39,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:41,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:42,006 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:43,478 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:45,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:45,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:52:45,976 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:52:47,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:47,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:49,627 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:52:49,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:51,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:52,951 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:53,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:55,793 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:57,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:52:57,984 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:52:59,753 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:52:59,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:00,996 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:53:00,996 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:53:01,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:03,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:03,974 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:53:03,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:06,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:08,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:08,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:10,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:12,209 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:13,905 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:14,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:16,005 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:53:16,005 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:53:16,300 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:53:16,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:18,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:19,275 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:20,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:22,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:24,330 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:24,520 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:26,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:27,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:53:27,772 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:53:27,772 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:53:27,773 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:53:28,609 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:53:28,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:29,766 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:53:29,767 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:30,657 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:53:30,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:31,009 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:53:31,009 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:53:32,720 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:34,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:35,296 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:36,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:38,883 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:40,330 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:40,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:41,909 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:53:42,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:45,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:45,726 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:46,026 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:53:46,026 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:53:47,064 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:49,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:51,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:51,320 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:53,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:54,198 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:53:55,258 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:56,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:53:57,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:59,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:53:59,780 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:54:01,042 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:54:01,043 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:54:01,419 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:02,295 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:03,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:05,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:07,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:07,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:08,577 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:54:09,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:11,662 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:12,592 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:13,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:15,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:16,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:54:16,060 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:54:17,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:18,334 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:19,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:21,913 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:22,920 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:54:23,570 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:23,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:26,015 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:28,085 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:28,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:29,781 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:54:30,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:31,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:54:31,060 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:54:32,190 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:34,243 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:54:34,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:34,446 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:36,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:38,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:39,486 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:40,474 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:42,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:44,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:44,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:46,060 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:54:46,061 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:54:46,565 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:54:46,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:48,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:50,351 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:50,671 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:52,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:54,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:55,397 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:54:56,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:58,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:54:59,783 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:55:00,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:00,942 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:55:00,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:01,067 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:55:01,067 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:55:02,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:05,055 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:06,386 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:07,099 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:09,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:11,254 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:12,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:12,222 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:55:13,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:15,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:16,069 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:55:16,069 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:55:17,351 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:17,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:19,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:21,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:22,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:23,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:24,538 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:55:25,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:27,639 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:28,163 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:29,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:29,793 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:55:31,084 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:55:31,084 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:55:31,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:33,342 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:33,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:35,840 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:37,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:55:37,001 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:55:37,002 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:55:37,003 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:55:37,880 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:55:37,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:38,885 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:55:39,025 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:40,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:42,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:44,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:44,078 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:46,093 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:55:46,093 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:55:46,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:48,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:49,361 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:50,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:51,188 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:55:52,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:54,298 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:54,965 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:55:56,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:58,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:55:59,807 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:56:00,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:00,813 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:01,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:56:01,099 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:56:02,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:04,547 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:56:04,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:05,893 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:06,603 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:08,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:10,764 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:10,945 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:12,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:14,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:16,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:56:16,105 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:56:16,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:16,822 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:56:16,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:18,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:20,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:21,426 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:22,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:25,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:26,479 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:27,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:29,122 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:29,815 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:56:31,153 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:56:31,154 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:56:31,156 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:56:31,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:32,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:33,213 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:35,261 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:37,312 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:37,438 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:39,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:41,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:42,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:43,466 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:56:43,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:45,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:46,144 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:56:46,146 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:56:47,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:48,425 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:49,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:51,694 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:53,470 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:53,741 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:55,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:56,794 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:56:57,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:56:58,694 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:56:59,816 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:56:59,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:01,142 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:57:01,143 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:57:01,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:03,999 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:04,439 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:06,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:08,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:09,097 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:57:09,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:10,141 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:12,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:14,251 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:14,591 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:16,151 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:57:16,151 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:57:16,281 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:18,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:20,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:20,426 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:22,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:23,427 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:57:24,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:25,481 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:26,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:28,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:29,824 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:57:30,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:30,835 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:31,154 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:57:31,154 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:57:32,671 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:34,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:35,733 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:57:36,365 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:36,797 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:38,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:40,889 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:41,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:42,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:45,026 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:46,177 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:57:46,177 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:57:46,236 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:57:46,427 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:57:46,427 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:46,428 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:57:46,428 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:57:47,107 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:57:47,107 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:57:47,126 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:49,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:51,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:51,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:53,257 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:55,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:56,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:57:57,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:59,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:57:59,826 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:58:01,167 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:58:01,168 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:58:01,446 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:58:01,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:02,453 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:03,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:05,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:07,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:07,609 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:09,655 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:11,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:12,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:13,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:15,813 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:58:15,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:16,180 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:58:16,180 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:58:17,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:18,458 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:19,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:21,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:23,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:24,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:26,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:27,078 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:58:28,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:28,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:29,836 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:58:30,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:31,196 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:58:31,196 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:58:32,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:34,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:34,480 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:36,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:38,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:39,396 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:58:39,660 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:40,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:42,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:44,600 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:44,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:46,212 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:58:46,212 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:58:46,617 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:48,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:50,503 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:50,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:52,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:53,744 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:58:54,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:55,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:58:56,837 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:58,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:58:59,843 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:59:00,856 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:00,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:01,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:59:01,231 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:59:02,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:05,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:06,040 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:59:06,793 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:07,101 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:09,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:11,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:11,820 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:13,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:15,353 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:16,240 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:59:16,240 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:59:17,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:17,499 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:19,376 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:59:19,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:21,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:22,790 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:23,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:25,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:27,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:27,842 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:29,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:29,858 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 04:59:31,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:59:31,248 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:59:31,676 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:59:31,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:33,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:33,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:35,765 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:37,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:38,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:39,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:41,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:43,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:43,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:46,030 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:59:46,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:46,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 04:59:46,255 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 04:59:48,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:49,515 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:50,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:52,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:54,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:54,557 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:56,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:56,675 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 04:59:56,676 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 04:59:56,677 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 04:59:56,677 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 04:59:57,285 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 04:59:58,324 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 04:59:58,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 04:59:59,701 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 04:59:59,860 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:00:00,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:01,246 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:00:01,247 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:00:02,432 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:04,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:05,504 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:06,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:08,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:10,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:10,675 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:11,639 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:00:12,681 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:14,753 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:15,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:16,256 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:00:16,256 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:00:16,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:18,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:20,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:21,544 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:22,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:23,981 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:00:25,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:26,657 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:27,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:29,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:29,903 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:00:31,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:31,265 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:00:31,265 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:00:32,529 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:33,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:35,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:37,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:37,599 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:38,346 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:00:39,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:41,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:42,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:43,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:45,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:46,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:00:46,284 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:00:47,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:48,540 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:49,645 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:00:49,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:51,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:53,593 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:53,764 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:55,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:57,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:00:58,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:00:59,891 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:00:59,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:01,299 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:01:01,299 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:01:01,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:04,011 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:01:04,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:04,582 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:06,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:08,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:09,641 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:10,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:12,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:14,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:15,461 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:16,307 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:01:16,307 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:01:16,309 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:01:16,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:18,446 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:20,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:20,567 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:22,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:24,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:25,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:26,593 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:28,652 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:01:28,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:29,904 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:01:30,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:30,909 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:31,302 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:01:31,302 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:01:32,753 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:34,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:36,589 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:36,858 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:38,911 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:40,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:41,953 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:01:42,317 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:43,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:45,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:46,298 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:01:46,299 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:01:47,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:47,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:49,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:51,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:52,585 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:53,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:54,273 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:01:55,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:57,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:58,331 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:01:59,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:01:59,931 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:02:01,316 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:02:01,317 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:02:01,461 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:03,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:03,572 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:05,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:06,247 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:02:06,249 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:02:06,249 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:02:06,250 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:02:06,569 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:02:07,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:08,630 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:02:09,271 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:09,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:11,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:13,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:14,312 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:15,857 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:16,327 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:02:16,328 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:02:17,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:20,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:20,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:20,997 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:02:22,052 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:24,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:25,232 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:26,131 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:28,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:29,927 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:02:30,261 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:30,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:31,338 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:02:31,338 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:02:32,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:34,370 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:02:34,380 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:36,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:36,436 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:38,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:40,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:41,222 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:42,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:44,619 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:46,341 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:02:46,341 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:02:46,597 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:46,672 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:02:46,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:48,739 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:50,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:51,640 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:52,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:54,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:56,684 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:02:56,940 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:58,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:02:59,935 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:03:01,034 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:03:01,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:01,356 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:03:01,357 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:03:02,619 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:03,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:05,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:07,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:07,674 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:09,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:11,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:12,289 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:03:13,029 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:13,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:15,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:16,370 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:03:16,370 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:03:17,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:18,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:19,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:21,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:23,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:23,692 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:24,580 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:03:25,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:27,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:28,948 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:29,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:29,943 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:03:31,386 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:03:31,387 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:03:31,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:33,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:34,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:35,901 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:37,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:38,954 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:03:39,825 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:39,998 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:42,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:44,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:44,879 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:46,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:46,393 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:03:46,402 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:03:48,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:50,257 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:50,783 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:51,253 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:03:52,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:54,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:55,814 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:03:56,413 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:58,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:03:59,955 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:04:00,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:00,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:01,393 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:04:01,394 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:04:02,551 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:04,594 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:04:04,603 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:06,645 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:06,738 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:08,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:10,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:11,799 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:12,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:14,848 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:15,669 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:04:15,670 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:04:15,671 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:04:15,672 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:04:15,839 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:04:16,389 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:04:16,390 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:04:16,886 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:04:16,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:17,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:18,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:21,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:22,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:23,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:25,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:27,157 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:27,734 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:29,199 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:29,967 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:04:31,238 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:04:31,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:31,399 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:04:31,399 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:04:33,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:33,668 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:35,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:37,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:38,715 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:39,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:41,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:43,548 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:04:43,560 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:44,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:45,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:46,413 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:04:46,413 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:04:47,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:49,746 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:49,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:51,807 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:53,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:55,426 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:04:55,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:56,940 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:04:57,986 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:04:59,968 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:05:00,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:00,971 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:01,429 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:05:01,430 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:05:02,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:04,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:06,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:06,725 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:08,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:09,221 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:05:10,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:12,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:12,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:14,394 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:16,437 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:05:16,437 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:05:16,443 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:17,723 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:18,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:20,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:21,532 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:05:22,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:23,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:24,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:26,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:28,452 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:28,731 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:29,971 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:05:30,772 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:31,446 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:05:31,446 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:05:32,821 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:34,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:34,870 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:05:34,871 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:36,918 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:38,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:39,443 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:41,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:43,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:44,490 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:45,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:46,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:05:46,474 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:05:47,175 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:05:47,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:49,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:49,748 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:51,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:53,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:54,802 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:05:55,450 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:57,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:59,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:05:59,980 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:05:59,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:01,523 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:06:01,524 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:06:01,527 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:01,533 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:03,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:05,634 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:05,815 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:07,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:09,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:10,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:11,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:13,833 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:13,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:15,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:16,177 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:16,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:06:16,492 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:06:17,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:20,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:21,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:22,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:24,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:25,096 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:06:25,097 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:06:25,098 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:06:25,098 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:06:25,114 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:06:26,187 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:26,231 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:27,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:27,194 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:28,243 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:29,997 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:06:30,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:31,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:06:31,506 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:06:32,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:32,749 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:34,372 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:36,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:37,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:38,462 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:39,465 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:40,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:42,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:43,173 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:44,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:46,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:06:46,497 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:06:46,662 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:48,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:48,775 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:50,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:52,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:53,808 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:06:54,075 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:06:54,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:56,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:58,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:06:59,122 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:00,007 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:07:01,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:01,517 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:07:01,517 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:07:03,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:05,079 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:05,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:06,147 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:07:07,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:09,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:10,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:11,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:13,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:15,162 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:15,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:16,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:07:16,521 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:07:17,472 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:07:17,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:19,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:21,037 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:21,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:23,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:25,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:26,066 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:27,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:29,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:30,021 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:07:31,260 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:31,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:07:31,511 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:07:31,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:33,874 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:07:33,883 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:35,938 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:36,594 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:38,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:40,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:41,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:42,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:44,131 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:46,190 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:46,527 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:07:46,528 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:07:46,826 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:48,241 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:07:48,249 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:50,298 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:51,862 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:52,361 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:54,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:56,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:07:56,913 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:07:58,573 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:00,034 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:08:00,589 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:01,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:08:01,531 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:08:01,562 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:08:02,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:02,776 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:04,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:06,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:07,809 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:08,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:10,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:12,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:12,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:14,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:15,901 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:08:16,539 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:08:16,539 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:08:16,949 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:17,862 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:19,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:21,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:22,915 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:23,107 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:25,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:27,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:27,944 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:29,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:30,038 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:08:30,283 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:08:31,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:31,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:08:31,567 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:08:33,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:33,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:35,404 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:37,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:38,850 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:39,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:41,563 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:41,904 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:08:41,905 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:08:41,905 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:08:41,906 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:08:42,558 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:08:43,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:43,921 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:44,623 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:08:45,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:46,571 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:08:46,571 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:08:47,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:49,807 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:49,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:51,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:53,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:54,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:08:55,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:08:58,030 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:08:58,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:00,082 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:09:00,083 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:00,138 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:01,591 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:09:01,592 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:09:02,155 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:04,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:05,875 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:06,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:08,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:10,304 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:09:10,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:11,585 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:12,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:14,439 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:16,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:16,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:09:16,596 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:09:16,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:18,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:20,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:21,873 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:22,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:24,683 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:09:24,694 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:26,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:27,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:28,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:30,060 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:09:30,927 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:31,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:09:31,607 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:09:32,896 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:32,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:34,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:37,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:38,044 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:09:38,216 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:39,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:41,162 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:43,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:43,247 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:45,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:46,613 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:09:46,613 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:09:47,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:49,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:49,363 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:51,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:52,412 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:09:53,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:55,144 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:09:55,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:57,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:09:59,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:00,070 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:10:01,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:01,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:10:01,627 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:10:01,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:03,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:05,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:06,683 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 05:10:06,833 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:07,866 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:08,868 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:10:09,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:11,889 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:11,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:14,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:16,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:16,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:10:16,637 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:10:16,902 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:18,135 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:20,175 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:21,172 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:10:22,237 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:22,857 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:24,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:26,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:27,888 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:28,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:30,076 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:10:30,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:31,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:10:31,651 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:10:32,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:33,750 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:34,568 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:10:34,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:36,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:38,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:38,793 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:40,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:42,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:43,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:44,835 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:46,654 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:10:46,654 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:10:46,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:48,916 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:10:48,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:48,926 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:50,972 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:53,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:53,959 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:55,064 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:57,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:10:59,002 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:10:59,175 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:00,092 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:11:00,620 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:11:00,621 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:11:00,622 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:11:00,622 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:11:01,217 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:11:01,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:01,672 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:11:01,672 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:11:03,294 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:11:03,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:04,934 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:05,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:07,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:09,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:09,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:11,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:13,519 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:15,439 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:15,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:16,571 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:11:16,666 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:11:16,666 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:11:17,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:19,673 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:20,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:21,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:23,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:25,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:25,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:27,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:29,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:30,092 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:11:30,918 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:11:31,106 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:31,682 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:11:31,683 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:11:31,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:34,067 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:36,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:36,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:38,139 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:40,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:42,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:42,991 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:43,232 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:11:44,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:46,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:46,705 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:11:46,706 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:11:48,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:48,982 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:50,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:52,544 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:54,026 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:11:54,606 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:56,661 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:11:56,674 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:58,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:11:59,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:00,101 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:12:00,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:01,714 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:12:01,714 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:12:02,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:04,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:04,970 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:06,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:08,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:10,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:10,994 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:12:11,009 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:13,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:15,102 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:15,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:16,728 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:12:16,728 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:12:17,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:19,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:21,012 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:21,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:23,335 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:25,368 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:12:25,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:26,499 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:27,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:29,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:30,114 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:12:31,551 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:31,744 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:12:31,744 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:12:31,997 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:33,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:35,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:37,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:37,721 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:39,732 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:12:39,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:41,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:42,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:43,835 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:45,891 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:46,758 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:12:46,758 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:12:47,973 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:48,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:50,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:52,067 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:53,070 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:12:53,290 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:12:54,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:56,179 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:58,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:12:58,347 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:00,122 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:13:00,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:01,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:13:01,763 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:13:02,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:04,040 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:04,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:06,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:07,456 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:13:08,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:10,037 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:10,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:12,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:14,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:15,082 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:16,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:16,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:13:16,770 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:13:18,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:13:18,756 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:13:18,756 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:13:18,757 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:13:18,757 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:13:18,763 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:19,762 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:13:19,762 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:13:20,785 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:20,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:21,810 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:13:22,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:24,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:25,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:26,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:29,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:30,138 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:13:31,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:31,153 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:31,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:13:31,763 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:13:33,101 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:13:33,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:35,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:36,635 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:37,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:39,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:41,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:41,672 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:43,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:45,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:46,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:13:46,779 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:13:47,020 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:47,469 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:13:47,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:49,520 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:51,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:52,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:53,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:55,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:57,090 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:13:57,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:13:59,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:00,140 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:14:01,790 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:14:01,797 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:14:01,797 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:14:01,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:03,055 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:03,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:05,909 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:08,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:08,091 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:10,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:12,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:13,128 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:14,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:16,120 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:14:16,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:16,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:14:16,801 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:14:18,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:19,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:20,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:22,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:24,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:24,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:26,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:28,474 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:29,484 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:14:30,066 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:30,145 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:14:30,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:31,804 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:14:31,804 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:14:32,595 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:34,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:35,094 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:36,697 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:38,810 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:40,124 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:40,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:42,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:43,855 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:14:44,911 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:45,796 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:46,825 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:14:46,825 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:14:46,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:48,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:51,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:51,119 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:53,097 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:55,141 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:56,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:14:57,185 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:14:58,188 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:14:59,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:00,157 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:15:01,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:01,836 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:15:01,837 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:15:02,087 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:03,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:05,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:07,109 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:07,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:09,545 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:10,513 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:15:11,567 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:12,330 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:13,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:15,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:16,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:15:16,844 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:15:17,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:18,097 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:19,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:21,822 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:23,136 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:23,857 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:15:23,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:25,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:27,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:28,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:30,013 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:30,164 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:15:31,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:15:31,850 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:15:32,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:34,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:34,115 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:36,151 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:37,061 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:15:37,062 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:15:37,063 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:15:37,064 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:15:37,157 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:15:38,183 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:15:38,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:40,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:40,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:42,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:44,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:45,139 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:46,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:46,851 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:15:46,852 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:15:48,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:50,495 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:50,936 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:52,543 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:15:52,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:54,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:55,963 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:15:56,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:15:58,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:00,168 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:16:00,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:01,169 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:01,880 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:16:01,881 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:16:02,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:04,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:05,854 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:16:06,814 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:06,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:08,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:11,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:11,855 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:13,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:15,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:16,884 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:16:16,885 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:16:17,171 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:17,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:19,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:20,251 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:16:21,298 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:22,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:23,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:25,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:27,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:27,562 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:29,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:30,174 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:16:31,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:31,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:16:31,903 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:16:33,325 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:33,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:34,599 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:16:35,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:37,704 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:38,359 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:39,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:41,869 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:43,398 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:43,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:45,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:46,916 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:16:46,917 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:16:48,017 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:16:48,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:49,190 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:50,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:52,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:54,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:54,228 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:16:56,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:58,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:16:59,297 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:00,178 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:17:00,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:01,920 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:17:01,921 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:17:02,370 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:17:02,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:04,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:05,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:06,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:08,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:10,243 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:10,559 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:12,691 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:14,676 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:17:14,709 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:15,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:16,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:16,916 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:17:16,917 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:17:18,781 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:20,830 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:21,207 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:22,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:24,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:26,233 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:26,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:29,009 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:17:29,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:30,186 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:17:31,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:31,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:31,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:17:31,938 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:17:33,117 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:35,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:37,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:37,236 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:39,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:41,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:42,329 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:17:42,468 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:43,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:45,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:46,948 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:17:46,949 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:17:47,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:48,203 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:49,586 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:51,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:53,238 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:53,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:55,316 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:17:55,318 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:17:55,318 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:17:55,319 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:17:55,708 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:17:55,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:56,723 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:17:57,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:17:58,347 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:17:59,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:00,199 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:18:01,869 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:01,966 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:18:01,966 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:18:03,918 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:04,212 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:05,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:08,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:10,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:10,178 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:11,088 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:18:12,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:14,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:15,209 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:16,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:16,971 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:18:16,971 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:18:18,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:20,235 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:20,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:22,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:24,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:25,417 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:18:26,042 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:26,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:28,519 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:30,208 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:18:30,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:31,212 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:31,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:18:31,975 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:18:32,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:34,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:36,260 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:36,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:38,752 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:18:38,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:40,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:41,944 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:42,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:44,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:46,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:46,987 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:18:46,987 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:18:47,237 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:49,001 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:51,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:52,856 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:53,079 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:18:53,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:55,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:57,179 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:18:57,910 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:18:59,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:00,222 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:19:01,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:01,989 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:19:01,989 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:19:03,247 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:03,342 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:05,397 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:07,435 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:19:07,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:08,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:09,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:11,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:13,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:13,709 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:15,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:16,995 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:19:16,995 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:19:17,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:19,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:19,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:20,808 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:19:21,850 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:23,908 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:24,545 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:25,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:28,017 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:29,586 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:30,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:30,227 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:19:32,010 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:19:32,010 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:19:32,105 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:33,110 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:19:34,158 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:35,291 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:36,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:38,266 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:40,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:40,328 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:42,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:44,419 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:45,399 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:46,524 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:47,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:19:47,022 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:19:47,495 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:19:48,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:50,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:51,292 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:52,640 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:54,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:56,343 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:19:56,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:19:58,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:00,235 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:20:00,840 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:01,619 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:01,842 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:20:02,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:20:02,047 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:20:02,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:04,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:06,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:07,339 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:09,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:11,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:12,378 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:13,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:13,497 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:20:13,498 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:20:13,499 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:20:13,501 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:20:14,152 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:20:15,189 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:20:15,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:17,055 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:20:17,055 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:20:17,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:18,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:19,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:21,349 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:23,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:23,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:25,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:27,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:28,978 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:29,532 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:20:29,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:30,243 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:20:31,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:32,079 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:20:32,079 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:20:33,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:34,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:35,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:37,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:39,395 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:39,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:41,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:43,880 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:20:43,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:44,801 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:45,937 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:47,089 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:20:47,090 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:20:48,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:50,104 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:50,363 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:52,122 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:54,171 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:55,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:20:56,222 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:20:57,228 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:20:58,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:00,257 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:21:00,319 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:01,262 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:02,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:21:02,102 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:21:02,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:04,427 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:06,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:06,474 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:08,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:10,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:11,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:11,559 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:21:12,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:14,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:16,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:16,708 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:17,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:21:17,120 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:21:18,815 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:20,832 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:22,396 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:22,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:24,913 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:25,913 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:21:26,946 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:28,368 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:28,999 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:30,273 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:21:31,057 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:32,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:21:32,129 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:21:33,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:33,393 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:35,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:37,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:39,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:39,348 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:40,284 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:21:41,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:43,380 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:44,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:45,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:47,140 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:21:47,140 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:21:47,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:49,406 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:49,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:51,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:53,622 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:21:53,640 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:55,127 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:21:55,688 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:57,735 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:21:59,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:00,173 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:00,285 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:22:01,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:02,142 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:22:02,142 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:22:03,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:05,831 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:05,932 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:22:05,940 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:07,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:10,048 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:10,859 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:12,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:14,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:15,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:16,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:17,151 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:22:17,151 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:22:18,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:20,333 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:22:20,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:21,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:22,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:24,417 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:26,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:26,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:28,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:30,285 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:22:30,570 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:32,168 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:22:32,168 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:22:32,429 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:22:32,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:34,682 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:36,511 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:22:36,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:36,511 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:22:36,513 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:22:36,710 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:22:36,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:37,715 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:22:38,760 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:22:38,765 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:40,822 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:41,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:42,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:44,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:46,683 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:46,985 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:47,185 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:22:47,186 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:22:47,984 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:22:49,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:51,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:52,456 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:53,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:55,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:57,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:22:57,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:22:59,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:00,271 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:23:00,287 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:23:01,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:02,198 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:23:02,199 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:23:03,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:03,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:05,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:07,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:08,518 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:09,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:11,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:13,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:13,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:14,605 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:23:15,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:17,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:23:17,220 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:23:17,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:19,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:19,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:21,909 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:23,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:24,537 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:25,925 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:23:25,953 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:27,980 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:30,023 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:30,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:30,288 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:23:32,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:32,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:23:32,231 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:23:34,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:35,512 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:36,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:38,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:40,269 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:23:40,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:40,855 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:42,325 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:44,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:45,894 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:46,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:47,235 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:23:47,236 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:23:48,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:50,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:51,812 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:52,601 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:23:52,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:54,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:56,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:23:56,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:23:58,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:00,295 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:24:00,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:02,253 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:24:02,254 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:24:02,528 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:02,891 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:04,924 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:24:04,938 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:06,987 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:07,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:09,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:11,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:12,906 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:13,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:15,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:17,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:17,253 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:24:17,254 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:24:18,214 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:24:18,498 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:19,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:21,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:23,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:23,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:25,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:27,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:28,616 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:29,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:30,307 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:24:31,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:32,250 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:24:32,250 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:24:32,599 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:24:33,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:34,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:35,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:37,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:39,551 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:39,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:41,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:42,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:24:42,645 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:24:42,645 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:24:42,646 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:24:42,868 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:24:43,912 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:44,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:44,901 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:24:45,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:47,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:24:47,263 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:24:48,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:50,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:50,544 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:52,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:54,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:55,582 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:24:56,219 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:24:56,252 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:24:58,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:00,316 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:25:00,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:01,335 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:02,260 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:25:02,260 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:25:02,372 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:04,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:06,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:06,547 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:08,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:10,566 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:25:10,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:11,584 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:12,624 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:14,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:16,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:16,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:17,268 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:25:17,268 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:25:18,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:20,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:22,487 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:22,860 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:25:22,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:24,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:27,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:27,525 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:29,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:30,324 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:25:31,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:32,271 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:25:32,271 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:25:33,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:33,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:35,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:37,183 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:25:37,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:39,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:39,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:41,285 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:43,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:44,529 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:45,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:47,282 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:25:47,283 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:25:47,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:48,435 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:25:49,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:49,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:51,551 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:53,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:54,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:55,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:57,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:25:59,750 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:25:59,761 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:00,328 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:26:00,755 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:26:01,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:02,287 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:26:02,287 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:26:03,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:05,561 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:05,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:07,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:10,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:10,612 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:12,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:14,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:15,142 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:26:16,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:16,433 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:17,313 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:26:17,313 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:26:18,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:20,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:21,589 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:22,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:24,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:26,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:27,443 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:28,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:29,503 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:26:30,338 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:26:30,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:32,321 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:26:32,321 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:26:32,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:32,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:34,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:36,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:37,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:38,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:40,827 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:26:40,829 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:42,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:43,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:44,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:46,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:47,337 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:26:47,338 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:26:48,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:49,026 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:51,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:52,374 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:26:52,375 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:26:52,375 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:26:52,376 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:26:53,132 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:26:53,133 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:26:53,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:54,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:26:55,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:57,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:59,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:26:59,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:00,354 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:27:01,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:02,352 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:27:02,352 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:27:03,395 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:04,675 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:05,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:07,473 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:27:07,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:09,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:10,520 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:11,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:13,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:15,677 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:15,688 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:17,375 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:27:17,375 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:27:17,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:19,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:21,428 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:21,856 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:27:21,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:23,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:25,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:26,489 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:28,057 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:30,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:30,369 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:27:31,883 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:32,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:32,387 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:27:32,387 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:27:33,100 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:27:34,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:36,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:37,702 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:38,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:40,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:42,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:42,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:44,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:45,420 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:27:46,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:47,405 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:27:47,405 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:27:48,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:48,672 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:50,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:52,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:53,726 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:54,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:56,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:58,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:27:59,371 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:27:59,814 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:28:00,377 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:28:00,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:02,409 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:28:02,409 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:28:02,901 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:04,688 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:04,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:06,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:09,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:09,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:11,075 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:28:11,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:13,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:15,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:15,404 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:17,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:17,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:28:17,402 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:28:19,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:20,690 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:21,330 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:23,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:25,448 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:28:25,461 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:26,319 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:27,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:29,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:30,380 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:28:31,394 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:31,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:32,416 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:28:32,416 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:28:33,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:35,708 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:37,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:37,748 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:28:37,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:39,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:41,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:42,320 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:43,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:45,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:47,433 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:28:47,433 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:28:47,676 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:48,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:50,105 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:52,171 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:28:52,172 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:53,265 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:54,241 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:56,296 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:28:58,346 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:28:58,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:00,413 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:29:00,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:02,162 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:29:02,163 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:29:02,163 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:29:02,164 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:29:02,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:29:02,458 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:29:02,459 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:29:02,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:03,474 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:29:03,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:04,511 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:06,567 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:08,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:08,733 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:10,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:12,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:13,782 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:14,772 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:15,758 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:29:16,821 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:17,465 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:29:17,466 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:29:18,873 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:19,723 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:20,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:22,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:24,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:25,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:27,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:29,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:30,109 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:29:30,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:30,410 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:29:31,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:32,477 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:29:32,478 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:29:33,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:35,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:35,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:37,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:39,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:40,795 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:41,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:42,416 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:29:43,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:45,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:46,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:47,493 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:29:47,493 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:29:47,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:49,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:51,697 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:51,780 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:53,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:55,785 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:29:55,793 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:57,141 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:29:57,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:29:59,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:00,411 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:30:01,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:02,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:30:02,510 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:30:02,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:04,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:06,033 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:08,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:08,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:10,123 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:30:10,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:12,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:13,179 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:14,239 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:16,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:17,524 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:30:17,525 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:30:18,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:18,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:20,383 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:22,426 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:30:22,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:24,044 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:24,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:26,539 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:28,586 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:29,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:30,412 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:30:30,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:32,541 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:30:32,541 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:30:32,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:33,712 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:30:34,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:35,071 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:36,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:38,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:40,122 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:40,908 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:42,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:45,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:45,164 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:47,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:47,551 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:30:47,551 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:30:48,073 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:30:49,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:50,829 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:51,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:53,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:55,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:55,877 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:30:57,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:30:59,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:00,376 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:31:00,422 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:31:01,429 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:01,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:02,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:31:02,554 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:31:03,536 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:05,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:06,852 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:07,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:09,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:11,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:11,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:12,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:31:12,023 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:31:12,023 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:31:12,024 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:31:12,689 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:31:13,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:14,746 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:31:15,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:17,019 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:17,562 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:31:17,563 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:31:17,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:19,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:21,989 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:22,943 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:24,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:26,071 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:31:26,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:28,131 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:28,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:30,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:30,424 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:31:32,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:32,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:31:32,583 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:31:34,325 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:34,865 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:36,348 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:38,352 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:31:38,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:40,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:40,792 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:42,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:44,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:45,850 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:46,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:47,584 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:31:47,584 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:31:48,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:50,681 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:51,678 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:52,708 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:31:52,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:54,774 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:56,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:31:56,810 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:31:58,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:00,433 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:32:00,911 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:02,046 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:02,582 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:32:02,583 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:32:02,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:05,028 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:32:05,064 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:07,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:07,593 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:09,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:11,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:12,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:13,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:15,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:17,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:17,587 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:32:17,587 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:32:17,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:18,311 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:32:19,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:21,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:22,886 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:23,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:25,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:27,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:27,927 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:29,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:30,440 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:32:30,614 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:32:31,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:32,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:32:32,596 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:32:33,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:33,874 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:35,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:37,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:38,931 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:39,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:41,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:43,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:44,191 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:44,968 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:32:46,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:47,614 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:32:47,614 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:32:48,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:49,882 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:50,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:52,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:54,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:54,933 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:32:56,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:32:57,265 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:32:58,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:00,213 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:00,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:00,451 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:33:02,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:02,613 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:33:02,613 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:33:04,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:05,902 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:06,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:08,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:10,593 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:33:10,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:11,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:12,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:14,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:16,153 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:16,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:17,616 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:33:17,617 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:33:18,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:20,850 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:20,972 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:33:20,974 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:33:20,974 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:33:20,975 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:33:21,845 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:33:21,985 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:22,890 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:33:22,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:24,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:27,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:27,021 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:29,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:30,456 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:33:31,085 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:32,083 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:32,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:33:32,635 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:33:33,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:35,188 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:33:35,199 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:37,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:37,874 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:39,327 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:41,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:42,923 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:43,403 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:45,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:47,505 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:47,654 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:33:47,655 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:33:48,504 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:33:48,902 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:49,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:51,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:53,702 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:53,940 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:55,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:57,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:33:59,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:33:59,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:00,470 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:34:00,849 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:34:01,889 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:02,662 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:34:02,663 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:34:03,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:04,950 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:05,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:08,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:10,069 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:10,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:12,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:14,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:15,179 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:34:15,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:16,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:17,687 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:34:17,687 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:34:18,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:20,364 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:20,978 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:22,407 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:24,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:26,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:26,805 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:27,530 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:34:28,601 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:30,485 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:34:30,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:32,119 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:32,691 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:34:32,692 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:34:32,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:34,750 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:36,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:37,992 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:38,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:40,895 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:34:40,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:42,958 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:43,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:45,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:47,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:47,695 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:34:47,696 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:34:49,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:49,966 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:51,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:53,168 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:34:53,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:55,231 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:55,880 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:34:57,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:34:59,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:00,486 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:35:01,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:01,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:02,713 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:35:02,713 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:35:03,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:05,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:06,846 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:07,526 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:35:07,536 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:09,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:11,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:11,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:13,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:15,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:16,923 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:17,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:35:17,787 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:35:17,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:19,830 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:35:19,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:21,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:22,080 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:23,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:25,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:27,112 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:28,035 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:30,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:30,498 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:35:30,706 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:35:30,707 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:35:30,707 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:35:30,708 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:35:31,076 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:35:32,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:32,158 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:32,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:35:32,727 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:35:33,134 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:35:34,193 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:36,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:38,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:38,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:40,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:42,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:43,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:44,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:45,445 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:35:46,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:47,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:35:47,737 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:35:48,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:48,997 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:50,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:52,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:54,035 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:35:54,680 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:56,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:57,720 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:35:58,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:35:59,681 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:00,502 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:36:00,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:02,751 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:36:02,752 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:36:02,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:04,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:05,017 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:06,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:09,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:10,609 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:11,098 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:36:11,149 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:13,171 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:15,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:15,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:17,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:17,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:36:17,772 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:36:19,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:21,040 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:21,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:23,386 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:36:23,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:25,439 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:26,625 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:27,501 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:29,536 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:30,503 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:36:31,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:32,223 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:32,773 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:36:32,774 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:36:33,639 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:35,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:37,512 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:37,741 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:36:37,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:39,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:41,896 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:42,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:43,915 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:45,945 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:47,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:36:47,789 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:36:48,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:48,036 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:50,036 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:36:50,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:52,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:53,585 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:36:54,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:56,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:58,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:36:58,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:00,316 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:00,509 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:37:02,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:02,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:37:02,811 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:37:03,363 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:37:04,088 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:04,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:06,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:08,517 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:09,132 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:10,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:12,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:14,184 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:14,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:15,681 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:37:16,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:17,827 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:37:17,828 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:37:18,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:20,091 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:20,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:22,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:24,946 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:25,134 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:26,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:29,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:30,051 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:37:30,514 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:37:30,515 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:31,108 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:32,822 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:37:32,822 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:37:33,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:35,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:36,105 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:37,254 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:39,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:40,478 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:37:40,479 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:37:40,480 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:37:40,480 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:37:41,331 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:37:41,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:41,489 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:42,341 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:37:43,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:45,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:46,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:47,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:47,843 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:37:47,843 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:37:49,551 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:51,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:52,148 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:53,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:55,684 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:37:55,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:57,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:37:57,761 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:37:59,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:00,522 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:38:01,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:02,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:38:02,845 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:38:03,088 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:03,911 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:05,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:07,989 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:38:07,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:08,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:10,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:12,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:13,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:14,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:16,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:17,847 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:38:17,847 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:38:18,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:19,117 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:20,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:22,337 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:38:22,348 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:24,400 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:24,502 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:26,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:28,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:29,545 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:30,542 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:38:30,550 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:32,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:32,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:38:32,846 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:38:34,628 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:38:34,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:35,105 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:36,682 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:38,720 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:40,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:40,787 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:42,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:44,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:45,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:45,924 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:38:46,973 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:47,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:38:47,850 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:38:49,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:51,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:51,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:53,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:55,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:56,162 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:38:57,206 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:38:59,244 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:00,244 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:39:00,543 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:39:01,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:01,544 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:02,853 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:39:02,853 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:39:03,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:05,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:07,158 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:07,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:09,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:11,546 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:12,397 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:12,555 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:39:13,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:15,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:17,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:17,874 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:39:17,875 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:39:18,124 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:19,787 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:21,839 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:23,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:23,877 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:25,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:26,911 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:39:27,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:28,354 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:30,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:30,549 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:39:32,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:32,881 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:39:32,882 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:39:34,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:34,141 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:36,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:38,216 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:39:38,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:39,227 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:40,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:42,319 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:44,270 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:44,383 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:46,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:47,888 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:39:47,888 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:39:48,524 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:50,156 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:50,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:39:50,215 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:39:50,216 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:39:50,217 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:39:50,509 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:39:50,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:52,559 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:39:52,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:54,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:55,266 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:39:56,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:39:58,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:00,324 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:00,562 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:40:00,784 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:02,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:02,893 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:40:02,894 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:40:04,869 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:40:04,876 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:06,208 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:06,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:08,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:11,048 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:11,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:13,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:15,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:17,179 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:17,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:17,892 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:40:17,892 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:40:19,230 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:40:19,245 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:21,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:22,180 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:23,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:25,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:27,228 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:27,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:29,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:30,481 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:40:30,575 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:40:31,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:32,360 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:32,895 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:40:32,896 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:40:33,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:35,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:37,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:38,177 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:39,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:41,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:42,812 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:40:43,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:44,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:45,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:47,932 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:40:47,979 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:40:48,053 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:49,245 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:50,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:52,099 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:54,158 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:54,294 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:40:56,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:40:57,203 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:40:58,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:00,115 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:00,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:00,588 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:41:02,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:02,916 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:41:02,916 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:41:04,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:05,186 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:06,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:08,495 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:41:08,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:10,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:11,011 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:12,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:14,680 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:16,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:16,719 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:17,922 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:41:17,923 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:41:18,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:20,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:21,934 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:22,872 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:41:22,883 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:24,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:26,973 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:26,990 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:29,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:30,597 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:41:31,053 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:32,395 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:32,928 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:41:32,928 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:41:33,101 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:35,156 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:41:35,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:37,224 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:37,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:39,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:41,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:42,982 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:43,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:45,451 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:47,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:47,942 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:41:47,943 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:41:48,203 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:49,574 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:41:49,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:51,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:53,260 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:53,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:55,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:57,765 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:58,282 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:41:59,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:41:59,884 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:41:59,884 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:41:59,885 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:41:59,886 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:42:00,602 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:42:00,801 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:42:00,801 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:42:01,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:02,944 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:42:02,944 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:42:03,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:04,199 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:05,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:08,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:09,246 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:10,072 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:12,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:14,162 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:14,603 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:15,159 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:42:16,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:17,953 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:42:17,954 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:42:18,267 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:20,224 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:20,383 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:22,398 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:24,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:25,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:26,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:28,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:29,535 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:42:30,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:30,609 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:30,609 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:42:32,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:32,958 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:42:32,958 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:42:34,695 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:36,237 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:36,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:38,799 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:40,837 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:42:40,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:41,458 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:42,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:44,953 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:46,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:46,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:47,971 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:42:47,971 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:42:49,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:51,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:52,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:53,144 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:42:53,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:55,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:57,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:42:57,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:42:59,301 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:00,610 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:43:01,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:02,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:43:02,974 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:43:03,237 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:03,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:05,439 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:07,473 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:43:07,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:08,477 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:09,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:11,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:13,605 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:13,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:15,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:17,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:17,986 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:43:17,987 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:43:19,376 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:19,763 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:43:19,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:21,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:23,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:24,421 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:25,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:27,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:29,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:30,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:30,620 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:43:32,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:32,991 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:43:32,991 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:43:33,084 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:43:34,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:35,243 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:36,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:38,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:40,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:40,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:42,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:44,456 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:45,452 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:43:46,208 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:46,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:47,994 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:43:47,994 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:43:48,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:50,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:51,264 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:52,719 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:54,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:56,306 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:43:56,771 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:58,821 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:43:59,813 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:44:00,633 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:44:00,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:01,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:02,916 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:02,997 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:44:02,998 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:44:04,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:07,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:07,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:09,085 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:10,074 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:44:10,076 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:44:10,076 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:44:10,077 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:44:10,079 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:44:11,130 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:44:11,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:13,178 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:13,188 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:15,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:17,285 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:18,012 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:44:18,012 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:44:18,252 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:19,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:21,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:23,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:24,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:25,498 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:44:25,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:27,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:29,116 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:29,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:30,644 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:44:31,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:33,032 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:44:33,032 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:44:33,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:34,295 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:35,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:37,810 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:44:37,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:39,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:40,066 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:41,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:43,972 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:45,111 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:46,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:48,065 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:44:48,065 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:44:48,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:50,118 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:44:50,131 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:51,078 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:52,175 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:54,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:56,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:44:56,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:44:58,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:00,385 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:00,652 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:45:01,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:02,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:03,068 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:45:03,068 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:45:03,418 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:45:04,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:06,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:07,340 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:08,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:10,615 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:12,382 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:12,665 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:14,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:15,717 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:45:16,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:17,475 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:18,071 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:45:18,071 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:45:18,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:20,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:22,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:23,365 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:25,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:27,055 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:28,843 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:29,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:30,061 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:45:30,660 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:45:31,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:33,081 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:45:33,081 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:45:33,157 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:34,339 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:35,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:37,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:39,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:39,379 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:41,372 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:42,380 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:45:43,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:44,782 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:45,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:47,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:48,090 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:45:48,090 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:45:49,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:50,354 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:51,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:53,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:55,703 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:45:55,789 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:45:55,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:57,841 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:45:59,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:00,673 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:46:01,681 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:01,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:03,105 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:46:03,106 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:46:03,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:06,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:07,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:08,075 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:46:08,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:10,132 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:12,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:12,716 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:14,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:16,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:18,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:46:18,120 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:46:18,325 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:18,360 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:19,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:46:19,607 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:46:19,607 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:46:19,608 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:46:20,366 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:46:20,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:22,413 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:46:22,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:23,640 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:24,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:26,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:28,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:28,689 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:30,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:30,681 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:46:32,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:33,128 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:46:33,128 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:46:33,666 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:46:34,376 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:34,721 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:36,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:38,854 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:39,429 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:40,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:42,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:44,448 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:45,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:47,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:48,071 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:46:48,132 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:46:48,133 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:46:49,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:50,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:51,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:53,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:55,301 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:55,459 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:46:57,407 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:46:59,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:00,384 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:47:00,682 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:47:00,683 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:01,451 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:03,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:47:03,143 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:47:03,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:05,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:06,399 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:07,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:09,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:11,428 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:11,691 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:13,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:14,746 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:47:15,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:16,961 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:17,854 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:18,149 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:47:18,150 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:47:19,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:21,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:22,435 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:24,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:26,067 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:47:26,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:27,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:28,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:30,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:30,692 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:47:32,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:33,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:47:33,157 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:47:33,398 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:34,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:36,363 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:38,422 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:47:38,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:38,741 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:40,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:42,553 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:43,809 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:44,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:46,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:48,163 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:47:48,164 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:47:48,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:49,444 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:50,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:52,776 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:47:52,786 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:54,779 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:47:54,828 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:56,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:59,013 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:47:59,831 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:00,692 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:48:01,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:03,049 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:03,192 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:48:03,193 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:48:04,050 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:48:05,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:05,780 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:07,139 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:09,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:10,833 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:11,243 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:13,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:15,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:15,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:17,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:18,205 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:48:18,205 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:48:18,396 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:48:19,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:21,481 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:21,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:23,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:25,625 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:26,621 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:27,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:29,765 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:48:29,767 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:48:29,767 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:48:29,768 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:48:29,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:30,696 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:48:30,741 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:48:31,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:31,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:32,764 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:48:33,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:48:33,220 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:48:33,834 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:35,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:37,499 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:37,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:39,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:42,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:42,543 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:44,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:45,064 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:48:46,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:47,607 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:48,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:48,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:48:48,220 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:48:50,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:52,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:53,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:48:54,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:56,379 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:48:56,390 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:58,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:48:58,711 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:00,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:00,718 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:49:02,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:03,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:49:03,215 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:49:04,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:04,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:06,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:08,699 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:09,700 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:10,742 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:49:10,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:12,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:14,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:14,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:16,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:18,226 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:49:18,227 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:49:18,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:20,495 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:21,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:23,048 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:49:23,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:25,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:25,727 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:27,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:29,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:30,720 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:49:31,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:31,728 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:33,242 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:49:33,242 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:49:33,325 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:35,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:36,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:37,419 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:49:37,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:39,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:41,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:41,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:43,558 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:45,606 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:46,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:47,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:48,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:49:48,254 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:49:48,681 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:49:49,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:51,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:52,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:53,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:55,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:57,592 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:49:57,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:49:59,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:00,719 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:50:02,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:02,660 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:03,049 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:50:03,266 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:50:03,266 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:50:04,107 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:06,153 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:08,193 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:08,565 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:10,251 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:12,296 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:14,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:14,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:15,330 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:50:16,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:18,282 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:50:18,282 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:50:18,443 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:19,536 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:20,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:22,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:24,579 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:24,588 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:26,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:28,672 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:50:28,680 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:30,371 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:30,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:30,734 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:50:32,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:33,279 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:50:33,279 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:50:34,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:35,564 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:36,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:38,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:39,402 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:50:39,404 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:50:39,404 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:50:39,406 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:50:39,985 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:50:41,020 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:50:41,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:41,423 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:43,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:45,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:46,458 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:47,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:48,299 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:50:48,300 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:50:49,260 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:51,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:51,609 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:53,348 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:50:53,359 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:55,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:57,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:50:57,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:50:59,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:00,745 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:51:01,573 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:02,658 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:03,313 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:51:03,313 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:51:03,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:05,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:07,744 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:51:07,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:08,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:09,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:11,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:13,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:13,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:15,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:18,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:18,306 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:51:18,307 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:51:18,547 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:19,029 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:51:20,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:22,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:23,601 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:24,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:26,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:28,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:28,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:30,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:30,758 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:51:32,342 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:33,325 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:51:33,326 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:51:33,341 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:51:34,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:34,589 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:36,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:38,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:39,631 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:40,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:42,632 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:44,674 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:45,224 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:45,681 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:51:46,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:48,339 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:51:48,339 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:51:48,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:50,586 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:50,828 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:52,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:54,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:55,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:51:56,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:51:59,012 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:00,004 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:52:00,771 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:52:00,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:01,066 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:03,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:03,350 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:52:03,351 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:52:05,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:06,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:07,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:09,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:11,298 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:52:11,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:12,209 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:13,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:15,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:17,257 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:17,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:18,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:52:18,358 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:52:19,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:21,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:22,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:23,617 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:25,660 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:52:25,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:27,714 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:28,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:29,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:30,782 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:52:31,824 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:33,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:52:33,373 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:52:33,625 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:33,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:35,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:37,970 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:52:38,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:39,127 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:40,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:42,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:44,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:44,161 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:46,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:48,224 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:48,377 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:52:48,378 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:52:49,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:52:49,065 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:52:49,065 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:52:49,067 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:52:49,224 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:52:50,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:50,282 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:52:50,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:52,348 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:54,395 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:55,105 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:52:56,442 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:52:58,507 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:00,156 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:00,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:00,796 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:53:02,609 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:03,385 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:53:03,386 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:53:03,604 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:53:04,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:05,645 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:06,789 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:08,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:10,729 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:10,840 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:12,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:14,946 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:15,950 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:15,951 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:53:17,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:18,397 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:53:18,397 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:53:19,049 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:21,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:21,657 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:23,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:25,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:26,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:27,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:29,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:30,289 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:53:30,805 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:53:31,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:31,807 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:33,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:33,418 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:53:33,418 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:53:35,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:37,552 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:37,706 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:39,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:41,598 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:53:41,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:42,752 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:43,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:45,709 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:47,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:47,774 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:48,418 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:53:48,419 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:53:49,818 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:51,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:53,716 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:53:53,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:55,961 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:53:55,972 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:58,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:53:58,799 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:00,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:00,815 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:54:02,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:03,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:54:03,437 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:54:04,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:04,697 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:06,207 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:08,276 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:54:08,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:09,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:10,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:12,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:14,402 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:14,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:16,456 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:18,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:54:18,485 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:54:18,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:20,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:20,576 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:22,608 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:54:22,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:24,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:25,648 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:26,699 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:28,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:30,714 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:30,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:30,825 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:54:32,848 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:33,448 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:54:33,448 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:54:33,842 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:54:34,899 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:35,726 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:36,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:39,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:40,775 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:41,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:43,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:45,157 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:46,165 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:54:46,370 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:47,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:48,465 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:54:48,465 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:54:49,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:51,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:51,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:53,399 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:55,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:56,792 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:54:57,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:54:58,336 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:54:58,337 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:54:58,337 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:54:58,339 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:54:58,525 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:54:59,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:00,562 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:55:00,831 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:55:01,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:01,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:03,484 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:55:03,484 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:55:03,667 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:05,709 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:07,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:07,761 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:09,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:11,884 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:12,868 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:55:13,217 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:13,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:15,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:18,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:18,495 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:55:18,496 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:55:18,745 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:20,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:22,115 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:23,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:24,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:26,222 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:55:26,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:28,287 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:29,267 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:30,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:30,843 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:55:32,403 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:33,503 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:55:33,503 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:55:34,452 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:34,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:36,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:38,531 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:55:38,539 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:40,095 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:40,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:42,655 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:44,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:45,134 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:46,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:48,519 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:55:48,519 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:55:48,768 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:50,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:51,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:52,878 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:55:52,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:54,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:56,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:55:56,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:55:59,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:00,846 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:56:01,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:01,858 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:03,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:03,519 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:56:03,519 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:56:05,199 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:56:05,213 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:07,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:07,805 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:09,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:11,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:12,864 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:13,429 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:15,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:16,434 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:56:17,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:18,525 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:56:18,525 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:56:18,766 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:19,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:21,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:23,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:23,820 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:25,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:27,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:28,869 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:29,799 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:30,794 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:56:30,857 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:56:31,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:33,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:56:33,543 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:56:33,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:34,797 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:35,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:38,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:39,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:40,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:42,150 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:43,119 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:56:44,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:44,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:46,188 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:48,224 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:48,553 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:56:48,554 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:56:50,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:50,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:52,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:54,380 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:56,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:56:56,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:56:57,440 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:56:58,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:00,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:00,860 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:57:01,866 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:02,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:03,567 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:57:03,567 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:57:04,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:06,698 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:07,648 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:07,659 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:57:07,661 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:57:07,661 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:57:07,662 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:57:07,694 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:57:08,735 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:57:08,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:10,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:12,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:12,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:14,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:16,942 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:17,735 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:18,563 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:57:18,563 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:57:18,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:21,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:23,074 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:57:23,085 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:23,682 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:25,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:27,172 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:28,716 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:29,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:30,860 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:57:31,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:33,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:33,573 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:57:33,573 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:57:33,813 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:35,363 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:57:35,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:37,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:38,863 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:39,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:41,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:43,654 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:43,910 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:45,679 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:47,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:48,595 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:57:48,596 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:57:49,769 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:57:49,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:49,865 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:51,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:53,873 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:54,880 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:57:55,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:57:58,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:00,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:00,475 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:00,874 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:58:01,072 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:58:02,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:03,608 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:58:03,608 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:58:04,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:05,870 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:06,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:08,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:10,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:10,894 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:12,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:13,370 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:58:14,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:16,468 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:16,500 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:18,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:18,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:58:18,631 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:58:20,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:21,889 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:22,606 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:24,665 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:26,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:27,341 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:27,715 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:58:28,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:30,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:30,880 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:58:32,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:32,900 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:33,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:58:33,636 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:58:34,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:36,967 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:37,962 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:39,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:39,999 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:58:41,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:43,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:43,249 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:45,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:47,243 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:48,650 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:58:48,650 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:58:48,899 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:49,252 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:51,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:53,349 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:58:53,357 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:54,249 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:55,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:57,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:58:59,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:58:59,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:00,886 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:59:01,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:03,601 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:03,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:59:03,651 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:59:05,117 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:05,643 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:59:05,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:07,708 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:09,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:10,145 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:11,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:13,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:15,197 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:15,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:17,121 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 05:59:17,123 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 05:59:17,123 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 05:59:17,125 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 05:59:17,985 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 05:59:18,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:18,668 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:59:18,669 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:59:20,036 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:59:20,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:20,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:22,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:24,135 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:25,982 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:26,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:28,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:30,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:30,896 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 05:59:31,286 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:59:31,911 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:32,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:33,686 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:59:33,687 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:59:34,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:36,432 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:36,964 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:38,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:40,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:42,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:42,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:43,598 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:59:44,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:46,741 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:47,060 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:48,724 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 05:59:48,724 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 05:59:48,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:50,794 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:52,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:53,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:54,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:56,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 05:59:57,935 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 05:59:58,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 05:59:58,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:00,910 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:00:01,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:03,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:03,712 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:00:03,712 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:00:04,976 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:05,138 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:07,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:09,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:10,230 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:00:10,869 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:11,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:13,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:15,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:15,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:17,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:18,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:00:18,731 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:00:19,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:21,012 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:21,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:23,590 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:00:23,601 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:25,662 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:26,843 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:27,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:29,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:30,916 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:00:31,850 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:31,922 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:33,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:00:33,753 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:00:33,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:35,957 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:00:35,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:37,740 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:38,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:40,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:42,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:42,760 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:44,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:46,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:48,000 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:48,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:48,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:00:48,768 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:00:50,371 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:00:50,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:52,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:53,034 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:54,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:56,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:00:58,107 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:00:58,600 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:00,662 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:00,918 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:01:01,661 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:01:02,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:03,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:01:03,764 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:01:04,020 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:04,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:06,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:08,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:09,058 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:10,901 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:12,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:14,598 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:14,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:15,993 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:01:17,053 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:18,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:01:18,771 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:01:19,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:20,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:21,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:23,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:25,060 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:25,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:26,498 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:01:26,499 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:01:26,499 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:01:26,500 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:01:27,324 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:01:27,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:28,337 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:01:29,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:30,526 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:30,918 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:01:31,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:33,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:33,767 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:01:33,768 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:01:35,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:36,028 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:37,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:39,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:40,610 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:01:41,405 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:41,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:43,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:45,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:46,460 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:47,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:48,780 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:01:48,782 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:01:49,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:51,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:52,055 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:53,976 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:01:53,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:56,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:01:57,382 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:01:58,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:00,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:00,927 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:02:02,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:02,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:03,785 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:02:03,786 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:02:04,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:06,291 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:02:06,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:08,349 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:08,379 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:10,411 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:12,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:13,432 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:14,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:16,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:18,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:18,781 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:02:18,781 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:02:19,024 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:20,677 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:02:20,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:22,741 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:24,060 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:24,775 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:26,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:28,866 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:29,098 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:30,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:30,927 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:02:32,959 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:02:32,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:33,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:02:33,778 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:02:35,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:35,051 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:37,074 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:39,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:40,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:41,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:43,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:45,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:45,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:46,299 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:02:47,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:48,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:02:48,789 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:02:49,390 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:51,054 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:51,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:53,524 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:55,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:56,081 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:02:57,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:02:58,604 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:02:59,648 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:00,938 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:03:01,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:01,949 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:03,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:03,794 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:03:03,794 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:03:05,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:07,091 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:07,838 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:09,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:11,931 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:12,927 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:03:13,051 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:13,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:16,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:18,095 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:18,107 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:18,815 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:03:18,815 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:03:20,151 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:22,259 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:23,933 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:24,261 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:03:24,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:26,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:28,400 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:28,985 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:30,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:30,948 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:03:32,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:33,820 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:03:33,821 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:03:34,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:34,550 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:35,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:03:35,980 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:03:35,981 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:03:35,982 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:03:36,606 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:03:36,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:38,653 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:03:38,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:39,240 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:40,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:42,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:44,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:44,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:46,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:48,828 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:03:48,828 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:03:48,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:49,465 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:50,932 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:03:50,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:53,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:54,518 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:03:55,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:57,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:59,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:03:59,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:00,964 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:04:01,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:03,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:03,839 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:04:03,839 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:04:05,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:05,282 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:04:05,284 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:07,332 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:09,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:10,154 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:11,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:13,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:15,186 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:15,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:16,515 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:04:17,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:18,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:04:18,840 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:04:19,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:21,105 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:21,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:23,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:25,822 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:26,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:27,846 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:28,825 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:04:29,879 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:30,968 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:04:31,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:31,970 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:33,861 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:04:33,861 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:04:33,973 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:36,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:37,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:38,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:40,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:42,182 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:42,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:43,190 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:04:44,239 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:46,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:47,218 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:48,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:48,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:04:48,855 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:04:50,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:52,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:53,139 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:04:54,572 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:55,543 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:04:56,603 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:58,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:04:59,080 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:00,666 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:00,983 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:05:02,714 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:03,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:05:03,872 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:05:04,117 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:04,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:06,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:08,876 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:05:08,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:09,996 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:10,938 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:13,006 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:15,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:15,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:17,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:18,868 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:05:18,869 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:05:19,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:20,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:21,204 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:05:21,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:23,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:25,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:25,909 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:27,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:29,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:30,952 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:30,985 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:05:31,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:33,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:33,887 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:05:33,888 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:05:35,583 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:05:35,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:36,151 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:37,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:39,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:41,170 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:41,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:43,787 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:45,681 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:05:45,682 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:05:45,683 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:05:45,684 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:05:45,819 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:05:45,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:46,684 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:47,892 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:05:47,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:48,903 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:05:48,904 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:05:49,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:52,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:52,170 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:54,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:56,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:57,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:05:58,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:05:59,154 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:06:00,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:00,998 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:06:02,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:03,017 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:03,918 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:06:03,918 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:06:04,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:06,351 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:08,189 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:08,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:10,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:12,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:13,515 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:06:13,575 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:14,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:16,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:18,682 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:18,934 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:06:18,934 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:06:19,176 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:20,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:22,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:24,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:24,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:25,828 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:06:26,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:28,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:29,581 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:31,000 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:06:31,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:33,048 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:33,951 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:06:33,951 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:06:35,105 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:35,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:37,164 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:39,199 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:06:39,207 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:40,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:41,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:43,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:45,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:45,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:47,404 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:48,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:06:48,976 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:06:49,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:51,307 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:51,486 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:06:51,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:53,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:55,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:56,327 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:06:57,732 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:06:59,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:01,009 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:07:01,781 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:02,019 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:03,821 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:03,985 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:07:03,985 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:07:05,870 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:07:05,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:07,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:07,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:09,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:12,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:12,292 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:14,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:16,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:18,179 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:07:18,189 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:18,198 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:18,999 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:07:18,999 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:07:20,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:22,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:23,293 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:24,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:26,383 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:28,338 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:28,512 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:29,481 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:07:30,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:31,010 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:07:32,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:34,000 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:07:34,000 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:07:34,253 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:34,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:36,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:38,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:39,291 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:40,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:42,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:43,842 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:07:44,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:45,061 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:46,949 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:49,002 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:49,014 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:07:49,014 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:07:50,280 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:51,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:53,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:55,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:07:55,024 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:07:55,025 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:07:55,025 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:07:55,139 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:07:55,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:56,038 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:07:56,148 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:07:57,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:07:59,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:01,020 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:08:01,327 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:02,026 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:03,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:04,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:08:04,017 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:08:05,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:07,289 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:07,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:09,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:10,518 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:08:11,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:12,977 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:13,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:15,653 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:17,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:18,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:19,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:08:19,016 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:08:19,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:21,813 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:08:21,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:23,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:23,887 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:25,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:27,968 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:28,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:30,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:31,019 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:08:32,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:34,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:08:34,016 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:08:34,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:34,266 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:36,162 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:08:36,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:38,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:39,324 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:40,261 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:42,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:44,381 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:44,389 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:46,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:48,497 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:08:48,505 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:49,020 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:08:49,021 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:08:50,287 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:50,550 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:52,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:54,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:55,334 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:08:56,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:08:58,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:00,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:00,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:01,028 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:09:02,833 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:09:02,871 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:04,010 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:09:04,011 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:09:04,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:06,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:06,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:08,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:11,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:11,322 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:13,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:14,095 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:09:15,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:16,512 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:17,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:19,029 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:09:19,030 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:09:19,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:21,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:22,319 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:23,341 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:25,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:26,401 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:09:27,450 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:27,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:29,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:31,040 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:09:31,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:33,052 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:33,649 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:34,027 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:09:34,027 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:09:35,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:37,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:38,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:39,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:40,761 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:09:41,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:43,262 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:43,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:45,889 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:47,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:48,296 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:49,032 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:09:49,033 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:09:50,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:52,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:53,085 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:09:54,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:54,189 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:09:56,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:58,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:09:59,227 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:00,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:01,048 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:10:02,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:04,040 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:10:04,041 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:10:04,100 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:10:04,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:04,536 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:10:04,536 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:04,536 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:10:04,537 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:10:05,401 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:10:06,454 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:10:06,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:08,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:09,593 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:10,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:12,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:14,669 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:14,679 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:16,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:18,791 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:10:18,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:19,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:10:19,059 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:10:20,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:20,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:22,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:24,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:25,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:26,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:29,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:31,050 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:31,071 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:10:31,081 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:33,156 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:10:33,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:34,068 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:10:34,068 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:10:35,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:36,317 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:37,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:39,281 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:41,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:41,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:43,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:44,374 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:10:45,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:46,988 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:47,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:49,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:10:49,076 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:10:49,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:51,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:52,341 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:53,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:55,652 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:56,648 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:10:57,695 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:10:57,877 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:10:59,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:01,080 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:11:01,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:03,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:03,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:04,083 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:11:04,084 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:11:05,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:07,981 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:08,353 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:10,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:11,036 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:11:12,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:13,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:14,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:16,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:18,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:19,096 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:11:19,097 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:11:19,333 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:20,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:22,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:23,338 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:11:24,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:24,679 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:26,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:28,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:29,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:30,511 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:31,084 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:11:32,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:34,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:11:34,105 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:11:34,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:35,615 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:36,650 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:11:36,677 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:38,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:40,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:40,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:42,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:44,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:45,852 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:46,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:48,924 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:11:48,933 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:49,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:11:49,121 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:11:50,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:51,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:53,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:55,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:56,439 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:11:57,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:11:59,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:01,092 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:12:01,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:02,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:03,250 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:12:03,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:04,127 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:12:04,128 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:12:05,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:07,385 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:07,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:09,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:11,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:12,422 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:13,517 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:12:13,519 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:12:13,519 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:12:13,520 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:12:13,520 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:12:13,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:15,571 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:12:15,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:17,561 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:17,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:19,142 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:12:19,142 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:12:19,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:21,810 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:23,424 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:23,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:25,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:26,908 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:12:27,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:28,490 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:30,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:31,102 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:12:32,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:34,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:34,147 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:12:34,147 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:12:34,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:36,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:38,258 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:39,412 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:40,281 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:41,262 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:12:42,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:44,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:44,454 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:46,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:48,472 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:49,169 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:12:49,169 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:12:50,413 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:50,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:52,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:53,571 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:12:54,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:56,372 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:12:56,654 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:12:58,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:00,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:01,116 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:13:02,121 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:02,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:04,182 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:13:04,182 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:13:04,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:06,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:07,267 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:07,913 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:13:08,968 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:10,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:12,315 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:13,057 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:15,117 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:17,157 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:17,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:19,185 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:13:19,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:13:19,193 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:13:19,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:21,249 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:22,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:23,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:25,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:27,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:27,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:29,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:31,125 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:13:31,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:33,149 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:33,532 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:13:33,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:34,190 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:13:34,190 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:13:35,586 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:37,704 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:38,478 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:39,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:41,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:43,517 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:43,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:45,825 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:13:45,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:47,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:49,199 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:13:49,200 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:13:49,444 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:49,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:52,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:54,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:54,501 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:13:56,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:58,166 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:13:58,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:13:59,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:00,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:01,129 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:14:02,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:04,196 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:14:04,197 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:14:04,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:05,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:06,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:08,478 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:10,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:10,905 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:11,461 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:14:12,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:14,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:15,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:16,609 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:18,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:19,214 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:14:19,215 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:14:20,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:21,486 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:22,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:22,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:14:22,793 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:14:22,793 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:14:22,794 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:14:23,761 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:14:23,762 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:14:24,810 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:26,851 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:26,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:28,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:30,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:31,138 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:14:32,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:32,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:34,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:14:34,221 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:14:35,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:37,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:37,796 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:38,094 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:14:39,217 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:41,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:42,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:43,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:45,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:47,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:47,860 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:49,223 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:14:49,224 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:14:49,456 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:14:49,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:51,513 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:53,549 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:53,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:55,624 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:57,677 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:14:58,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:14:59,724 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:01,146 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:15:01,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:03,808 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:15:03,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:04,228 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:15:04,229 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:15:04,471 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:05,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:07,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:09,509 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:10,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:12,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:14,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:14,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:16,124 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:15:16,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:18,190 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:19,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:15:19,256 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:15:20,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:20,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:22,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:24,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:25,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:26,372 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:28,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:30,463 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:15:30,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:30,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:31,150 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:15:32,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:34,261 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:15:34,261 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:15:34,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:36,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:36,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:38,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:40,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:41,738 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:15:42,429 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:42,782 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:44,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:46,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:47,466 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:48,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:49,271 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:15:49,272 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:15:50,932 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:52,550 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:52,987 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:53,996 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:15:55,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:57,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:15:58,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:15:59,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:01,185 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:16:01,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:03,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:03,550 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:04,272 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:16:04,272 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:16:05,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:07,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:08,326 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:16:09,298 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:09,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:11,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:13,511 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:14,335 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:15,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:17,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:19,298 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:16:19,299 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:16:19,540 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:19,655 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:20,656 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:16:21,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:23,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:24,580 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:25,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:27,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:29,613 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:29,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:31,189 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:16:31,955 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:32,279 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:16:32,280 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:16:32,281 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:16:32,281 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:16:32,946 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:16:33,995 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:16:34,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:34,304 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:16:34,305 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:16:35,562 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:36,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:38,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:40,162 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:40,608 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:42,265 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:44,287 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:46,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:46,315 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:16:46,327 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:48,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:49,325 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:16:49,326 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:16:50,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:51,624 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:52,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:54,519 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:56,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:16:56,657 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:16:58,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:00,657 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:17:00,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:01,202 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:17:02,208 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:02,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:04,323 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:17:04,323 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:17:04,772 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:06,815 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:07,577 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:08,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:10,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:12,976 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:17:13,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:13,096 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:15,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:17,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:18,145 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:19,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:19,339 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:17:19,339 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:17:21,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:23,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:23,616 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:25,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:26,242 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:17:27,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:29,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:29,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:31,219 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:17:31,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:33,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:34,358 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:17:34,359 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:17:34,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:35,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:37,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:38,591 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:17:39,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:39,995 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:41,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:43,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:45,032 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:45,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:47,870 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:49,366 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:17:49,367 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:17:49,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:50,932 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:17:51,009 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:51,980 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:54,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:56,067 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:17:56,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:17:58,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:00,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:01,171 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:01,220 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:18:02,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:04,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:04,397 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:18:04,398 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:18:05,279 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:18:06,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:06,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:08,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:10,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:11,677 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:12,500 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:14,619 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:16,595 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:18:16,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:16,860 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:18,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:19,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:18:19,390 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:18:20,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:22,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:22,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:24,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:26,858 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:27,673 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:28,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:30,939 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:18:30,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:31,227 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:18:33,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:33,245 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:34,396 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:18:34,396 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:18:35,041 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:37,097 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:38,680 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:39,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:41,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:41,744 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:18:41,746 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:18:41,746 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:18:41,747 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:18:42,193 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:18:43,247 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:18:43,257 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:43,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:45,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:47,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:49,400 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:18:49,400 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:18:49,403 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:49,642 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:51,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:53,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:54,701 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:18:55,570 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:57,607 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:18:57,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:59,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:18:59,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:01,240 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:19:01,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:03,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:04,435 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:19:04,435 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:19:05,689 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:05,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:07,858 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:08,853 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:19:09,902 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:10,704 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:11,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:14,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:15,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:16,124 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:18,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:19,432 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:19:19,433 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:19:20,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:21,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:22,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:23,233 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:19:24,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:26,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:26,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:28,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:30,422 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:31,250 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:19:32,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:32,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:34,448 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:19:34,449 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:19:34,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:35,511 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:19:36,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:37,714 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:38,609 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:40,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:42,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:42,752 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:44,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:46,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:47,828 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:19:48,433 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:48,870 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:49,447 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:19:49,448 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:19:50,902 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:52,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:53,720 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:55,015 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:57,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:19:58,760 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:19:59,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:01,184 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:20:01,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:01,256 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:20:03,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:04,468 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:20:04,469 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:20:04,711 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:05,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:07,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:09,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:09,756 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:11,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:13,517 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:20:13,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:15,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:15,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:17,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:19,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:20:19,463 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:20:19,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:20,739 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:21,727 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:23,775 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:25,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:26,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:27,849 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:20:27,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:29,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:31,260 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:20:31,261 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:31,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:34,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:34,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:20:34,473 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:20:36,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:36,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:38,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:39,126 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:20:40,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:42,199 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:42,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:44,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:46,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:47,251 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:48,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:49,469 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:20:49,470 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:20:50,446 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:51,086 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:20:51,087 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:20:51,087 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:20:51,088 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:20:51,416 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:20:52,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:53,097 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:53,476 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:20:54,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:56,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:20:58,147 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:20:58,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:00,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:01,276 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:21:02,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:03,286 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:04,469 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:21:04,470 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:21:04,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:05,774 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:21:06,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:08,750 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:08,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:10,922 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:12,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:13,786 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:15,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:17,049 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:19,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:19,469 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:21:19,469 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:21:19,715 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:20,129 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:21:21,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:23,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:24,770 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:25,287 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:27,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:29,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:29,822 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:31,287 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:21:31,426 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:21:31,433 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:33,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:34,476 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:21:34,476 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:21:35,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:35,721 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:37,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:39,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:40,752 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:41,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:43,713 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:21:43,721 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:45,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:45,834 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:47,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:49,497 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:21:49,497 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:21:49,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:51,756 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:51,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:54,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:56,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:21:57,700 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:21:58,109 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:21:58,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:00,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:01,293 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:22:02,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:03,313 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:04,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:04,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:22:04,516 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:22:06,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:08,362 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:08,578 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:10,405 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:22:10,413 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:12,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:13,612 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:14,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:16,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:18,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:18,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:19,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:22:19,514 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:22:20,735 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:22,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:23,733 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:22:24,601 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:24,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:26,823 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:28,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:29,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:30,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:31,306 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:22:33,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:34,518 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:22:34,519 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:22:34,760 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:35,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:36,043 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:22:37,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:39,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:39,790 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:41,209 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:43,260 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:44,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:45,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:47,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:48,358 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:22:49,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:49,519 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:22:49,519 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:22:50,756 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:51,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:53,570 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:55,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:55,796 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:22:57,624 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:22:59,670 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:00,275 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:23:00,275 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:23:00,276 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:23:00,277 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:23:00,671 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:23:01,283 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:01,316 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:23:01,716 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:23:01,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:03,781 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:04,527 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:23:04,527 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:23:05,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:06,782 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:07,896 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:09,942 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:11,801 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:11,987 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:14,027 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:23:14,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:16,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:17,276 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:18,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:19,539 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:23:19,539 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:23:20,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:22,305 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:22,812 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:24,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:26,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:28,190 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:28,393 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:23:28,402 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:30,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:31,322 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:23:32,505 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:33,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:34,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:23:34,560 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:23:34,570 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:36,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:38,674 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:38,832 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:40,700 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:23:40,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:42,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:44,116 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:44,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:46,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:48,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:49,559 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:23:49,560 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:23:49,794 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:50,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:53,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:54,055 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:23:55,094 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:23:55,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:57,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:23:59,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:00,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:01,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:01,330 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:24:03,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:04,568 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:24:04,569 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:24:05,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:05,935 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:06,361 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:24:07,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:09,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:10,993 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:11,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:13,589 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:15,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:16,039 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:17,712 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:19,593 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:24:19,593 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:24:19,750 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:20,750 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:24:21,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:21,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:23,901 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:25,916 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:26,898 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:27,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:30,002 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:31,342 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:24:32,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:32,358 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:33,047 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:24:34,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:34,605 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:24:34,606 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:24:36,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:37,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:38,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:40,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:42,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:42,915 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:44,338 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:24:44,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:46,407 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:48,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:48,788 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:49,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:24:49,604 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:24:50,520 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:52,568 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:53,873 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:24:54,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:56,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:58,719 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:24:58,731 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:24:59,707 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:00,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:01,355 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:25:02,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:04,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:25:04,628 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:25:04,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:04,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:06,940 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:08,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:09,688 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:25:09,690 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:25:09,690 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:25:09,691 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:25:09,945 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:09,985 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:25:11,022 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:25:11,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:13,101 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:15,006 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:15,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:17,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:19,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:19,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:25:19,637 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:25:20,891 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:21,319 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:23,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:24,375 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:25:25,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:26,408 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:27,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:29,545 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:31,360 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:25:31,595 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:32,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:33,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:34,635 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:25:34,635 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:25:35,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:37,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:38,177 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:38,733 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:25:39,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:41,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:43,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:43,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:45,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:47,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:48,259 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:49,656 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:25:49,657 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:25:50,055 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:51,048 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:25:52,099 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:53,921 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:25:54,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:56,245 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:58,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:25:58,969 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:00,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:01,361 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:26:02,341 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:03,337 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:26:04,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:04,674 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:26:04,674 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:26:04,954 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:06,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:08,500 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:09,985 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:10,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:12,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:14,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:15,842 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:16,671 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:26:16,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:18,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:19,685 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:26:19,685 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:26:20,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:20,943 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:22,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:24,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:25,994 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:26,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:28,992 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:26:29,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:31,049 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:31,376 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:26:31,377 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:33,102 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:34,688 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:26:34,689 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:26:35,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:36,954 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:37,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:39,241 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:41,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:42,678 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:43,336 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:26:43,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:45,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:47,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:47,741 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:49,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:49,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:26:49,694 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:26:51,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:52,951 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:53,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:55,636 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:26:55,639 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:57,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:26:58,539 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:26:59,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:01,387 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:27:01,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:03,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:03,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:04,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:27:04,710 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:27:05,872 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:06,887 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:27:07,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:09,404 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:09,987 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:12,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:14,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:14,449 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:16,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:18,222 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:19,333 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:27:19,335 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:27:19,335 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:27:19,336 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:27:19,723 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:27:19,723 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:27:19,958 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:20,267 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:27:20,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:21,268 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:27:22,335 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:24,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:25,021 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:26,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:28,559 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:30,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:30,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:31,387 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:27:32,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:33,596 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:27:34,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:34,737 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:27:34,738 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:27:35,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:36,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:38,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:40,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:41,014 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:42,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:44,901 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:46,158 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:46,940 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:27:46,953 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:49,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:49,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:27:49,732 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:27:51,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:51,983 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:53,102 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:55,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:57,041 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:27:57,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:27:59,286 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:27:59,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:01,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:01,395 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:28:02,398 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:03,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:04,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:28:04,732 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:28:05,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:07,472 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:08,006 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:09,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:11,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:13,601 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:28:13,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:13,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:15,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:17,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:19,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:28:19,754 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:28:19,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:19,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:21,819 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:23,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:25,810 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:25,926 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:28:25,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:27,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:30,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:30,853 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:31,404 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:28:32,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:34,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:34,762 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:28:34,763 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:28:36,008 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:36,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:37,177 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:28:38,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:40,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:41,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:42,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:44,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:46,429 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:46,810 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:48,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:49,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:28:49,773 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:28:50,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:51,540 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:28:52,027 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:52,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:54,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:56,688 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:28:57,081 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:28:58,732 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:00,841 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:01,419 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:29:02,434 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:02,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:03,845 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:29:04,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:29:04,801 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:29:04,896 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:06,945 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:08,072 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:09,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:11,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:13,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:13,097 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:15,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:17,175 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:18,172 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:29:18,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:19,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:19,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:29:19,807 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:29:21,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:23,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:24,082 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:25,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:27,429 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:28,360 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:29:28,361 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:29:28,362 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:29:28,363 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:29:28,427 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:29:29,374 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:29,476 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:29:29,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:31,431 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:29:31,600 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:33,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:34,826 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:29:34,827 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:29:35,068 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:35,654 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:37,702 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:39,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:40,117 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:41,815 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:43,862 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:29:43,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:45,338 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:45,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:48,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:49,838 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:29:49,839 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:29:50,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:51,103 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:52,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:54,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:56,200 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:29:56,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:29:56,277 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:29:58,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:00,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:01,317 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:01,446 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:30:02,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:04,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:04,848 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:30:04,849 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:30:06,472 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:07,096 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:08,502 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:30:08,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:10,570 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:12,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:12,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:14,652 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:16,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:17,290 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:18,789 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:19,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:30:19,850 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:30:20,830 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:21,819 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:30:22,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:23,257 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:24,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:26,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:28,297 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:29,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:31,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:31,448 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:30:33,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:33,996 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:34,154 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:30:34,859 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:30:34,860 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:30:35,199 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:37,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:39,145 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:39,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:41,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:43,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:44,185 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:45,436 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:47,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:48,483 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:30:49,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:49,869 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:30:49,869 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:30:50,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:51,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:53,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:55,148 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:30:55,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:57,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:30:59,775 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:30:59,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:00,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:01,458 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:31:01,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:03,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:04,881 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:31:04,882 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:31:05,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:06,141 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:07,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:10,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:11,184 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:12,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:14,119 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:31:14,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:16,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:16,904 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:18,230 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:19,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:31:19,897 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:31:20,276 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:22,160 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:22,322 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:24,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:26,414 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:31:26,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:27,819 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:28,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:30,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:31,466 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:31:32,576 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:33,491 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:34,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:34,925 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:31:34,926 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:31:36,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:37,790 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:31:37,791 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:31:37,792 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:31:37,793 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:31:38,736 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:31:38,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:38,809 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:40,787 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:31:40,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:42,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:43,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:44,891 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:46,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:48,917 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:48,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:49,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:31:49,938 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:31:51,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:52,020 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:31:53,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:54,827 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:31:55,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:57,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:59,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:31:59,858 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:01,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:01,478 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:32:03,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:04,362 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:32:04,948 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:32:04,949 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:32:05,197 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:05,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:07,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:09,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:10,216 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:11,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:13,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:15,261 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:15,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:17,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:18,757 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:32:19,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:19,962 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:32:19,962 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:32:21,209 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:21,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:23,902 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:25,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:26,257 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:28,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:30,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:31,059 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:32:31,484 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:32:31,486 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:32,117 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:34,169 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:34,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:32:34,982 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:32:36,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:37,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:38,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:40,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:42,361 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:42,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:44,404 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:32:44,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:46,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:47,526 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:48,517 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:49,990 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:32:49,991 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:32:50,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:52,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:53,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:54,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:56,700 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:32:56,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:32:58,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:32:58,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:00,797 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:01,489 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:33:02,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:03,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:04,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:04,991 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:33:04,992 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:33:07,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:09,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:09,428 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:11,114 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:33:11,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:13,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:14,479 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:15,224 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:17,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:19,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:20,007 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:33:20,007 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:33:20,252 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:21,356 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:22,362 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:33:23,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:25,412 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:25,462 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:27,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:29,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:30,459 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:31,493 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:33:31,603 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:33,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:35,021 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:33:35,021 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:33:35,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:36,270 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:36,707 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:33:37,816 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:39,832 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:41,287 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:41,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:43,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:45,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:46,329 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:47,053 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:33:47,054 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:33:47,055 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:33:47,056 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:33:48,008 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:33:48,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:49,011 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:33:50,074 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:33:50,075 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:33:50,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:52,128 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:52,320 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:54,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:56,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:33:57,363 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:33:58,273 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:00,341 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:01,343 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:34:01,496 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:34:02,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:02,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:04,452 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:05,044 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:34:05,045 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:34:06,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:08,315 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:08,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:10,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:12,649 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:13,911 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:14,691 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:34:14,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:16,744 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:18,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:18,945 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:20,052 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:34:20,052 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:34:20,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:22,912 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:24,332 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:24,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:27,017 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:34:27,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:29,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:29,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:31,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:31,507 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:34:33,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:35,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:34:35,064 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:34:35,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:35,299 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:37,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:39,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:40,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:41,403 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:34:41,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:43,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:45,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:45,691 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:47,545 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:49,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:50,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:34:50,074 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:34:51,332 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:51,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:53,700 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:34:53,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:55,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:56,625 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:34:57,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:34:59,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:01,515 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:35:01,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:02,527 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:03,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:05,085 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:35:05,086 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:35:06,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:06,997 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:35:08,052 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:08,343 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:10,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:12,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:13,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:14,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:16,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:18,298 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:18,498 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:19,301 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:35:20,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:35:20,100 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:35:20,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:22,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:24,376 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:24,439 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:26,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:28,544 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:29,420 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:30,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:31,529 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:35:31,591 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:35:32,632 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:34,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:35,110 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:35:35,111 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:35:35,352 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:36,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:38,789 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:40,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:40,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:42,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:44,945 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:45,920 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:35:46,342 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:46,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:49,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:50,108 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:35:50,108 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:35:51,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:51,361 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:53,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:55,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:56,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:35:56,285 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:35:56,285 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:35:56,286 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:35:57,240 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:35:57,240 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:35:57,252 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:35:57,302 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:35:59,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:01,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:01,543 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:36:02,553 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:03,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:05,108 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:36:05,108 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:36:05,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:07,524 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:08,359 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:09,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:11,637 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:36:11,674 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:13,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:14,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:15,727 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:17,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:19,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:20,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:36:20,124 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:36:20,368 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:21,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:23,934 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:36:23,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:25,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:26,033 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:28,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:30,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:31,092 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:31,547 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:36:32,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:34,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:35,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:36:35,125 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:36:36,239 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:36,362 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:37,232 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:36:38,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:40,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:41,418 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:42,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:44,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:46,524 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:46,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:48,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:49,601 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:36:50,133 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:36:50,134 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:36:50,639 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:52,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:52,702 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:54,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:56,821 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:36:57,423 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:36:58,884 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:00,932 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:01,556 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:37:01,934 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:37:02,565 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:02,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:05,030 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:05,156 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:37:05,157 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:37:07,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:08,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:09,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:11,185 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:13,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:13,470 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:15,316 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:16,286 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:37:17,332 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:18,672 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:19,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:20,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:37:20,166 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:37:21,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:23,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:24,426 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:25,547 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:27,603 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:37:27,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:29,539 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:29,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:31,565 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:37:31,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:33,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:35,181 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:37:35,181 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:37:35,427 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:35,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:37,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:39,939 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:41,410 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:41,984 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:37:41,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:44,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:46,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:46,456 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:48,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:50,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:37:50,194 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:37:50,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:52,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:52,301 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:54,284 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:37:54,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:56,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:37:57,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:37:58,394 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:00,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:01,568 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:38:02,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:02,580 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:04,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:05,211 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:38:05,211 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:38:05,225 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:38:05,445 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:38:05,445 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:38:05,446 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:38:05,540 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:38:06,587 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:06,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:07,594 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:08,476 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:08,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:10,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:12,732 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:13,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:14,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:16,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:18,888 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:19,219 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:19,875 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:20,217 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:38:20,218 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:38:20,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:22,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:24,488 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:25,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:27,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:29,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:29,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:31,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:31,570 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:38:32,170 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:33,217 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:35,238 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:38:35,259 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:38:35,268 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:35,501 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:37,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:39,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:40,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:41,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:43,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:45,546 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:46,024 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:46,524 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:47,567 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:49,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:50,237 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:38:50,237 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:38:51,480 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:51,653 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:53,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:55,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:56,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:38:57,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:38:58,801 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:38:59,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:01,573 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:39:01,573 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:01,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:03,938 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:05,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:39:05,249 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:39:06,002 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:07,519 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:08,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:10,122 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:12,159 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:39:12,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:12,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:14,213 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:16,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:17,846 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:18,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:20,257 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:39:20,258 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:39:20,363 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:22,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:23,724 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:24,442 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:39:24,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:26,501 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:28,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:28,757 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:30,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:31,577 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:39:32,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:34,292 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:34,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:35,269 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:39:35,269 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:39:36,775 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:39:36,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:38,829 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:39,608 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:40,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:42,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:44,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:45,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:47,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:49,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:50,267 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:39:50,268 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:39:50,519 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:51,131 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:39:51,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:53,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:55,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:55,570 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:39:57,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:39:59,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:00,617 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:01,394 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:01,579 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:40:02,388 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:40:03,445 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:05,276 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:40:05,276 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:40:05,496 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:06,522 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:07,545 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:09,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:11,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:11,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:13,704 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:14,322 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:40:14,324 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:40:14,324 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:40:14,325 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:40:14,712 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:40:15,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:16,741 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:40:17,355 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:17,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:19,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:20,290 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:40:20,291 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:40:21,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:22,538 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:23,962 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:26,010 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:28,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:28,262 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:29,067 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:40:30,113 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:31,593 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:40:32,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:33,610 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:34,206 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:35,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:40:35,314 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:40:36,267 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:38,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:39,585 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:40,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:42,411 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:40:42,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:44,473 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:45,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:46,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:48,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:50,318 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:40:50,318 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:40:50,560 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:50,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:52,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:54,709 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:40:54,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:56,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:40:56,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:40:58,829 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:00,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:01,110 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:01,604 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:41:02,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:04,986 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:05,326 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:41:05,326 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:41:06,910 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:07,037 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:41:07,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:09,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:11,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:11,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:13,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:15,253 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:16,990 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:17,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:19,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:20,348 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:41:20,349 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:41:21,375 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:41:21,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:22,611 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:23,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:25,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:27,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:27,669 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:29,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:31,657 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:41:31,667 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:32,668 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:41:32,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:33,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:35,362 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:41:35,362 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:41:35,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:37,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:38,618 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:39,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:41,933 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:43,655 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:43,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:46,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:47,038 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:41:48,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:48,735 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:50,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:50,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:41:50,373 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:41:52,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:54,254 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:54,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:41:56,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:58,349 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:41:59,346 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:42:00,385 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:00,670 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:01,619 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:42:02,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:04,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:05,382 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:42:05,383 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:42:06,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:06,639 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:08,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:10,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:11,652 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:42:12,579 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:12,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:14,751 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:16,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:17,599 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:18,863 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:20,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:42:20,401 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:42:20,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:22,656 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:23,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:23,602 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:42:23,603 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:42:23,603 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:42:23,604 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:42:23,986 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:42:25,022 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:42:25,033 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:27,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:28,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:29,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:31,189 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:31,625 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:42:33,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:34,382 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:35,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:35,425 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:42:35,426 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:42:37,337 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:39,374 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:42:39,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:39,597 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:41,433 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:43,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:44,635 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:45,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:47,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:49,615 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:50,443 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:42:50,443 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:42:50,684 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:51,683 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:42:51,724 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:53,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:55,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:42:55,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:57,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:42:59,871 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:00,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:01,629 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:43:01,919 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:03,962 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:43:03,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:05,447 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:43:05,447 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:43:06,015 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:06,708 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:08,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:10,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:11,758 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:12,185 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:14,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:16,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:17,285 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:43:17,379 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:18,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:20,399 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:20,456 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:43:20,456 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:43:22,552 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:22,721 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:24,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:26,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:27,766 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:28,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:29,624 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:43:30,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:31,642 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:43:32,714 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:33,660 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:34,771 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:35,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:43:35,458 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:43:36,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:38,726 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:38,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:40,915 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:42,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:43,970 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:43:44,234 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:45,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:47,074 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:49,135 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:49,270 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:50,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:43:50,464 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:43:51,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:53,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:55,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:43:55,296 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:43:55,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:57,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:43:59,406 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:00,205 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:01,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:01,652 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:44:03,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:05,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:05,467 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:44:05,468 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:44:05,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:07,573 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:44:07,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:09,644 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:11,102 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:11,681 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:13,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:15,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:16,152 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:17,870 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:19,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:20,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:44:20,464 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:44:21,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:21,943 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:44:21,953 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:24,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:26,077 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:26,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:28,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:30,146 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:31,664 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:44:32,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:32,673 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:33,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:44:33,017 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:44:33,017 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:44:33,018 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:44:33,204 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:44:34,252 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:44:34,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:35,256 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:44:35,474 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:44:35,475 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:44:36,301 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:37,732 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:38,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:40,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:42,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:42,791 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:44,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:46,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:47,550 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:44:47,935 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:48,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:50,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:44:50,497 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:44:50,654 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:52,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:53,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:54,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:56,850 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:44:58,857 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:44:58,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:00,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:01,681 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:45:01,950 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:45:03,001 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:04,462 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:05,060 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:05,508 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:45:05,509 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:45:07,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:09,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:09,767 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:11,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:13,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:14,272 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:45:15,307 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:15,316 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:17,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:19,457 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:20,474 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:20,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:45:20,522 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:45:21,518 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:23,566 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:25,633 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:45:25,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:26,457 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:27,698 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:29,784 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:31,501 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:31,686 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:45:31,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:33,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:35,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:45:35,512 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:45:35,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:36,759 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:38,001 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:40,048 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:45:40,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:41,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:42,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:44,165 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:46,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:46,967 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:48,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:50,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:50,523 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:45:50,525 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:45:52,298 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:52,397 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:54,446 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:45:54,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:56,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:45:57,319 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:45:58,607 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:00,639 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:01,688 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:46:02,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:02,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:04,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:05,535 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:46:05,536 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:46:06,816 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:07,831 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:46:07,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:08,866 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:10,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:12,993 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:13,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:15,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:17,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:18,020 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:19,132 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:20,130 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:46:20,524 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:46:20,524 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:46:21,184 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:23,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:23,795 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:25,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:27,396 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:28,838 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:29,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:31,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:31,695 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:46:32,420 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:46:33,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:34,526 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:35,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:35,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:46:35,543 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:46:37,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:39,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:39,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:41,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:43,727 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:44,293 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:46:44,294 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:46:44,295 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:46:44,296 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:46:44,730 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:46:45,541 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:45,783 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:46:45,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:46,783 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:46:47,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:49,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:50,556 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:46:50,557 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:46:50,799 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:51,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:54,029 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:55,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:46:56,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:46:58,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:00,168 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:47:00,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:00,942 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:01,710 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:47:02,237 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:04,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:05,576 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:47:05,576 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:47:06,376 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:06,821 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:08,426 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:10,472 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:11,891 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:12,517 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:47:12,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:14,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:16,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:16,936 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:18,673 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:20,596 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:47:20,596 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:47:20,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:22,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:22,855 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:24,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:26,882 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:47:26,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:27,863 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:29,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:31,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:31,725 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:47:33,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:33,744 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:35,088 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:35,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:47:35,606 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:47:37,138 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:38,133 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:47:39,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:39,746 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:41,260 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:43,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:44,795 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:45,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:47,414 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:49,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:50,461 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:47:50,615 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:47:50,615 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:47:50,861 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:51,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:53,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:55,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:55,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:47:57,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:47:59,793 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:00,944 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:01,766 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:48:01,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:03,834 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:04,824 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:48:05,622 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:48:05,623 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:48:05,877 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:06,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:07,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:10,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:11,941 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:12,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:14,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:16,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:17,120 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:48:17,479 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:18,172 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:20,220 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:20,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:48:20,630 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:48:22,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:22,895 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:24,327 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:26,388 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:27,919 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:28,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:30,521 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:48:30,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:31,751 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:48:32,594 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:33,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:34,610 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:35,628 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:48:35,628 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:48:36,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:38,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:38,899 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:40,750 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:42,799 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:48:42,816 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:44,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:44,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:46,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:48,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:49,308 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:50,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:48:50,630 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:48:50,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:53,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:54,211 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:48:54,213 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:48:54,214 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:48:54,215 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:48:55,078 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:48:55,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:55,223 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:48:57,119 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:48:57,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:48:59,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:00,265 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:01,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:01,761 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:49:03,306 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:05,348 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:05,607 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:05,638 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:49:05,638 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:49:07,402 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:09,449 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:49:09,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:11,210 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:11,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:13,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:15,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:16,252 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:17,687 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:19,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:20,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:49:20,644 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:49:21,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:21,905 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:22,787 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:49:23,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:25,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:26,951 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:27,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:29,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:31,771 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:49:32,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:32,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:34,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:35,069 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:49:35,650 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:49:35,650 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:49:36,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:38,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:38,460 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:40,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:42,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:43,499 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:44,324 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:46,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:47,391 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:49:48,437 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:48,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:50,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:50,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:49:50,665 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:49:52,545 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:53,936 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:49:54,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:56,658 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:58,704 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:49:59,816 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:00,751 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:50:00,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:01,779 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:50:02,876 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:04,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:05,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:05,690 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:50:05,691 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:50:06,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:08,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:10,955 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:11,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:13,054 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:50:13,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:15,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:16,890 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:17,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:19,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:20,697 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:50:20,698 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:50:21,260 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:21,952 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:23,308 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:25,347 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:50:25,355 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:27,399 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:27,792 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:29,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:31,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:31,792 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:50:32,808 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:33,617 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:35,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:35,721 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:50:35,722 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:50:37,655 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:38,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:39,692 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:50:39,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:41,751 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:43,739 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:43,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:45,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:47,926 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:48,773 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:49,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:50,712 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:50:50,712 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:50:50,964 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:50:52,016 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:54,028 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:50:54,073 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:56,141 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:58,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:50:59,038 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:00,265 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:01,798 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:51:02,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:03,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:51:03,442 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:51:03,443 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:51:03,443 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:51:04,411 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:51:04,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:04,450 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:05,419 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:51:05,734 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:51:05,735 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:51:06,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:08,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:10,003 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:10,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:12,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:14,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:15,032 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:16,696 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:17,699 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:51:18,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:20,698 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:20,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:51:20,798 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:51:20,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:22,848 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:24,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:26,074 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:26,949 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:29,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:31,055 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:31,400 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:31,809 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:51:32,060 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:51:33,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:35,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:35,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:51:35,754 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:51:36,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:37,229 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:39,254 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:41,298 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:42,036 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:43,340 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:51:43,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:45,404 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:47,454 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:47,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:49,509 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:50,756 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:51:50,757 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:51:51,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:53,004 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:53,604 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:55,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:57,698 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:51:57,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:51:58,304 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:51:59,751 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:01,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:01,818 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:52:03,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:03,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:05,773 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:52:05,773 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:52:05,959 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:07,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:09,167 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:10,003 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:52:10,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:12,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:14,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:14,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:16,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:18,210 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:19,267 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:20,266 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:20,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:52:20,789 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:52:22,305 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:52:22,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:24,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:25,145 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:26,413 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:28,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:30,174 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:30,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:31,821 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:52:32,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:34,593 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:35,601 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:52:35,726 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:35,803 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:52:35,804 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:52:36,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:38,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:40,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:41,087 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:42,785 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:44,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:46,125 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:46,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:47,869 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:52:48,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:50,816 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:52:50,816 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:52:50,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:52,066 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:53,012 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:55,060 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:57,095 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:52:57,112 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:52:59,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:01,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:01,830 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:53:02,206 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:53:02,840 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:03,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:05,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:05,830 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:53:05,831 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:53:07,403 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:08,090 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:09,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:11,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:12,810 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:53:12,811 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:53:12,811 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:53:12,812 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:53:13,462 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:53:13,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:13,827 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:14,466 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:53:15,542 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:17,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:18,876 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:19,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:20,830 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:53:20,830 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:53:21,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:23,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:24,095 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:25,803 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:27,843 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:53:27,851 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:29,877 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:29,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:31,842 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:53:31,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:34,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:35,755 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:35,833 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:53:35,833 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:53:36,050 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:38,160 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:40,145 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:53:40,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:41,684 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:42,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:44,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:46,349 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:46,723 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:48,404 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:50,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:50,838 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:53:50,839 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:53:52,492 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:52,614 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:54,542 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:53:54,553 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:56,596 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:53:57,646 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:53:58,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:00,687 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:01,850 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:54:02,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:02,861 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:04,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:05,847 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:54:05,847 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:54:06,819 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:54:06,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:08,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:08,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:10,949 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:12,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:13,153 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:15,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:17,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:18,084 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:54:18,360 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:19,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:20,843 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:54:20,844 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:54:21,189 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:23,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:24,128 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:25,282 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:27,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:29,175 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:29,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:31,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:31,864 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:54:32,425 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:54:33,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:34,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:35,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:35,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:54:35,854 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:54:37,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:39,694 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:40,126 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:41,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:43,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:44,721 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:54:45,236 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:45,781 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:47,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:49,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:50,785 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:50,863 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:54:50,864 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:54:51,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:53,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:56,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:56,156 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:54:58,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:54:59,078 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:00,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:01,209 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:01,873 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:55:02,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:04,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:05,863 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:55:05,863 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:55:06,287 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:07,101 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:08,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:10,392 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:10,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:12,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:13,079 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:14,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:16,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:18,115 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:18,589 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:20,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:20,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:55:20,872 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:55:21,920 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:55:21,921 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:55:21,922 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:55:21,922 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:55:22,685 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:22,686 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:55:22,699 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:23,925 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:24,747 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:24,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:26,800 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:28,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:28,958 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:30,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:31,888 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:55:32,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:34,810 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:34,993 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:35,874 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:55:35,874 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:55:37,036 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:37,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:39,094 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:40,149 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:41,213 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:43,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:45,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:45,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:47,301 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:48,297 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:55:49,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:50,820 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:50,883 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:55:50,884 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:55:51,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:53,447 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:55,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:56,175 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:55:57,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:55:59,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:01,626 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:01,796 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:01,890 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:56:02,622 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:56:03,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:05,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:05,897 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:56:05,899 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:56:07,154 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:07,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:09,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:11,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:12,196 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:13,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:14,925 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:56:15,997 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:17,770 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:18,067 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:20,126 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:20,914 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:56:20,915 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:56:22,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:23,186 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:24,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:26,275 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:28,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:28,659 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:29,326 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:56:30,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:31,903 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:56:32,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:33,925 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:34,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:35,943 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:56:35,943 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:56:36,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:38,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:39,210 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:40,625 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:56:40,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:42,746 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:44,551 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:44,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:46,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:48,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:49,581 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:50,870 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:50,964 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:56:50,964 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:56:52,914 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:54,961 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:56:54,971 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:55,418 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:56:57,017 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:56:59,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:00,456 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:01,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:01,915 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:57:03,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:05,214 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:05,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:57:05,979 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:57:07,259 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:57:07,272 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:07,279 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:09,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:11,386 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:12,356 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:13,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:15,540 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:17,394 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:17,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:19,579 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:57:19,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:20,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:57:20,998 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:57:21,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:23,252 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:23,710 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:25,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:27,824 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:28,278 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:29,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:31,298 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:57:31,300 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:57:31,300 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:57:31,301 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:57:31,891 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:57:31,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:31,919 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:57:32,901 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:57:33,942 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:33,946 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:35,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:36,009 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:57:36,010 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:57:38,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:39,288 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:40,090 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:42,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:44,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:45,213 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:57:45,243 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:46,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:48,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:50,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:50,909 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:51,017 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:57:51,017 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:57:52,417 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:54,457 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:56,295 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:57:56,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:58,541 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:57:59,533 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:58:00,591 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:01,924 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:58:01,925 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:02,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:04,674 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:06,028 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:58:06,028 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:58:06,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:08,480 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:08,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:10,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:11,804 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:58:12,857 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:14,281 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:14,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:16,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:19,006 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:19,308 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:21,042 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:58:21,043 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:58:21,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:23,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:25,136 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:58:25,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:25,205 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:27,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:29,231 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:30,249 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:31,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:31,938 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:58:33,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:35,407 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:35,926 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:36,066 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:58:36,067 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:58:37,438 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:58:37,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:39,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:41,348 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:41,550 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:43,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:45,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:46,377 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:47,750 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:49,784 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:58:49,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:51,062 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:58:51,063 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:58:51,850 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:52,308 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:53,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:55,950 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:58:57,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:58:57,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:00,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:01,946 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:59:02,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:03,030 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:04,138 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:04,149 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:06,068 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:59:06,068 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:59:06,199 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:08,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:08,335 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:10,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:12,357 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:13,363 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:14,413 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:15,418 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:16,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:18,542 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:18,974 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:20,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:21,084 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:59:21,085 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:59:22,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:24,341 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:24,679 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:26,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:28,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:29,760 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:29,980 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:30,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:31,955 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 06:59:32,869 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:34,933 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:35,972 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:36,098 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:59:36,099 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:59:36,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:39,041 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:40,913 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 06:59:40,914 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 06:59:40,915 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 06:59:40,916 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 06:59:41,072 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 06:59:41,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:41,918 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:42,077 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:43,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:44,138 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:45,193 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:46,954 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:47,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:49,318 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:51,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 06:59:51,099 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 06:59:51,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:52,350 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:53,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:55,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:56,438 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 06:59:57,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 06:59:57,876 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 06:59:59,536 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:01,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:01,958 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:00:02,973 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:03,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:05,692 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:06,090 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:00:06,091 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:00:07,725 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:00:07,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:08,774 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:09,786 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:11,837 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:13,835 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:13,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:15,927 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:18,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:18,874 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:20,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:21,106 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:00:21,106 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:00:22,092 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:00:22,104 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:24,146 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:24,387 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:26,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:28,240 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:29,427 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:30,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:31,959 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:00:32,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:34,370 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:00:34,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:34,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:36,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:00:36,115 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:00:36,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:38,495 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:40,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:40,546 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:42,601 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:44,653 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:45,433 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:46,701 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:00:46,702 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:48,839 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:50,858 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:50,973 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:51,120 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:00:51,120 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:00:52,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:54,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:56,397 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:00:56,993 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:00:59,032 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:00,028 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:01:01,084 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:01,599 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:01,964 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:01:03,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:05,171 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:06,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:01:06,129 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:01:07,239 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:07,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:09,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:11,356 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:12,354 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:01:12,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:13,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:15,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:17,525 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:17,543 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:19,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:21,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:01:21,130 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:01:21,641 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:23,374 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:23,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:25,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:26,736 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:01:27,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:28,530 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:29,816 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:31,878 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:31,969 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:01:33,932 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:33,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:35,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:36,136 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:01:36,137 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:01:38,010 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:01:38,023 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:39,532 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:40,076 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:42,135 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:44,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:44,567 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:46,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:48,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:49,612 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:50,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:50,487 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:01:50,488 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:01:50,489 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:01:50,490 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:01:51,148 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:01:51,149 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:01:51,337 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:01:52,371 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:01:52,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:54,438 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:55,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:01:56,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:01:58,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:00,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:00,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:01,972 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:02:02,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:04,656 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:02:04,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:06,005 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:06,162 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:02:06,162 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:02:06,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:08,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:10,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:11,431 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:12,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:14,900 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:16,475 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:16,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:18,990 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:02:19,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:21,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:21,188 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:02:21,189 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:02:22,445 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:23,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:25,167 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:27,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:27,497 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:29,247 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:31,280 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:02:31,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:31,981 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:02:32,989 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:33,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:35,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:36,196 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:02:36,196 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:02:37,439 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:38,447 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:39,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:41,527 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:43,466 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:43,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:44,588 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:02:45,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:47,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:48,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:49,764 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:51,218 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:02:51,218 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:02:51,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:53,894 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:54,483 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:02:55,932 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:56,940 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:02:57,987 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:02:59,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:00,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:01,991 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:03:02,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:04,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:05,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:06,206 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:06,237 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:03:06,237 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:03:08,251 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:10,305 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:10,415 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:11,300 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:03:12,370 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:14,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:15,453 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:16,481 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:18,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:20,558 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:21,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:21,236 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:03:21,237 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:03:22,636 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:03:22,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:24,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:26,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:26,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:28,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:30,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:31,544 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:32,001 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:03:32,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:34,904 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:03:34,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:36,233 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:03:36,233 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:03:36,978 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:37,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:39,048 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:41,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:42,538 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:43,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:45,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:47,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:48,321 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:49,288 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:03:49,299 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:51,242 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:03:51,242 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:03:51,345 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:53,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:53,514 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:55,469 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:57,491 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:03:58,563 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:03:59,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:00,235 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:04:00,237 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:04:00,237 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:04:00,238 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:04:00,523 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:04:01,574 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:01,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:02,013 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:04:02,582 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:03,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:04,027 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:05,682 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:06,244 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:04:06,244 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:04:07,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:09,512 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:09,768 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:11,811 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:13,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:14,854 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:15,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:15,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:17,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:20,018 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:21,071 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:21,262 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:04:21,263 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:04:22,066 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:24,164 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:26,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:26,182 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:27,155 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:28,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:30,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:31,244 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:32,026 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:04:32,293 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:34,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:36,261 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:04:36,261 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:04:36,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:36,503 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:38,444 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:40,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:41,505 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:42,168 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:42,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:44,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:46,645 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:47,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:48,697 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:50,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:51,277 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:04:51,278 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:04:52,781 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:53,018 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:53,779 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:04:54,886 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:56,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:04:58,065 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:04:58,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:00,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:02,031 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:05:03,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:04,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:05,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:06,290 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:05:06,290 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:05:07,129 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:05:07,137 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:09,194 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:09,558 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:11,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:13,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:14,601 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:15,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:17,411 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:19,459 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:05:19,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:20,021 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:21,308 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:05:21,308 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:05:21,524 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:23,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:25,603 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:25,664 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:27,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:29,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:30,675 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:31,733 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:05:31,744 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:32,043 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:05:33,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:35,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:36,109 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:36,316 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:05:36,316 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:05:37,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:39,940 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:41,606 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:41,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:44,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:46,084 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:05:46,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:46,816 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:48,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:50,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:51,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:05:51,311 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:05:52,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:52,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:54,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:56,420 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:05:57,375 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:05:57,715 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:05:58,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:00,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:02,052 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:06:02,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:03,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:04,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:06,314 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:06:06,315 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:06:06,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:08,566 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:08,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:09,584 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:06:09,585 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:06:09,586 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:06:09,587 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:06:09,698 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:06:10,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:11,759 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:06:12,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:13,608 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:14,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:16,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:18,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:18,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:21,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:21,328 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:06:21,328 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:06:23,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:24,078 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:06:24,586 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:25,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:27,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:29,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:29,621 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:31,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:32,058 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:06:33,341 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:35,078 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:35,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:36,334 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:06:36,334 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:06:37,429 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:06:37,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:39,494 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:40,600 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:41,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:43,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:45,632 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:45,648 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:47,704 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:49,736 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:06:49,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:51,151 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:51,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:06:51,357 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:06:51,799 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:53,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:55,918 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:06:56,662 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:06:58,040 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:00,063 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:02,072 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:07:02,072 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:02,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:04,108 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:07:04,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:06,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:06,352 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:07:06,353 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:07:07,609 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:08,228 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:10,296 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:12,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:12,642 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:14,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:16,440 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:07:16,442 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:18,346 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:18,490 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:20,539 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:21,374 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:07:21,374 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:07:22,588 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:23,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:24,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:26,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:28,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:29,269 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:29,745 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:07:30,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:32,079 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:07:32,834 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:34,882 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:35,103 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:36,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:07:36,381 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:07:36,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:38,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:40,175 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:41,031 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:42,033 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:07:43,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:45,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:45,219 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:47,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:49,215 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:51,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:51,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:51,366 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:07:51,367 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:07:53,317 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:54,325 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:07:55,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:56,205 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:07:57,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:07:59,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:01,234 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:01,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:02,080 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:08:03,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:05,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:06,380 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:08:06,380 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:08:06,706 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:07,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:08,665 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:08:09,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:11,767 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:11,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:13,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:15,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:16,837 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:17,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:19,136 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:08:19,138 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:08:19,138 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:08:19,140 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:08:19,969 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:08:19,969 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:08:19,984 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:21,386 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:08:21,386 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:08:22,019 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:08:22,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:22,637 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:24,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:26,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:27,671 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:28,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:30,285 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:32,083 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:08:32,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:32,922 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:34,313 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:08:34,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:36,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:36,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:08:36,392 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:08:38,415 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:38,643 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:40,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:42,526 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:43,681 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:44,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:46,624 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:08:46,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:48,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:48,824 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:50,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:51,401 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:08:51,402 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:08:52,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:54,670 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:08:54,845 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:56,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:58,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:08:59,802 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:01,015 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:09:01,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:02,092 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:09:03,081 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:05,108 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:05,126 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:06,408 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:09:06,409 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:09:07,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:09,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:10,687 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:11,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:12,272 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:09:13,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:15,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:15,734 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:17,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:19,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:21,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:21,411 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:09:21,411 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:09:21,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:23,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:25,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:26,615 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:09:26,661 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:27,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:29,709 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:31,744 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:31,820 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:32,097 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:09:33,840 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:35,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:36,421 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:09:36,422 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:09:37,611 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:37,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:38,905 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:09:39,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:41,998 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:42,659 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:44,057 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:46,112 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:47,706 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:48,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:50,217 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:51,429 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:09:51,429 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:09:52,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:53,262 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:09:53,687 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:09:54,309 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:56,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:58,411 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:09:58,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:00,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:02,100 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:10:02,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:04,420 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:04,572 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:04,622 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:06,440 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:10:06,440 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:10:06,637 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:08,683 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:09,735 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:10,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:12,784 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:14,817 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:14,825 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:16,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:18,921 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:18,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:20,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:20,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:21,444 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:10:21,444 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:10:23,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:25,068 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:25,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:27,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:29,177 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:29,326 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:10:29,328 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:10:29,329 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:10:29,330 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:10:30,170 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:30,170 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:10:31,208 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:31,217 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:31,355 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:32,111 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:10:33,323 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:35,333 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:36,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:10:36,463 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:10:36,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:37,366 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:39,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:41,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:41,744 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:42,474 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:43,519 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:45,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:47,286 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:47,630 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:49,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:51,466 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:10:51,467 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:10:51,718 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:52,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:53,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:55,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:56,810 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:10:57,873 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:10:58,183 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:10:59,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:01,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:02,120 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:11:04,093 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:04,133 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:06,125 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:06,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:11:06,473 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:11:08,150 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:09,117 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:11:10,171 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:10,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:12,260 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:14,329 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:15,189 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:16,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:18,424 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:20,230 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:20,472 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:21,475 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:11:21,475 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:11:22,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:23,517 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:11:24,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:25,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:26,617 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:28,661 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:30,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:30,812 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:32,136 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:11:32,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:34,848 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:11:34,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:36,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:36,471 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:11:36,471 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:11:36,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:38,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:40,995 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:41,749 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:43,043 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:45,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:46,782 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:47,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:49,164 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:11:49,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:51,223 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:51,471 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:11:51,472 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:11:52,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:53,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:55,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:57,362 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:11:57,766 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:11:59,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:01,469 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:12:01,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:02,142 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:12:03,155 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:03,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:05,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:06,468 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:12:06,468 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:12:07,665 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:08,722 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:09,676 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:11,717 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:13,763 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:13,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:15,808 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:12:15,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:17,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:18,928 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:19,933 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:21,477 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:12:21,477 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:12:21,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:24,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:24,749 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:26,103 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:27,103 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:12:28,149 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:29,889 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:30,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:32,155 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:12:32,239 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:34,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:35,197 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:36,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:36,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:12:36,505 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:12:38,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:38,816 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:12:38,818 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:12:38,818 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:12:38,819 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:12:39,418 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:12:40,462 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:40,836 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:41,454 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:12:42,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:44,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:45,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:46,610 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:48,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:50,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:51,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:51,502 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:12:51,502 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:12:52,759 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:53,751 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:12:54,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:56,846 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:12:56,857 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:12:58,903 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:00,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:01,867 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:02,169 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:13:03,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:05,053 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:13:05,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:06,511 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:13:06,512 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:13:07,185 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:07,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:09,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:11,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:12,846 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:13,288 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:15,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:17,374 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:18,548 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:19,411 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:13:19,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:21,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:21,528 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:13:21,528 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:13:23,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:23,783 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:25,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:27,628 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:28,833 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:29,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:31,748 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:13:31,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:32,181 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:13:33,812 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:34,192 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:35,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:36,543 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:13:36,543 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:13:37,976 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:39,823 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:39,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:42,020 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:44,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:45,590 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:46,097 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:13:46,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:48,168 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:50,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:51,414 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:51,553 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:13:51,554 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:13:52,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:54,303 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:56,350 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:13:56,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:13:57,359 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:13:58,397 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:00,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:01,570 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:02,185 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:14:02,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:04,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:06,590 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:14:06,591 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:14:06,600 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:06,832 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:08,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:10,743 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:11,701 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:14:12,467 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:12,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:14,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:16,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:17,504 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:18,899 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:20,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:21,574 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:14:21,574 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:14:23,015 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:23,406 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:24,008 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:14:25,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:27,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:28,451 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:29,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:31,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:32,190 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:14:33,252 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:34,212 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:35,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:36,597 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:14:36,598 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:14:37,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:38,348 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:14:39,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:39,885 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:41,489 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:43,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:44,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:45,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:47,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:48,279 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:14:48,281 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:14:48,281 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:14:48,283 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:14:48,614 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:14:49,675 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:14:49,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:50,304 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:51,601 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:14:51,601 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:14:51,727 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:53,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:55,824 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:55,886 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:14:57,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:14:59,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:00,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:01,960 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:15:01,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:02,205 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:15:04,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:06,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:06,453 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:06,626 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:15:06,626 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:15:08,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:10,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:11,912 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:12,271 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:14,296 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:16,324 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:15:16,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:17,062 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:18,508 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:20,563 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:21,626 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:15:21,627 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:15:22,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:22,875 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:24,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:26,715 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:27,709 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:15:28,051 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:28,760 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:30,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:32,206 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:15:32,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:33,211 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:34,913 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:36,620 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:15:36,621 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:15:36,965 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:38,864 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:39,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:41,122 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:42,090 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:15:43,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:44,073 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:45,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:47,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:49,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:49,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:51,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:51,618 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:15:51,619 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:15:53,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:54,361 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:15:54,983 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:15:55,392 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:57,461 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:15:59,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:00,022 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:01,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:02,208 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:16:03,638 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:05,243 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:05,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:06,627 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:16:06,627 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:16:07,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:08,721 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:16:09,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:10,900 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:11,893 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:13,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:15,935 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:15,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:18,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:20,058 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:16:20,067 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:21,473 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:21,630 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:16:21,631 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:16:22,126 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:24,179 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:26,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:26,912 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:28,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:30,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:32,213 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:16:32,213 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:32,378 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:34,412 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:16:34,428 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:36,465 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:36,633 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:16:36,634 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:16:37,884 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:38,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:40,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:42,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:42,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:44,695 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:46,721 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:16:46,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:48,776 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:48,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:50,817 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:51,653 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:16:51,654 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:16:52,867 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:53,911 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:16:54,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:56,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:57,690 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:16:57,691 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:16:57,692 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:16:57,693 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:16:57,969 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:16:59,011 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:16:59,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:16:59,703 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:00,017 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:17:01,078 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:02,217 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:17:03,128 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:05,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:05,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:06,671 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:17:06,672 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:17:07,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:09,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:10,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:11,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:12,321 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:17:13,452 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:15,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:15,598 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:17,501 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:19,548 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:21,503 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:21,584 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:21,679 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:17:21,679 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:17:23,634 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:24,632 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:17:25,694 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:27,496 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:27,742 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:29,786 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:31,830 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:32,225 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:17:33,235 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:33,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:35,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:36,698 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:17:36,698 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:17:37,967 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:38,944 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:38,977 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:17:40,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:42,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:43,990 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:44,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:46,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:48,241 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:49,032 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:50,294 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:17:50,304 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:51,718 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:17:51,719 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:17:52,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:54,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:54,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:17:56,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:17:58,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:00,014 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:00,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:02,239 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:18:02,621 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:04,657 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:18:04,670 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:05,260 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:06,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:18:06,730 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:18:06,734 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:08,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:10,827 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:10,998 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:12,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:14,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:16,165 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:16,963 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:18:16,992 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:19,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:21,099 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:21,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:21,751 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:18:21,751 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:18:23,165 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:25,216 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:27,034 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:27,263 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:29,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:31,359 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:18:31,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:32,192 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:32,240 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:18:33,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:35,458 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:36,742 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:18:36,743 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:18:37,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:37,981 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:39,564 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:41,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:42,615 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:18:43,129 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:43,677 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:45,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:47,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:48,162 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:49,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:51,757 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:18:51,758 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:18:51,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:53,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:54,016 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:18:54,947 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:18:56,001 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:58,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:18:59,134 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:00,096 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:02,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:02,248 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:19:04,183 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:04,258 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:06,234 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:06,763 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:19:06,764 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:19:07,023 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:19:07,024 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:19:07,026 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:19:07,027 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:19:07,230 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:19:08,269 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:19:08,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:09,281 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:19:10,055 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:10,334 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:12,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:14,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:15,087 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:16,554 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:18,567 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:20,591 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:20,601 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:21,599 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:19:21,790 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:19:21,790 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:19:22,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:24,702 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:26,074 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:26,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:28,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:30,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:31,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:32,258 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:19:32,905 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:34,948 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:19:34,956 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:36,607 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:36,781 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:19:36,782 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:19:37,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:39,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:41,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:42,097 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:43,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:45,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:47,302 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:19:47,351 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:47,884 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:49,367 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:51,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:51,800 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:19:51,801 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:19:53,044 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:53,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:55,474 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:57,521 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:19:58,089 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:19:59,563 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:01,590 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:20:01,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:02,262 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:20:03,274 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:03,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:05,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:06,813 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:20:06,813 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:20:07,762 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:09,069 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:09,818 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:11,868 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:12,873 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:20:13,918 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:14,774 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:15,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:18,101 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:19,815 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:20,121 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:21,823 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:20:21,824 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:20:22,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:24,173 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:25,701 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:26,205 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:27,200 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:20:28,243 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:30,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:30,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:32,273 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:20:32,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:34,408 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:36,463 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:36,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:36,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:20:36,849 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:20:38,506 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:39,503 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:20:40,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:41,730 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:42,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:44,655 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:46,706 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:46,775 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:48,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:50,864 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:51,853 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:20:51,853 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:20:52,083 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:52,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:53,898 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:20:54,938 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:56,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:20:57,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:20:59,033 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:01,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:02,170 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:02,283 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:21:03,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:05,166 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:05,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:06,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:21:06,878 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:21:07,236 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:08,141 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:09,286 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:11,331 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:13,199 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:13,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:15,429 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:16,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:21:16,584 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:21:16,584 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:21:16,585 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:21:17,471 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:17,471 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:21:17,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:18,593 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:19,552 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:19,585 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:21,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:21,889 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:21:21,890 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:21:23,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:24,152 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:25,697 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:27,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:29,194 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:29,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:31,837 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:31,844 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:32,297 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:21:33,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:34,319 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:35,946 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:36,902 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:21:36,903 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:21:37,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:40,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:40,166 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:42,097 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:44,138 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:44,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:45,523 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:46,195 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:48,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:50,342 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:50,576 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:51,919 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:21:51,919 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:21:52,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:54,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:56,452 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:21:56,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:21:57,467 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:21:58,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:00,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:01,555 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:02,313 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:22:02,598 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:04,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:06,689 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:06,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:06,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:22:06,937 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:22:08,740 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:09,741 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:22:10,791 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:12,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:12,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:14,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:16,941 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:17,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:18,981 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:21,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:21,938 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:22:21,939 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:22:23,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:23,186 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:24,085 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:22:25,134 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:27,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:28,225 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:29,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:31,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:32,319 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:22:33,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:33,347 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:35,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:36,402 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:22:36,936 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:22:36,937 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:22:37,475 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:39,183 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:39,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:41,575 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:43,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:44,238 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:45,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:47,750 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:22:47,765 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:49,361 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:49,814 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:51,930 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:51,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:22:51,938 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:22:53,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:55,197 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:22:55,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:22:58,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:00,080 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:00,246 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:02,116 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:02,123 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:02,328 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:23:04,203 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:05,360 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:06,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:06,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:23:06,937 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:23:08,300 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:10,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:11,209 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:12,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:14,443 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:14,455 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:16,279 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:16,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:18,556 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:20,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:21,746 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:21,953 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:23:21,953 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:23:22,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:24,738 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:26,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:23:26,196 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:23:26,197 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:23:26,197 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:23:26,734 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:23:26,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:27,200 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:27,736 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:28,778 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:28,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:30,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:32,230 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:32,340 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:23:32,887 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:34,932 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:36,970 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:23:36,970 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:23:36,979 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:38,211 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:39,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:40,024 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:41,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:43,120 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:44,195 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:45,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:47,218 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:49,254 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:49,264 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:51,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:51,979 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:23:51,979 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:23:53,417 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:54,383 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:23:55,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:23:55,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:57,476 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:23:59,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:00,300 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:01,560 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:02,351 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:24:03,615 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:05,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:05,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:06,653 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:24:07,000 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:24:07,000 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:24:07,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:09,749 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:11,266 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:11,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:13,835 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:15,885 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:16,326 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:17,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:19,982 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:24:19,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:21,784 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:22,029 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:24:22,030 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:24:22,038 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:24,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:26,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:27,326 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:28,204 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:30,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:32,282 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:24:32,290 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:32,355 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:24:32,356 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:34,360 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:36,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:37,026 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:24:37,027 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:24:38,273 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:38,467 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:40,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:42,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:43,328 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:44,609 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:46,648 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:24:46,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:48,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:48,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:50,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:52,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:24:52,050 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:24:52,788 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:54,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:24:54,911 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:56,927 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:58,931 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:24:58,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:24:59,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:01,008 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:02,364 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:25:03,052 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:05,097 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:05,390 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:07,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:25:07,047 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:25:07,146 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:09,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:10,194 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:25:10,774 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:11,244 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:13,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:15,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:15,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:17,390 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:19,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:21,488 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:21,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:22,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:25:22,050 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:25:23,533 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:24,542 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:25:25,650 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:27,678 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:27,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:29,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:31,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:32,371 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:25:33,377 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:33,801 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:35,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:25:35,653 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:25:35,653 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:25:35,654 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:25:35,857 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:25:35,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:36,872 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:25:37,059 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:25:37,060 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:25:37,914 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:25:37,916 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:39,996 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:41,799 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:42,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:44,091 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:46,139 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:46,844 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:48,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:50,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:51,237 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:25:52,069 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:25:52,069 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:25:52,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:52,299 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:54,332 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:56,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:25:57,370 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:25:58,466 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:00,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:02,371 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:26:02,372 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:02,520 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:26:02,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:04,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:06,627 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:07,096 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:26:07,096 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:26:08,351 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:08,680 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:10,713 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:12,765 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:13,393 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:14,822 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:16,850 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:26:16,859 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:18,424 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:18,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:20,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:22,101 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:26:22,101 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:26:23,009 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:24,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:25,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:27,143 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:29,122 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:26:29,159 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:30,353 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:31,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:32,379 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:26:33,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:35,269 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:35,411 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:37,130 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:26:37,130 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:26:37,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:39,369 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:41,281 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:41,425 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:43,470 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:26:43,477 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:45,536 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:46,337 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:47,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:49,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:51,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:51,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:52,131 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:26:52,132 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:26:53,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:54,699 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:26:55,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:57,184 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:26:57,890 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:26:59,912 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:01,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:02,230 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:02,388 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:27:03,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:06,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:07,006 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:27:07,144 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:27:07,145 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:27:07,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:08,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:10,104 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:12,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:12,439 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:14,202 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:16,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:17,485 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:18,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:20,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:21,330 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:27:22,150 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:27:22,151 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:27:22,387 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:23,404 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:24,442 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:26,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:28,445 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:28,614 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:30,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:32,394 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:27:32,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:33,655 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:27:34,412 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:34,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:36,763 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:37,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:27:37,172 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:27:38,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:39,421 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:40,855 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:42,909 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:44,468 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:44,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:27:44,748 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:27:44,748 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:27:44,749 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:27:44,946 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:27:44,960 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:46,992 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:27:47,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:49,044 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:49,787 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:51,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:52,164 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:27:52,164 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:27:53,140 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:55,196 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:55,427 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:27:57,245 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:27:59,315 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:27:59,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:00,703 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:01,379 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:02,399 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:28:03,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:05,437 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:05,913 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:07,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:28:07,194 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:28:07,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:09,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:11,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:11,632 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:13,620 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:28:13,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:15,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:16,667 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:17,767 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:19,826 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:21,881 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:21,928 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:22,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:28:22,194 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:28:23,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:24,919 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:28:25,966 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:27,561 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:28,037 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:30,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:32,174 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:32,413 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:28:33,428 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:34,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:36,249 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:37,216 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:28:37,217 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:28:37,249 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:28:38,305 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:38,463 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:40,359 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:42,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:43,479 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:44,456 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:46,498 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:48,532 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:48,546 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:50,583 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:51,585 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:28:52,220 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:28:52,221 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:28:52,619 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:54,482 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:28:54,670 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:56,716 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:58,779 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:28:59,515 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:00,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:02,423 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:29:02,916 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:03,892 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:04,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:05,466 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:07,004 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:07,219 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:29:07,220 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:29:09,052 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:10,963 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:11,087 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:13,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:15,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:16,247 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:17,233 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:17,241 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:19,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:21,336 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:21,983 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:22,232 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:29:22,232 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:29:23,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:25,431 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:27,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:27,526 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:29,537 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:29,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:31,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:32,433 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:29:33,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:33,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:35,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:37,256 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:29:37,257 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:29:37,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:38,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:39,796 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:41,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:43,886 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:43,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:44,114 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:45,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:48,015 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:49,158 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:50,074 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:52,130 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:52,268 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:29:52,268 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:29:54,110 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:29:54,111 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:29:54,111 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:29:54,112 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:29:54,170 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:29:54,171 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:29:54,181 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:55,194 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:56,217 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:29:56,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:58,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:29:59,385 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:00,342 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:02,448 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:30:02,454 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:04,456 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:04,475 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:06,499 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:07,284 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:30:07,284 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:30:08,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:09,545 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:30:09,686 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:10,602 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:12,663 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:14,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:14,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:16,758 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:18,804 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:19,806 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:20,852 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:21,850 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:30:22,287 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:30:22,288 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:30:22,904 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:24,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:25,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:27,011 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:29,061 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:30,610 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:31,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:32,450 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:30:33,225 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:35,248 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:36,014 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:36,221 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:30:37,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:37,293 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:30:37,293 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:30:39,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:41,421 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:41,570 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:43,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:45,519 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:47,259 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:47,555 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:30:47,565 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:49,610 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:51,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:52,309 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:30:52,310 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:30:52,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:53,701 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:55,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:57,603 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:30:57,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:30:59,852 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:30:59,865 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:01,915 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:02,455 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:31:03,459 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:04,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:06,075 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:07,325 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:31:07,326 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:31:08,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:08,563 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:10,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:12,209 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:14,008 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:14,250 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:31:14,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:16,312 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:18,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:19,053 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:20,399 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:22,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:31:22,350 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:31:22,446 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:24,496 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:24,605 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:26,535 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:31:26,543 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:28,608 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:29,937 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:30,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:32,471 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:31:32,723 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:34,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:35,500 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:36,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:37,343 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:31:37,344 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:31:38,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:39,866 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:31:40,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:40,954 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:42,951 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:45,012 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:46,010 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:47,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:49,105 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:51,041 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:51,154 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:52,163 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:31:52,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:31:52,350 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:31:53,197 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:55,265 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:56,625 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:31:57,311 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:31:59,346 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:01,393 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:01,676 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:02,473 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:32:03,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:03,887 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:32:03,889 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:32:03,889 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:32:03,890 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:32:04,445 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:32:05,538 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:05,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:06,541 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:07,061 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:07,344 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:32:07,345 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:32:07,590 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:09,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:11,659 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:12,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:13,705 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:15,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:17,828 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:17,841 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:18,826 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:19,861 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:21,902 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:22,347 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:32:22,348 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:32:23,579 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:23,943 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:25,981 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:28,025 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:28,623 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:30,072 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:32,117 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:32,127 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:32,485 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:32:34,171 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:34,513 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:36,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:37,349 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:32:37,349 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:32:38,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:39,599 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:40,313 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:42,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:44,414 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:44,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:44,691 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:46,478 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:48,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:49,731 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:50,577 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:52,357 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:32:52,357 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:32:52,625 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:54,670 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:55,601 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:32:56,717 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:32:56,731 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:32:58,790 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:00,647 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:00,842 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:02,497 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:33:02,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:04,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:06,087 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:07,065 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:07,374 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:33:07,375 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:33:09,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:11,076 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:33:11,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:11,464 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:13,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:15,179 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:16,507 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:17,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:19,295 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:21,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:22,263 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:22,343 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:33:22,389 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:33:22,390 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:33:23,411 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:25,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:27,533 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:27,672 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:29,572 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:31,620 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:32,509 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:33:33,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:33,679 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:35,724 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:36,726 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:33:37,389 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:33:37,390 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:33:37,833 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:38,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:39,848 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:41,876 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:43,685 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:43,923 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:45,977 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:48,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:49,029 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:33:49,107 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:50,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:52,148 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:52,395 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:33:52,395 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:33:54,187 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:54,657 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:33:56,235 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:58,278 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:33:59,707 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:00,326 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:02,372 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:34:02,384 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:02,511 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:34:04,434 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:05,531 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:06,486 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:07,387 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:34:07,388 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:34:08,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:10,613 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:10,648 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:12,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:12,891 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:34:12,893 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:34:12,893 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:34:12,894 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:34:13,604 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:34:14,649 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:34:14,660 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:15,924 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:16,693 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:18,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:20,808 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:20,984 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:22,392 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:34:22,393 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:34:22,847 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:24,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:26,821 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:26,927 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:34:26,936 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:29,006 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:31,053 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:31,857 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:32,524 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:34:33,105 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:35,156 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:37,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:37,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:37,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:34:37,389 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:34:39,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:41,289 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:34:41,341 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:42,747 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:43,358 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:45,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:47,455 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:47,798 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:49,503 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:51,555 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:52,396 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:34:52,397 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:34:52,556 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:34:53,592 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:53,639 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:55,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:57,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:34:58,699 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:34:59,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:01,798 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:02,539 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:35:03,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:04,567 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:05,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:06,901 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:35:07,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:35:07,389 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:35:07,964 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:09,644 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:10,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:12,116 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:14,142 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:14,687 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:16,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:18,242 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:19,246 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:35:19,700 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:20,296 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:22,354 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:22,395 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:35:22,395 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:35:24,418 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:25,657 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:26,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:28,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:30,559 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:30,703 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:32,589 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:35:32,605 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:33,593 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:35:34,648 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:36,192 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:36,691 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:37,394 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:35:37,395 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:35:38,730 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:40,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:41,663 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:42,873 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:44,872 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:35:44,889 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:46,937 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:47,641 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:49,001 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:51,066 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:52,403 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:35:52,404 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:35:53,122 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:53,636 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:35:55,166 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:57,202 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:35:57,211 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:59,270 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:35:59,526 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:01,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:02,561 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:36:03,356 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:04,574 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:05,405 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:07,451 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:36:07,451 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:36:07,461 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:09,510 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:10,396 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:11,580 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:36:11,618 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:13,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:15,447 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:15,657 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:17,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:19,769 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:20,491 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:21,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:22,332 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:36:22,333 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:36:22,333 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:36:22,334 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:36:22,437 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:36:22,437 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:36:22,811 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:36:23,847 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:36:23,854 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:24,852 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:36:25,702 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:25,925 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:27,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:30,019 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:30,748 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:32,079 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:32,561 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:36:34,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:36,190 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:36,231 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:37,188 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:36:37,439 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:36:37,440 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:36:38,227 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:40,283 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:41,701 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:42,397 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:44,412 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:46,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:46,754 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:48,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:49,486 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:36:50,527 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:52,249 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:52,436 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:36:52,436 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:36:52,586 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:54,652 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:56,703 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:36:57,738 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:36:58,771 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:00,818 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:02,571 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:37:02,876 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:03,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:03,885 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:37:04,929 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:06,970 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:07,442 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:37:07,443 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:37:08,705 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:09,024 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:11,071 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:13,176 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:13,742 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:15,150 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:37:15,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:17,222 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:19,085 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:19,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:21,330 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:22,430 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:37:22,431 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:37:23,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:24,689 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:25,423 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:27,461 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:37:27,470 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:29,537 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:29,983 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:31,579 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:32,577 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:37:33,623 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:35,662 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:35,672 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:37,440 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:37:37,440 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:37:37,728 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:39,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:40,882 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:41,820 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:37:41,829 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:43,944 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:45,920 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:45,961 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:47,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:50,036 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:51,039 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:52,082 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:52,443 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:37:52,444 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:37:54,118 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:37:54,129 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:56,192 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:37:56,864 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:37:58,250 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:00,294 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:01,891 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:02,344 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:02,587 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:38:04,400 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:06,459 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:07,285 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:07,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:38:07,441 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:38:07,443 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:08,502 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:10,580 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:12,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:12,732 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:14,755 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:16,780 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:17,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:18,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:19,805 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:20,856 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:22,441 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:38:22,441 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:38:22,921 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:23,681 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:24,967 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:27,014 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:28,762 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:29,062 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:31,101 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:31,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:38:31,547 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:38:31,548 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:38:31,549 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:38:32,103 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:38:32,590 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:38:33,137 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:33,149 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:34,152 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:34,618 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:35,201 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:37,249 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:37,447 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:38:37,447 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:38:39,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:39,693 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:41,349 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:43,400 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:45,507 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:45,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:46,463 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:47,522 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:49,578 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:50,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:51,611 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:52,466 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:38:52,466 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:38:53,665 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:55,719 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:55,724 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:38:57,752 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:38:57,766 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:38:59,806 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:01,469 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:01,849 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:02,594 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:39:03,909 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:05,942 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:07,329 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:07,487 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:39:07,488 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:39:07,991 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:10,051 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:12,086 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:39:12,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:12,385 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:14,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:16,265 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:17,419 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:18,281 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:20,307 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:22,327 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:22,505 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:39:22,505 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:39:22,750 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:24,367 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:39:24,375 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:26,432 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:28,353 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:28,485 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:30,532 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:32,587 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:32,609 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:39:33,625 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:34,636 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:36,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:37,508 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:39:37,509 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:39:38,716 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:39:38,726 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:38,758 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:40,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:42,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:43,805 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:44,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:47,000 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:48,852 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:49,022 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:49,990 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:39:51,049 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:52,521 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:39:52,522 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:39:53,098 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:54,779 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:39:55,144 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:57,189 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:59,232 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:39:59,822 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:01,277 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:02,613 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:40:03,322 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:04,318 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:40:05,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:05,641 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:07,409 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:07,514 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:40:07,515 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:40:09,468 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:10,802 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:11,516 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:13,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:15,635 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:16,633 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:40:16,774 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:17,784 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:19,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:21,828 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:22,373 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:22,516 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:40:22,516 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:40:23,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:25,935 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:27,680 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:27,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:28,967 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:40:30,047 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:32,114 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:32,619 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:40:33,622 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:34,161 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:36,212 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:37,523 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:40:37,523 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:40:38,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:38,770 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:40,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:40,620 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:40:40,622 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:40:40,622 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:40:40,623 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:40:41,313 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:40:42,344 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:40:42,356 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:44,401 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:44,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:46,446 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:48,553 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:49,695 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:50,571 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:52,579 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:40:52,586 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:40:52,597 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:54,636 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:40:54,646 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:55,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:40:56,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:40:58,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:00,583 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:00,773 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:02,623 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:41:02,837 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:04,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:05,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:06,952 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:07,545 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:41:07,546 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:41:08,985 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:41:08,994 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:10,818 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:11,042 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:13,109 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:15,150 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:15,871 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:17,191 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:19,302 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:20,266 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:41:21,319 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:21,389 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:22,558 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:41:22,559 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:41:23,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:25,382 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:26,848 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:27,449 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:29,515 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:31,564 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:31,898 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:32,638 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:41:33,633 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:34,626 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:41:35,690 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:37,406 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:37,562 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:41:37,562 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:41:37,745 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:39,809 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:41,860 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:42,859 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:43,909 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:45,957 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:46,957 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:41:48,045 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:48,624 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:50,165 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:52,186 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:52,571 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:41:52,571 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:41:53,817 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:41:54,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:56,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:58,301 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:41:59,605 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:00,335 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:01,341 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:02,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:02,651 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:42:04,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:04,671 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:06,484 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:07,594 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:42:07,595 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:42:08,534 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:09,856 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:10,569 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:12,608 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:12,615 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:14,669 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:15,546 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:16,707 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:18,756 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:20,590 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:20,892 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:22,604 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:42:22,605 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:42:22,918 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:24,934 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:26,455 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:26,966 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:26,974 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:29,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:31,083 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:31,486 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:32,653 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:42:33,136 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:35,178 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:37,226 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:37,367 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:37,617 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:42:37,618 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:42:39,260 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:39,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:41,315 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:42,902 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:43,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:45,404 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:47,460 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:47,934 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:49,523 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:50,267 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:42:50,269 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:42:50,269 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:42:50,270 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:42:50,520 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:42:51,668 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:52,619 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:52,634 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:42:52,635 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:42:53,638 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:42:53,687 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:53,882 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:55,711 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:57,757 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:42:58,921 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:42:59,836 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:01,896 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:02,666 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:43:03,948 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:04,143 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:04,950 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:43:06,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:07,652 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:43:07,652 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:43:08,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:09,911 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:10,100 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:12,147 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:14,193 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:14,966 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:16,255 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:17,258 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:43:18,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:20,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:20,368 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:22,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:22,665 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:43:22,665 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:43:24,493 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:25,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:26,517 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:28,549 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:30,593 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:30,968 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:31,599 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:43:32,651 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:32,674 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:43:34,695 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:36,470 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:36,736 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:37,686 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:43:37,687 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:43:38,797 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:40,848 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:41,869 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:42,890 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:43:42,897 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:44,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:46,923 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:46,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:49,028 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:51,072 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:52,492 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:52,697 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:43:52,698 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:43:53,199 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:55,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:57,238 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:43:57,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:43:57,764 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:43:59,289 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:01,340 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:02,688 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:44:03,381 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:03,695 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:05,432 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:07,480 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:07,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:44:07,718 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:44:09,520 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:44:09,535 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:09,710 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:11,581 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:13,631 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:14,755 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:15,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:17,748 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:19,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:19,797 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:21,835 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:44:21,843 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:22,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:44:22,717 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:44:23,963 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:24,975 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:25,982 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:28,002 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:30,039 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:30,048 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:32,089 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:32,695 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:44:34,133 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:35,476 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:36,191 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:44:36,200 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:37,717 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:44:37,717 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:44:38,243 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:40,292 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:40,984 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:42,338 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:44,390 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:46,039 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:46,453 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:47,447 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:44:48,505 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:50,563 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:51,388 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:52,616 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:52,725 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:44:52,725 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:44:54,729 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:56,752 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:56,997 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:44:58,777 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:44:59,320 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:44:59,322 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:44:59,322 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:44:59,323 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:44:59,769 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:45:00,810 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:00,818 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:01,821 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:02,343 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:02,709 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:45:02,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:04,917 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:06,964 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:07,518 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:07,741 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:45:07,742 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:45:09,002 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:11,046 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:13,110 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:13,212 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:14,108 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:15,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:17,217 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:18,248 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:19,253 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:21,297 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:22,769 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:45:22,770 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:45:23,339 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:24,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:25,462 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:26,407 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:27,482 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:29,146 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:29,504 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:31,561 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:32,720 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:45:33,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:34,736 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:35,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:37,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:37,773 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:45:37,773 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:45:39,741 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:39,753 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:40,079 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:41,802 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:43,857 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:45,112 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:45,907 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:47,975 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:50,021 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:50,157 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:52,058 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:45:52,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:52,790 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:45:52,790 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:45:54,111 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:56,042 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:45:56,224 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:45:58,233 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:00,280 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:01,074 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:02,328 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:02,724 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:46:04,373 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:06,431 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:46:06,436 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:06,552 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:07,816 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:46:07,816 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:46:08,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:10,538 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:12,159 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:12,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:14,647 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:16,685 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:17,680 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:46:17,801 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:18,725 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:20,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:22,813 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:22,831 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:46:22,831 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:46:23,063 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:24,858 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:26,969 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:28,099 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:28,983 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:31,005 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:32,004 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:46:32,730 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:46:33,069 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:33,735 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:35,118 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:37,163 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:37,843 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:46:37,844 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:46:39,079 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:39,208 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:41,262 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:43,320 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:44,320 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:46:44,758 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:45,372 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:47,433 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:49,479 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:49,789 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:51,528 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:52,849 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:46:52,849 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:46:53,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:55,093 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:46:55,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:56,629 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:46:57,754 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:46:59,774 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:00,753 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:01,795 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:02,743 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:47:03,841 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:05,772 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:05,898 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:07,854 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:47:07,855 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:47:07,947 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:08,577 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:47:08,578 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:47:08,579 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:47:08,579 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:47:08,949 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:47:09,992 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:47:10,007 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:11,615 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:12,056 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:14,104 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:16,152 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:16,650 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:18,219 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:20,266 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:22,305 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:47:22,316 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:22,594 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:22,877 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:47:22,878 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:47:24,365 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:26,410 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:28,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:28,514 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:30,531 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:32,557 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:32,750 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:47:33,755 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:34,612 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:36,666 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:47:36,675 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:37,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:47:37,879 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:47:38,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:39,124 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:40,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:42,828 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:44,171 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:44,874 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:46,920 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:48,956 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:47:48,968 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:49,355 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:51,012 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:52,878 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:47:52,878 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:47:53,070 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:55,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:55,131 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:47:57,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:47:59,291 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:00,229 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:01,319 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:02,280 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:48:02,753 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:48:03,348 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:05,377 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:05,771 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:07,435 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:07,881 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:48:07,881 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:48:09,483 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:11,133 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:11,530 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:13,582 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:14,582 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:48:15,643 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:16,193 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:17,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:19,747 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:21,224 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:21,805 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:22,908 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:48:22,909 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:48:23,862 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:25,910 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:27,123 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:27,964 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:28,960 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:48:30,102 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:32,119 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:32,173 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:32,760 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:48:34,145 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:36,198 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:37,664 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:37,916 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:48:37,916 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:48:38,244 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:40,307 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:48:40,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:42,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:43,047 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:44,416 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:46,471 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:48,076 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:48,520 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:50,562 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:52,604 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:48:52,615 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:52,939 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:48:52,939 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:48:53,172 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:54,671 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:56,722 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:48:58,223 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:48:58,770 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:00,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:02,774 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:49:02,895 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:03,778 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:04,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:06,979 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:49:06,990 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:07,951 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:49:07,952 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:49:09,034 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:09,206 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:11,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:13,128 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:14,255 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:15,180 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:17,231 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:17,746 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:49:17,748 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:49:17,748 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:49:17,749 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:49:18,226 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:49:19,271 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:49:19,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:19,768 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:20,276 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:49:21,343 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:22,946 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:49:22,946 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:49:23,391 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:25,217 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:25,440 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:27,487 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:29,529 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:30,256 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:31,642 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:32,606 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:49:32,780 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:49:33,656 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:35,684 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:35,808 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:37,737 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:37,963 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:49:37,964 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:49:39,783 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:41,224 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:41,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:43,880 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:44,877 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:49:45,924 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:46,618 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:47,988 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:50,039 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:51,660 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:52,086 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:52,982 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:49:52,983 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:49:54,138 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:56,185 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:57,501 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:49:58,246 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:49:59,235 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:50:00,310 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:02,441 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:02,540 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:02,793 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:50:04,464 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:06,491 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:07,706 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:07,994 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:50:07,994 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:50:08,544 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:10,599 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:11,600 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:50:12,665 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:13,651 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:14,700 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:16,741 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:18,711 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:18,792 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:20,853 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:22,906 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:23,001 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:50:23,002 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:50:24,244 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:24,945 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:50:24,954 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:27,006 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:29,059 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:29,295 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:31,106 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:32,806 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:50:33,238 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:34,822 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:35,256 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:37,247 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:50:37,274 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:38,003 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:50:38,004 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:50:39,321 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:40,262 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:41,371 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:43,430 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:45,332 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:45,478 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:47,527 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:49,572 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:50:49,574 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:51,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:51,629 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:53,004 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:50:53,005 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:50:53,686 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:55,733 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:57,265 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:50:57,778 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:50:59,831 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:01,875 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:02,810 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:51:02,811 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:03,963 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:04,003 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:06,027 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:08,048 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:51:08,048 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:51:08,058 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:08,281 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:10,108 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:12,170 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:13,337 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:14,221 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:15,215 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:16,279 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:18,314 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:19,310 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:20,352 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:22,398 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:23,029 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 07:51:23,029 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: stop_status +2023-04-21 07:51:24,448 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:25,280 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:26,497 ERROR gpu :37392 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 07:51:27,218 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:51:27,219 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:51:27,219 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:51:27,220 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:51:27,510 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:51:27,894 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 07:51:27,897 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 07:51:27,898 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 07:51:27,898 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 07:51:27,898 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 07:51:27,899 DEBUG SenderThread:37392 [sender.py:send():375] send: metric +2023-04-21 07:51:27,899 DEBUG SenderThread:37392 [sender.py:send():375] send: history +2023-04-21 07:51:27,899 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: summary_record +2023-04-21 07:51:27,901 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:51:27,945 DEBUG SenderThread:37392 [sender.py:send():375] send: exit +2023-04-21 07:51:27,945 INFO SenderThread:37392 [sender.py:send_exit():598] handling exit code: 0 +2023-04-21 07:51:27,945 INFO SenderThread:37392 [sender.py:send_exit():600] handling runtime: 13319 +2023-04-21 07:51:27,946 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:51:27,946 INFO SenderThread:37392 [sender.py:send_exit():606] send defer +2023-04-21 07:51:27,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:27,947 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 07:51:27,947 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:27,947 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 07:51:27,947 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 1 +2023-04-21 07:51:27,947 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:27,948 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 07:51:27,948 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:27,948 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 07:51:27,948 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 2 +2023-04-21 07:51:27,948 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:27,949 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 07:51:27,949 INFO HandlerThread:37392 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 07:51:27,949 DEBUG SystemMonitor:37392 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 07:51:27,949 INFO HandlerThread:37392 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 07:51:27,949 DEBUG SystemMonitor:37392 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 07:51:27,967 INFO HandlerThread:37392 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 07:51:28,038 INFO HandlerThread:37392 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 07:51:28,039 INFO HandlerThread:37392 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 07:51:28,039 INFO HandlerThread:37392 [interfaces.py:finish():202] Joined network monitor +2023-04-21 07:51:28,039 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,040 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 07:51:28,040 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 3 +2023-04-21 07:51:28,040 DEBUG SenderThread:37392 [sender.py:send():375] send: stats +2023-04-21 07:51:28,040 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:28,040 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 07:51:28,041 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,041 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 07:51:28,041 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 4 +2023-04-21 07:51:28,041 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:28,041 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 07:51:28,042 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,042 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 07:51:28,042 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 5 +2023-04-21 07:51:28,042 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:28,042 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 07:51:28,043 DEBUG SenderThread:37392 [sender.py:send():375] send: summary +2023-04-21 07:51:28,044 INFO SenderThread:37392 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 07:51:28,044 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,044 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 07:51:28,044 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 6 +2023-04-21 07:51:28,045 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:28,045 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 07:51:28,045 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,045 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 07:51:28,049 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:28,525 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:28,525 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:51:28,658 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 7 +2023-04-21 07:51:28,658 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:28,658 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 07:51:28,658 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:28,658 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 07:51:29,057 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 07:51:29,532 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:29,533 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\config.yaml +2023-04-21 07:51:30,287 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 8 +2023-04-21 07:51:30,288 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 07:51:30,288 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:30,288 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 07:51:30,288 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:30,288 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 07:51:30,301 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 9 +2023-04-21 07:51:30,301 DEBUG SenderThread:37392 [sender.py:send():375] send: artifact +2023-04-21 07:51:30,301 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:30,303 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 07:51:30,539 INFO Thread-16 :37392 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:31,662 INFO SenderThread:37392 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MTMzMDE1', 'digest': '651bcc3a27fe0b9435b558b5f4a1dbed', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v1'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v1'} +2023-04-21 07:51:31,662 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:31,662 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 07:51:31,662 INFO SenderThread:37392 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 07:51:32,545 INFO SenderThread:37392 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files +2023-04-21 07:51:32,545 INFO SenderThread:37392 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\config.yaml config.yaml +2023-04-21 07:51:32,546 INFO SenderThread:37392 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log output.log +2023-04-21 07:51:32,548 INFO SenderThread:37392 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\requirements.txt requirements.txt +2023-04-21 07:51:32,551 INFO SenderThread:37392 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-metadata.json wandb-metadata.json +2023-04-21 07:51:32,552 INFO SenderThread:37392 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json wandb-summary.json +2023-04-21 07:51:32,554 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 10 +2023-04-21 07:51:32,554 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:32,554 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 07:51:32,556 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:32,556 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 07:51:32,556 INFO SenderThread:37392 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 07:51:33,661 INFO wandb-upload_0:37392 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\config.yaml +2023-04-21 07:51:33,844 INFO wandb-upload_3:37392 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\wandb-summary.json +2023-04-21 07:51:33,844 INFO wandb-upload_2:37392 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\requirements.txt +2023-04-21 07:51:34,104 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 07:51:34,341 INFO wandb-upload_1:37392 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\files\output.log +2023-04-21 07:51:34,541 INFO Thread-15 :37392 [sender.py:transition_state():626] send defer: 11 +2023-04-21 07:51:34,541 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:34,541 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 07:51:34,542 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:34,542 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 07:51:34,542 INFO SenderThread:37392 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 07:51:34,542 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 12 +2023-04-21 07:51:34,542 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:34,542 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 07:51:34,542 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:34,542 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 07:51:39,174 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 07:51:43,929 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 13 +2023-04-21 07:51:43,929 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:43,929 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 07:51:43,929 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 07:51:43,929 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:43,929 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 07:51:43,929 INFO SenderThread:37392 [sender.py:transition_state():626] send defer: 14 +2023-04-21 07:51:43,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: defer +2023-04-21 07:51:43,930 DEBUG SenderThread:37392 [sender.py:send():375] send: final +2023-04-21 07:51:43,930 INFO HandlerThread:37392 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 07:51:43,930 DEBUG SenderThread:37392 [sender.py:send():375] send: footer +2023-04-21 07:51:43,930 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: defer +2023-04-21 07:51:43,930 INFO SenderThread:37392 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 07:51:43,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 07:51:43,930 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 07:51:43,931 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 07:51:43,931 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 07:51:43,931 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 07:51:43,931 DEBUG SenderThread:37392 [sender.py:send_request():402] send_request: server_info +2023-04-21 07:51:44,182 INFO MainThread:37392 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 07:51:44,182 INFO MainThread:37392 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 07:51:44,186 INFO MainThread:37392 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 07:51:44,187 DEBUG HandlerThread:37392 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 07:51:44,187 INFO HandlerThread:37392 [handler.py:finish():845] shutting down handler +2023-04-21 07:51:44,942 INFO WriterThread:37392 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\run-o7wbvva1.wandb +2023-04-21 07:51:45,189 INFO SenderThread:37392 [sender.py:finish():1550] shutting down sender +2023-04-21 07:51:45,189 INFO SenderThread:37392 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 07:51:45,189 INFO SenderThread:37392 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug.log b/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..3d4343c4f3a668d8726cb63a49bd515561444402 --- /dev/null +++ b/ptuning/wandb/run-20230421_040927-o7wbvva1/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Configure stats pid to 34384 +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 04:09:27,547 INFO MainThread:34384 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\logs\debug.log +2023-04-21 04:09:27,548 INFO MainThread:34384 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_040927-o7wbvva1\logs\debug-internal.log +2023-04-21 04:09:27,548 INFO MainThread:34384 [wandb_init.py:init():547] calling init triggers +2023-04-21 04:09:27,548 INFO MainThread:34384 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 04:09:27,548 INFO MainThread:34384 [wandb_init.py:init():595] starting backend +2023-04-21 04:09:27,548 INFO MainThread:34384 [wandb_init.py:init():599] setting up manager +2023-04-21 04:09:27,550 INFO MainThread:34384 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 04:09:27,554 INFO MainThread:34384 [wandb_init.py:init():605] backend started and connected +2023-04-21 04:09:27,556 INFO MainThread:34384 [wandb_init.py:init():695] updated telemetry +2023-04-21 04:09:27,623 INFO MainThread:34384 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 04:09:28,343 INFO MainThread:34384 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 04:09:28,928 INFO MainThread:34384 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 04:09:28,928 INFO MainThread:34384 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 04:09:29,188 INFO MainThread:34384 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 04:09:29,188 INFO MainThread:34384 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 04:09:29,189 INFO MainThread:34384 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 04:09:29,189 INFO MainThread:34384 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 04:09:29,189 INFO MainThread:34384 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 04:09:29,192 INFO MainThread:34384 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_04-07-46_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 07:51:46,232 WARNING MsgRouterThr:34384 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_040927-o7wbvva1/run-o7wbvva1.wandb b/ptuning/wandb/run-20230421_040927-o7wbvva1/run-o7wbvva1.wandb new file mode 100644 index 0000000000000000000000000000000000000000..5dac7f5522cb9604970f5b2a6020dcce86bd7732 Binary files /dev/null and b/ptuning/wandb/run-20230421_040927-o7wbvva1/run-o7wbvva1.wandb differ diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/files/config.yaml b/ptuning/wandb/run-20230421_114724-wke6zfdm/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..563a7d2d3bfb8ec307105604e5e94ba19185dd84 --- /dev/null +++ b/ptuning/wandb/run-20230421_114724-wke6zfdm/files/config.yaml @@ -0,0 +1,30 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682048844.12797 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/files/requirements.txt b/ptuning/wandb/run-20230421_114724-wke6zfdm/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/files/wandb-summary.json b/ptuning/wandb/run-20230421_114724-wke6zfdm/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..9e26dfeeb6e641a33dae4961196235bdb965b21b --- /dev/null +++ b/ptuning/wandb/run-20230421_114724-wke6zfdm/files/wandb-summary.json @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug-internal.log b/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..08deafa8da3101e9c9e50d2c32ada785df1a026d --- /dev/null +++ b/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug-internal.log @@ -0,0 +1,51 @@ +2023-04-21 11:47:24,126 INFO StreamThr :33184 [internal.py:wandb_internal():86] W&B internal server running at pid: 33184, started at: 2023-04-21 11:47:24.126972 +2023-04-21 11:47:24,128 DEBUG HandlerThread:33184 [handler.py:handle_request():144] handle_request: status +2023-04-21 11:47:24,129 INFO WriterThread:33184 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\run-wke6zfdm.wandb +2023-04-21 11:47:24,132 DEBUG SenderThread:33184 [sender.py:send():375] send: header +2023-04-21 11:47:24,194 DEBUG SenderThread:33184 [sender.py:send():375] send: run +2023-04-21 11:47:24,922 INFO SenderThread:33184 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files +2023-04-21 11:47:24,922 INFO SenderThread:33184 [sender.py:_start_run_threads():1124] run started: wke6zfdm with start time 1682048844.12797 +2023-04-21 11:47:24,922 DEBUG SenderThread:33184 [sender.py:send_request():402] send_request: summary_record +2023-04-21 11:47:24,922 INFO SenderThread:33184 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 11:47:24,923 DEBUG HandlerThread:33184 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 11:47:24,924 DEBUG SenderThread:33184 [sender.py:send_request():402] send_request: check_version +2023-04-21 11:47:25,473 DEBUG HandlerThread:33184 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 11:47:25,522 DEBUG HandlerThread:33184 [system_info.py:__init__():31] System info init +2023-04-21 11:47:25,522 DEBUG HandlerThread:33184 [system_info.py:__init__():46] System info init done +2023-04-21 11:47:25,522 INFO HandlerThread:33184 [system_monitor.py:start():181] Starting system monitor +2023-04-21 11:47:25,523 INFO SystemMonitor:33184 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 11:47:25,523 INFO HandlerThread:33184 [system_monitor.py:probe():201] Collecting system info +2023-04-21 11:47:25,530 INFO SystemMonitor:33184 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 11:47:25,530 INFO SystemMonitor:33184 [interfaces.py:start():190] Started disk monitoring +2023-04-21 11:47:25,531 INFO SystemMonitor:33184 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 11:47:25,543 INFO SystemMonitor:33184 [interfaces.py:start():190] Started memory monitoring +2023-04-21 11:47:25,562 INFO SystemMonitor:33184 [interfaces.py:start():190] Started network monitoring +2023-04-21 11:47:25,596 DEBUG HandlerThread:33184 [system_info.py:probe():195] Probing system +2023-04-21 11:47:25,600 DEBUG HandlerThread:33184 [system_info.py:_probe_git():180] Probing git +2023-04-21 11:47:25,603 ERROR gpu :33184 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:47:25,694 DEBUG HandlerThread:33184 [system_info.py:_probe_git():188] Probing git done +2023-04-21 11:47:25,695 DEBUG HandlerThread:33184 [system_info.py:probe():240] Probing system done +2023-04-21 11:47:25,695 DEBUG HandlerThread:33184 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T03:47:25.597930', 'startedAt': '2023-04-21T03:47:24.114971', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 500.32030868530273}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 11:47:25,695 INFO HandlerThread:33184 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 11:47:25,695 INFO HandlerThread:33184 [system_monitor.py:probe():214] Publishing system info +2023-04-21 11:47:25,695 DEBUG HandlerThread:33184 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 11:47:25,696 ERROR HandlerThread:33184 [system_info.py:_save_pip():66] Error saving pip packages: [Errno 28] No space left on device +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\wandb\sdk\internal\system\system_info.py", line 64, in _save_pip + f.write("\n".join(installed_packages_list)) +OSError: [Errno 28] No space left on device +2023-04-21 11:47:25,697 DEBUG HandlerThread:33184 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 11:47:25,930 INFO WriterThread:33184 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\run-wke6zfdm.wandb +2023-04-21 11:47:25,932 INFO Thread-16 :33184 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\wandb-summary.json +2023-04-21 11:47:25,932 INFO Thread-16 :33184 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\requirements.txt +2023-04-21 11:47:26,466 INFO SenderThread:33184 [sender.py:finish():1550] shutting down sender +2023-04-21 11:47:26,466 INFO SenderThread:33184 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 11:47:26,938 INFO SenderThread:33184 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files +2023-04-21 11:47:26,938 INFO SenderThread:33184 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\config.yaml config.yaml +2023-04-21 11:47:26,939 INFO SenderThread:33184 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\requirements.txt requirements.txt +2023-04-21 11:47:26,941 INFO SenderThread:33184 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\wandb-summary.json wandb-summary.json +2023-04-21 11:47:26,942 INFO SenderThread:33184 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 11:47:26,942 INFO SenderThread:33184 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 11:47:27,626 ERROR gpu :33184 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:47:27,939 INFO wandb-upload_0:33184 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\config.yaml +2023-04-21 11:47:28,068 INFO wandb-upload_1:33184 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\files\wandb-summary.json diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug.log b/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..f4a6479054f474ebda3c7d757b1dafd38389806a --- /dev/null +++ b/ptuning/wandb/run-20230421_114724-wke6zfdm/logs/debug.log @@ -0,0 +1,31 @@ +2023-04-21 11:47:24,118 INFO MainThread:30268 [wandb_setup.py:_flush():76] Configure stats pid to 30268 +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\logs\debug.log +2023-04-21 11:47:24,119 INFO MainThread:30268 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_114724-wke6zfdm\logs\debug-internal.log +2023-04-21 11:47:24,120 INFO MainThread:30268 [wandb_init.py:init():547] calling init triggers +2023-04-21 11:47:24,120 INFO MainThread:30268 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 11:47:24,120 INFO MainThread:30268 [wandb_init.py:init():595] starting backend +2023-04-21 11:47:24,120 INFO MainThread:30268 [wandb_init.py:init():599] setting up manager +2023-04-21 11:47:24,122 INFO MainThread:30268 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 11:47:24,127 INFO MainThread:30268 [wandb_init.py:init():605] backend started and connected +2023-04-21 11:47:24,128 INFO MainThread:30268 [wandb_init.py:init():695] updated telemetry +2023-04-21 11:47:24,194 INFO MainThread:30268 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 11:47:24,923 INFO MainThread:30268 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 11:47:25,465 INFO MainThread:30268 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 11:47:25,465 INFO MainThread:30268 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 11:47:29,809 WARNING MsgRouterThr:30268 [router.py:message_loop():77] message_loop has been closed +2023-04-21 11:47:31,187 ERROR MainThread:30268 [wandb_init.py:init():1163] transport failed +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\wandb\sdk\wandb_init.py", line 1145, in init + run = wi.init() + File "D:\Program\Python38\lib\site-packages\wandb\sdk\wandb_init.py", line 794, in init + run_start_result = run_start_handle.wait(timeout=30) + File "D:\Program\Python38\lib\site-packages\wandb\sdk\lib\mailbox.py", line 281, in wait + raise MailboxError("transport failed") +wandb.sdk.lib.mailbox.MailboxError: transport failed diff --git a/ptuning/wandb/run-20230421_114724-wke6zfdm/run-wke6zfdm.wandb b/ptuning/wandb/run-20230421_114724-wke6zfdm/run-wke6zfdm.wandb new file mode 100644 index 0000000000000000000000000000000000000000..c31702bdf2d7bfed68b92c253394e8a4820e5ba7 Binary files /dev/null and b/ptuning/wandb/run-20230421_114724-wke6zfdm/run-wke6zfdm.wandb differ diff --git a/ptuning/wandb/run-20230421_115513-a578muah/files/config.yaml b/ptuning/wandb/run-20230421_115513-a578muah/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..8c443edd8f828b96fd6c6db9adea6cc086d920c7 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682049313.738031 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_11-53-35_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_115513-a578muah/files/output.log b/ptuning/wandb/run-20230421_115513-a578muah/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..b043f8510065a23c44663b12fc531be3c2ab6687 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/files/output.log @@ -0,0 +1,90 @@ + + 0%| | 0/1000 [00:00 + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 633, in get_tokens_unprocessed + if m: +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 440, in + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_115513-a578muah/files/requirements.txt b/ptuning/wandb/run-20230421_115513-a578muah/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-metadata.json b/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..40217993d9cf573763ca0d5b0900a40de8735333 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T03:55:15.240287", + "startedAt": "2023-04-21T03:55:13.717029", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 486.8331985473633 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-summary.json b/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..7a227768ba41e92277e4c6755a2286dadf3c56b7 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 123}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_115513-a578muah/logs/debug-internal.log b/ptuning/wandb/run-20230421_115513-a578muah/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..98cb9bfb152ac192e8c5512dbda50444d14852e5 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/logs/debug-internal.log @@ -0,0 +1,290 @@ +2023-04-21 11:55:13,737 INFO StreamThr :43372 [internal.py:wandb_internal():86] W&B internal server running at pid: 43372, started at: 2023-04-21 11:55:13.737032 +2023-04-21 11:55:13,739 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status +2023-04-21 11:55:13,740 INFO WriterThread:43372 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\run-a578muah.wandb +2023-04-21 11:55:13,744 DEBUG SenderThread:43372 [sender.py:send():375] send: header +2023-04-21 11:55:13,809 DEBUG SenderThread:43372 [sender.py:send():375] send: run +2023-04-21 11:55:14,453 INFO SenderThread:43372 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files +2023-04-21 11:55:14,453 INFO SenderThread:43372 [sender.py:_start_run_threads():1124] run started: a578muah with start time 1682049313.738031 +2023-04-21 11:55:14,454 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: summary_record +2023-04-21 11:55:14,454 INFO SenderThread:43372 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 11:55:14,455 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 11:55:14,456 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: check_version +2023-04-21 11:55:15,099 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 11:55:15,151 DEBUG HandlerThread:43372 [system_info.py:__init__():31] System info init +2023-04-21 11:55:15,151 DEBUG HandlerThread:43372 [system_info.py:__init__():46] System info init done +2023-04-21 11:55:15,151 INFO HandlerThread:43372 [system_monitor.py:start():181] Starting system monitor +2023-04-21 11:55:15,151 INFO SystemMonitor:43372 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 11:55:15,151 INFO HandlerThread:43372 [system_monitor.py:probe():201] Collecting system info +2023-04-21 11:55:15,161 INFO SystemMonitor:43372 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 11:55:15,161 INFO SystemMonitor:43372 [interfaces.py:start():190] Started disk monitoring +2023-04-21 11:55:15,162 INFO SystemMonitor:43372 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 11:55:15,169 INFO SystemMonitor:43372 [interfaces.py:start():190] Started memory monitoring +2023-04-21 11:55:15,202 INFO SystemMonitor:43372 [interfaces.py:start():190] Started network monitoring +2023-04-21 11:55:15,228 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:15,240 DEBUG HandlerThread:43372 [system_info.py:probe():195] Probing system +2023-04-21 11:55:15,242 DEBUG HandlerThread:43372 [system_info.py:_probe_git():180] Probing git +2023-04-21 11:55:15,349 DEBUG HandlerThread:43372 [system_info.py:_probe_git():188] Probing git done +2023-04-21 11:55:15,349 DEBUG HandlerThread:43372 [system_info.py:probe():240] Probing system done +2023-04-21 11:55:15,349 DEBUG HandlerThread:43372 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T03:55:15.240287', 'startedAt': '2023-04-21T03:55:13.717029', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 486.8331985473633}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 11:55:15,349 INFO HandlerThread:43372 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 11:55:15,349 INFO HandlerThread:43372 [system_monitor.py:probe():214] Publishing system info +2023-04-21 11:55:15,349 DEBUG HandlerThread:43372 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 11:55:15,350 DEBUG HandlerThread:43372 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 11:55:15,351 INFO HandlerThread:43372 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 11:55:15,363 DEBUG SenderThread:43372 [sender.py:send():375] send: files +2023-04-21 11:55:15,363 INFO SenderThread:43372 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 11:55:15,377 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:55:15,378 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:55:15,467 INFO Thread-16 :43372 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\requirements.txt +2023-04-21 11:55:15,468 INFO Thread-16 :43372 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-metadata.json +2023-04-21 11:55:15,468 INFO Thread-16 :43372 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-summary.json +2023-04-21 11:55:15,882 DEBUG SenderThread:43372 [sender.py:send():375] send: telemetry +2023-04-21 11:55:15,882 DEBUG SenderThread:43372 [sender.py:send():375] send: config +2023-04-21 11:55:15,883 DEBUG SenderThread:43372 [sender.py:send():375] send: metric +2023-04-21 11:55:15,883 DEBUG SenderThread:43372 [sender.py:send():375] send: telemetry +2023-04-21 11:55:15,883 DEBUG SenderThread:43372 [sender.py:send():375] send: metric +2023-04-21 11:55:15,883 WARNING SenderThread:43372 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 11:55:16,394 INFO wandb-upload_0:43372 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpjdq92a_kwandb\m45opcy6-wandb-metadata.json +2023-04-21 11:55:16,481 INFO Thread-16 :43372 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:55:17,278 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:18,501 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:55:18,877 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:19,334 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:19,514 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:55:21,386 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:23,432 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:23,931 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:25,471 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:27,514 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:28,982 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:29,561 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:30,385 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:55:30,386 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:55:31,624 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:33,666 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:34,689 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:35,707 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:37,749 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:39,784 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:39,795 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:41,852 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:43,894 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:44,857 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:45,402 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:55:45,403 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:55:45,962 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\config.yaml +2023-04-21 11:55:46,011 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:48,040 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:49,022 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:55:50,070 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:50,644 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:52,121 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:54,180 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:55,690 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:55:56,220 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:55:58,271 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:00,327 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:00,415 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:56:00,415 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:56:01,679 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:02,379 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:04,423 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:06,474 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:06,719 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:08,529 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:10,569 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:56:10,580 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:12,609 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:12,622 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:14,663 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:15,209 DEBUG SystemMonitor:43372 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 11:56:15,209 DEBUG SenderThread:43372 [sender.py:send():375] send: stats +2023-04-21 11:56:15,427 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:56:15,428 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:56:16,774 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:17,721 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:18,792 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:20,815 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:22,763 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:22,863 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:24,905 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:56:24,919 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:26,975 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:28,232 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:29,042 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:30,429 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:56:30,430 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:56:31,087 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:33,142 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:33,716 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:35,176 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:37,234 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:38,756 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:39,011 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:56:39,296 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:41,346 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:43,384 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:43,804 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:45,224 DEBUG SenderThread:43372 [sender.py:send():375] send: stats +2023-04-21 11:56:45,431 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:45,444 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:56:45,444 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:56:47,527 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:49,550 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:49,712 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:51,150 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:56:51,579 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:53,632 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:54,937 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:56:55,744 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:57,775 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:59,824 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:56:59,977 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:57:00,442 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:57:00,443 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:57:01,894 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:03,945 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:05,263 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:57:05,639 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:57:05,996 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:08,058 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:10,124 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:10,701 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:57:12,199 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:14,247 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:15,228 DEBUG SenderThread:43372 [sender.py:send():375] send: stats +2023-04-21 11:57:15,432 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 11:57:15,433 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: stop_status +2023-04-21 11:57:16,309 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:16,690 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:57:18,385 DEBUG SenderThread:43372 [sender.py:send():375] send: exit +2023-04-21 11:57:18,398 INFO SenderThread:43372 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 11:57:18,398 INFO SenderThread:43372 [sender.py:send_exit():600] handling runtime: 123 +2023-04-21 11:57:18,411 INFO SenderThread:43372 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 11:57:18,423 INFO SenderThread:43372 [sender.py:send_exit():606] send defer +2023-04-21 11:57:18,434 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,434 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 11:57:18,434 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-summary.json +2023-04-21 11:57:18,435 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,435 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 11:57:18,435 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 1 +2023-04-21 11:57:18,435 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,435 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 11:57:18,435 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,435 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 11:57:18,436 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 2 +2023-04-21 11:57:18,436 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,436 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 11:57:18,436 INFO HandlerThread:43372 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 11:57:18,436 DEBUG SystemMonitor:43372 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 11:57:18,436 DEBUG SystemMonitor:43372 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 11:57:18,436 INFO HandlerThread:43372 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 11:57:18,437 INFO HandlerThread:43372 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 11:57:18,444 ERROR gpu :43372 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 11:57:18,504 INFO HandlerThread:43372 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 11:57:18,504 INFO HandlerThread:43372 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 11:57:18,504 INFO HandlerThread:43372 [interfaces.py:finish():202] Joined network monitor +2023-04-21 11:57:18,505 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,505 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 11:57:18,505 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 3 +2023-04-21 11:57:18,505 DEBUG SenderThread:43372 [sender.py:send():375] send: stats +2023-04-21 11:57:18,506 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,506 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 11:57:18,506 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,506 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 11:57:18,506 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 4 +2023-04-21 11:57:18,506 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,506 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 11:57:18,507 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,507 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 11:57:18,507 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 5 +2023-04-21 11:57:18,507 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,507 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 11:57:18,507 DEBUG SenderThread:43372 [sender.py:send():375] send: summary +2023-04-21 11:57:18,508 INFO SenderThread:43372 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 11:57:18,508 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,508 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 11:57:18,509 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 6 +2023-04-21 11:57:18,509 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,509 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 11:57:18,509 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,509 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 11:57:18,509 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 7 +2023-04-21 11:57:18,509 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 11:57:18,509 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:18,510 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 11:57:18,510 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:18,510 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 11:57:19,447 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 11:57:19,448 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-summary.json +2023-04-21 11:57:19,449 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:57:21,418 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 8 +2023-04-21 11:57:21,418 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 11:57:21,418 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:21,419 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 11:57:21,419 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:21,419 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 11:57:21,445 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 9 +2023-04-21 11:57:21,445 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:21,445 DEBUG SenderThread:43372 [sender.py:send():375] send: artifact +2023-04-21 11:57:21,445 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 11:57:21,465 INFO Thread-16 :43372 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:57:22,766 INFO SenderThread:43372 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'digest': '53476254d59d98151858729a5d45f45c', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v2'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v2'} +2023-04-21 11:57:22,766 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:22,766 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 11:57:22,766 INFO SenderThread:43372 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 11:57:23,491 INFO SenderThread:43372 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files +2023-04-21 11:57:23,491 INFO SenderThread:43372 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\config.yaml config.yaml +2023-04-21 11:57:23,492 INFO SenderThread:43372 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log output.log +2023-04-21 11:57:23,496 INFO SenderThread:43372 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\requirements.txt requirements.txt +2023-04-21 11:57:23,500 INFO SenderThread:43372 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-metadata.json wandb-metadata.json +2023-04-21 11:57:23,500 INFO SenderThread:43372 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-summary.json wandb-summary.json +2023-04-21 11:57:23,503 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 10 +2023-04-21 11:57:23,503 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:23,503 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 11:57:23,508 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:23,508 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 11:57:23,508 INFO SenderThread:43372 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 11:57:24,155 INFO wandb-upload_1:43372 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\output.log +2023-04-21 11:57:24,504 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 11:57:24,541 INFO wandb-upload_0:43372 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\config.yaml +2023-04-21 11:57:24,658 INFO wandb-upload_3:43372 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\wandb-summary.json +2023-04-21 11:57:24,681 INFO wandb-upload_2:43372 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\files\requirements.txt +2023-04-21 11:57:24,883 INFO Thread-15 :43372 [sender.py:transition_state():626] send defer: 11 +2023-04-21 11:57:24,883 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:24,883 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 11:57:24,884 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:24,884 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 11:57:24,884 INFO SenderThread:43372 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 11:57:24,884 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 12 +2023-04-21 11:57:24,884 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:24,884 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 11:57:24,884 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:24,884 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 11:57:26,496 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 13 +2023-04-21 11:57:26,496 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:26,496 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 11:57:26,497 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:26,497 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 11:57:26,497 INFO SenderThread:43372 [sender.py:transition_state():626] send defer: 14 +2023-04-21 11:57:26,497 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: defer +2023-04-21 11:57:26,497 DEBUG SenderThread:43372 [sender.py:send():375] send: final +2023-04-21 11:57:26,497 INFO HandlerThread:43372 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 11:57:26,497 DEBUG SenderThread:43372 [sender.py:send():375] send: footer +2023-04-21 11:57:26,498 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: defer +2023-04-21 11:57:26,498 INFO SenderThread:43372 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 11:57:26,498 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 11:57:26,498 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 11:57:26,499 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 11:57:26,499 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 11:57:26,499 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 11:57:26,499 DEBUG SenderThread:43372 [sender.py:send_request():402] send_request: server_info +2023-04-21 11:57:26,753 INFO MainThread:43372 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 11:57:26,753 INFO MainThread:43372 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 11:57:26,753 INFO MainThread:43372 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 11:57:26,754 DEBUG HandlerThread:43372 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 11:57:26,754 INFO HandlerThread:43372 [handler.py:finish():845] shutting down handler +2023-04-21 11:57:27,506 INFO WriterThread:43372 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\run-a578muah.wandb +2023-04-21 11:57:27,758 INFO SenderThread:43372 [sender.py:finish():1550] shutting down sender +2023-04-21 11:57:27,758 INFO SenderThread:43372 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 11:57:27,758 INFO SenderThread:43372 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_115513-a578muah/logs/debug.log b/ptuning/wandb/run-20230421_115513-a578muah/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..bdc1d6e3830a5240a5bfadace9a1d6dd945cd119 --- /dev/null +++ b/ptuning/wandb/run-20230421_115513-a578muah/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 11:55:13,721 INFO MainThread:19992 [wandb_setup.py:_flush():76] Configure stats pid to 19992 +2023-04-21 11:55:13,722 INFO MainThread:19992 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 11:55:13,722 INFO MainThread:19992 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 11:55:13,722 INFO MainThread:19992 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 11:55:13,722 INFO MainThread:19992 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 11:55:13,722 INFO MainThread:19992 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\logs\debug.log +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_115513-a578muah\logs\debug-internal.log +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:init():547] calling init triggers +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:init():595] starting backend +2023-04-21 11:55:13,731 INFO MainThread:19992 [wandb_init.py:init():599] setting up manager +2023-04-21 11:55:13,734 INFO MainThread:19992 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 11:55:13,738 INFO MainThread:19992 [wandb_init.py:init():605] backend started and connected +2023-04-21 11:55:13,740 INFO MainThread:19992 [wandb_init.py:init():695] updated telemetry +2023-04-21 11:55:13,808 INFO MainThread:19992 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 11:55:14,455 INFO MainThread:19992 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 11:55:15,091 INFO MainThread:19992 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 11:55:15,091 INFO MainThread:19992 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 11:55:15,378 INFO MainThread:19992 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 11:55:15,378 INFO MainThread:19992 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 11:55:15,378 INFO MainThread:19992 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 11:55:15,378 INFO MainThread:19992 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 11:55:15,379 INFO MainThread:19992 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 11:55:15,381 INFO MainThread:19992 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_11-53-35_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 11:57:28,566 WARNING MsgRouterThr:19992 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_115513-a578muah/run-a578muah.wandb b/ptuning/wandb/run-20230421_115513-a578muah/run-a578muah.wandb new file mode 100644 index 0000000000000000000000000000000000000000..6c70abb9f3d1ee0f2f9179a2cec220079f6ed5a6 Binary files /dev/null and b/ptuning/wandb/run-20230421_115513-a578muah/run-a578muah.wandb differ diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/files/config.yaml b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..dab4c4f7c63a687492b20e8d3690485e91b2057b --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682049647.163995 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4-1\models--THUDM--chatglm-6b-int4\snapshots\e02ba894cf18f3fd9b2526c795f983683c4ec732 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_11-59-08_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/files/output.log b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..2c10a98f38b948eae3e0e5487100cb3461698351 --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/output.log @@ -0,0 +1,91 @@ + + 0%| | 0/1000 [00:00 + if __name__ == "__main__": + File "main.py", line 379, in main + model.enable_input_require_grads() + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 631, in get_tokens_unprocessed + for rexmatch, action, new_state in statetokens: +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 440, in + if __name__ == "__main__": + File "main.py", line 379, in main + model.enable_input_require_grads() + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/files/requirements.txt b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-metadata.json b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..aa9de2d616e39839d7df6c197faa42e35d6fa7cb --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T04:00:48.730598", + "startedAt": "2023-04-21T04:00:47.144996", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 486.5024871826172 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-summary.json b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..e44a9829709700b21dd4d79b9ac8df43a2aad1cb --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 112}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug-internal.log b/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..a5a6828f543c700074048ff8ea61ecc4307825ff --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug-internal.log @@ -0,0 +1,279 @@ +2023-04-21 12:00:47,163 INFO StreamThr :4128 [internal.py:wandb_internal():86] W&B internal server running at pid: 4128, started at: 2023-04-21 12:00:47.162997 +2023-04-21 12:00:47,164 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status +2023-04-21 12:00:47,165 INFO WriterThread:4128 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\run-67c5qu7s.wandb +2023-04-21 12:00:47,169 DEBUG SenderThread:4128 [sender.py:send():375] send: header +2023-04-21 12:00:47,234 DEBUG SenderThread:4128 [sender.py:send():375] send: run +2023-04-21 12:00:47,952 INFO SenderThread:4128 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files +2023-04-21 12:00:47,952 INFO SenderThread:4128 [sender.py:_start_run_threads():1124] run started: 67c5qu7s with start time 1682049647.163995 +2023-04-21 12:00:47,952 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: summary_record +2023-04-21 12:00:47,953 INFO SenderThread:4128 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:00:47,955 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 12:00:47,956 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: check_version +2023-04-21 12:00:48,555 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 12:00:48,628 DEBUG HandlerThread:4128 [system_info.py:__init__():31] System info init +2023-04-21 12:00:48,628 DEBUG HandlerThread:4128 [system_info.py:__init__():46] System info init done +2023-04-21 12:00:48,628 INFO HandlerThread:4128 [system_monitor.py:start():181] Starting system monitor +2023-04-21 12:00:48,628 INFO SystemMonitor:4128 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 12:00:48,628 INFO HandlerThread:4128 [system_monitor.py:probe():201] Collecting system info +2023-04-21 12:00:48,637 INFO SystemMonitor:4128 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 12:00:48,637 INFO SystemMonitor:4128 [interfaces.py:start():190] Started disk monitoring +2023-04-21 12:00:48,638 INFO SystemMonitor:4128 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 12:00:48,654 INFO SystemMonitor:4128 [interfaces.py:start():190] Started memory monitoring +2023-04-21 12:00:48,687 INFO SystemMonitor:4128 [interfaces.py:start():190] Started network monitoring +2023-04-21 12:00:48,730 DEBUG HandlerThread:4128 [system_info.py:probe():195] Probing system +2023-04-21 12:00:48,733 DEBUG HandlerThread:4128 [system_info.py:_probe_git():180] Probing git +2023-04-21 12:00:48,737 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:00:48,843 DEBUG HandlerThread:4128 [system_info.py:_probe_git():188] Probing git done +2023-04-21 12:00:48,844 DEBUG HandlerThread:4128 [system_info.py:probe():240] Probing system done +2023-04-21 12:00:48,844 DEBUG HandlerThread:4128 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T04:00:48.730598', 'startedAt': '2023-04-21T04:00:47.144996', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 486.5024871826172}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 12:00:48,844 INFO HandlerThread:4128 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 12:00:48,844 INFO HandlerThread:4128 [system_monitor.py:probe():214] Publishing system info +2023-04-21 12:00:48,844 DEBUG HandlerThread:4128 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 12:00:48,845 DEBUG HandlerThread:4128 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 12:00:48,846 INFO HandlerThread:4128 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 12:00:48,858 DEBUG SenderThread:4128 [sender.py:send():375] send: files +2023-04-21 12:00:48,859 INFO SenderThread:4128 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 12:00:48,872 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:00:48,873 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:00:48,972 INFO Thread-16 :4128 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-metadata.json +2023-04-21 12:00:48,973 INFO Thread-16 :4128 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\requirements.txt +2023-04-21 12:00:48,973 INFO Thread-16 :4128 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-summary.json +2023-04-21 12:00:49,258 DEBUG SenderThread:4128 [sender.py:send():375] send: telemetry +2023-04-21 12:00:49,259 DEBUG SenderThread:4128 [sender.py:send():375] send: config +2023-04-21 12:00:49,259 DEBUG SenderThread:4128 [sender.py:send():375] send: metric +2023-04-21 12:00:49,260 DEBUG SenderThread:4128 [sender.py:send():375] send: telemetry +2023-04-21 12:00:49,260 DEBUG SenderThread:4128 [sender.py:send():375] send: metric +2023-04-21 12:00:49,260 WARNING SenderThread:4128 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 12:00:49,843 INFO wandb-upload_0:4128 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpanqo2sfqwandb\n8g764s2-wandb-metadata.json +2023-04-21 12:00:49,980 INFO Thread-16 :4128 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:00:50,768 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:00:51,996 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:00:52,622 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:00:52,818 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:00:53,003 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:00:54,880 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:00:56,906 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:00:57,657 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:00:58,953 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:00,996 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:02,702 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:03,044 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:03,889 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:01:03,890 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:01:05,099 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:06,122 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:01:07,141 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:08,682 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:09,183 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:11,220 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:13,273 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:13,737 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:15,317 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:17,365 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:18,744 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:18,896 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:01:19,165 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:01:19,421 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\config.yaml +2023-04-21 12:01:19,431 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:01:19,477 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:21,493 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:23,518 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:24,459 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:25,555 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:27,600 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:29,496 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:29,658 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:31,708 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:33,846 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:01:33,848 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:33,908 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:01:33,909 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:01:35,173 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:35,917 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:38,051 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:40,113 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:40,239 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:42,175 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:44,219 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:45,299 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:46,269 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:48,306 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:01:48,318 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:48,725 DEBUG SystemMonitor:4128 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 12:01:48,726 DEBUG SenderThread:4128 [sender.py:send():375] send: stats +2023-04-21 12:01:48,913 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:01:48,914 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:01:50,456 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:51,173 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:52,473 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:54,491 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:56,206 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:01:56,534 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:01:58,584 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:00,622 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:01,887 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:02,679 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:02,690 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:03,917 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:02:03,918 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:02:04,758 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:06,816 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:07,205 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:08,864 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:10,908 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:12,250 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:12,938 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:15,060 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:16,064 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:17,121 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:18,135 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:18,732 DEBUG SenderThread:4128 [sender.py:send():375] send: stats +2023-04-21 12:02:18,935 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:02:18,935 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:02:19,175 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:21,322 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:23,216 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:23,333 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:25,369 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:27,439 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:28,260 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:29,474 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:30,477 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:31,534 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:33,452 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:33,629 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:33,946 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:02:33,947 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:02:35,767 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:37,830 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:39,321 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:39,874 ERROR gpu :4128 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:02:41,026 DEBUG SenderThread:4128 [sender.py:send():375] send: exit +2023-04-21 12:02:41,026 INFO SenderThread:4128 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 12:02:41,026 INFO SenderThread:4128 [sender.py:send_exit():600] handling runtime: 112 +2023-04-21 12:02:41,027 INFO SenderThread:4128 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:02:41,027 INFO SenderThread:4128 [sender.py:send_exit():606] send defer +2023-04-21 12:02:41,027 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,028 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 12:02:41,028 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,028 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 12:02:41,028 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 1 +2023-04-21 12:02:41,028 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,029 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 12:02:41,029 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,029 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 12:02:41,029 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 2 +2023-04-21 12:02:41,029 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,030 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 12:02:41,030 INFO HandlerThread:4128 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 12:02:41,030 DEBUG SystemMonitor:4128 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 12:02:41,030 INFO HandlerThread:4128 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 12:02:41,030 DEBUG SystemMonitor:4128 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 12:02:41,043 INFO HandlerThread:4128 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 12:02:41,103 INFO HandlerThread:4128 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 12:02:41,104 INFO HandlerThread:4128 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 12:02:41,104 INFO HandlerThread:4128 [interfaces.py:finish():202] Joined network monitor +2023-04-21 12:02:41,104 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,104 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 12:02:41,105 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 3 +2023-04-21 12:02:41,105 DEBUG SenderThread:4128 [sender.py:send():375] send: stats +2023-04-21 12:02:41,105 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,106 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 12:02:41,106 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,106 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 12:02:41,106 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 4 +2023-04-21 12:02:41,107 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,107 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 12:02:41,107 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,108 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 12:02:41,108 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 5 +2023-04-21 12:02:41,108 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,108 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 12:02:41,109 DEBUG SenderThread:4128 [sender.py:send():375] send: summary +2023-04-21 12:02:41,110 INFO SenderThread:4128 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:02:41,110 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,110 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 12:02:41,110 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 6 +2023-04-21 12:02:41,111 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,111 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 12:02:41,111 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,111 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 12:02:41,111 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 7 +2023-04-21 12:02:41,111 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:02:41,112 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:41,112 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 12:02:41,112 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:41,112 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 12:02:41,656 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-summary.json +2023-04-21 12:02:42,109 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 12:02:42,659 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:43,229 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 8 +2023-04-21 12:02:43,229 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 12:02:43,229 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:43,229 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 12:02:43,229 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:43,230 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 12:02:43,255 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 9 +2023-04-21 12:02:43,255 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:43,255 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 12:02:43,255 DEBUG SenderThread:4128 [sender.py:send():375] send: artifact +2023-04-21 12:02:43,668 INFO Thread-16 :4128 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:44,590 INFO SenderThread:4128 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'digest': '53476254d59d98151858729a5d45f45c', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v2'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v2'} +2023-04-21 12:02:44,590 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:44,590 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 12:02:44,590 INFO SenderThread:4128 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 12:02:44,672 INFO SenderThread:4128 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files +2023-04-21 12:02:44,672 INFO SenderThread:4128 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\config.yaml config.yaml +2023-04-21 12:02:44,673 INFO SenderThread:4128 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log output.log +2023-04-21 12:02:44,675 INFO SenderThread:4128 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\requirements.txt requirements.txt +2023-04-21 12:02:44,677 INFO SenderThread:4128 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-metadata.json wandb-metadata.json +2023-04-21 12:02:44,678 INFO SenderThread:4128 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-summary.json wandb-summary.json +2023-04-21 12:02:44,681 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 10 +2023-04-21 12:02:44,681 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:44,681 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 12:02:44,686 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:44,686 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 12:02:44,686 INFO SenderThread:4128 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 12:02:45,331 INFO wandb-upload_1:4128 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\output.log +2023-04-21 12:02:45,742 INFO wandb-upload_0:4128 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\config.yaml +2023-04-21 12:02:45,763 INFO wandb-upload_3:4128 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\wandb-summary.json +2023-04-21 12:02:45,781 INFO wandb-upload_2:4128 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\files\requirements.txt +2023-04-21 12:02:45,981 INFO Thread-15 :4128 [sender.py:transition_state():626] send defer: 11 +2023-04-21 12:02:45,981 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:45,981 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 12:02:45,981 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:45,981 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 12:02:45,981 INFO SenderThread:4128 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 12:02:45,982 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 12 +2023-04-21 12:02:45,982 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:45,982 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 12:02:45,982 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:45,982 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 12:02:47,138 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 12:02:47,578 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 13 +2023-04-21 12:02:47,578 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:47,578 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 12:02:47,578 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:47,578 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 12:02:47,578 INFO SenderThread:4128 [sender.py:transition_state():626] send defer: 14 +2023-04-21 12:02:47,579 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:02:47,579 DEBUG SenderThread:4128 [sender.py:send():375] send: final +2023-04-21 12:02:47,579 INFO HandlerThread:4128 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 12:02:47,579 DEBUG SenderThread:4128 [sender.py:send():375] send: footer +2023-04-21 12:02:47,579 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: defer +2023-04-21 12:02:47,579 INFO SenderThread:4128 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 12:02:47,580 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 12:02:47,580 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 12:02:47,580 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 12:02:47,580 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 12:02:47,580 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 12:02:47,580 DEBUG SenderThread:4128 [sender.py:send_request():402] send_request: server_info +2023-04-21 12:02:47,827 INFO MainThread:4128 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 12:02:47,827 INFO MainThread:4128 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 12:02:47,827 INFO MainThread:4128 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 12:02:47,829 DEBUG HandlerThread:4128 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 12:02:47,829 INFO HandlerThread:4128 [handler.py:finish():845] shutting down handler +2023-04-21 12:02:48,594 INFO WriterThread:4128 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\run-67c5qu7s.wandb +2023-04-21 12:02:48,829 INFO SenderThread:4128 [sender.py:finish():1550] shutting down sender +2023-04-21 12:02:48,829 INFO SenderThread:4128 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 12:02:48,829 INFO SenderThread:4128 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug.log b/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..1fc7d330b6f6c6a6282804df7aa6d79e48206501 --- /dev/null +++ b/ptuning/wandb/run-20230421_120047-67c5qu7s/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Configure stats pid to 21296 +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 12:00:47,147 INFO MainThread:21296 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\logs\debug.log +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120047-67c5qu7s\logs\debug-internal.log +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:init():547] calling init triggers +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:init():595] starting backend +2023-04-21 12:00:47,157 INFO MainThread:21296 [wandb_init.py:init():599] setting up manager +2023-04-21 12:00:47,160 INFO MainThread:21296 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 12:00:47,163 INFO MainThread:21296 [wandb_init.py:init():605] backend started and connected +2023-04-21 12:00:47,165 INFO MainThread:21296 [wandb_init.py:init():695] updated telemetry +2023-04-21 12:00:47,233 INFO MainThread:21296 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 12:00:47,955 INFO MainThread:21296 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 12:00:48,546 INFO MainThread:21296 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 12:00:48,546 INFO MainThread:21296 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 12:00:48,872 INFO MainThread:21296 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 12:00:48,873 INFO MainThread:21296 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 12:00:48,873 INFO MainThread:21296 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 12:00:48,873 INFO MainThread:21296 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 12:00:48,874 INFO MainThread:21296 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 12:00:48,876 INFO MainThread:21296 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4-1\\models--THUDM--chatglm-6b-int4\\snapshots\\e02ba894cf18f3fd9b2526c795f983683c4ec732', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_11-59-08_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 12:02:49,633 WARNING MsgRouterThr:21296 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_120047-67c5qu7s/run-67c5qu7s.wandb b/ptuning/wandb/run-20230421_120047-67c5qu7s/run-67c5qu7s.wandb new file mode 100644 index 0000000000000000000000000000000000000000..8d1c71a511694b08c07b616cbb97878b2ff0262e Binary files /dev/null and b/ptuning/wandb/run-20230421_120047-67c5qu7s/run-67c5qu7s.wandb differ diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/files/config.yaml b/ptuning/wandb/run-20230421_120557-k2go25b6/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..561a54d2ef8b3b1d1aeacf3ade9c4dac591541d0 --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682049957.128132 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_12-04-16_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/files/output.log b/ptuning/wandb/run-20230421_120557-k2go25b6/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..91621623bbb9db57386cc958fc7fc7e4a793c44f --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/files/output.log @@ -0,0 +1,87 @@ + + 0%| | 0/1000 [00:00 + main() + File "main.py", line 380, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 631, in get_tokens_unprocessed + for rexmatch, action, new_state in statetokens: +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 441, in + main() + File "main.py", line 380, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/files/requirements.txt b/ptuning/wandb/run-20230421_120557-k2go25b6/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..52967c83d0025866df64e32b3fc9aac41769cc26 --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/files/requirements.txt @@ -0,0 +1,445 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-metadata.json b/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..33df1fa1027139c6cee58e84de79990723589efa --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T04:05:58.567503", + "startedAt": "2023-04-21T04:05:57.108130", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + "..\\AdvertiseGen\\train.json", + "--validation_file", + "..\\AdvertiseGen\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 486.5026168823242 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-summary.json b/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..1093bbfcbd5bf0a80c2cfc4eb36d419986db80b0 --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 59}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug-internal.log b/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..31997e2f44ccccdbde66bd15129e0a7372a6dc37 --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug-internal.log @@ -0,0 +1,231 @@ +2023-04-21 12:05:57,127 INFO StreamThr :41364 [internal.py:wandb_internal():86] W&B internal server running at pid: 41364, started at: 2023-04-21 12:05:57.127132 +2023-04-21 12:05:57,128 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status +2023-04-21 12:05:57,130 INFO WriterThread:41364 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\run-k2go25b6.wandb +2023-04-21 12:05:57,133 DEBUG SenderThread:41364 [sender.py:send():375] send: header +2023-04-21 12:05:57,196 DEBUG SenderThread:41364 [sender.py:send():375] send: run +2023-04-21 12:05:57,911 INFO SenderThread:41364 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files +2023-04-21 12:05:57,912 INFO SenderThread:41364 [sender.py:_start_run_threads():1124] run started: k2go25b6 with start time 1682049957.128132 +2023-04-21 12:05:57,912 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: summary_record +2023-04-21 12:05:57,912 INFO SenderThread:41364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:05:57,913 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 12:05:57,913 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: check_version +2023-04-21 12:05:58,428 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 12:05:58,488 DEBUG HandlerThread:41364 [system_info.py:__init__():31] System info init +2023-04-21 12:05:58,489 DEBUG HandlerThread:41364 [system_info.py:__init__():46] System info init done +2023-04-21 12:05:58,489 INFO HandlerThread:41364 [system_monitor.py:start():181] Starting system monitor +2023-04-21 12:05:58,489 INFO SystemMonitor:41364 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 12:05:58,489 INFO HandlerThread:41364 [system_monitor.py:probe():201] Collecting system info +2023-04-21 12:05:58,497 INFO SystemMonitor:41364 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 12:05:58,498 INFO SystemMonitor:41364 [interfaces.py:start():190] Started disk monitoring +2023-04-21 12:05:58,498 INFO SystemMonitor:41364 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 12:05:58,511 INFO SystemMonitor:41364 [interfaces.py:start():190] Started memory monitoring +2023-04-21 12:05:58,530 INFO SystemMonitor:41364 [interfaces.py:start():190] Started network monitoring +2023-04-21 12:05:58,567 DEBUG HandlerThread:41364 [system_info.py:probe():195] Probing system +2023-04-21 12:05:58,569 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:05:58,569 DEBUG HandlerThread:41364 [system_info.py:_probe_git():180] Probing git +2023-04-21 12:05:58,667 DEBUG HandlerThread:41364 [system_info.py:_probe_git():188] Probing git done +2023-04-21 12:05:58,667 DEBUG HandlerThread:41364 [system_info.py:probe():240] Probing system done +2023-04-21 12:05:58,668 DEBUG HandlerThread:41364 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T04:05:58.567503', 'startedAt': '2023-04-21T04:05:57.108130', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '..\\AdvertiseGen\\train.json', '--validation_file', '..\\AdvertiseGen\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 486.5026168823242}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 12:05:58,668 INFO HandlerThread:41364 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 12:05:58,668 INFO HandlerThread:41364 [system_monitor.py:probe():214] Publishing system info +2023-04-21 12:05:58,668 DEBUG HandlerThread:41364 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 12:05:58,669 DEBUG HandlerThread:41364 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 12:05:58,670 INFO HandlerThread:41364 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 12:05:58,681 DEBUG SenderThread:41364 [sender.py:send():375] send: files +2023-04-21 12:05:58,682 INFO SenderThread:41364 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 12:05:58,694 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:05:58,695 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:05:58,925 INFO Thread-16 :41364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-summary.json +2023-04-21 12:05:58,925 INFO Thread-16 :41364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\requirements.txt +2023-04-21 12:05:58,925 INFO Thread-16 :41364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-metadata.json +2023-04-21 12:05:59,091 DEBUG SenderThread:41364 [sender.py:send():375] send: telemetry +2023-04-21 12:05:59,091 DEBUG SenderThread:41364 [sender.py:send():375] send: config +2023-04-21 12:05:59,092 DEBUG SenderThread:41364 [sender.py:send():375] send: metric +2023-04-21 12:05:59,092 DEBUG SenderThread:41364 [sender.py:send():375] send: telemetry +2023-04-21 12:05:59,092 DEBUG SenderThread:41364 [sender.py:send():375] send: metric +2023-04-21 12:05:59,092 WARNING SenderThread:41364 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 12:05:59,671 INFO wandb-upload_0:41364 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmprvlus53hwandb\rkeo70bq-wandb-metadata.json +2023-04-21 12:05:59,929 INFO Thread-16 :41364 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:00,597 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:01,946 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:02,150 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:02,641 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:02,953 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:04,686 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:06,736 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:07,201 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:08,800 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:10,846 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:12,246 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:12,889 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:13,724 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:06:13,725 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:06:14,928 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:16,085 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:16,974 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:17,487 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:19,024 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:21,068 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:22,534 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:23,118 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:25,172 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:27,220 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:27,589 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:28,211 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\config.yaml +2023-04-21 12:06:28,740 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:06:28,741 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:06:29,336 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:31,363 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:33,256 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:33,380 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:35,417 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:35,427 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:37,476 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:38,312 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:39,527 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:41,581 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:43,372 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:43,623 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:43,747 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 12:06:43,747 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: stop_status +2023-04-21 12:06:45,675 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:47,742 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:49,027 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:49,780 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:51,832 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:53,880 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:54,067 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:55,966 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:58,001 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:06:58,012 ERROR gpu :41364 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 12:06:58,423 DEBUG SenderThread:41364 [sender.py:send():375] send: exit +2023-04-21 12:06:58,423 INFO SenderThread:41364 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 12:06:58,423 INFO SenderThread:41364 [sender.py:send_exit():600] handling runtime: 59 +2023-04-21 12:06:58,424 INFO SenderThread:41364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:06:58,424 INFO SenderThread:41364 [sender.py:send_exit():606] send defer +2023-04-21 12:06:58,425 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,425 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 12:06:58,425 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,425 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 12:06:58,425 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 1 +2023-04-21 12:06:58,426 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,426 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 12:06:58,426 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,426 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 12:06:58,426 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 2 +2023-04-21 12:06:58,426 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,426 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 12:06:58,426 INFO HandlerThread:41364 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 12:06:58,427 INFO HandlerThread:41364 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 12:06:58,427 DEBUG SystemMonitor:41364 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 12:06:58,427 INFO HandlerThread:41364 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 12:06:58,439 DEBUG SystemMonitor:41364 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 12:06:58,439 DEBUG SystemMonitor:41364 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 12:06:58,487 INFO HandlerThread:41364 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 12:06:58,487 INFO HandlerThread:41364 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 12:06:58,487 INFO HandlerThread:41364 [interfaces.py:finish():202] Joined network monitor +2023-04-21 12:06:58,488 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,488 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 12:06:58,488 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 3 +2023-04-21 12:06:58,488 DEBUG SenderThread:41364 [sender.py:send():375] send: stats +2023-04-21 12:06:58,488 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,489 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 12:06:58,489 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,489 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 12:06:58,489 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 4 +2023-04-21 12:06:58,489 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,490 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 12:06:58,490 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,490 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 12:06:58,490 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 5 +2023-04-21 12:06:58,490 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,490 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 12:06:58,491 DEBUG SenderThread:41364 [sender.py:send():375] send: summary +2023-04-21 12:06:58,492 INFO SenderThread:41364 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 12:06:58,492 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,492 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 12:06:58,492 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 6 +2023-04-21 12:06:58,493 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,493 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 12:06:58,493 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,493 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 12:06:58,493 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 7 +2023-04-21 12:06:58,494 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 12:06:58,494 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:06:58,494 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 12:06:58,494 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:06:58,494 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 12:06:59,002 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-summary.json +2023-04-21 12:06:59,500 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 12:07:00,009 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:07:00,384 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 8 +2023-04-21 12:07:00,384 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 12:07:00,384 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:00,384 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 12:07:00,385 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:00,385 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 12:07:00,397 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 9 +2023-04-21 12:07:00,397 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:00,397 DEBUG SenderThread:41364 [sender.py:send():375] send: artifact +2023-04-21 12:07:00,397 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 12:07:00,506 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 12:07:01,019 INFO Thread-16 :41364 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:07:01,688 INFO SenderThread:41364 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'digest': '53476254d59d98151858729a5d45f45c', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v2'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'v2'} +2023-04-21 12:07:01,688 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:01,688 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 12:07:01,688 INFO SenderThread:41364 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 12:07:02,025 INFO SenderThread:41364 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files +2023-04-21 12:07:02,025 INFO SenderThread:41364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\config.yaml config.yaml +2023-04-21 12:07:02,026 INFO SenderThread:41364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log output.log +2023-04-21 12:07:02,029 INFO SenderThread:41364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\requirements.txt requirements.txt +2023-04-21 12:07:02,032 INFO SenderThread:41364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-metadata.json wandb-metadata.json +2023-04-21 12:07:02,032 INFO SenderThread:41364 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-summary.json wandb-summary.json +2023-04-21 12:07:02,035 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 10 +2023-04-21 12:07:02,036 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 12:07:02,037 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:02,037 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 12:07:02,037 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:02,039 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 12:07:02,040 INFO SenderThread:41364 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 12:07:02,714 INFO wandb-upload_1:41364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\output.log +2023-04-21 12:07:03,052 INFO wandb-upload_0:41364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\config.yaml +2023-04-21 12:07:03,268 INFO wandb-upload_3:41364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\wandb-summary.json +2023-04-21 12:07:03,316 INFO wandb-upload_2:41364 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\files\requirements.txt +2023-04-21 12:07:03,517 INFO Thread-15 :41364 [sender.py:transition_state():626] send defer: 11 +2023-04-21 12:07:03,517 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:03,517 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 12:07:03,518 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:03,518 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 12:07:03,518 INFO SenderThread:41364 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 12:07:03,518 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 12 +2023-04-21 12:07:03,518 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:03,518 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 12:07:03,518 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:03,518 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 12:07:04,678 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 13 +2023-04-21 12:07:04,678 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:04,678 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 12:07:04,678 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:04,678 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 12:07:04,679 INFO SenderThread:41364 [sender.py:transition_state():626] send defer: 14 +2023-04-21 12:07:04,679 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: defer +2023-04-21 12:07:04,679 DEBUG SenderThread:41364 [sender.py:send():375] send: final +2023-04-21 12:07:04,679 INFO HandlerThread:41364 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 12:07:04,679 DEBUG SenderThread:41364 [sender.py:send():375] send: footer +2023-04-21 12:07:04,679 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: defer +2023-04-21 12:07:04,679 INFO SenderThread:41364 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 12:07:04,680 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 12:07:04,680 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 12:07:04,681 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 12:07:04,681 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 12:07:04,681 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 12:07:04,681 DEBUG SenderThread:41364 [sender.py:send_request():402] send_request: server_info +2023-04-21 12:07:04,921 INFO MainThread:41364 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 12:07:04,921 INFO MainThread:41364 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 12:07:04,921 INFO MainThread:41364 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 12:07:04,923 DEBUG HandlerThread:41364 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 12:07:04,923 INFO HandlerThread:41364 [handler.py:finish():845] shutting down handler +2023-04-21 12:07:05,687 INFO WriterThread:41364 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\run-k2go25b6.wandb +2023-04-21 12:07:05,935 INFO SenderThread:41364 [sender.py:finish():1550] shutting down sender +2023-04-21 12:07:05,935 INFO SenderThread:41364 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 12:07:05,935 INFO SenderThread:41364 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug.log b/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..c03802225467778636b3d1c8ca6227b72beb79d1 --- /dev/null +++ b/ptuning/wandb/run-20230421_120557-k2go25b6/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Configure stats pid to 3092 +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 12:05:57,111 INFO MainThread:3092 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\logs\debug.log +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_120557-k2go25b6\logs\debug-internal.log +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:init():547] calling init triggers +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:init():595] starting backend +2023-04-21 12:05:57,121 INFO MainThread:3092 [wandb_init.py:init():599] setting up manager +2023-04-21 12:05:57,124 INFO MainThread:3092 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 12:05:57,128 INFO MainThread:3092 [wandb_init.py:init():605] backend started and connected +2023-04-21 12:05:57,130 INFO MainThread:3092 [wandb_init.py:init():695] updated telemetry +2023-04-21 12:05:57,196 INFO MainThread:3092 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 12:05:57,913 INFO MainThread:3092 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 12:05:58,418 INFO MainThread:3092 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 12:05:58,418 INFO MainThread:3092 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 12:05:58,695 INFO MainThread:3092 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 12:05:58,695 INFO MainThread:3092 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 12:05:58,696 INFO MainThread:3092 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 12:05:58,696 INFO MainThread:3092 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 12:05:58,697 INFO MainThread:3092 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 12:05:58,698 INFO MainThread:3092 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_12-04-16_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 12:07:06,748 WARNING MsgRouterThr:3092 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_120557-k2go25b6/run-k2go25b6.wandb b/ptuning/wandb/run-20230421_120557-k2go25b6/run-k2go25b6.wandb new file mode 100644 index 0000000000000000000000000000000000000000..dcfa77ca1846b7d105c55eb75037ea369567ffa2 Binary files /dev/null and b/ptuning/wandb/run-20230421_120557-k2go25b6/run-k2go25b6.wandb differ diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/files/config.yaml b/ptuning/wandb/run-20230421_155803-a11nraei/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..9f57b7172dd8776580d422af78fb4e104eba607f --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/files/config.yaml @@ -0,0 +1,616 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682063883.600233 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_15-57-52_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/files/output.log b/ptuning/wandb/run-20230421_155803-a11nraei/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..98d888885c1e845c7b72c5c91205e30a2bb013dd --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/files/output.log @@ -0,0 +1,207 @@ + + 0%| | 0/1000 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\config.json +[INFO|configuration_utils.py:362] 2023-04-21 15:59:42,771 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 15:59:42,968 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 15:59:42,972 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 15:59:42,973 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\special_tokens_map.json + 10%|██████▊ | 101/1000 [01:38<15:49, 1.06s/it] +{'loss': 0.0, 'learning_rate': 0.018000000000000002, 'epoch': 100.0} + + + + + 11%|███████▍ | 110/1000 [01:46<12:41, 1.17it/s] + + + + + + 12%|████████▏ | 120/1000 [01:55<13:01, 1.13it/s] + + + 13%|████████▋ | 127/1000 [02:02<13:06, 1.11it/s]Traceback (most recent call last): + File "main.py", line 440, in + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 440, in + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/files/requirements.txt b/ptuning/wandb/run-20230421_155803-a11nraei/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..0fa97653da3ca48f8e6c3dbdcfe7798718834fa0 --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/files/requirements.txt @@ -0,0 +1,449 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-metadata.json b/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..9e27ffbdd7f626e50bcbcdf0e23785c147429a45 --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T07:58:05.121062", + "startedAt": "2023-04-21T07:58:03.589231", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\chat\\train.json", + "--validation_file", + ".\\datasets\\chat\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 251.9500846862793 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-summary.json b/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..62515e640d35651aae16843957732abeffeeec08 --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 0.0, "train/learning_rate": 0.0176, "train/epoch": 120.0, "train/global_step": 120, "_timestamp": 1682064001.017566, "_runtime": 117.41733288764954, "_step": 11, "_wandb": {"runtime": 122}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug-internal.log b/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..680cff407e9638ab0939d2fff327d1881fe31d06 --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug-internal.log @@ -0,0 +1,419 @@ +2023-04-21 15:58:03,600 INFO StreamThr :48096 [internal.py:wandb_internal():86] W&B internal server running at pid: 48096, started at: 2023-04-21 15:58:03.599231 +2023-04-21 15:58:03,600 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status +2023-04-21 15:58:03,601 INFO WriterThread:48096 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\run-a11nraei.wandb +2023-04-21 15:58:03,605 DEBUG SenderThread:48096 [sender.py:send():375] send: header +2023-04-21 15:58:03,671 DEBUG SenderThread:48096 [sender.py:send():375] send: run +2023-04-21 15:58:04,394 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 15:58:04,395 INFO SenderThread:48096 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files +2023-04-21 15:58:04,395 INFO SenderThread:48096 [sender.py:_start_run_threads():1124] run started: a11nraei with start time 1682063883.600233 +2023-04-21 15:58:04,395 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:04,395 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:04,396 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: check_version +2023-04-21 15:58:04,992 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 15:58:05,044 DEBUG HandlerThread:48096 [system_info.py:__init__():31] System info init +2023-04-21 15:58:05,044 DEBUG HandlerThread:48096 [system_info.py:__init__():46] System info init done +2023-04-21 15:58:05,044 INFO HandlerThread:48096 [system_monitor.py:start():181] Starting system monitor +2023-04-21 15:58:05,045 INFO SystemMonitor:48096 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 15:58:05,045 INFO HandlerThread:48096 [system_monitor.py:probe():201] Collecting system info +2023-04-21 15:58:05,051 INFO SystemMonitor:48096 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 15:58:05,052 INFO SystemMonitor:48096 [interfaces.py:start():190] Started disk monitoring +2023-04-21 15:58:05,052 INFO SystemMonitor:48096 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 15:58:05,066 INFO SystemMonitor:48096 [interfaces.py:start():190] Started memory monitoring +2023-04-21 15:58:05,085 INFO SystemMonitor:48096 [interfaces.py:start():190] Started network monitoring +2023-04-21 15:58:05,121 DEBUG HandlerThread:48096 [system_info.py:probe():195] Probing system +2023-04-21 15:58:05,124 DEBUG HandlerThread:48096 [system_info.py:_probe_git():180] Probing git +2023-04-21 15:58:05,124 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:05,220 DEBUG HandlerThread:48096 [system_info.py:_probe_git():188] Probing git done +2023-04-21 15:58:05,221 DEBUG HandlerThread:48096 [system_info.py:probe():240] Probing system done +2023-04-21 15:58:05,221 DEBUG HandlerThread:48096 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T07:58:05.121062', 'startedAt': '2023-04-21T07:58:03.589231', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\chat\\train.json', '--validation_file', '.\\datasets\\chat\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 251.9500846862793}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 15:58:05,221 INFO HandlerThread:48096 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 15:58:05,221 INFO HandlerThread:48096 [system_monitor.py:probe():214] Publishing system info +2023-04-21 15:58:05,221 DEBUG HandlerThread:48096 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 15:58:05,222 DEBUG HandlerThread:48096 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 15:58:05,223 INFO HandlerThread:48096 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 15:58:05,235 DEBUG SenderThread:48096 [sender.py:send():375] send: files +2023-04-21 15:58:05,235 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 15:58:05,248 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:58:05,248 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:58:05,402 INFO Thread-16 :48096 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\requirements.txt +2023-04-21 15:58:05,402 INFO Thread-16 :48096 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-metadata.json +2023-04-21 15:58:05,403 INFO Thread-16 :48096 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:05,640 DEBUG SenderThread:48096 [sender.py:send():375] send: telemetry +2023-04-21 15:58:05,640 DEBUG SenderThread:48096 [sender.py:send():375] send: config +2023-04-21 15:58:05,641 DEBUG SenderThread:48096 [sender.py:send():375] send: metric +2023-04-21 15:58:05,641 DEBUG SenderThread:48096 [sender.py:send():375] send: telemetry +2023-04-21 15:58:05,641 DEBUG SenderThread:48096 [sender.py:send():375] send: metric +2023-04-21 15:58:05,641 WARNING SenderThread:48096 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 15:58:06,336 INFO wandb-upload_0:48096 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmp7htn438swandb\0yevl0ho-wandb-metadata.json +2023-04-21 15:58:06,413 INFO Thread-16 :48096 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:07,155 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:08,428 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:08,709 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:09,227 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:09,442 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:11,249 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:12,478 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:13,305 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:14,023 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:14,499 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:15,341 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:16,522 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:17,385 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:17,704 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:58:17,706 DEBUG SenderThread:48096 [sender.py:send():375] send: metric +2023-04-21 15:58:17,707 DEBUG SenderThread:48096 [sender.py:send():375] send: metric +2023-04-21 15:58:17,707 DEBUG SenderThread:48096 [sender.py:send():375] send: metric +2023-04-21 15:58:17,707 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:58:17,707 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:17,707 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:18,543 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:18,543 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:19,438 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:19,458 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:19,553 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:20,259 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:58:20,260 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:58:20,555 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:21,490 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:22,580 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:23,535 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:24,597 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:24,843 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:25,582 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:26,614 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:26,923 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:58:26,924 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:58:26,924 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:26,925 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:27,623 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:27,632 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:28,634 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:29,678 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:29,692 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:30,217 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:30,696 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:31,739 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:32,742 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:33,777 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:33,786 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:35,266 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:58:35,267 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:58:35,510 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:35,846 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:35,890 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:36,857 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\config.yaml +2023-04-21 15:58:37,298 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:58:37,300 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:58:37,300 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:37,300 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:37,885 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:37,885 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:37,915 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:38,895 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:39,928 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:39,936 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:41,316 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:41,978 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:41,986 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:44,029 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:44,039 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:46,069 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:46,081 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:46,362 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:46,374 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:58:46,376 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:58:46,376 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:46,377 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:47,071 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:47,071 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:48,130 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:48,138 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:50,191 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:50,199 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:50,270 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:58:50,270 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:58:51,657 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:52,239 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:52,247 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:54,291 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:54,298 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:56,356 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:56,364 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:56,736 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:58:56,758 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:58:56,760 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:58:56,761 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:58:56,761 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:58:57,367 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:58:58,405 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:58:58,417 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:58:59,417 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:00,456 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:00,463 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:02,048 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:02,504 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:02,514 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:04,549 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:04,559 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:05,099 DEBUG SystemMonitor:48096 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 15:59:05,100 DEBUG SenderThread:48096 [sender.py:send():375] send: stats +2023-04-21 15:59:05,273 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:59:05,274 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:59:06,641 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:06,674 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:06,842 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:06,843 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:06,844 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:06,847 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:07,637 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:07,854 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:08,674 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:08,701 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:09,688 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:10,723 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:10,731 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:12,777 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:12,786 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:13,429 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:14,823 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:14,833 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:16,002 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:16,004 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:16,004 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:16,005 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:16,883 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:16,883 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:16,891 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:17,890 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:18,546 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:18,929 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:18,939 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:20,293 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:59:20,294 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:59:20,970 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:20,984 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:21,986 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:23,040 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:24,044 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:24,321 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:24,366 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:24,367 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:24,367 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:24,368 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:25,084 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:25,084 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:25,100 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:26,099 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:27,156 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:28,154 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:29,196 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:29,832 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:30,197 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:31,235 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:32,231 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:33,278 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:33,782 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:33,783 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:33,783 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:33,784 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:34,271 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:34,271 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:35,105 DEBUG SenderThread:48096 [sender.py:send():375] send: stats +2023-04-21 15:59:35,105 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:35,323 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:59:35,323 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:59:35,325 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:35,326 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:36,327 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:37,529 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:38,425 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:39,541 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:40,494 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:40,760 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:41,561 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:42,565 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:42,753 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:42,755 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:42,755 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:42,756 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:43,591 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:43,599 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:44,595 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:45,648 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:45,658 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:46,493 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:46,650 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:47,731 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:48,725 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:49,784 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:50,309 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 15:59:50,310 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 15:59:50,786 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:51,741 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:51,753 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 15:59:51,753 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 15:59:51,754 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 15:59:51,755 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 15:59:51,825 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 15:59:51,835 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:52,832 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:53,885 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:53,887 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:54,898 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:55,949 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:56,948 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 15:59:57,327 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 15:59:57,995 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 15:59:59,004 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:00,046 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:00:01,017 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 16:00:01,018 DEBUG SenderThread:48096 [sender.py:send():375] send: history +2023-04-21 16:00:01,018 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: summary_record +2023-04-21 16:00:01,019 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:00:01,056 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 16:00:01,056 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:02,105 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:00:02,928 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:00:03,104 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:04,156 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:04,161 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:00:05,121 DEBUG SenderThread:48096 [sender.py:send():375] send: stats +2023-04-21 16:00:05,169 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:05,309 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 16:00:05,310 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: stop_status +2023-04-21 16:00:06,216 ERROR gpu :48096 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:00:07,219 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:07,930 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:00:07,938 DEBUG SenderThread:48096 [sender.py:send():375] send: exit +2023-04-21 16:00:07,938 INFO SenderThread:48096 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 16:00:07,939 INFO SenderThread:48096 [sender.py:send_exit():600] handling runtime: 122 +2023-04-21 16:00:07,940 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:00:07,940 INFO SenderThread:48096 [sender.py:send_exit():606] send defer +2023-04-21 16:00:07,940 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:07,941 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 16:00:07,941 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:07,941 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 16:00:07,941 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 1 +2023-04-21 16:00:07,942 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:07,942 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 16:00:07,942 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:07,942 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 16:00:07,942 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 2 +2023-04-21 16:00:07,943 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:07,943 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 16:00:07,943 INFO HandlerThread:48096 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 16:00:07,943 DEBUG SystemMonitor:48096 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 16:00:07,943 INFO HandlerThread:48096 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 16:00:07,944 DEBUG SystemMonitor:48096 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 16:00:07,944 INFO HandlerThread:48096 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 16:00:08,023 INFO HandlerThread:48096 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 16:00:08,024 INFO HandlerThread:48096 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 16:00:08,024 INFO HandlerThread:48096 [interfaces.py:finish():202] Joined network monitor +2023-04-21 16:00:08,024 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,024 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 16:00:08,025 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 3 +2023-04-21 16:00:08,025 DEBUG SenderThread:48096 [sender.py:send():375] send: stats +2023-04-21 16:00:08,026 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:08,026 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 16:00:08,026 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,026 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 16:00:08,026 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 4 +2023-04-21 16:00:08,026 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:08,026 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 16:00:08,027 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,027 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 16:00:08,027 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 5 +2023-04-21 16:00:08,027 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:08,027 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 16:00:08,027 DEBUG SenderThread:48096 [sender.py:send():375] send: summary +2023-04-21 16:00:08,028 INFO SenderThread:48096 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:00:08,028 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,028 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 16:00:08,028 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 6 +2023-04-21 16:00:08,029 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:08,029 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 16:00:08,029 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,029 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 16:00:08,029 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 7 +2023-04-21 16:00:08,029 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:00:08,029 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:08,029 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 16:00:08,029 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:08,030 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 16:00:08,227 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 16:00:09,040 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:00:09,151 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 8 +2023-04-21 16:00:09,152 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:00:09,152 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:09,152 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 16:00:09,153 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:09,153 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 16:00:09,167 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 9 +2023-04-21 16:00:09,167 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:09,167 DEBUG SenderThread:48096 [sender.py:send():375] send: artifact +2023-04-21 16:00:09,167 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 16:00:09,243 INFO Thread-16 :48096 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:10,052 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:00:11,858 INFO wandb-upload_0:48096 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmpead4zo98 +2023-04-21 16:00:12,049 INFO wandb-upload_1:48096 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmpk3jzp2c_ +2023-04-21 16:00:13,752 INFO SenderThread:48096 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5NjczOTYx', 'digest': '4883a71159b2ed9cfabb55c0066de401', 'state': 'PENDING', 'aliases': [], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5MjY1Nzg0', 'versionIndex': 2}}, 'version': 'latest'} +2023-04-21 16:00:13,752 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:13,752 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 16:00:13,752 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:00:13,752 INFO SenderThread:48096 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 16:00:14,284 INFO SenderThread:48096 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files +2023-04-21 16:00:14,284 INFO SenderThread:48096 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\config.yaml config.yaml +2023-04-21 16:00:14,285 INFO SenderThread:48096 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log output.log +2023-04-21 16:00:14,287 INFO SenderThread:48096 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\requirements.txt requirements.txt +2023-04-21 16:00:14,290 INFO SenderThread:48096 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-metadata.json wandb-metadata.json +2023-04-21 16:00:14,290 INFO SenderThread:48096 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json wandb-summary.json +2023-04-21 16:00:14,292 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 10 +2023-04-21 16:00:14,293 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:00:14,293 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:14,293 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 16:00:14,294 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:14,295 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 16:00:14,295 INFO SenderThread:48096 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 16:00:15,031 INFO wandb-upload_0:48096 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\config.yaml +2023-04-21 16:00:15,089 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 16:00:15,495 INFO wandb-upload_1:48096 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\output.log +2023-04-21 16:00:15,680 INFO wandb-upload_2:48096 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\requirements.txt +2023-04-21 16:00:15,770 INFO wandb-upload_3:48096 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\files\wandb-summary.json +2023-04-21 16:00:15,974 INFO Thread-15 :48096 [sender.py:transition_state():626] send defer: 11 +2023-04-21 16:00:15,974 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:15,974 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 16:00:15,975 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:15,975 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 16:00:15,975 INFO SenderThread:48096 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 16:00:15,975 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 12 +2023-04-21 16:00:15,975 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:15,975 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 16:00:15,975 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:15,975 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 16:00:16,395 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 13 +2023-04-21 16:00:16,395 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:16,395 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 16:00:16,395 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:16,395 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 16:00:16,395 INFO SenderThread:48096 [sender.py:transition_state():626] send defer: 14 +2023-04-21 16:00:16,395 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:00:16,396 DEBUG SenderThread:48096 [sender.py:send():375] send: final +2023-04-21 16:00:16,396 INFO HandlerThread:48096 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 16:00:16,396 DEBUG SenderThread:48096 [sender.py:send():375] send: footer +2023-04-21 16:00:16,396 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: defer +2023-04-21 16:00:16,396 INFO SenderThread:48096 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 16:00:16,397 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:00:16,397 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 16:00:16,397 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 16:00:16,397 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:00:16,397 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 16:00:16,397 DEBUG SenderThread:48096 [sender.py:send_request():402] send_request: server_info +2023-04-21 16:00:16,641 INFO MainThread:48096 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 16:00:16,641 INFO MainThread:48096 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 16:00:16,643 INFO MainThread:48096 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 16:00:16,645 DEBUG HandlerThread:48096 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 16:00:16,645 INFO HandlerThread:48096 [handler.py:finish():845] shutting down handler +2023-04-21 16:00:17,410 INFO WriterThread:48096 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\run-a11nraei.wandb +2023-04-21 16:00:17,646 INFO SenderThread:48096 [sender.py:finish():1550] shutting down sender +2023-04-21 16:00:17,646 INFO SenderThread:48096 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 16:00:17,646 INFO SenderThread:48096 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug.log b/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..c3083e225722f87c723d06688516889d4704ef3b --- /dev/null +++ b/ptuning/wandb/run-20230421_155803-a11nraei/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Configure stats pid to 41476 +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 15:58:03,593 INFO MainThread:41476 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\logs\debug.log +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_155803-a11nraei\logs\debug-internal.log +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:init():547] calling init triggers +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:init():595] starting backend +2023-04-21 15:58:03,594 INFO MainThread:41476 [wandb_init.py:init():599] setting up manager +2023-04-21 15:58:03,597 INFO MainThread:41476 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 15:58:03,600 INFO MainThread:41476 [wandb_init.py:init():605] backend started and connected +2023-04-21 15:58:03,601 INFO MainThread:41476 [wandb_init.py:init():695] updated telemetry +2023-04-21 15:58:03,670 INFO MainThread:41476 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 15:58:04,393 INFO MainThread:41476 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 15:58:04,984 INFO MainThread:41476 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 15:58:04,985 INFO MainThread:41476 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 15:58:05,248 INFO MainThread:41476 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 15:58:05,248 INFO MainThread:41476 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 15:58:05,249 INFO MainThread:41476 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 15:58:05,249 INFO MainThread:41476 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 15:58:05,249 INFO MainThread:41476 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 15:58:05,251 INFO MainThread:41476 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_15-57-52_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 16:00:18,523 WARNING MsgRouterThr:41476 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_155803-a11nraei/run-a11nraei.wandb b/ptuning/wandb/run-20230421_155803-a11nraei/run-a11nraei.wandb new file mode 100644 index 0000000000000000000000000000000000000000..b71d519bd65d1a383d958dc690c92f6eb975ca56 Binary files /dev/null and b/ptuning/wandb/run-20230421_155803-a11nraei/run-a11nraei.wandb differ diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/files/config.yaml b/ptuning/wandb/run-20230421_165208-onaovmt4/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..f811453bd98115f7f633b0ce843919550a4fb49f --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682067128.226776 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_16-51-52_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/files/output.log b/ptuning/wandb/run-20230421_165208-onaovmt4/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..144b3d48ceccf53c8d30774cb1de6866dec72be8 --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/files/output.log @@ -0,0 +1,90 @@ + + 0%| | 0/1000 [00:00 + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 633, in get_tokens_unprocessed + if m: +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 440, in + main() + File "main.py", line 379, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/files/requirements.txt b/ptuning/wandb/run-20230421_165208-onaovmt4/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-metadata.json b/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..1f664014a728e5b386baf2584ba5b43a106deda7 --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T08:52:10.989425", + "startedAt": "2023-04-21T08:52:08.204662", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\chat\\train.json", + "--validation_file", + ".\\datasets\\chat\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 236.01997756958008 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-summary.json b/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..4653804a67243ad211b33a12310a35c7fbc98960 --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 50}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug-internal.log b/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..01ca2327329d68ae92945849125a84dfb7e40fb8 --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug-internal.log @@ -0,0 +1,230 @@ +2023-04-21 16:52:08,224 INFO StreamThr :24428 [internal.py:wandb_internal():86] W&B internal server running at pid: 24428, started at: 2023-04-21 16:52:08.224660 +2023-04-21 16:52:08,226 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status +2023-04-21 16:52:08,228 INFO WriterThread:24428 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\run-onaovmt4.wandb +2023-04-21 16:52:08,232 DEBUG SenderThread:24428 [sender.py:send():375] send: header +2023-04-21 16:52:08,320 DEBUG SenderThread:24428 [sender.py:send():375] send: run +2023-04-21 16:52:09,776 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 16:52:09,776 INFO SenderThread:24428 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files +2023-04-21 16:52:09,777 INFO SenderThread:24428 [sender.py:_start_run_threads():1124] run started: onaovmt4 with start time 1682067128.226776 +2023-04-21 16:52:09,777 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: summary_record +2023-04-21 16:52:09,778 INFO SenderThread:24428 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:52:09,778 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: check_version +2023-04-21 16:52:10,793 INFO Thread-16 :24428 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-summary.json +2023-04-21 16:52:10,858 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 16:52:10,916 DEBUG HandlerThread:24428 [system_info.py:__init__():31] System info init +2023-04-21 16:52:10,916 DEBUG HandlerThread:24428 [system_info.py:__init__():46] System info init done +2023-04-21 16:52:10,916 INFO HandlerThread:24428 [system_monitor.py:start():181] Starting system monitor +2023-04-21 16:52:10,916 INFO SystemMonitor:24428 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 16:52:10,916 INFO HandlerThread:24428 [system_monitor.py:probe():201] Collecting system info +2023-04-21 16:52:10,924 INFO SystemMonitor:24428 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 16:52:10,925 INFO SystemMonitor:24428 [interfaces.py:start():190] Started disk monitoring +2023-04-21 16:52:10,926 INFO SystemMonitor:24428 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 16:52:10,939 INFO SystemMonitor:24428 [interfaces.py:start():190] Started memory monitoring +2023-04-21 16:52:10,957 INFO SystemMonitor:24428 [interfaces.py:start():190] Started network monitoring +2023-04-21 16:52:10,981 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:10,989 DEBUG HandlerThread:24428 [system_info.py:probe():195] Probing system +2023-04-21 16:52:10,992 DEBUG HandlerThread:24428 [system_info.py:_probe_git():180] Probing git +2023-04-21 16:52:11,095 DEBUG HandlerThread:24428 [system_info.py:_probe_git():188] Probing git done +2023-04-21 16:52:11,095 DEBUG HandlerThread:24428 [system_info.py:probe():240] Probing system done +2023-04-21 16:52:11,095 DEBUG HandlerThread:24428 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T08:52:10.989425', 'startedAt': '2023-04-21T08:52:08.204662', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\chat\\train.json', '--validation_file', '.\\datasets\\chat\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 236.01997756958008}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 16:52:11,095 INFO HandlerThread:24428 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 16:52:11,095 INFO HandlerThread:24428 [system_monitor.py:probe():214] Publishing system info +2023-04-21 16:52:11,095 DEBUG HandlerThread:24428 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 16:52:11,096 DEBUG HandlerThread:24428 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 16:52:11,099 INFO HandlerThread:24428 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 16:52:11,143 DEBUG SenderThread:24428 [sender.py:send():375] send: files +2023-04-21 16:52:11,143 INFO SenderThread:24428 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 16:52:11,156 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 16:52:11,157 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: stop_status +2023-04-21 16:52:11,577 DEBUG SenderThread:24428 [sender.py:send():375] send: telemetry +2023-04-21 16:52:11,577 DEBUG SenderThread:24428 [sender.py:send():375] send: config +2023-04-21 16:52:11,578 DEBUG SenderThread:24428 [sender.py:send():375] send: metric +2023-04-21 16:52:11,578 DEBUG SenderThread:24428 [sender.py:send():375] send: telemetry +2023-04-21 16:52:11,578 DEBUG SenderThread:24428 [sender.py:send():375] send: metric +2023-04-21 16:52:11,579 WARNING SenderThread:24428 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 16:52:11,800 INFO Thread-16 :24428 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:11,800 INFO Thread-16 :24428 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-metadata.json +2023-04-21 16:52:11,800 INFO Thread-16 :24428 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\requirements.txt +2023-04-21 16:52:12,178 INFO wandb-upload_0:24428 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpedo5nrgdwandb\1xtewqbc-wandb-metadata.json +2023-04-21 16:52:13,036 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:13,600 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:13,818 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:15,078 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:15,829 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:17,130 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:18,669 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:19,182 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:21,226 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:23,265 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:23,721 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:25,323 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:26,169 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 16:52:26,169 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: stop_status +2023-04-21 16:52:27,371 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:28,978 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:29,406 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:29,958 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:31,451 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:33,485 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:34,631 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:35,541 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:36,010 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:37,580 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:39,617 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:39,647 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:41,072 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\config.yaml +2023-04-21 16:52:41,179 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 16:52:41,180 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: stop_status +2023-04-21 16:52:41,735 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:43,767 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:44,091 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:45,799 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:46,152 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:47,856 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:49,876 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:50,188 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:51,402 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:51,892 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:53,916 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:54,266 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:52:55,929 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:56,195 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 16:52:56,196 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: stop_status +2023-04-21 16:52:56,454 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:52:57,959 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:52:59,979 ERROR gpu :24428 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 16:53:00,438 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:53:01,207 DEBUG SenderThread:24428 [sender.py:send():375] send: exit +2023-04-21 16:53:01,207 INFO SenderThread:24428 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 16:53:01,207 INFO SenderThread:24428 [sender.py:send_exit():600] handling runtime: 50 +2023-04-21 16:53:01,207 INFO SenderThread:24428 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:53:01,208 INFO SenderThread:24428 [sender.py:send_exit():606] send defer +2023-04-21 16:53:01,208 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,208 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 16:53:01,208 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,208 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 16:53:01,209 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 1 +2023-04-21 16:53:01,209 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,209 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 16:53:01,209 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,209 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 16:53:01,209 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 2 +2023-04-21 16:53:01,210 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,210 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 16:53:01,210 INFO HandlerThread:24428 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 16:53:01,210 INFO HandlerThread:24428 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 16:53:01,210 DEBUG SystemMonitor:24428 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 16:53:01,225 INFO HandlerThread:24428 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 16:53:01,225 DEBUG SystemMonitor:24428 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 16:53:01,226 DEBUG SystemMonitor:24428 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 16:53:01,274 INFO HandlerThread:24428 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 16:53:01,274 INFO HandlerThread:24428 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 16:53:01,275 INFO HandlerThread:24428 [interfaces.py:finish():202] Joined network monitor +2023-04-21 16:53:01,276 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,277 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 16:53:01,277 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 3 +2023-04-21 16:53:01,277 DEBUG SenderThread:24428 [sender.py:send():375] send: stats +2023-04-21 16:53:01,277 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,278 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 16:53:01,278 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,278 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 16:53:01,278 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 4 +2023-04-21 16:53:01,278 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,278 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 16:53:01,279 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,279 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 16:53:01,279 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 5 +2023-04-21 16:53:01,279 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,279 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 16:53:01,279 DEBUG SenderThread:24428 [sender.py:send():375] send: summary +2023-04-21 16:53:01,280 INFO SenderThread:24428 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 16:53:01,280 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,280 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 16:53:01,280 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 6 +2023-04-21 16:53:01,280 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,281 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 16:53:01,281 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,281 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 16:53:01,281 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 7 +2023-04-21 16:53:01,281 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:53:01,281 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:01,281 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 16:53:01,282 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:01,282 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 16:53:01,442 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-summary.json +2023-04-21 16:53:02,283 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:53:02,395 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 8 +2023-04-21 16:53:02,395 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:53:02,395 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:02,395 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 16:53:02,396 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:02,396 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 16:53:02,429 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 9 +2023-04-21 16:53:02,429 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:02,429 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 16:53:02,429 DEBUG SenderThread:24428 [sender.py:send():375] send: artifact +2023-04-21 16:53:02,456 INFO Thread-16 :24428 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:53:03,292 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:53:04,709 INFO wandb-upload_0:24428 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmppu32xnxj +2023-04-21 16:53:04,789 INFO wandb-upload_1:24428 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmpztqfj66q +2023-04-21 16:53:06,544 INFO SenderThread:24428 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5NzAxNTEw', 'digest': '1077e319ad39e537c1ccc8f9a5c233bc', 'state': 'PENDING', 'aliases': [], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5NjczOTYx', 'versionIndex': 3}}, 'version': 'latest'} +2023-04-21 16:53:06,544 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:06,544 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 16:53:06,544 INFO SenderThread:24428 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 16:53:07,513 INFO SenderThread:24428 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files +2023-04-21 16:53:07,514 INFO SenderThread:24428 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\config.yaml config.yaml +2023-04-21 16:53:07,514 INFO SenderThread:24428 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log output.log +2023-04-21 16:53:07,517 INFO SenderThread:24428 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\requirements.txt requirements.txt +2023-04-21 16:53:07,519 INFO SenderThread:24428 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-metadata.json wandb-metadata.json +2023-04-21 16:53:07,519 INFO SenderThread:24428 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-summary.json wandb-summary.json +2023-04-21 16:53:07,522 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 10 +2023-04-21 16:53:07,522 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:53:07,522 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:07,522 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 16:53:07,523 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 16:53:07,525 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:07,525 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 16:53:07,525 INFO SenderThread:24428 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 16:53:08,278 INFO wandb-upload_0:24428 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\config.yaml +2023-04-21 16:53:08,332 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 16:53:08,566 INFO wandb-upload_1:24428 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\output.log +2023-04-21 16:53:08,756 INFO wandb-upload_3:24428 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\wandb-summary.json +2023-04-21 16:53:08,813 INFO wandb-upload_2:24428 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\files\requirements.txt +2023-04-21 16:53:09,028 INFO Thread-15 :24428 [sender.py:transition_state():626] send defer: 11 +2023-04-21 16:53:09,028 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:09,028 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 16:53:09,028 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:09,029 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 16:53:09,029 INFO SenderThread:24428 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 16:53:09,029 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 12 +2023-04-21 16:53:09,029 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:09,029 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 16:53:09,029 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:09,029 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 16:53:09,905 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 13 +2023-04-21 16:53:09,905 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:09,905 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 16:53:09,905 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:09,905 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 16:53:09,905 INFO SenderThread:24428 [sender.py:transition_state():626] send defer: 14 +2023-04-21 16:53:09,906 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: defer +2023-04-21 16:53:09,906 INFO HandlerThread:24428 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 16:53:09,906 DEBUG SenderThread:24428 [sender.py:send():375] send: final +2023-04-21 16:53:09,906 DEBUG SenderThread:24428 [sender.py:send():375] send: footer +2023-04-21 16:53:09,906 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: defer +2023-04-21 16:53:09,906 INFO SenderThread:24428 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 16:53:09,907 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 16:53:09,907 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 16:53:09,907 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 16:53:09,907 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 16:53:09,907 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 16:53:09,908 DEBUG SenderThread:24428 [sender.py:send_request():402] send_request: server_info +2023-04-21 16:53:10,163 INFO MainThread:24428 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 16:53:10,163 INFO MainThread:24428 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 16:53:10,163 INFO MainThread:24428 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 16:53:10,164 DEBUG HandlerThread:24428 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 16:53:10,165 INFO HandlerThread:24428 [handler.py:finish():845] shutting down handler +2023-04-21 16:53:10,910 INFO WriterThread:24428 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\run-onaovmt4.wandb +2023-04-21 16:53:11,171 INFO SenderThread:24428 [sender.py:finish():1550] shutting down sender +2023-04-21 16:53:11,171 INFO SenderThread:24428 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 16:53:11,171 INFO SenderThread:24428 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug.log b/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..25f86a62231e0235a095378fa88b732a218bd87c --- /dev/null +++ b/ptuning/wandb/run-20230421_165208-onaovmt4/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Configure stats pid to 14196 +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 16:52:08,208 INFO MainThread:14196 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 16:52:08,217 INFO MainThread:14196 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\logs\debug.log +2023-04-21 16:52:08,217 INFO MainThread:14196 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_165208-onaovmt4\logs\debug-internal.log +2023-04-21 16:52:08,218 INFO MainThread:14196 [wandb_init.py:init():547] calling init triggers +2023-04-21 16:52:08,218 INFO MainThread:14196 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 16:52:08,218 INFO MainThread:14196 [wandb_init.py:init():595] starting backend +2023-04-21 16:52:08,218 INFO MainThread:14196 [wandb_init.py:init():599] setting up manager +2023-04-21 16:52:08,221 INFO MainThread:14196 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 16:52:08,226 INFO MainThread:14196 [wandb_init.py:init():605] backend started and connected +2023-04-21 16:52:08,227 INFO MainThread:14196 [wandb_init.py:init():695] updated telemetry +2023-04-21 16:52:08,319 INFO MainThread:14196 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 16:52:09,775 INFO MainThread:14196 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 16:52:10,848 INFO MainThread:14196 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 16:52:10,848 INFO MainThread:14196 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 16:52:11,156 INFO MainThread:14196 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 16:52:11,157 INFO MainThread:14196 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 16:52:11,157 INFO MainThread:14196 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 16:52:11,157 INFO MainThread:14196 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 16:52:11,157 INFO MainThread:14196 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 16:52:11,159 INFO MainThread:14196 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_16-51-52_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 16:53:12,071 WARNING MsgRouterThr:14196 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_165208-onaovmt4/run-onaovmt4.wandb b/ptuning/wandb/run-20230421_165208-onaovmt4/run-onaovmt4.wandb new file mode 100644 index 0000000000000000000000000000000000000000..173659ff1a9904ba62e63ec502fe9c03a41a3f1f Binary files /dev/null and b/ptuning/wandb/run-20230421_165208-onaovmt4/run-onaovmt4.wandb differ diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/files/config.yaml b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..f6121d6dee72e23e31e9172ac528d86821f9c987 --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/config.yaml @@ -0,0 +1,606 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682088940.325418 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + - 37 + - 42 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_22-55-28_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/files/output.log b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..0b01eb83f35e2347e6ca68562ce86da2589bdaa5 --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/output.log @@ -0,0 +1,84 @@ + +04/21/2023 22:55:43 - WARNING - transformers_modules.chatglm-6b-int4.modeling_chatglm - `use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`... + 0%| | 2/1000 [00:26<3:38:31, 13.14s/it]Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/files/requirements.txt b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-metadata.json b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..6ff1728fadd61d3801c379ba87b60499fa5635ba --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T14:55:42.258743", + "startedAt": "2023-04-21T14:55:40.285418", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\chat\\train.json", + "--validation_file", + ".\\datasets\\chat\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 236.0998077392578 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-summary.json b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..3892ef5cf05335a4d6498d97a0207f7255ec759b --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 31}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug-internal.log b/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..4db39ea8060b1fdddac5beb4252ed24b05135832 --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug-internal.log @@ -0,0 +1,209 @@ +2023-04-21 22:55:40,325 INFO StreamThr :13736 [internal.py:wandb_internal():86] W&B internal server running at pid: 13736, started at: 2023-04-21 22:55:40.324419 +2023-04-21 22:55:40,326 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status +2023-04-21 22:55:40,328 INFO WriterThread:13736 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\run-uxlxnflu.wandb +2023-04-21 22:55:40,332 DEBUG SenderThread:13736 [sender.py:send():375] send: header +2023-04-21 22:55:40,395 DEBUG SenderThread:13736 [sender.py:send():375] send: run +2023-04-21 22:55:41,218 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 22:55:41,220 INFO SenderThread:13736 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files +2023-04-21 22:55:41,220 INFO SenderThread:13736 [sender.py:_start_run_threads():1124] run started: uxlxnflu with start time 1682088940.325418 +2023-04-21 22:55:41,220 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: summary_record +2023-04-21 22:55:41,220 INFO SenderThread:13736 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 22:55:41,222 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: check_version +2023-04-21 22:55:42,138 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 22:55:42,189 DEBUG HandlerThread:13736 [system_info.py:__init__():31] System info init +2023-04-21 22:55:42,189 DEBUG HandlerThread:13736 [system_info.py:__init__():46] System info init done +2023-04-21 22:55:42,189 INFO HandlerThread:13736 [system_monitor.py:start():181] Starting system monitor +2023-04-21 22:55:42,190 INFO SystemMonitor:13736 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 22:55:42,190 INFO HandlerThread:13736 [system_monitor.py:probe():201] Collecting system info +2023-04-21 22:55:42,196 INFO SystemMonitor:13736 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 22:55:42,197 INFO SystemMonitor:13736 [interfaces.py:start():190] Started disk monitoring +2023-04-21 22:55:42,197 INFO SystemMonitor:13736 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 22:55:42,209 INFO SystemMonitor:13736 [interfaces.py:start():190] Started memory monitoring +2023-04-21 22:55:42,243 INFO SystemMonitor:13736 [interfaces.py:start():190] Started network monitoring +2023-04-21 22:55:42,244 INFO Thread-16 :13736 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-summary.json +2023-04-21 22:55:42,258 DEBUG HandlerThread:13736 [system_info.py:probe():195] Probing system +2023-04-21 22:55:42,261 DEBUG HandlerThread:13736 [system_info.py:_probe_git():180] Probing git +2023-04-21 22:55:42,263 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:42,374 DEBUG HandlerThread:13736 [system_info.py:_probe_git():188] Probing git done +2023-04-21 22:55:42,374 DEBUG HandlerThread:13736 [system_info.py:probe():240] Probing system done +2023-04-21 22:55:42,374 DEBUG HandlerThread:13736 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T14:55:42.258743', 'startedAt': '2023-04-21T14:55:40.285418', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\chat\\train.json', '--validation_file', '.\\datasets\\chat\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 236.0998077392578}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 22:55:42,374 INFO HandlerThread:13736 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 22:55:42,374 INFO HandlerThread:13736 [system_monitor.py:probe():214] Publishing system info +2023-04-21 22:55:42,374 DEBUG HandlerThread:13736 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 22:55:42,375 DEBUG HandlerThread:13736 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 22:55:42,377 INFO HandlerThread:13736 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 22:55:42,390 DEBUG SenderThread:13736 [sender.py:send():375] send: files +2023-04-21 22:55:42,390 INFO SenderThread:13736 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 22:55:42,402 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:55:42,402 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:55:42,798 DEBUG SenderThread:13736 [sender.py:send():375] send: telemetry +2023-04-21 22:55:42,798 DEBUG SenderThread:13736 [sender.py:send():375] send: config +2023-04-21 22:55:42,799 DEBUG SenderThread:13736 [sender.py:send():375] send: metric +2023-04-21 22:55:42,799 DEBUG SenderThread:13736 [sender.py:send():375] send: telemetry +2023-04-21 22:55:42,799 DEBUG SenderThread:13736 [sender.py:send():375] send: metric +2023-04-21 22:55:42,799 WARNING SenderThread:13736 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 22:55:43,256 INFO Thread-16 :13736 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-metadata.json +2023-04-21 22:55:43,257 INFO Thread-16 :13736 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\requirements.txt +2023-04-21 22:55:43,257 INFO Thread-16 :13736 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:55:43,498 INFO wandb-upload_0:13736 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmp2okrjzy_wandb\julsuqja-wandb-metadata.json +2023-04-21 22:55:44,292 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:45,288 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:55:45,859 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:55:46,330 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:55:46,341 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:48,394 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:50,428 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:50,902 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:55:52,473 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:54,507 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:55,945 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:55:56,551 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:57,421 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:55:57,422 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:55:58,582 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:55:59,592 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:56:00,629 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:01,838 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:56:02,675 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:04,710 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:06,754 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:06,865 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:56:08,794 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:10,845 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:11,852 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:56:12,389 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:56:12,430 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:56:13,027 ERROR gpu :13736 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:56:13,553 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:56:13,873 DEBUG SenderThread:13736 [sender.py:send():375] send: exit +2023-04-21 22:56:13,873 INFO SenderThread:13736 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 22:56:13,873 INFO SenderThread:13736 [sender.py:send_exit():600] handling runtime: 31 +2023-04-21 22:56:13,875 INFO SenderThread:13736 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 22:56:13,876 INFO SenderThread:13736 [sender.py:send_exit():606] send defer +2023-04-21 22:56:13,877 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,877 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 22:56:13,878 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,878 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 22:56:13,878 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 1 +2023-04-21 22:56:13,878 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,878 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 22:56:13,878 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,878 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 22:56:13,878 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 2 +2023-04-21 22:56:13,879 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,879 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 22:56:13,879 INFO HandlerThread:13736 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 22:56:13,879 DEBUG SystemMonitor:13736 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 22:56:13,879 INFO HandlerThread:13736 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 22:56:13,879 DEBUG SystemMonitor:13736 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 22:56:13,891 INFO HandlerThread:13736 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 22:56:13,892 DEBUG SystemMonitor:13736 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 22:56:13,946 INFO HandlerThread:13736 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 22:56:13,946 INFO HandlerThread:13736 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 22:56:13,946 INFO HandlerThread:13736 [interfaces.py:finish():202] Joined network monitor +2023-04-21 22:56:13,947 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,947 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 22:56:13,947 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 3 +2023-04-21 22:56:13,947 DEBUG SenderThread:13736 [sender.py:send():375] send: stats +2023-04-21 22:56:13,947 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,948 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 22:56:13,948 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,948 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 22:56:13,948 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 4 +2023-04-21 22:56:13,948 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,949 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 22:56:13,949 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,949 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 22:56:13,949 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 5 +2023-04-21 22:56:13,949 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,949 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 22:56:13,950 DEBUG SenderThread:13736 [sender.py:send():375] send: summary +2023-04-21 22:56:13,950 INFO SenderThread:13736 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 22:56:13,951 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,951 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 22:56:13,951 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 6 +2023-04-21 22:56:13,951 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,951 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 22:56:13,951 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,951 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 22:56:13,952 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 7 +2023-04-21 22:56:13,952 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:56:13,952 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:13,952 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 22:56:13,952 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:13,952 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 22:56:13,955 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\config.yaml +2023-04-21 22:56:13,955 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-summary.json +2023-04-21 22:56:14,157 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 22:56:14,969 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:56:16,025 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 8 +2023-04-21 22:56:16,025 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 22:56:16,025 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:16,026 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 22:56:16,026 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:16,026 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 22:56:16,051 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 9 +2023-04-21 22:56:16,051 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:16,051 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 22:56:16,051 DEBUG SenderThread:13736 [sender.py:send():375] send: artifact +2023-04-21 22:56:16,982 INFO Thread-16 :13736 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:56:17,671 INFO SenderThread:13736 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5NzAxNTEw', 'digest': '1077e319ad39e537c1ccc8f9a5c233bc', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v4'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5NzAxNTEw', 'versionIndex': 4}}, 'version': 'v4'} +2023-04-21 22:56:17,671 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:17,671 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 22:56:17,671 INFO SenderThread:13736 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 22:56:17,994 INFO SenderThread:13736 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files +2023-04-21 22:56:17,994 INFO SenderThread:13736 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\config.yaml config.yaml +2023-04-21 22:56:17,995 INFO SenderThread:13736 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log output.log +2023-04-21 22:56:17,997 INFO SenderThread:13736 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\requirements.txt requirements.txt +2023-04-21 22:56:17,999 INFO SenderThread:13736 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-metadata.json wandb-metadata.json +2023-04-21 22:56:18,001 INFO SenderThread:13736 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-summary.json wandb-summary.json +2023-04-21 22:56:18,002 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 10 +2023-04-21 22:56:18,003 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:18,003 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 22:56:18,005 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:18,005 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 22:56:18,005 INFO SenderThread:13736 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 22:56:19,006 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:56:19,186 INFO wandb-upload_0:13736 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\config.yaml +2023-04-21 22:56:19,195 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 22:56:19,224 INFO wandb-upload_3:13736 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\wandb-summary.json +2023-04-21 22:56:19,294 INFO wandb-upload_2:13736 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\requirements.txt +2023-04-21 22:56:20,158 INFO wandb-upload_1:13736 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\files\output.log +2023-04-21 22:56:20,366 INFO Thread-15 :13736 [sender.py:transition_state():626] send defer: 11 +2023-04-21 22:56:20,366 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:20,366 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 22:56:20,367 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:20,367 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 22:56:20,367 INFO SenderThread:13736 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 22:56:20,367 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 12 +2023-04-21 22:56:20,367 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:20,367 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 22:56:20,367 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:20,367 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 22:56:20,941 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 13 +2023-04-21 22:56:20,942 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:20,942 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 22:56:20,942 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:20,942 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 22:56:20,942 INFO SenderThread:13736 [sender.py:transition_state():626] send defer: 14 +2023-04-21 22:56:20,942 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: defer +2023-04-21 22:56:20,942 INFO HandlerThread:13736 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 22:56:20,943 DEBUG SenderThread:13736 [sender.py:send():375] send: final +2023-04-21 22:56:20,943 DEBUG SenderThread:13736 [sender.py:send():375] send: footer +2023-04-21 22:56:20,943 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: defer +2023-04-21 22:56:20,943 INFO SenderThread:13736 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 22:56:20,944 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 22:56:20,945 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 22:56:20,945 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 22:56:20,945 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 22:56:20,945 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 22:56:20,945 DEBUG SenderThread:13736 [sender.py:send_request():402] send_request: server_info +2023-04-21 22:56:21,184 INFO MainThread:13736 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 22:56:21,184 INFO MainThread:13736 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 22:56:21,184 INFO MainThread:13736 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 22:56:21,185 DEBUG HandlerThread:13736 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 22:56:21,186 INFO HandlerThread:13736 [handler.py:finish():845] shutting down handler +2023-04-21 22:56:21,953 INFO WriterThread:13736 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\run-uxlxnflu.wandb +2023-04-21 22:56:22,187 INFO SenderThread:13736 [sender.py:finish():1550] shutting down sender +2023-04-21 22:56:22,187 INFO SenderThread:13736 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 22:56:22,187 INFO SenderThread:13736 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug.log b/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..890436c428c0b334e0355cfe378b9846449750d3 --- /dev/null +++ b/ptuning/wandb/run-20230421_225540-uxlxnflu/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 22:55:40,316 INFO MainThread:33836 [wandb_setup.py:_flush():76] Configure stats pid to 33836 +2023-04-21 22:55:40,316 INFO MainThread:33836 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\logs\debug.log +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225540-uxlxnflu\logs\debug-internal.log +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:init():547] calling init triggers +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:init():595] starting backend +2023-04-21 22:55:40,317 INFO MainThread:33836 [wandb_init.py:init():599] setting up manager +2023-04-21 22:55:40,321 INFO MainThread:33836 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 22:55:40,325 INFO MainThread:33836 [wandb_init.py:init():605] backend started and connected +2023-04-21 22:55:40,326 INFO MainThread:33836 [wandb_init.py:init():695] updated telemetry +2023-04-21 22:55:40,394 INFO MainThread:33836 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 22:55:41,218 INFO MainThread:33836 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 22:55:42,129 INFO MainThread:33836 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 22:55:42,129 INFO MainThread:33836 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 22:55:42,402 INFO MainThread:33836 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 22:55:42,403 INFO MainThread:33836 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 22:55:42,403 INFO MainThread:33836 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 22:55:42,403 INFO MainThread:33836 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 22:55:42,403 INFO MainThread:33836 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 22:55:42,406 INFO MainThread:33836 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_22-55-28_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 22:56:23,071 WARNING MsgRouterThr:33836 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/run-uxlxnflu.wandb b/ptuning/wandb/run-20230421_225540-uxlxnflu/run-uxlxnflu.wandb new file mode 100644 index 0000000000000000000000000000000000000000..1a569f3da87e589f6a6c35b4cbeec0875d5bbbbe Binary files /dev/null and b/ptuning/wandb/run-20230421_225540-uxlxnflu/run-uxlxnflu.wandb differ diff --git a/ptuning/wandb/run-20230421_225540-uxlxnflu/run-uxlxnflu.wandb.synced b/ptuning/wandb/run-20230421_225540-uxlxnflu/run-uxlxnflu.wandb.synced new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/files/config.yaml b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..ebe0a040539cd2fd4d9342690d7045902dc13985 --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/config.yaml @@ -0,0 +1,616 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682089053.995174 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_22-57-22_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 100 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/files/output.log b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..5c93d317f55683c83d6fa3adf8435942d06b67ab --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/output.log @@ -0,0 +1,166 @@ + + 0%| | 0/1000 [00:00 + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 631, in get_tokens_unprocessed + for rexmatch, action, new_state in statetokens: +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/files/requirements.txt b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-metadata.json b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..478b93d6ecd90ec71c7593e375d720c44d46f2fa --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T14:57:36.871881", + "startedAt": "2023-04-21T14:57:33.956697", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\chat\\train.json", + "--validation_file", + ".\\datasets\\chat\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "100", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 236.0998878479004 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-summary.json b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..0eb1c7a710429037f621b9f6dc42caea914a9e9e --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 0.3263, "train/learning_rate": 0.0194, "train/epoch": 4.75, "train/global_step": 30, "_timestamp": 1682089537.022641, "_runtime": 483.0274670124054, "_step": 2, "_wandb": {"runtime": 520}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug-internal.log b/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..fe0a4df0cc826757157a208cdfa402c58b4b2aee --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug-internal.log @@ -0,0 +1,679 @@ +2023-04-21 22:57:33,994 INFO StreamThr :12140 [internal.py:wandb_internal():86] W&B internal server running at pid: 12140, started at: 2023-04-21 22:57:33.994174 +2023-04-21 22:57:33,996 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status +2023-04-21 22:57:33,996 INFO WriterThread:12140 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\run-tqce6lr7.wandb +2023-04-21 22:57:34,000 DEBUG SenderThread:12140 [sender.py:send():375] send: header +2023-04-21 22:57:34,060 DEBUG SenderThread:12140 [sender.py:send():375] send: run +2023-04-21 22:57:34,877 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 22:57:34,879 INFO SenderThread:12140 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files +2023-04-21 22:57:34,879 INFO SenderThread:12140 [sender.py:_start_run_threads():1124] run started: tqce6lr7 with start time 1682089053.995174 +2023-04-21 22:57:34,880 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: summary_record +2023-04-21 22:57:34,880 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 22:57:34,880 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: check_version +2023-04-21 22:57:35,884 INFO Thread-16 :12140 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 22:57:36,746 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 22:57:36,796 DEBUG HandlerThread:12140 [system_info.py:__init__():31] System info init +2023-04-21 22:57:36,796 DEBUG HandlerThread:12140 [system_info.py:__init__():46] System info init done +2023-04-21 22:57:36,797 INFO HandlerThread:12140 [system_monitor.py:start():181] Starting system monitor +2023-04-21 22:57:36,797 INFO SystemMonitor:12140 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 22:57:36,797 INFO HandlerThread:12140 [system_monitor.py:probe():201] Collecting system info +2023-04-21 22:57:36,804 INFO SystemMonitor:12140 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 22:57:36,805 INFO SystemMonitor:12140 [interfaces.py:start():190] Started disk monitoring +2023-04-21 22:57:36,805 INFO SystemMonitor:12140 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 22:57:36,819 INFO SystemMonitor:12140 [interfaces.py:start():190] Started memory monitoring +2023-04-21 22:57:36,839 INFO SystemMonitor:12140 [interfaces.py:start():190] Started network monitoring +2023-04-21 22:57:36,870 DEBUG HandlerThread:12140 [system_info.py:probe():195] Probing system +2023-04-21 22:57:36,873 DEBUG HandlerThread:12140 [system_info.py:_probe_git():180] Probing git +2023-04-21 22:57:36,875 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:36,974 DEBUG HandlerThread:12140 [system_info.py:_probe_git():188] Probing git done +2023-04-21 22:57:36,974 DEBUG HandlerThread:12140 [system_info.py:probe():240] Probing system done +2023-04-21 22:57:36,974 DEBUG HandlerThread:12140 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T14:57:36.871881', 'startedAt': '2023-04-21T14:57:33.956697', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\chat\\train.json', '--validation_file', '.\\datasets\\chat\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '100', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 236.0998878479004}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 22:57:36,974 INFO HandlerThread:12140 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 22:57:36,975 INFO HandlerThread:12140 [system_monitor.py:probe():214] Publishing system info +2023-04-21 22:57:36,975 DEBUG HandlerThread:12140 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 22:57:36,975 DEBUG HandlerThread:12140 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 22:57:36,978 INFO HandlerThread:12140 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 22:57:36,990 DEBUG SenderThread:12140 [sender.py:send():375] send: files +2023-04-21 22:57:36,990 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 22:57:37,006 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:57:37,006 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:57:37,423 DEBUG SenderThread:12140 [sender.py:send():375] send: telemetry +2023-04-21 22:57:37,423 DEBUG SenderThread:12140 [sender.py:send():375] send: config +2023-04-21 22:57:37,423 DEBUG SenderThread:12140 [sender.py:send():375] send: metric +2023-04-21 22:57:37,423 DEBUG SenderThread:12140 [sender.py:send():375] send: telemetry +2023-04-21 22:57:37,423 DEBUG SenderThread:12140 [sender.py:send():375] send: metric +2023-04-21 22:57:37,423 WARNING SenderThread:12140 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 22:57:37,902 INFO Thread-16 :12140 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-metadata.json +2023-04-21 22:57:37,902 INFO Thread-16 :12140 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:57:37,902 INFO Thread-16 :12140 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\requirements.txt +2023-04-21 22:57:38,905 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:39,352 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:57:39,717 INFO wandb-upload_0:12140 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpmhk2g0pgwandb\cv9h9qgb-wandb-metadata.json +2023-04-21 22:57:39,910 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:57:40,942 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:57:40,985 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:43,007 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:44,379 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:57:45,047 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:47,086 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:49,136 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:49,414 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:57:51,171 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:52,007 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:57:52,008 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:57:53,210 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:54,211 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:57:55,251 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:55,326 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:57:57,289 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:57:59,330 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:00,374 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:01,381 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:03,428 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:05,443 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:05,471 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:06,466 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\config.yaml +2023-04-21 22:58:07,025 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:58:07,025 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:58:07,561 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:09,586 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:10,558 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:58:11,599 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:12,225 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:13,654 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:15,693 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:17,275 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:17,743 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:19,786 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:21,831 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:22,036 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:58:22,036 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:58:22,280 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:23,866 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:25,912 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:27,331 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:27,949 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:30,003 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:32,035 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:58:32,043 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:32,718 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:34,087 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:36,132 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:36,870 DEBUG SystemMonitor:12140 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 22:58:36,871 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 22:58:37,039 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:58:37,040 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:58:38,229 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:38,311 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:40,237 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:42,282 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:43,913 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:44,312 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:58:44,319 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:46,361 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:48,396 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:48,962 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:50,442 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:52,055 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:58:52,055 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:58:52,484 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:54,336 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:58:54,533 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:56,690 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:58:58,710 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:58:58,719 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:00,221 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:00,763 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:02,799 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:04,839 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:05,265 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:06,884 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 22:59:06,894 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:07,080 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:59:07,080 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:59:08,996 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:09,957 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:59:10,461 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:11,020 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:13,051 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:15,082 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:15,502 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:17,113 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:19,164 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:20,552 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:21,208 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:22,084 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:59:22,084 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:59:23,243 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:24,246 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:59:25,291 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:26,075 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:27,341 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:29,370 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:31,121 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:31,437 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:33,475 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:35,522 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:36,152 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:36,886 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 22:59:37,089 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:59:37,090 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:59:37,566 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:38,569 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:59:39,672 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:41,395 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:41,692 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:43,717 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:45,770 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:46,433 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:47,811 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:49,852 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:51,570 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:51,898 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:52,100 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 22:59:52,100 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 22:59:52,900 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 22:59:53,943 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:55,993 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 22:59:57,403 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 22:59:58,028 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:00,072 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:02,109 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:02,437 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:04,157 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:04,659 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:00:04,660 DEBUG SenderThread:12140 [sender.py:send():375] send: metric +2023-04-21 23:00:04,660 DEBUG SenderThread:12140 [sender.py:send():375] send: metric +2023-04-21 23:00:04,661 DEBUG SenderThread:12140 [sender.py:send():375] send: metric +2023-04-21 23:00:04,661 DEBUG SenderThread:12140 [sender.py:send():375] send: history +2023-04-21 23:00:04,661 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:00:04,661 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:00:05,159 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 23:00:06,196 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:00:06,208 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:06,901 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:00:07,122 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:00:07,123 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:00:07,201 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:00:08,259 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:08,380 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:10,363 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:12,380 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:13,426 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:14,414 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\config.yaml +2023-04-21 23:00:14,422 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:16,483 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:18,530 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:19,563 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:20,572 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:00:20,582 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:22,129 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:00:22,129 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:00:22,631 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:24,666 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:25,411 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:26,720 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:28,764 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:30,459 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:30,817 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:32,880 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:34,903 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:00:34,912 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:35,571 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:36,941 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:00:36,954 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:37,139 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:00:37,140 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:00:38,997 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:41,088 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:41,433 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:43,112 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:45,137 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:46,484 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:47,169 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:49,246 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:49,247 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:00:51,308 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:51,730 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:52,153 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:00:52,153 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:00:53,348 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:55,395 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:57,442 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:00:57,462 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:00:59,474 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:01,520 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:02,523 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:03,331 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:01:03,569 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:05,604 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:06,935 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:01:07,168 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:01:07,169 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:01:07,646 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:08,452 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:09,697 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:11,787 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:13,498 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:13,801 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:15,820 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:16,449 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:01:17,864 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:19,447 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:19,897 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:21,941 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:22,182 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:01:22,182 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:01:23,990 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:24,471 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:26,028 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:28,082 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:30,123 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:30,338 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:30,572 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:01:32,170 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:34,200 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:35,390 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:36,249 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:36,946 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:01:37,194 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:01:37,194 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:01:38,280 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:40,327 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:40,458 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:42,419 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:44,433 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:44,691 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:01:46,454 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:46,462 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:48,501 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:50,544 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:51,530 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:52,217 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:01:52,217 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:01:52,588 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:54,636 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:56,686 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:57,308 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:01:58,734 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:01:58,823 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:02:00,777 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:02,374 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:02,821 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:04,859 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:06,899 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:06,947 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:02:07,227 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:02:07,227 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:02:07,498 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:08,945 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:10,976 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:02:10,990 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:13,082 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:13,452 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:15,109 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:17,129 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:18,484 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:19,147 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:21,179 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:22,253 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:02:22,254 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:02:23,214 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:23,513 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:24,761 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:02:24,764 DEBUG SenderThread:12140 [sender.py:send():375] send: history +2023-04-21 23:02:24,765 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:02:24,767 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:02:25,255 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 23:02:25,263 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:26,266 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:02:27,303 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:02:27,310 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:28,806 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:29,346 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:31,395 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:33,437 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:33,860 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:35,484 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:36,952 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:02:37,265 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:02:37,265 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:02:37,530 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:39,032 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:39,562 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:41,601 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:02:41,609 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:43,704 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:44,086 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:45,715 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:47,780 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:49,147 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:49,823 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:51,865 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:52,275 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:02:52,275 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:02:53,900 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:54,537 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:02:55,949 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:57,986 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:02:58,992 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:03:00,028 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:00,038 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:02,070 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:04,124 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:05,105 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:06,157 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:06,957 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:03:07,283 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:03:07,284 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:03:08,209 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:10,260 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:10,549 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:12,309 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:14,410 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:15,598 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:16,432 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:17,398 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:03:18,454 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:20,474 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:20,883 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:22,288 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:03:22,289 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:03:22,522 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:24,557 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:26,590 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:26,607 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:28,643 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:30,692 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:31,677 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:32,729 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:34,768 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:36,815 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:36,972 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:03:36,973 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:37,299 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:03:37,299 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:03:37,817 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:03:38,861 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:40,918 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:42,574 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:42,958 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:45,063 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:47,077 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:47,627 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:49,112 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:51,164 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:52,303 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:03:52,304 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:03:53,204 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:53,575 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:55,635 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:57,570 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:03:57,652 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:03:58,967 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:03:59,679 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:01,694 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:03,734 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:04,013 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:05,791 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:06,976 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:04:07,316 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:04:07,316 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:04:07,838 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:09,579 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:09,881 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:11,931 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:13,971 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:14,619 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:16,072 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:18,089 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:19,744 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:20,105 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:04:20,113 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:22,161 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:22,321 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:04:22,322 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:04:24,219 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:25,602 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:26,267 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:28,301 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:30,368 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:30,649 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:32,413 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:34,458 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:35,685 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:36,496 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:36,980 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:04:37,337 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:04:37,337 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:04:38,547 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:39,540 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:04:40,587 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:41,635 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:42,641 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:44,685 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:46,727 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:46,786 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:48,799 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:50,834 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:51,749 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:52,341 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:04:52,341 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:04:52,884 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:54,920 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:56,973 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:04:57,652 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:04:59,001 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:00,000 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:05:01,052 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:02,790 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:03,090 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:05,130 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:06,991 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:05:07,171 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:07,349 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:05:07,349 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:05:08,632 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:09,224 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:11,262 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:13,297 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:13,670 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:15,332 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:17,431 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:18,410 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:05:19,076 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:19,440 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:21,466 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:22,378 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:05:22,378 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:05:23,503 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:24,645 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:25,537 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:27,576 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:29,624 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:29,695 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:31,662 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:33,708 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:34,744 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:35,761 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:37,001 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:05:37,023 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:05:37,024 DEBUG SenderThread:12140 [sender.py:send():375] send: history +2023-04-21 23:05:37,025 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:05:37,026 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:05:37,387 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:05:37,387 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:05:37,809 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 23:05:37,810 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:05:37,816 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:38,815 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:05:39,863 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:40,671 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:41,907 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:43,954 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:45,703 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:45,994 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:48,097 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:50,121 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:50,764 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:52,148 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:52,400 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:05:52,401 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:05:54,169 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:56,220 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:05:56,586 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:05:58,253 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:05:58,263 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:00,323 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:01,625 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:02,375 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:04,425 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:06,470 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:06,667 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:07,013 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:06:07,417 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:06:07,417 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:06:08,515 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:10,564 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:11,784 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:12,618 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:14,695 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:16,748 ERROR gpu :12140 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:06:16,821 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:17,259 DEBUG SenderThread:12140 [sender.py:send():375] send: exit +2023-04-21 23:06:17,259 INFO SenderThread:12140 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 23:06:17,259 INFO SenderThread:12140 [sender.py:send_exit():600] handling runtime: 520 +2023-04-21 23:06:17,260 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:06:17,260 INFO SenderThread:12140 [sender.py:send_exit():606] send defer +2023-04-21 23:06:17,261 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,261 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 23:06:17,261 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,261 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 23:06:17,261 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 1 +2023-04-21 23:06:17,262 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,262 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 23:06:17,262 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,262 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 23:06:17,262 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 2 +2023-04-21 23:06:17,262 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,262 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 23:06:17,262 INFO HandlerThread:12140 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 23:06:17,262 DEBUG SystemMonitor:12140 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 23:06:17,263 INFO HandlerThread:12140 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 23:06:17,273 DEBUG SystemMonitor:12140 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 23:06:17,274 INFO HandlerThread:12140 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 23:06:17,322 INFO HandlerThread:12140 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 23:06:17,323 INFO HandlerThread:12140 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 23:06:17,323 INFO HandlerThread:12140 [interfaces.py:finish():202] Joined network monitor +2023-04-21 23:06:17,323 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,323 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 23:06:17,323 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 3 +2023-04-21 23:06:17,324 DEBUG SenderThread:12140 [sender.py:send():375] send: stats +2023-04-21 23:06:17,324 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,324 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 23:06:17,324 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,324 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 23:06:17,324 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 4 +2023-04-21 23:06:17,325 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,325 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 23:06:17,325 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,325 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 23:06:17,326 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 5 +2023-04-21 23:06:17,326 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,326 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 23:06:17,327 DEBUG SenderThread:12140 [sender.py:send():375] send: summary +2023-04-21 23:06:17,327 INFO SenderThread:12140 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:06:17,328 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,328 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 23:06:17,328 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 6 +2023-04-21 23:06:17,328 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,328 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 23:06:17,328 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,328 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 23:06:17,328 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 7 +2023-04-21 23:06:17,328 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:17,329 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:17,329 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 23:06:17,329 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:17,329 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 23:06:17,744 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 23:06:18,332 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 23:06:18,690 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 8 +2023-04-21 23:06:18,691 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 23:06:18,691 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:18,691 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 23:06:18,691 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:18,691 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 23:06:18,720 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 9 +2023-04-21 23:06:18,720 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:18,720 DEBUG SenderThread:12140 [sender.py:send():375] send: artifact +2023-04-21 23:06:18,720 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 23:06:18,753 INFO Thread-16 :12140 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:06:19,343 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 23:06:20,484 INFO wandb-upload_0:12140 [upload_job.py:push():92] Skipped uploading C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmp6sf25d7k +2023-04-21 23:06:22,904 INFO wandb-upload_1:12140 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmptrtt3vnx +2023-04-21 23:06:24,395 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 23:06:24,982 INFO SenderThread:12140 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'digest': 'ab8fa958b17d8260d4017e4943372340', 'state': 'PENDING', 'aliases': [], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5NzAxNTEw', 'versionIndex': 4}}, 'version': 'latest'} +2023-04-21 23:06:24,982 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:24,982 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:24,982 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 23:06:24,982 INFO SenderThread:12140 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 23:06:25,824 INFO SenderThread:12140 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files +2023-04-21 23:06:25,824 INFO SenderThread:12140 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\config.yaml config.yaml +2023-04-21 23:06:25,825 INFO SenderThread:12140 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log output.log +2023-04-21 23:06:25,827 INFO SenderThread:12140 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\requirements.txt requirements.txt +2023-04-21 23:06:25,830 INFO SenderThread:12140 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-metadata.json wandb-metadata.json +2023-04-21 23:06:25,830 INFO SenderThread:12140 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json wandb-summary.json +2023-04-21 23:06:25,833 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 10 +2023-04-21 23:06:25,833 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 23:06:25,833 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:25,834 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 23:06:25,836 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:25,836 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 23:06:25,836 INFO SenderThread:12140 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 23:06:26,846 INFO wandb-upload_0:12140 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\config.yaml +2023-04-21 23:06:27,222 INFO wandb-upload_1:12140 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\output.log +2023-04-21 23:06:28,238 INFO wandb-upload_3:12140 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\wandb-summary.json +2023-04-21 23:06:29,430 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 23:06:29,688 ERROR wandb-upload_2:12140 [internal_api.py:upload_file():2099] upload_file exception https://storage.googleapis.com/wandb-production.appspot.com/anony-mouse-536056/huggingface/tqce6lr7/requirements.txt?Expires=1682175987&GoogleAccessId=wandb-production%40appspot.gserviceaccount.com&Signature=iQhCG8ukz0tz8P1C3WsC%2F8T3HdbkGB8BVz342%2FBjcmthpKZikj0Q3wEJrL6BcPjbq68djFgI5bvCGx%2B5a0RfwuwnK0D9UcnpI4fl15kyNZyECnVRlSVndH7M5dMK1nNgPaMra0DVWUa2vNLTkReeq3zqV43aFdFqDhRhi2aeAuX7ZYYdCcky%2BYe5UcltoWQ%2Bkb5%2BjTC1ZemSgAzeLza3EBXR2b0D2%2B5aFk%2BJB0HDFN3Iz0KxjDa4EBLwtyCL2CqtUslHp4kZuBQtHAA8W%2BR%2FWtJoymjWu%2FnXMvVKAyNb%2BbpAuXa2vGHC%2FMKhS%2FS9IFOj0YjwE9YOkjq%2FVdt90DqAvQ%3D%3D: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Max retries exceeded with url: /wandb-production.appspot.com/anony-mouse-536056/huggingface/tqce6lr7/requirements.txt?Expires=1682175987&GoogleAccessId=wandb-production%40appspot.gserviceaccount.com&Signature=iQhCG8ukz0tz8P1C3WsC%2F8T3HdbkGB8BVz342%2FBjcmthpKZikj0Q3wEJrL6BcPjbq68djFgI5bvCGx%2B5a0RfwuwnK0D9UcnpI4fl15kyNZyECnVRlSVndH7M5dMK1nNgPaMra0DVWUa2vNLTkReeq3zqV43aFdFqDhRhi2aeAuX7ZYYdCcky%2BYe5UcltoWQ%2Bkb5%2BjTC1ZemSgAzeLza3EBXR2b0D2%2B5aFk%2BJB0HDFN3Iz0KxjDa4EBLwtyCL2CqtUslHp4kZuBQtHAA8W%2BR%2FWtJoymjWu%2FnXMvVKAyNb%2BbpAuXa2vGHC%2FMKhS%2FS9IFOj0YjwE9YOkjq%2FVdt90DqAvQ%3D%3D (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1131)'))) +2023-04-21 23:06:29,689 ERROR wandb-upload_2:12140 [internal_api.py:upload_file():2101] upload_file request headers: {'User-Agent': 'python-requests/2.27.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '8664'} +2023-04-21 23:06:29,689 ERROR wandb-upload_2:12140 [internal_api.py:upload_file():2103] upload_file response body: +2023-04-21 23:06:30,884 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:06:31,690 INFO wandb-upload_2:12140 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\files\requirements.txt +2023-04-21 23:06:31,897 INFO Thread-15 :12140 [sender.py:transition_state():626] send defer: 11 +2023-04-21 23:06:31,897 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:31,898 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 23:06:31,898 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:31,898 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 23:06:31,898 INFO SenderThread:12140 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 23:06:31,898 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 12 +2023-04-21 23:06:31,898 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:31,899 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 23:06:31,899 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:31,899 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 23:06:33,534 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 13 +2023-04-21 23:06:33,534 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:33,534 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 23:06:33,534 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:33,534 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 23:06:33,534 INFO SenderThread:12140 [sender.py:transition_state():626] send defer: 14 +2023-04-21 23:06:33,535 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:06:33,535 INFO HandlerThread:12140 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 23:06:33,535 DEBUG SenderThread:12140 [sender.py:send():375] send: final +2023-04-21 23:06:33,535 DEBUG SenderThread:12140 [sender.py:send():375] send: footer +2023-04-21 23:06:33,535 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: defer +2023-04-21 23:06:33,535 INFO SenderThread:12140 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 23:06:33,536 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 23:06:33,536 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 23:06:33,536 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 23:06:33,536 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 23:06:33,536 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 23:06:33,536 DEBUG SenderThread:12140 [sender.py:send_request():402] send_request: server_info +2023-04-21 23:06:33,788 INFO MainThread:12140 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 23:06:33,803 INFO MainThread:12140 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 23:06:33,806 INFO MainThread:12140 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 23:06:33,808 DEBUG HandlerThread:12140 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 23:06:33,808 INFO HandlerThread:12140 [handler.py:finish():845] shutting down handler +2023-04-21 23:06:34,551 INFO WriterThread:12140 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\run-tqce6lr7.wandb +2023-04-21 23:06:34,799 INFO SenderThread:12140 [sender.py:finish():1550] shutting down sender +2023-04-21 23:06:34,799 INFO SenderThread:12140 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 23:06:34,799 INFO SenderThread:12140 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug.log b/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..62aa6b0e9fbe7bec31a9318b1f0d5d3dcb52f13d --- /dev/null +++ b/ptuning/wandb/run-20230421_225733-tqce6lr7/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Configure stats pid to 13408 +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\logs\debug.log +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_225733-tqce6lr7\logs\debug-internal.log +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_init.py:init():547] calling init triggers +2023-04-21 22:57:33,987 INFO MainThread:13408 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 22:57:33,988 INFO MainThread:13408 [wandb_init.py:init():595] starting backend +2023-04-21 22:57:33,988 INFO MainThread:13408 [wandb_init.py:init():599] setting up manager +2023-04-21 22:57:33,990 INFO MainThread:13408 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 22:57:33,995 INFO MainThread:13408 [wandb_init.py:init():605] backend started and connected +2023-04-21 22:57:33,996 INFO MainThread:13408 [wandb_init.py:init():695] updated telemetry +2023-04-21 22:57:34,060 INFO MainThread:13408 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 22:57:34,876 INFO MainThread:13408 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 22:57:36,737 INFO MainThread:13408 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 22:57:36,737 INFO MainThread:13408 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 22:57:37,006 INFO MainThread:13408 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 22:57:37,006 INFO MainThread:13408 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 22:57:37,006 INFO MainThread:13408 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 22:57:37,007 INFO MainThread:13408 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 22:57:37,007 INFO MainThread:13408 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 22:57:37,010 INFO MainThread:13408 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_22-57-22_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 100, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 23:06:35,692 WARNING MsgRouterThr:13408 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_225733-tqce6lr7/run-tqce6lr7.wandb b/ptuning/wandb/run-20230421_225733-tqce6lr7/run-tqce6lr7.wandb new file mode 100644 index 0000000000000000000000000000000000000000..15d911bc96452cf8138bc5d7e0a47d5f59e03ed7 Binary files /dev/null and b/ptuning/wandb/run-20230421_225733-tqce6lr7/run-tqce6lr7.wandb differ diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/config.yaml b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..ae2cc0bc0f73b10538817b64b1db24c15de43afa --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/config.yaml @@ -0,0 +1,616 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682089626.089182 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr21_23-06-54_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 10 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/output.log b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..35529a3c8917f4d10ea3f251a288511209dc5da8 --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/output.log @@ -0,0 +1,145 @@ + + 0%| | 0/1000 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\config.json +[INFO|configuration_utils.py:362] 2023-04-21 23:11:14,364 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 23:11:14,598 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 23:11:14,603 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 23:11:14,604 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\special_tokens_map.json +{'loss': 2.5933, 'learning_rate': 0.0198, 'epoch': 1.58} +Saving PrefixEncoder + + + + + + + + + 2%|▋ | 19/1000 [07:37<6:22:46, 23.41s/it] +{'loss': 0.6242, 'learning_rate': 0.0196, 'epoch': 3.17} + 2%|▊ | 20/1000 [08:00<6:21:36, 23.36s/it][INFO|configuration_utils.py:457] 2023-04-21 23:15:08,506 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\config.json +[INFO|configuration_utils.py:362] 2023-04-21 23:15:08,510 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 23:15:08,727 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 23:15:08,732 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 23:15:08,734 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\special_tokens_map.json + + + + + + + + + 3%|█ | 29/1000 [10:51<4:24:54, 16.37s/it] +{'loss': 0.3263, 'learning_rate': 0.0194, 'epoch': 4.75} + 3%|█▏ | 30/1000 [11:06<4:13:45, 15.70s/it][INFO|configuration_utils.py:457] 2023-04-21 23:18:13,918 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\config.json +[INFO|configuration_utils.py:362] 2023-04-21 23:18:13,920 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 23:18:14,122 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 23:18:14,127 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 23:18:14,128 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\special_tokens_map.json + + + + + + + + + 4%|█▌ | 40/1000 [13:28<3:57:02, 14.81s/it][INFO|configuration_utils.py:457] 2023-04-21 23:20:36,822 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\config.json +[INFO|configuration_utils.py:362] 2023-04-21 23:20:36,826 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-21 23:20:37,064 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-21 23:20:37,069 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-21 23:20:37,071 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\special_tokens_map.json +{'loss': 0.2162, 'learning_rate': 0.0192, 'epoch': 6.34} +Saving PrefixEncoder + 4%|█▌ | 41/1000 [13:55<4:52:14, 18.28s/it]Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/requirements.txt b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-metadata.json b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..6dc3918344eb151fa019b6c2034b7fc0d3403c1a --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-21T15:07:07.763382", + "startedAt": "2023-04-21T15:07:06.078182", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\chat\\train.json", + "--validation_file", + ".\\datasets\\chat\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "10", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 235.76929092407227 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-summary.json b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..6ddb2f01f58b8fe583591c36944b6e7760e84722 --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 0.2162, "train/learning_rate": 0.0192, "train/epoch": 6.34, "train/global_step": 40, "_timestamp": 1682090436.776966, "_runtime": 810.6877841949463, "_step": 3, "_wandb": {"runtime": 854}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug-internal.log b/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..5166483325e1f695e6e19903aab643792840e8ee --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug-internal.log @@ -0,0 +1,970 @@ +2023-04-21 23:07:06,088 INFO StreamThr :22944 [internal.py:wandb_internal():86] W&B internal server running at pid: 22944, started at: 2023-04-21 23:07:06.088184 +2023-04-21 23:07:06,090 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status +2023-04-21 23:07:06,090 INFO WriterThread:22944 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\run-1i2p1xzz.wandb +2023-04-21 23:07:06,094 DEBUG SenderThread:22944 [sender.py:send():375] send: header +2023-04-21 23:07:06,157 DEBUG SenderThread:22944 [sender.py:send():375] send: run +2023-04-21 23:07:06,955 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: check_version +2023-04-21 23:07:06,956 INFO SenderThread:22944 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files +2023-04-21 23:07:06,956 INFO SenderThread:22944 [sender.py:_start_run_threads():1124] run started: 1i2p1xzz with start time 1682089626.089182 +2023-04-21 23:07:06,956 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:07:06,957 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:07:06,957 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: check_version +2023-04-21 23:07:07,646 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: run_start +2023-04-21 23:07:07,693 DEBUG HandlerThread:22944 [system_info.py:__init__():31] System info init +2023-04-21 23:07:07,693 DEBUG HandlerThread:22944 [system_info.py:__init__():46] System info init done +2023-04-21 23:07:07,693 INFO HandlerThread:22944 [system_monitor.py:start():181] Starting system monitor +2023-04-21 23:07:07,694 INFO SystemMonitor:22944 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-21 23:07:07,694 INFO HandlerThread:22944 [system_monitor.py:probe():201] Collecting system info +2023-04-21 23:07:07,700 INFO SystemMonitor:22944 [interfaces.py:start():190] Started cpu monitoring +2023-04-21 23:07:07,701 INFO SystemMonitor:22944 [interfaces.py:start():190] Started disk monitoring +2023-04-21 23:07:07,701 INFO SystemMonitor:22944 [interfaces.py:start():190] Started gpu monitoring +2023-04-21 23:07:07,715 INFO SystemMonitor:22944 [interfaces.py:start():190] Started memory monitoring +2023-04-21 23:07:07,748 INFO SystemMonitor:22944 [interfaces.py:start():190] Started network monitoring +2023-04-21 23:07:07,763 DEBUG HandlerThread:22944 [system_info.py:probe():195] Probing system +2023-04-21 23:07:07,765 DEBUG HandlerThread:22944 [system_info.py:_probe_git():180] Probing git +2023-04-21 23:07:07,768 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:07,862 DEBUG HandlerThread:22944 [system_info.py:_probe_git():188] Probing git done +2023-04-21 23:07:07,862 DEBUG HandlerThread:22944 [system_info.py:probe():240] Probing system done +2023-04-21 23:07:07,862 DEBUG HandlerThread:22944 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-21T15:07:07.763382', 'startedAt': '2023-04-21T15:07:06.078182', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\chat\\train.json', '--validation_file', '.\\datasets\\chat\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '10', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 235.76929092407227}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-21 23:07:07,862 INFO HandlerThread:22944 [system_monitor.py:probe():211] Finished collecting system info +2023-04-21 23:07:07,862 INFO HandlerThread:22944 [system_monitor.py:probe():214] Publishing system info +2023-04-21 23:07:07,862 DEBUG HandlerThread:22944 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-21 23:07:07,863 DEBUG HandlerThread:22944 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-21 23:07:07,864 INFO HandlerThread:22944 [system_monitor.py:probe():216] Finished publishing system info +2023-04-21 23:07:07,876 DEBUG SenderThread:22944 [sender.py:send():375] send: files +2023-04-21 23:07:07,876 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-21 23:07:07,889 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:07:07,890 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:07:07,968 INFO Thread-16 :22944 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:07:07,968 INFO Thread-16 :22944 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-metadata.json +2023-04-21 23:07:07,969 INFO Thread-16 :22944 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\requirements.txt +2023-04-21 23:07:08,283 DEBUG SenderThread:22944 [sender.py:send():375] send: telemetry +2023-04-21 23:07:08,283 DEBUG SenderThread:22944 [sender.py:send():375] send: config +2023-04-21 23:07:08,284 DEBUG SenderThread:22944 [sender.py:send():375] send: metric +2023-04-21 23:07:08,284 DEBUG SenderThread:22944 [sender.py:send():375] send: telemetry +2023-04-21 23:07:08,285 DEBUG SenderThread:22944 [sender.py:send():375] send: metric +2023-04-21 23:07:08,285 WARNING SenderThread:22944 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-21 23:07:08,973 INFO Thread-16 :22944 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:07:09,801 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:10,978 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:07:11,033 INFO wandb-upload_0:22944 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmp75xykfhvwandb\u6pe2ft2-wandb-metadata.json +2023-04-21 23:07:11,502 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:11,845 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:11,988 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:07:13,897 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:15,933 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:16,543 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:17,966 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:20,016 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:21,595 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:22,059 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:22,903 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:07:22,904 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:07:24,099 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:26,146 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:27,182 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:28,194 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:30,238 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:32,238 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:32,283 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:34,336 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:35,333 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:07:36,373 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:37,915 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:07:37,918 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:38,466 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:38,765 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:07:39,434 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\config.yaml +2023-04-21 23:07:40,488 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:42,518 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:44,055 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:44,543 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:46,579 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:48,620 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:49,108 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:50,680 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:52,722 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:52,926 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:07:52,926 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:07:54,416 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:07:54,759 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:07:54,770 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:56,808 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:58,842 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:07:59,455 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:00,896 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:02,931 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:04,505 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:04,980 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:07,008 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:07,761 DEBUG SystemMonitor:22944 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-21 23:08:07,762 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:08:07,935 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:08:07,935 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:08:09,129 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:10,193 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:11,145 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:13,163 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:15,189 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:08:15,201 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:15,926 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:17,242 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:19,283 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:20,957 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:21,338 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:22,941 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:08:22,941 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:08:23,382 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:25,430 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:26,226 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:27,477 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:29,527 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:31,272 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:31,566 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:33,615 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:35,654 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:36,324 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:37,713 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:37,764 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:08:37,952 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:08:37,953 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:08:39,840 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:41,854 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:42,244 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:43,869 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:45,894 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:47,301 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:47,938 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:48,941 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:08:49,988 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:52,025 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:52,958 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:08:52,959 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:08:53,218 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:08:54,058 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:56,101 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:58,148 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:08:58,254 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:00,185 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:02,240 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:03,300 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:04,279 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:06,339 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:07,780 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:09:07,966 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:09:07,967 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:09:08,380 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:09,236 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:10,514 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:12,532 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:14,274 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:14,548 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:16,577 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:18,607 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:19,326 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:20,671 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:22,712 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:22,971 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:09:22,972 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:09:23,703 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:09:24,769 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:25,229 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:26,838 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:28,863 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:30,281 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:30,912 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:32,958 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:35,001 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:35,334 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:37,035 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:37,784 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:09:37,971 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:09:37,972 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:09:39,073 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:41,187 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:41,263 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:43,213 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:45,208 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:09:45,231 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:46,350 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:47,261 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:49,301 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:51,356 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:51,405 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:52,976 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:09:52,977 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:09:53,419 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:55,457 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:57,263 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:09:57,507 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:09:59,551 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:01,595 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:02,303 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:03,642 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:05,687 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:07,336 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:07,723 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:07,788 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:10:07,988 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:10:07,988 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:10:09,761 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:10:09,787 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:11,878 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:13,283 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:13,897 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:15,922 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:17,939 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:18,316 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:19,986 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:22,039 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:22,997 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:10:22,998 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:10:24,086 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:24,270 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:26,123 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:28,164 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:29,322 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:30,225 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:31,213 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:10:32,253 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:34,287 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:34,840 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:36,334 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:37,793 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:10:38,012 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:10:38,012 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:10:38,372 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:40,304 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:40,418 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:42,525 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:44,542 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:45,338 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:46,566 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:48,605 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:50,397 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:50,642 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:52,684 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:53,024 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:10:53,024 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:10:53,688 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:10:54,725 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:56,303 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:10:56,767 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:10:58,806 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:00,868 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:01,347 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:02,904 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:04,940 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:06,405 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:06,978 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:07,805 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:11:08,034 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:11:08,035 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:11:09,030 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:11,071 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:12,314 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:13,165 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:14,341 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:11:14,344 DEBUG SenderThread:22944 [sender.py:send():375] send: metric +2023-04-21 23:11:14,344 DEBUG SenderThread:22944 [sender.py:send():375] send: metric +2023-04-21 23:11:14,345 DEBUG SenderThread:22944 [sender.py:send():375] send: metric +2023-04-21 23:11:14,345 DEBUG SenderThread:22944 [sender.py:send():375] send: history +2023-04-21 23:11:14,345 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:11:14,347 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:11:15,144 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:11:15,184 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:16,150 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:11:17,196 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:11:17,204 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:17,641 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:19,249 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:21,305 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:22,682 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:23,036 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:11:23,374 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:11:23,376 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:24,342 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\config.yaml +2023-04-21 23:11:25,395 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:27,423 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:28,664 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:29,479 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:31,524 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:33,567 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:33,703 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:35,617 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:37,655 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:37,816 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:11:38,032 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:11:38,032 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:11:39,286 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:39,698 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:11:39,711 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:41,752 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:43,869 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:44,306 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:45,882 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:47,899 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:49,363 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:49,966 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:52,023 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:53,041 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:11:53,041 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:11:54,070 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:55,307 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:11:56,111 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:11:58,171 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:00,250 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:00,352 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:02,285 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:04,331 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:05,863 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:06,371 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:12:06,385 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:07,820 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:12:08,055 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:12:08,055 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:12:08,421 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:10,470 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:11,327 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:12,509 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:14,620 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:16,381 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:16,641 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:18,661 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:20,687 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:21,437 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:22,741 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:23,073 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:12:23,073 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:12:24,782 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:26,666 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:26,824 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:27,818 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:12:28,859 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:30,907 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:31,711 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:32,955 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:34,995 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:36,751 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:37,035 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:37,824 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:12:38,086 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:12:38,087 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:12:39,088 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:41,129 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:42,379 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:43,184 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:45,276 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:47,304 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:47,419 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:49,327 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:50,318 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:12:51,372 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:52,918 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:53,105 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:12:53,105 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:12:53,412 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:55,472 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:57,516 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:12:58,422 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:12:59,560 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:01,606 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:03,464 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:03,653 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:05,695 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:07,731 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:07,827 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:13:08,119 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:13:08,120 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:13:09,380 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:09,801 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:11,831 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:13,859 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:13:13,867 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:14,392 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:15,967 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:17,991 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:19,433 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:20,047 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:22,089 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:23,122 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:13:23,122 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:13:24,134 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:25,392 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:26,178 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:28,220 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:30,291 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:30,421 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:32,324 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:34,379 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:35,477 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:36,418 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:13:36,428 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:37,840 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:13:38,138 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:13:38,139 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:13:38,464 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:40,505 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:41,158 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:42,562 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:44,599 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:46,199 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:46,696 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:48,718 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:50,737 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:51,242 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:52,762 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:53,147 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:13:53,148 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:13:54,805 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:56,433 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:13:56,857 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:13:58,899 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:13:58,901 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:00,945 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:01,478 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:02,988 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:05,049 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:06,539 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:07,087 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:07,844 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:14:08,157 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:14:08,158 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:14:09,129 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:11,170 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:11,597 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:13,227 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:15,264 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:16,647 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:17,374 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:19,388 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:21,443 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:21,998 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:22,435 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:14:23,165 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:14:23,165 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:14:23,475 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:25,513 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:27,463 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:27,566 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:29,611 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:31,668 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:32,512 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:33,715 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:35,759 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:37,563 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:37,813 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:37,856 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:14:38,181 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:14:38,182 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:14:39,853 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:41,899 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:43,474 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:43,937 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:45,987 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:46,981 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:14:48,092 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:49,157 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:50,113 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:52,133 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:53,180 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:14:53,180 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:14:54,181 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:54,437 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:14:56,229 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:58,266 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:14:59,497 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:00,362 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:02,399 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:04,439 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:04,537 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:06,479 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:07,866 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:15:08,192 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:15:08,192 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:15:08,509 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:15:08,512 DEBUG SenderThread:22944 [sender.py:send():375] send: history +2023-04-21 23:15:08,513 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:15:08,513 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:15:08,515 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:15:08,523 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:09,529 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:15:09,750 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:10,555 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:15:10,567 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:12,615 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:14,651 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:14,805 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:16,702 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:18,805 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:19,858 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:20,833 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:22,850 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:23,215 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:15:23,215 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:15:24,888 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:25,480 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:26,923 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:28,973 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:30,522 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:31,021 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:33,061 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:15:33,069 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:35,114 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:36,483 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:37,153 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:37,869 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:15:38,212 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:15:38,212 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:15:39,194 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:41,237 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:41,488 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:43,277 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:45,319 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:46,538 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:47,363 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:49,462 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:51,476 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:51,601 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:53,226 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:15:53,226 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:15:53,501 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:54,498 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:15:55,551 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:56,785 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:15:57,598 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:15:59,631 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:01,696 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:01,829 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:03,738 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:05,801 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:06,882 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:07,844 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:07,877 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:16:08,248 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:16:08,249 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:16:09,894 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:11,935 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:12,557 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:13,974 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:16,021 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:18,080 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:18,474 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:19,078 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:16:20,183 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:22,207 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:23,259 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:16:23,260 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:16:23,542 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:24,238 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:26,277 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:28,318 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:28,581 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:30,378 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:32,421 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:33,637 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:34,476 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:36,522 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:37,881 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:16:38,275 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:16:38,276 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:16:38,569 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:39,576 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:40,620 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:41,615 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:16:42,675 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:44,630 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:44,719 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:46,769 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:48,805 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:49,678 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:50,907 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:52,922 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:53,299 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:16:53,300 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:16:54,942 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:55,583 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:16:56,968 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:16:59,012 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:00,673 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:01,051 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:17:01,059 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:03,096 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:05,148 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:05,705 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:07,185 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:07,889 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:17:08,315 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:17:08,316 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:17:09,233 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:11,266 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:11,606 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:13,320 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:15,364 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:16,653 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:17,403 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:19,439 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:17:19,453 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:21,541 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:22,124 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:23,329 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:17:23,330 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:17:23,555 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:25,566 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:27,612 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:27,656 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:29,652 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:31,686 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:32,952 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:33,723 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:17:33,726 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:35,782 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:37,824 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:37,893 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:17:38,331 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:17:38,331 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:17:38,588 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:39,876 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:41,914 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:43,630 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:43,958 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:46,001 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:47,001 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:17:48,044 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:48,791 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:50,091 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:52,177 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:53,339 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:17:53,340 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:17:54,200 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:54,613 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:17:56,221 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:58,245 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:17:59,655 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:00,316 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:01,307 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:02,356 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:04,405 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:04,818 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:06,450 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:07,906 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:18:08,354 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:18:08,355 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:18:08,494 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:10,530 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:10,646 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:12,576 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:13,903 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:18:13,904 DEBUG SenderThread:22944 [sender.py:send():375] send: history +2023-04-21 23:18:13,905 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:18:13,906 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:18:14,620 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:14,620 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:18:14,621 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:15,634 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:16,145 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:16,672 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:18,720 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:20,758 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:21,200 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:22,862 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:23,371 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:18:23,372 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:18:24,881 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:26,676 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:26,904 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:28,948 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:29,953 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:30,989 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:32,440 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:33,032 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:35,075 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:37,108 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:37,487 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:37,909 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:18:38,387 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:18:38,387 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:18:39,155 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:41,194 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:43,236 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:43,244 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:44,235 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:45,278 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:47,329 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:48,279 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:49,371 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:51,407 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:53,315 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:53,452 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:18:53,453 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:18:53,500 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:55,526 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:57,541 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:18:57,548 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:18:59,228 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:18:59,609 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:01,665 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:03,701 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:04,272 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:05,743 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:07,792 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:07,917 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:19:08,416 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:19:08,416 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:19:09,688 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:09,846 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:11,882 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:19:11,891 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:13,932 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:15,188 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:15,976 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:18,026 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:20,071 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:20,227 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:22,105 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:23,437 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:19:23,437 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:19:24,197 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:26,011 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:26,172 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:19:26,210 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:28,234 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:30,272 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:31,059 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:32,311 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:34,356 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:36,097 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:36,401 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:37,927 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:19:38,452 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:19:38,453 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:19:38,461 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:40,493 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:19:40,504 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:41,715 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:42,539 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:44,587 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:46,635 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:46,761 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:48,674 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:50,707 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:51,783 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:52,749 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:53,457 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:19:53,458 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:19:53,753 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:19:54,861 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:56,880 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:19:57,767 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:19:58,900 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:00,935 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:02,811 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:03,000 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:05,030 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:07,066 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:07,934 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:20:07,935 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:08,075 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:20:08,466 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:20:08,467 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:20:09,107 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:11,145 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:13,181 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:13,731 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:15,223 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:17,266 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:18,759 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:19,313 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:21,366 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:22,372 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:20:23,417 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:23,480 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:20:23,481 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:20:24,733 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:25,525 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:27,548 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:29,562 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:29,774 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:31,598 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:33,641 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:34,809 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:35,696 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:36,777 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: partial_history +2023-04-21 23:20:36,778 DEBUG SenderThread:22944 [sender.py:send():375] send: history +2023-04-21 23:20:36,778 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: summary_record +2023-04-21 23:20:36,779 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:20:37,728 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:20:37,740 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:37,934 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:20:38,480 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:20:38,480 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:20:38,730 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:20:39,779 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:20:39,797 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:40,754 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:41,840 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:43,883 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:45,809 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:45,933 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:47,984 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:50,036 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:50,858 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:52,082 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:53,493 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:20:53,494 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:20:54,130 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:56,255 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:20:56,772 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:20:58,277 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:00,302 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:01,818 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:02,327 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:04,353 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:21:04,365 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:06,404 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:07,176 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:07,940 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:21:08,448 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:08,498 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: stop_status +2023-04-21 23:21:08,498 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: stop_status +2023-04-21 23:21:10,491 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:12,528 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:12,789 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:14,571 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:16,615 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:17,831 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:18,657 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:20,701 ERROR gpu :22944 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-21 23:21:21,785 DEBUG SenderThread:22944 [sender.py:send():375] send: exit +2023-04-21 23:21:21,785 INFO SenderThread:22944 [sender.py:send_exit():598] handling exit code: 255 +2023-04-21 23:21:21,785 INFO SenderThread:22944 [sender.py:send_exit():600] handling runtime: 854 +2023-04-21 23:21:21,786 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:21:21,786 INFO SenderThread:22944 [sender.py:send_exit():606] send defer +2023-04-21 23:21:21,787 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,787 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-21 23:21:21,787 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,787 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-21 23:21:21,787 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 1 +2023-04-21 23:21:21,787 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,787 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-21 23:21:21,788 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,788 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-21 23:21:21,788 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 2 +2023-04-21 23:21:21,788 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,788 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-21 23:21:21,788 INFO HandlerThread:22944 [system_monitor.py:finish():190] Stopping system monitor +2023-04-21 23:21:21,788 DEBUG SystemMonitor:22944 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-21 23:21:21,788 INFO HandlerThread:22944 [interfaces.py:finish():202] Joined cpu monitor +2023-04-21 23:21:21,789 DEBUG SystemMonitor:22944 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-21 23:21:21,799 INFO HandlerThread:22944 [interfaces.py:finish():202] Joined disk monitor +2023-04-21 23:21:21,841 INFO HandlerThread:22944 [interfaces.py:finish():202] Joined gpu monitor +2023-04-21 23:21:21,841 INFO HandlerThread:22944 [interfaces.py:finish():202] Joined memory monitor +2023-04-21 23:21:21,841 INFO HandlerThread:22944 [interfaces.py:finish():202] Joined network monitor +2023-04-21 23:21:21,842 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,842 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-21 23:21:21,842 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 3 +2023-04-21 23:21:21,842 DEBUG SenderThread:22944 [sender.py:send():375] send: stats +2023-04-21 23:21:21,842 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,842 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-21 23:21:21,843 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,843 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-21 23:21:21,843 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 4 +2023-04-21 23:21:21,843 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,843 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-21 23:21:21,843 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,843 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-21 23:21:21,844 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 5 +2023-04-21 23:21:21,844 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,844 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-21 23:21:21,844 DEBUG SenderThread:22944 [sender.py:send():375] send: summary +2023-04-21 23:21:21,845 INFO SenderThread:22944 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-21 23:21:21,846 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,846 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-21 23:21:21,846 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 6 +2023-04-21 23:21:21,846 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,846 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-21 23:21:21,846 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,846 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-21 23:21:21,846 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 7 +2023-04-21 23:21:21,846 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:21,847 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:21,847 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-21 23:21:21,847 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:21,848 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-21 23:21:22,699 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:21:22,699 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:21:22,850 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 23:21:24,718 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:21:25,028 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 8 +2023-04-21 23:21:25,028 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 23:21:25,028 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:25,028 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-21 23:21:25,029 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:25,029 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-21 23:21:25,061 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 9 +2023-04-21 23:21:25,063 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:25,063 DEBUG SenderThread:22944 [sender.py:send():375] send: artifact +2023-04-21 23:21:25,063 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-21 23:21:25,733 INFO Thread-16 :22944 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:21:26,441 INFO SenderThread:22944 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'digest': 'ab8fa958b17d8260d4017e4943372340', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v5'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'versionIndex': 5}}, 'version': 'v5'} +2023-04-21 23:21:26,442 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:26,442 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-21 23:21:26,442 INFO SenderThread:22944 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-21 23:21:26,748 INFO SenderThread:22944 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files +2023-04-21 23:21:26,748 INFO SenderThread:22944 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\config.yaml config.yaml +2023-04-21 23:21:26,749 INFO SenderThread:22944 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log output.log +2023-04-21 23:21:26,751 INFO SenderThread:22944 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\requirements.txt requirements.txt +2023-04-21 23:21:26,753 INFO SenderThread:22944 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-metadata.json wandb-metadata.json +2023-04-21 23:21:26,753 INFO SenderThread:22944 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json wandb-summary.json +2023-04-21 23:21:26,756 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 10 +2023-04-21 23:21:26,758 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:26,758 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-21 23:21:26,760 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:26,760 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-21 23:21:26,760 INFO SenderThread:22944 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 23:21:27,904 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 23:21:30,784 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:32,157 INFO wandb-upload_3:22944 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\wandb-summary.json +2023-04-21 23:21:32,827 INFO wandb-upload_0:22944 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\config.yaml +2023-04-21 23:21:32,933 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: keepalive +2023-04-21 23:21:33,023 INFO wandb-upload_2:22944 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\requirements.txt +2023-04-21 23:21:34,949 INFO wandb-upload_1:22944 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\files\output.log +2023-04-21 23:21:35,157 INFO Thread-15 :22944 [sender.py:transition_state():626] send defer: 11 +2023-04-21 23:21:35,157 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:35,157 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-21 23:21:35,158 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:35,158 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-21 23:21:35,158 INFO SenderThread:22944 [file_pusher.py:join():173] waiting for file pusher +2023-04-21 23:21:35,158 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 12 +2023-04-21 23:21:35,158 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:35,158 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-21 23:21:35,159 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:35,159 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-21 23:21:36,260 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 13 +2023-04-21 23:21:36,260 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:36,260 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-21 23:21:36,260 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: status_report +2023-04-21 23:21:36,260 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:36,260 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-21 23:21:36,260 INFO SenderThread:22944 [sender.py:transition_state():626] send defer: 14 +2023-04-21 23:21:36,260 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: defer +2023-04-21 23:21:36,261 DEBUG SenderThread:22944 [sender.py:send():375] send: final +2023-04-21 23:21:36,261 INFO HandlerThread:22944 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-21 23:21:36,261 DEBUG SenderThread:22944 [sender.py:send():375] send: footer +2023-04-21 23:21:36,261 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: defer +2023-04-21 23:21:36,261 INFO SenderThread:22944 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-21 23:21:36,261 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-21 23:21:36,262 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: server_info +2023-04-21 23:21:36,262 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: get_summary +2023-04-21 23:21:36,262 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: poll_exit +2023-04-21 23:21:36,262 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-21 23:21:36,262 DEBUG SenderThread:22944 [sender.py:send_request():402] send_request: server_info +2023-04-21 23:21:36,509 INFO MainThread:22944 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-21 23:21:36,509 INFO MainThread:22944 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-21 23:21:36,512 INFO MainThread:22944 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-21 23:21:36,513 DEBUG HandlerThread:22944 [handler.py:handle_request():144] handle_request: shutdown +2023-04-21 23:21:36,513 INFO HandlerThread:22944 [handler.py:finish():845] shutting down handler +2023-04-21 23:21:37,278 INFO WriterThread:22944 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\run-1i2p1xzz.wandb +2023-04-21 23:21:37,522 INFO SenderThread:22944 [sender.py:finish():1550] shutting down sender +2023-04-21 23:21:37,522 INFO SenderThread:22944 [file_pusher.py:finish():168] shutting down file pusher +2023-04-21 23:21:37,522 INFO SenderThread:22944 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug.log b/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..7e53afc6adaa0c83043c9d596264e8351ee0326a --- /dev/null +++ b/ptuning/wandb/run-20230421_230706-1i2p1xzz/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Configure stats pid to 8532 +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-21 23:07:06,081 INFO MainThread:8532 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\logs\debug.log +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230421_230706-1i2p1xzz\logs\debug-internal.log +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:init():547] calling init triggers +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:init():595] starting backend +2023-04-21 23:07:06,082 INFO MainThread:8532 [wandb_init.py:init():599] setting up manager +2023-04-21 23:07:06,085 INFO MainThread:8532 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-21 23:07:06,088 INFO MainThread:8532 [wandb_init.py:init():605] backend started and connected +2023-04-21 23:07:06,090 INFO MainThread:8532 [wandb_init.py:init():695] updated telemetry +2023-04-21 23:07:06,157 INFO MainThread:8532 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-21 23:07:06,955 INFO MainThread:8532 [wandb_run.py:_on_init():2176] communicating current version +2023-04-21 23:07:07,639 INFO MainThread:8532 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-21 23:07:07,639 INFO MainThread:8532 [wandb_init.py:init():782] starting run threads in backend +2023-04-21 23:07:07,890 INFO MainThread:8532 [wandb_run.py:_console_start():2157] atexit reg +2023-04-21 23:07:07,890 INFO MainThread:8532 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-21 23:07:07,891 INFO MainThread:8532 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-21 23:07:07,891 INFO MainThread:8532 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-21 23:07:07,892 INFO MainThread:8532 [wandb_init.py:init():824] run started, returning control to user process +2023-04-21 23:07:07,894 INFO MainThread:8532 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr21_23-06-54_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-21 23:21:38,433 WARNING MsgRouterThr:8532 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230421_230706-1i2p1xzz/run-1i2p1xzz.wandb b/ptuning/wandb/run-20230421_230706-1i2p1xzz/run-1i2p1xzz.wandb new file mode 100644 index 0000000000000000000000000000000000000000..7b6cf3e1d907aa23d54a6a067256e7166a444ef1 Binary files /dev/null and b/ptuning/wandb/run-20230421_230706-1i2p1xzz/run-1i2p1xzz.wandb differ diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/files/config.yaml b/ptuning/wandb/run-20230422_155255-30hvteby/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..0a2668630b14cad4bd1aabca206991078b81b108 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/files/config.yaml @@ -0,0 +1,604 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682149975.535061 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr22_15-52-40_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 10 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/files/output.log b/ptuning/wandb/run-20230422_155255-30hvteby/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..e424f5a5efd43272f8e2b23fe9774590bec0d992 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/files/output.log @@ -0,0 +1,132 @@ + + 0%| | 0/1000 [00:00 + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2647, in training_step + loss = self.compute_loss(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2679, in compute_loss + outputs = model(**inputs) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1191, in forward + transformer_outputs = self.transformer( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 986, in forward + layer_ret = torch.utils.checkpoint.checkpoint( + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 249, in checkpoint + return CheckpointFunction.apply(function, preserve, *args) + File "D:\Program\Python38\lib\site-packages\torch\autograd\function.py", line 506, in apply + return super().apply(*args, **kwargs) # type: ignore[misc] + File "D:\Program\Python38\lib\site-packages\torch\utils\checkpoint.py", line 107, in forward + outputs = run_function(*args) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 627, in forward + attention_outputs = self.attention( + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 460, in forward + cos, sin = self.rotary_emb(q1, seq_len=position_ids.max() + 1) + File "D:\Program\Python38\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl + return forward_call(*args, **kwargs) + File "C:\Users\Lenovo/.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 201, in forward + if self.max_seq_len_cached is None or (seq_len > self.max_seq_len_cached): +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/files/requirements.txt b/ptuning/wandb/run-20230422_155255-30hvteby/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-metadata.json b/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..960370945d3a39e0b6061ac536fc5bd67f8521d8 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-22T07:52:57.094978", + "startedAt": "2023-04-22T07:52:55.495061", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\Zettels\\train.json", + "--validation_file", + ".\\datasets\\Zettels\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "10", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 239.2071533203125 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-summary.json b/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..29d260f9f40d6ff7b646f91dd9757eaf0d310965 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/files/wandb-summary.json @@ -0,0 +1 @@ +{"_wandb": {"runtime": 16}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug-internal.log b/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..f849c2c7d09252a4cb622796fd0fefab285adb8c --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug-internal.log @@ -0,0 +1,193 @@ +2023-04-22 15:52:55,534 INFO StreamThr :33684 [internal.py:wandb_internal():86] W&B internal server running at pid: 33684, started at: 2023-04-22 15:52:55.534078 +2023-04-22 15:52:55,536 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: status +2023-04-22 15:52:55,537 INFO WriterThread:33684 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\run-30hvteby.wandb +2023-04-22 15:52:55,543 DEBUG SenderThread:33684 [sender.py:send():375] send: header +2023-04-22 15:52:55,606 DEBUG SenderThread:33684 [sender.py:send():375] send: run +2023-04-22 15:52:56,299 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: check_version +2023-04-22 15:52:56,301 INFO SenderThread:33684 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files +2023-04-22 15:52:56,301 INFO SenderThread:33684 [sender.py:_start_run_threads():1124] run started: 30hvteby with start time 1682149975.535061 +2023-04-22 15:52:56,301 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: summary_record +2023-04-22 15:52:56,302 INFO SenderThread:33684 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 15:52:56,302 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: check_version +2023-04-22 15:52:56,959 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: run_start +2023-04-22 15:52:57,024 DEBUG HandlerThread:33684 [system_info.py:__init__():31] System info init +2023-04-22 15:52:57,024 DEBUG HandlerThread:33684 [system_info.py:__init__():46] System info init done +2023-04-22 15:52:57,024 INFO HandlerThread:33684 [system_monitor.py:start():181] Starting system monitor +2023-04-22 15:52:57,025 INFO SystemMonitor:33684 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-22 15:52:57,025 INFO HandlerThread:33684 [system_monitor.py:probe():201] Collecting system info +2023-04-22 15:52:57,032 INFO SystemMonitor:33684 [interfaces.py:start():190] Started cpu monitoring +2023-04-22 15:52:57,033 INFO SystemMonitor:33684 [interfaces.py:start():190] Started disk monitoring +2023-04-22 15:52:57,033 INFO SystemMonitor:33684 [interfaces.py:start():190] Started gpu monitoring +2023-04-22 15:52:57,045 INFO SystemMonitor:33684 [interfaces.py:start():190] Started memory monitoring +2023-04-22 15:52:57,079 INFO SystemMonitor:33684 [interfaces.py:start():190] Started network monitoring +2023-04-22 15:52:57,094 DEBUG HandlerThread:33684 [system_info.py:probe():195] Probing system +2023-04-22 15:52:57,097 DEBUG HandlerThread:33684 [system_info.py:_probe_git():180] Probing git +2023-04-22 15:52:57,110 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:52:57,198 DEBUG HandlerThread:33684 [system_info.py:_probe_git():188] Probing git done +2023-04-22 15:52:57,198 DEBUG HandlerThread:33684 [system_info.py:probe():240] Probing system done +2023-04-22 15:52:57,198 DEBUG HandlerThread:33684 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-22T07:52:57.094978', 'startedAt': '2023-04-22T07:52:55.495061', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\Zettels\\train.json', '--validation_file', '.\\datasets\\Zettels\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '10', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 239.2071533203125}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-22 15:52:57,199 INFO HandlerThread:33684 [system_monitor.py:probe():211] Finished collecting system info +2023-04-22 15:52:57,199 INFO HandlerThread:33684 [system_monitor.py:probe():214] Publishing system info +2023-04-22 15:52:57,199 DEBUG HandlerThread:33684 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-22 15:52:57,200 DEBUG HandlerThread:33684 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-22 15:52:57,201 INFO HandlerThread:33684 [system_monitor.py:probe():216] Finished publishing system info +2023-04-22 15:52:57,216 DEBUG SenderThread:33684 [sender.py:send():375] send: files +2023-04-22 15:52:57,216 INFO SenderThread:33684 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-22 15:52:57,231 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:52:57,231 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:52:57,317 INFO Thread-16 :33684 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\requirements.txt +2023-04-22 15:52:57,317 INFO Thread-16 :33684 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-summary.json +2023-04-22 15:52:57,318 INFO Thread-16 :33684 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-metadata.json +2023-04-22 15:52:57,611 DEBUG SenderThread:33684 [sender.py:send():375] send: telemetry +2023-04-22 15:52:57,611 DEBUG SenderThread:33684 [sender.py:send():375] send: config +2023-04-22 15:52:57,613 DEBUG SenderThread:33684 [sender.py:send():375] send: metric +2023-04-22 15:52:57,613 DEBUG SenderThread:33684 [sender.py:send():375] send: telemetry +2023-04-22 15:52:57,613 DEBUG SenderThread:33684 [sender.py:send():375] send: metric +2023-04-22 15:52:57,613 WARNING SenderThread:33684 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-22 15:52:58,319 INFO Thread-16 :33684 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:52:58,483 INFO wandb-upload_0:33684 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpskkixiipwandb\wfodv4i9-wandb-metadata.json +2023-04-22 15:52:59,131 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:00,331 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:53:00,960 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:53:01,185 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:02,355 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:53:03,221 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:05,272 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:06,005 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:53:07,319 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:09,369 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:11,046 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:53:11,424 ERROR gpu :33684 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:53:12,250 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:53:12,251 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:53:12,989 DEBUG SenderThread:33684 [sender.py:send():375] send: exit +2023-04-22 15:53:12,989 INFO SenderThread:33684 [sender.py:send_exit():598] handling exit code: 255 +2023-04-22 15:53:12,989 INFO SenderThread:33684 [sender.py:send_exit():600] handling runtime: 16 +2023-04-22 15:53:12,990 INFO SenderThread:33684 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 15:53:12,991 INFO SenderThread:33684 [sender.py:send_exit():606] send defer +2023-04-22 15:53:12,991 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:12,991 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-22 15:53:12,992 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:12,992 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-22 15:53:12,992 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 1 +2023-04-22 15:53:12,992 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:12,992 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-22 15:53:12,992 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:12,992 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-22 15:53:12,992 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 2 +2023-04-22 15:53:12,992 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:12,993 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-22 15:53:12,993 INFO HandlerThread:33684 [system_monitor.py:finish():190] Stopping system monitor +2023-04-22 15:53:12,993 DEBUG SystemMonitor:33684 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-22 15:53:12,993 INFO HandlerThread:33684 [interfaces.py:finish():202] Joined cpu monitor +2023-04-22 15:53:12,993 DEBUG SystemMonitor:33684 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-22 15:53:12,993 INFO HandlerThread:33684 [interfaces.py:finish():202] Joined disk monitor +2023-04-22 15:53:12,993 DEBUG SystemMonitor:33684 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-22 15:53:13,070 INFO HandlerThread:33684 [interfaces.py:finish():202] Joined gpu monitor +2023-04-22 15:53:13,070 INFO HandlerThread:33684 [interfaces.py:finish():202] Joined memory monitor +2023-04-22 15:53:13,070 INFO HandlerThread:33684 [interfaces.py:finish():202] Joined network monitor +2023-04-22 15:53:13,071 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,071 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-22 15:53:13,071 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 3 +2023-04-22 15:53:13,071 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:13,071 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-22 15:53:13,073 DEBUG SenderThread:33684 [sender.py:send():375] send: stats +2023-04-22 15:53:13,074 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,074 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-22 15:53:13,074 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 4 +2023-04-22 15:53:13,074 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:13,074 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-22 15:53:13,074 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,074 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-22 15:53:13,074 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 5 +2023-04-22 15:53:13,074 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:13,074 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-22 15:53:13,075 DEBUG SenderThread:33684 [sender.py:send():375] send: summary +2023-04-22 15:53:13,075 INFO SenderThread:33684 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 15:53:13,076 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,076 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-22 15:53:13,076 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 6 +2023-04-22 15:53:13,076 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:13,076 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-22 15:53:13,076 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,076 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-22 15:53:13,080 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:53:13,427 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-summary.json +2023-04-22 15:53:13,706 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 7 +2023-04-22 15:53:13,706 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:13,706 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-22 15:53:13,706 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:13,706 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-22 15:53:14,056 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 15:53:14,433 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\config.yaml +2023-04-22 15:53:14,433 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:53:16,166 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 8 +2023-04-22 15:53:16,166 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 15:53:16,166 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:16,166 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-22 15:53:16,167 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:16,167 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-22 15:53:16,180 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 9 +2023-04-22 15:53:16,181 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:16,181 DEBUG SenderThread:33684 [sender.py:send():375] send: artifact +2023-04-22 15:53:16,181 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-22 15:53:16,461 INFO Thread-16 :33684 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:53:17,602 INFO SenderThread:33684 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5NzAxNTEw', 'digest': '1077e319ad39e537c1ccc8f9a5c233bc', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v4'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'versionIndex': 5}}, 'version': 'v4'} +2023-04-22 15:53:17,602 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:17,602 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-22 15:53:17,603 INFO SenderThread:33684 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-22 15:53:18,483 INFO SenderThread:33684 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files +2023-04-22 15:53:18,483 INFO SenderThread:33684 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\config.yaml config.yaml +2023-04-22 15:53:18,484 INFO SenderThread:33684 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log output.log +2023-04-22 15:53:18,486 INFO SenderThread:33684 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\requirements.txt requirements.txt +2023-04-22 15:53:18,489 INFO SenderThread:33684 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-metadata.json wandb-metadata.json +2023-04-22 15:53:18,489 INFO SenderThread:33684 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-summary.json wandb-summary.json +2023-04-22 15:53:18,492 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 10 +2023-04-22 15:53:18,492 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:18,492 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-22 15:53:18,494 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:18,494 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-22 15:53:18,495 INFO SenderThread:33684 [file_pusher.py:finish():168] shutting down file pusher +2023-04-22 15:53:19,107 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: keepalive +2023-04-22 15:53:19,301 INFO wandb-upload_1:33684 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\output.log +2023-04-22 15:53:19,632 INFO wandb-upload_0:33684 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\config.yaml +2023-04-22 15:53:19,821 INFO wandb-upload_2:33684 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\requirements.txt +2023-04-22 15:53:20,243 INFO wandb-upload_3:33684 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\files\wandb-summary.json +2023-04-22 15:53:20,445 INFO Thread-15 :33684 [sender.py:transition_state():626] send defer: 11 +2023-04-22 15:53:20,445 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:20,445 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-22 15:53:20,446 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:20,446 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-22 15:53:20,446 INFO SenderThread:33684 [file_pusher.py:join():173] waiting for file pusher +2023-04-22 15:53:20,446 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 12 +2023-04-22 15:53:20,446 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:20,446 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-22 15:53:20,446 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:20,446 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-22 15:53:20,872 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 13 +2023-04-22 15:53:20,872 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:20,872 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-22 15:53:20,872 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:20,872 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-22 15:53:20,872 INFO SenderThread:33684 [sender.py:transition_state():626] send defer: 14 +2023-04-22 15:53:20,873 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: defer +2023-04-22 15:53:20,873 DEBUG SenderThread:33684 [sender.py:send():375] send: final +2023-04-22 15:53:20,873 INFO HandlerThread:33684 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-22 15:53:20,873 DEBUG SenderThread:33684 [sender.py:send():375] send: footer +2023-04-22 15:53:20,873 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: defer +2023-04-22 15:53:20,873 INFO SenderThread:33684 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-22 15:53:20,874 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 15:53:20,874 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: server_info +2023-04-22 15:53:20,874 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: get_summary +2023-04-22 15:53:20,875 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 15:53:20,875 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-22 15:53:20,875 DEBUG SenderThread:33684 [sender.py:send_request():402] send_request: server_info +2023-04-22 15:53:21,122 INFO MainThread:33684 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-22 15:53:21,123 INFO MainThread:33684 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-22 15:53:21,123 INFO MainThread:33684 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-22 15:53:21,125 DEBUG HandlerThread:33684 [handler.py:handle_request():144] handle_request: shutdown +2023-04-22 15:53:21,125 INFO HandlerThread:33684 [handler.py:finish():845] shutting down handler +2023-04-22 15:53:21,881 INFO WriterThread:33684 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\run-30hvteby.wandb +2023-04-22 15:53:22,129 INFO SenderThread:33684 [sender.py:finish():1550] shutting down sender +2023-04-22 15:53:22,129 INFO SenderThread:33684 [file_pusher.py:finish():168] shutting down file pusher +2023-04-22 15:53:22,129 INFO SenderThread:33684 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug.log b/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..9fe6081c16c072e4b5ca480008cb50f17edee777 --- /dev/null +++ b/ptuning/wandb/run-20230422_155255-30hvteby/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-22 15:52:55,525 INFO MainThread:18056 [wandb_setup.py:_flush():76] Configure stats pid to 18056 +2023-04-22 15:52:55,525 INFO MainThread:18056 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\logs\debug.log +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155255-30hvteby\logs\debug-internal.log +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:init():547] calling init triggers +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:init():595] starting backend +2023-04-22 15:52:55,526 INFO MainThread:18056 [wandb_init.py:init():599] setting up manager +2023-04-22 15:52:55,529 INFO MainThread:18056 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-22 15:52:55,535 INFO MainThread:18056 [wandb_init.py:init():605] backend started and connected +2023-04-22 15:52:55,536 INFO MainThread:18056 [wandb_init.py:init():695] updated telemetry +2023-04-22 15:52:55,606 INFO MainThread:18056 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-22 15:52:56,298 INFO MainThread:18056 [wandb_run.py:_on_init():2176] communicating current version +2023-04-22 15:52:56,951 INFO MainThread:18056 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-22 15:52:56,951 INFO MainThread:18056 [wandb_init.py:init():782] starting run threads in backend +2023-04-22 15:52:57,231 INFO MainThread:18056 [wandb_run.py:_console_start():2157] atexit reg +2023-04-22 15:52:57,231 INFO MainThread:18056 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-22 15:52:57,231 INFO MainThread:18056 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-22 15:52:57,231 INFO MainThread:18056 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-22 15:52:57,232 INFO MainThread:18056 [wandb_init.py:init():824] run started, returning control to user process +2023-04-22 15:52:57,235 INFO MainThread:18056 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr22_15-52-40_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-22 15:53:23,042 WARNING MsgRouterThr:18056 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230422_155255-30hvteby/run-30hvteby.wandb b/ptuning/wandb/run-20230422_155255-30hvteby/run-30hvteby.wandb new file mode 100644 index 0000000000000000000000000000000000000000..a698ea47265d6a8876971f4c0d27f27ba03bfa77 Binary files /dev/null and b/ptuning/wandb/run-20230422_155255-30hvteby/run-30hvteby.wandb differ diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/config.yaml b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..ba7ef1142be6a2920bb405c86e74493fd9a42cc1 --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/config.yaml @@ -0,0 +1,616 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682150047.057658 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr22_15-53-54_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 10 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/output.log b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..a50e1f7d8706115aee099b5dd0880a11f798d03f --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/output.log @@ -0,0 +1,160 @@ + + 0%| | 0/1000 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\config.json +[INFO|configuration_utils.py:362] 2023-04-22 15:58:29,416 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-22 15:58:29,657 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-22 15:58:29,662 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-22 15:58:29,664 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\special_tokens_map.json + + + + + + + + + 2%|█▉ | 19/1000 [08:55<8:52:26, 32.56s/it] +{'loss': 1.6609, 'learning_rate': 0.0196, 'epoch': 16.84} + 2%|██ | 20/1000 [09:09<7:20:43, 26.98s/it][INFO|configuration_utils.py:457] 2023-04-22 16:03:18,188 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\config.json +[INFO|configuration_utils.py:362] 2023-04-22 16:03:18,191 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-22 16:03:18,399 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-22 16:03:18,403 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-22 16:03:18,405 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\special_tokens_map.json + + + + + + + + + 3%|██▉ | 29/1000 [12:01<4:03:58, 15.08s/it] +{'loss': 0.38, 'learning_rate': 0.0194, 'epoch': 25.26} + 3%|███ | 30/1000 [12:14<3:54:04, 14.48s/it][INFO|configuration_utils.py:457] 2023-04-22 16:06:23,084 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\config.json +[INFO|configuration_utils.py:362] 2023-04-22 16:06:23,086 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-22 16:06:23,292 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-22 16:06:23,296 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-22 16:06:23,296 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\special_tokens_map.json + + + + + + + + + 4%|████ | 39/1000 [14:51<4:40:29, 17.51s/it] +{'loss': 0.0535, 'learning_rate': 0.0192, 'epoch': 33.68} + 4%|████ | 40/1000 [15:11<4:49:19, 18.08s/it][INFO|configuration_utils.py:457] 2023-04-22 16:09:20,023 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\config.json +[INFO|configuration_utils.py:362] 2023-04-22 16:09:20,027 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-22 16:09:20,233 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-22 16:09:20,237 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-22 16:09:20,238 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\special_tokens_map.json + + + + + + + + + 5%|█████▏ | 50/1000 [18:38<5:22:50, 20.39s/it][INFO|configuration_utils.py:457] 2023-04-22 16:12:47,553 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\config.json +[INFO|configuration_utils.py:362] 2023-04-22 16:12:47,556 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-22 16:12:47,773 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-22 16:12:47,780 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-22 16:12:47,781 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\special_tokens_map.json +{'loss': 0.0295, 'learning_rate': 0.019, 'epoch': 42.11} +Saving PrefixEncoder + 5%|█████▎ | 51/1000 [18:59<5:26:08, 20.62s/it]Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt +Error in sys.excepthook: +Traceback (most recent call last): + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1694, in print + extend(render(renderable, render_options)) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\constrain.py", line 29, in __rich_console__ + yield from console.render(self.renderable, child_options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\panel.py", line 220, in __rich_console__ + lines = console.render_lines(renderable, child_options, style=style) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\padding.py", line 97, in __rich_console__ + lines = console.render_lines( + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1366, in render_lines + lines = list( + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 292, in split_and_crop_lines + for segment in segments: + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1330, in render + yield from self.render(render_output, _options) + File "D:\Program\Python38\lib\site-packages\rich\console.py", line 1326, in render + for render_output in iter_render: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 609, in __rich_console__ + segments = Segments(self._get_syntax(console, options)) + File "D:\Program\Python38\lib\site-packages\rich\segment.py", line 668, in __init__ + self.segments = list(segments) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 637, in _get_syntax + text = self.highlight(processed_code, self.line_range) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 509, in highlight + text.append_tokens(tokens_to_spans()) + File "D:\Program\Python38\lib\site-packages\rich\text.py", line 995, in append_tokens + for content, style in tokens: + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 497, in tokens_to_spans + _token_type, token = next(tokens) + File "D:\Program\Python38\lib\site-packages\rich\syntax.py", line 484, in line_tokenize + for token_type, token in lexer.get_tokens(code): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 190, in streamer + for _, t, v in self.get_tokens_unprocessed(text): + File "D:\Program\Python38\lib\site-packages\pygments\lexer.py", line 632, in get_tokens_unprocessed + m = rexmatch(text, pos) +KeyboardInterrupt +Original exception was: +Traceback (most recent call last): + File "main.py", line 444, in + main() + File "main.py", line 383, in main + train_result = trainer.train(resume_from_checkpoint=checkpoint) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1635, in train + return inner_training_loop( + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 1904, in _inner_training_loop + tr_loss_step = self.training_step(model, inputs) + File "E:\Documents\Desktop\ChatGLM-6B\ptuning\trainer.py", line 2665, in training_step + loss.backward() + File "D:\Program\Python38\lib\site-packages\torch\_tensor.py", line 487, in backward + torch.autograd.backward( + File "D:\Program\Python38\lib\site-packages\torch\autograd\__init__.py", line 200, in backward + Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass +KeyboardInterrupt \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/requirements.txt b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-metadata.json b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..b1b884e168385bc198a5600933a3049f56bdcd5e --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-22T07:54:08.595276", + "startedAt": "2023-04-22T07:54:07.044659", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\Zettels\\train.json", + "--validation_file", + ".\\datasets\\Zettels\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "10", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 239.2072868347168 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-summary.json b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..69cdd0ec98e84b725760bb81e9e5c4c71115ee45 --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 0.0295, "train/learning_rate": 0.019, "train/epoch": 42.11, "train/global_step": 50, "_timestamp": 1682151167.5252383, "_runtime": 1120.467580318451, "_step": 4, "_wandb": {"runtime": 1155}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug-internal.log b/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..c26dbd11db9fd57dd28f68b6acb38a91bc63e38b --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug-internal.log @@ -0,0 +1,1262 @@ +2023-04-22 15:54:07,056 INFO StreamThr :11636 [internal.py:wandb_internal():86] W&B internal server running at pid: 11636, started at: 2023-04-22 15:54:07.056656 +2023-04-22 15:54:07,058 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status +2023-04-22 15:54:07,059 INFO WriterThread:11636 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\run-eyf9ltgy.wandb +2023-04-22 15:54:07,063 DEBUG SenderThread:11636 [sender.py:send():375] send: header +2023-04-22 15:54:07,138 DEBUG SenderThread:11636 [sender.py:send():375] send: run +2023-04-22 15:54:07,899 INFO SenderThread:11636 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files +2023-04-22 15:54:07,899 INFO SenderThread:11636 [sender.py:_start_run_threads():1124] run started: eyf9ltgy with start time 1682150047.057658 +2023-04-22 15:54:07,899 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 15:54:07,900 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 15:54:07,901 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: check_version +2023-04-22 15:54:07,901 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: check_version +2023-04-22 15:54:08,478 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: run_start +2023-04-22 15:54:08,526 DEBUG HandlerThread:11636 [system_info.py:__init__():31] System info init +2023-04-22 15:54:08,526 DEBUG HandlerThread:11636 [system_info.py:__init__():46] System info init done +2023-04-22 15:54:08,526 INFO HandlerThread:11636 [system_monitor.py:start():181] Starting system monitor +2023-04-22 15:54:08,526 INFO SystemMonitor:11636 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-22 15:54:08,527 INFO HandlerThread:11636 [system_monitor.py:probe():201] Collecting system info +2023-04-22 15:54:08,533 INFO SystemMonitor:11636 [interfaces.py:start():190] Started cpu monitoring +2023-04-22 15:54:08,534 INFO SystemMonitor:11636 [interfaces.py:start():190] Started disk monitoring +2023-04-22 15:54:08,535 INFO SystemMonitor:11636 [interfaces.py:start():190] Started gpu monitoring +2023-04-22 15:54:08,550 INFO SystemMonitor:11636 [interfaces.py:start():190] Started memory monitoring +2023-04-22 15:54:08,571 INFO SystemMonitor:11636 [interfaces.py:start():190] Started network monitoring +2023-04-22 15:54:08,595 DEBUG HandlerThread:11636 [system_info.py:probe():195] Probing system +2023-04-22 15:54:08,597 DEBUG HandlerThread:11636 [system_info.py:_probe_git():180] Probing git +2023-04-22 15:54:08,599 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:08,699 DEBUG HandlerThread:11636 [system_info.py:_probe_git():188] Probing git done +2023-04-22 15:54:08,699 DEBUG HandlerThread:11636 [system_info.py:probe():240] Probing system done +2023-04-22 15:54:08,699 DEBUG HandlerThread:11636 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-22T07:54:08.595276', 'startedAt': '2023-04-22T07:54:07.044659', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\Zettels\\train.json', '--validation_file', '.\\datasets\\Zettels\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '10', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 239.2072868347168}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-22 15:54:08,699 INFO HandlerThread:11636 [system_monitor.py:probe():211] Finished collecting system info +2023-04-22 15:54:08,699 INFO HandlerThread:11636 [system_monitor.py:probe():214] Publishing system info +2023-04-22 15:54:08,699 DEBUG HandlerThread:11636 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-22 15:54:08,700 DEBUG HandlerThread:11636 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-22 15:54:08,701 INFO HandlerThread:11636 [system_monitor.py:probe():216] Finished publishing system info +2023-04-22 15:54:08,714 DEBUG SenderThread:11636 [sender.py:send():375] send: files +2023-04-22 15:54:08,715 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-22 15:54:08,727 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:54:08,728 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:54:08,910 INFO Thread-16 :11636 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 15:54:08,910 INFO Thread-16 :11636 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-metadata.json +2023-04-22 15:54:08,910 INFO Thread-16 :11636 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\requirements.txt +2023-04-22 15:54:09,119 DEBUG SenderThread:11636 [sender.py:send():375] send: telemetry +2023-04-22 15:54:09,119 DEBUG SenderThread:11636 [sender.py:send():375] send: config +2023-04-22 15:54:09,119 DEBUG SenderThread:11636 [sender.py:send():375] send: metric +2023-04-22 15:54:09,120 DEBUG SenderThread:11636 [sender.py:send():375] send: telemetry +2023-04-22 15:54:09,120 DEBUG SenderThread:11636 [sender.py:send():375] send: metric +2023-04-22 15:54:09,120 WARNING SenderThread:11636 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-22 15:54:09,823 INFO wandb-upload_0:11636 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpx9u2_pv0wandb\ww8fxgse-wandb-metadata.json +2023-04-22 15:54:09,921 INFO Thread-16 :11636 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:10,654 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:11,940 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:12,411 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:12,711 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:12,956 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:14,746 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:16,794 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:17,450 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:18,838 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:20,881 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:22,492 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:22,927 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:23,745 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:54:23,745 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:54:24,976 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:26,086 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:27,014 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:27,811 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:29,063 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:31,106 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:32,848 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:33,142 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:35,190 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:37,236 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:38,757 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:54:38,758 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:54:39,016 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:39,355 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:40,301 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:40,302 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\config.yaml +2023-04-22 15:54:41,368 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:43,394 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:44,504 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:45,445 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:47,480 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:49,523 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:49,545 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:51,570 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:54:51,574 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:53,627 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:53,764 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:54:53,765 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:54:55,039 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:54:55,671 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:57,707 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:54:59,749 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:00,106 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:01,799 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:03,840 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:05,162 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:05,884 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:07,928 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:55:07,933 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:08,586 DEBUG SystemMonitor:11636 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-22 15:55:08,587 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:55:08,772 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:55:08,772 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:55:10,040 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:11,040 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:12,052 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:14,102 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:16,088 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:16,162 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:18,239 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:20,262 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:55:20,277 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:21,663 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:22,316 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:23,789 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:55:23,790 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:55:24,360 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:26,423 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:27,077 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:28,473 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:30,529 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:32,109 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:32,574 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:34,622 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:55:34,638 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:36,671 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:37,309 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:38,601 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:55:38,712 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:38,803 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:55:38,803 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:55:40,804 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:42,821 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:43,084 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:44,862 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:46,899 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:48,134 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:48,944 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:50,986 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:53,021 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:53,170 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:53,826 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:55:53,826 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:55:55,084 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:57,121 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:55:59,154 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:55:59,163 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:01,200 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:03,235 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:04,171 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:05,271 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:06,275 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:56:07,325 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:08,614 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:56:08,836 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:56:08,836 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:56:09,375 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:10,080 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:11,479 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:13,512 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:15,131 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:15,556 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:17,596 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:19,647 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:20,191 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:21,710 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:23,756 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:23,841 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:56:23,841 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:56:25,813 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:26,104 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:27,848 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:29,900 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:31,139 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:31,948 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:33,983 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:36,026 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:36,182 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:38,101 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:38,627 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:56:38,843 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:56:38,844 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:56:40,140 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:42,192 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:42,266 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:44,293 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:46,313 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:47,294 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:48,365 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:50,419 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:52,340 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:52,470 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:54,000 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:56:54,002 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:56:54,554 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:56,601 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:56:57,384 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:56:58,682 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:00,763 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:02,449 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:02,820 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:04,875 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:06,920 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:07,500 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:08,632 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:57:08,955 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:57:08,956 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:57:08,971 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:11,006 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:13,105 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:13,232 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:15,118 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:17,144 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:18,551 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:18,663 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:57:19,178 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:21,226 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:23,270 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:23,607 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:23,961 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:57:23,962 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:57:25,347 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:27,392 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:29,257 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:29,440 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:31,495 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:33,541 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:34,311 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:35,609 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:37,665 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:38,633 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:57:38,984 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:57:38,985 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:57:39,717 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:40,240 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:41,780 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:43,900 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:45,276 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:45,918 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:47,959 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:50,010 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:50,319 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:52,066 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:53,984 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:57:53,985 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:57:54,106 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:56,152 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:57:56,226 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:57:58,203 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:00,248 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:01,266 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:02,316 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:04,366 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:06,333 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:06,413 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:08,458 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:08,642 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:58:08,997 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:58:08,998 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:58:10,508 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:12,256 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:12,564 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:14,665 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:16,651 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:58:16,687 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:17,728 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:18,753 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:20,789 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:22,792 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:22,847 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:24,003 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:58:24,003 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:58:24,905 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:26,952 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:28,285 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:28,996 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:29,399 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: partial_history +2023-04-22 15:58:29,400 DEBUG SenderThread:11636 [sender.py:send():375] send: metric +2023-04-22 15:58:29,401 DEBUG SenderThread:11636 [sender.py:send():375] send: metric +2023-04-22 15:58:29,401 DEBUG SenderThread:11636 [sender.py:send():375] send: metric +2023-04-22 15:58:29,401 DEBUG SenderThread:11636 [sender.py:send():375] send: history +2023-04-22 15:58:29,401 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 15:58:29,402 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 15:58:30,029 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:58:30,035 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 15:58:31,261 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:58:31,282 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:33,321 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:33,708 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:35,367 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:37,404 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:38,651 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:58:39,006 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:58:39,007 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:58:39,248 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:39,449 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:41,496 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:43,545 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:45,102 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:45,645 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:46,600 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:58:47,665 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:49,720 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:50,146 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:51,773 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:53,866 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:54,014 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:58:54,016 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:58:55,289 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:58:55,914 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:58:56,924 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\config.yaml +2023-04-22 15:58:57,968 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:00,006 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:01,011 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:59:01,773 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:02,059 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:04,114 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:06,148 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:06,811 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:08,192 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:08,658 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:59:09,031 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:59:09,031 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:59:10,234 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:12,049 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:12,283 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:13,285 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:59:14,331 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:16,453 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:17,089 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:18,470 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:20,494 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:22,143 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:22,520 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:24,032 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:59:24,032 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:59:24,566 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:26,610 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:27,324 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:28,676 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:29,667 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 15:59:30,728 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:32,372 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:32,764 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:34,816 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:36,860 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:37,421 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:38,662 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 15:59:38,916 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:39,047 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:59:39,047 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:59:40,964 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:43,001 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:43,325 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:45,048 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:47,175 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:48,374 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:49,195 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:51,213 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:53,260 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:53,416 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:54,060 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 15:59:54,061 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 15:59:55,324 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:57,374 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 15:59:59,401 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 15:59:59,405 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:01,465 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:00:01,493 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:03,523 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:04,627 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:05,589 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:07,642 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:08,677 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:00:09,068 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:00:09,069 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:00:09,697 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:10,323 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:11,746 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:13,788 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:15,364 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:15,842 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:17,949 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:19,963 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:20,398 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:21,986 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:24,026 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:24,085 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:00:24,085 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:00:26,072 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:26,359 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:28,119 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:30,167 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:31,388 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:32,221 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:34,264 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:36,311 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:36,441 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:38,361 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:38,685 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:00:39,090 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:00:39,090 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:00:40,396 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:42,373 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:42,444 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:44,484 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:46,536 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:47,415 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:48,638 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:50,660 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:52,435 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:52,690 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:54,098 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:00:54,099 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:00:54,734 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:56,792 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:00:58,395 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:00:58,840 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:00,916 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:02,954 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:03,453 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:05,012 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:07,058 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:08,506 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:08,698 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:01:09,094 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:09,108 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:01:09,108 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:01:11,152 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:13,201 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:14,419 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:15,243 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:17,310 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:19,434 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:19,453 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:21,440 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:01:21,460 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:23,520 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:24,126 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:01:24,127 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:01:25,380 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:25,558 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:27,601 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:29,659 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:30,416 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:31,703 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:33,745 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:35,459 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:35,792 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:37,850 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:38,702 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:01:39,131 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:01:39,131 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:01:39,909 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:41,379 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:41,957 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:43,999 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:46,048 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:46,438 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:48,094 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:50,203 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:51,490 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:52,221 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:54,146 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:01:54,146 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:01:54,249 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:56,307 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:01:57,428 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:01:58,427 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:00,447 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:02,481 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:02,483 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:04,545 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:06,607 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:07,584 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:08,661 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:08,705 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:02:09,152 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:02:09,153 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:02:10,720 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:12,758 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:13,409 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:14,808 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:16,870 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:18,460 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:18,925 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:21,027 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:23,050 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:23,516 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:24,174 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:02:24,175 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:02:25,093 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:27,160 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:29,216 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:29,484 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:31,280 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:33,337 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:34,331 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:02:35,372 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:35,378 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:37,424 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:38,707 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:02:39,192 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:02:39,193 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:02:39,460 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:40,454 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:41,510 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:43,603 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:45,510 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:45,654 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:47,710 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:49,602 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:02:49,765 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:51,087 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:51,885 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:53,903 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:54,205 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:02:54,205 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:02:55,927 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:02:56,457 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:02:57,981 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:00,027 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:01,498 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:02,074 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:04,124 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:05,740 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:03:06,164 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:07,200 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:08,219 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:08,714 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:03:09,210 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:03:09,211 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:03:10,291 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:12,329 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:12,494 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:14,382 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:16,423 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:17,544 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:18,173 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: partial_history +2023-04-22 16:03:18,174 DEBUG SenderThread:11636 [sender.py:send():375] send: history +2023-04-22 16:03:18,174 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 16:03:18,175 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:03:18,517 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:18,882 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:03:18,883 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:03:19,885 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:03:20,600 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:22,713 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:23,448 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:24,229 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:03:24,229 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:03:24,735 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:26,753 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:28,482 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:28,809 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:30,902 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:32,979 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:33,537 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:35,009 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:37,053 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:38,586 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:38,726 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:03:39,108 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:39,236 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:03:39,237 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:03:41,169 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:43,216 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:44,525 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:45,261 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:46,259 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:03:47,313 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:49,358 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:50,472 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:51,412 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:53,517 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:54,231 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:03:54,231 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:03:55,493 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:03:55,546 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:57,572 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:03:59,592 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:00,582 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:01,633 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:03,687 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:05,632 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:05,737 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:07,782 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:08,746 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:04:09,247 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:04:09,248 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:04:09,847 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:11,499 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:11,893 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:13,939 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:15,988 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:16,540 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:18,036 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:20,100 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:21,570 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:22,149 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:24,210 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:04:24,243 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:24,252 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:04:24,252 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:04:26,271 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:27,523 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:28,291 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:30,341 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:32,381 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:32,573 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:34,434 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:36,478 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:37,613 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:38,547 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:38,750 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:04:39,257 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:04:39,258 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:04:40,625 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:42,679 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:43,605 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:44,724 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:46,758 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:04:46,766 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:48,817 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:49,033 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:50,877 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:52,940 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:54,088 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:04:54,275 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:04:54,276 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:04:55,061 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:57,091 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:59,110 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:04:59,575 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:01,170 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:03,216 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:04,628 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:05,253 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:06,249 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:05:07,300 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:08,753 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:05:09,289 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:05:09,289 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:05:09,349 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:10,547 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:11,389 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:13,441 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:15,496 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:15,601 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:17,551 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:19,607 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:20,598 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:05:21,182 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:21,653 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:23,704 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:24,300 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:05:24,300 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:05:25,800 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:26,567 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:27,824 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:29,846 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:31,893 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:32,115 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:32,891 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:05:33,943 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:36,002 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:37,162 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:38,044 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:38,759 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:05:39,312 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:05:39,312 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:05:40,081 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:42,124 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:42,568 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:44,200 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:45,206 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:05:46,241 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:48,069 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:48,283 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:50,330 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:52,385 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:53,111 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:05:54,304 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:05:54,305 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:05:54,427 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:56,539 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:58,532 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:05:58,560 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:05:58,966 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:00,609 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:02,658 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:04,005 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:04,700 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:06,741 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:08,772 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:06:08,781 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:09,319 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:06:09,319 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:06:09,568 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:10,825 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:06:10,834 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:12,873 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:14,920 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:15,012 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:16,952 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:18,996 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:20,042 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:21,043 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:23,069 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: partial_history +2023-04-22 16:06:23,071 DEBUG SenderThread:11636 [sender.py:send():375] send: history +2023-04-22 16:06:23,071 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 16:06:23,072 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:06:23,073 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:06:23,086 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:24,084 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:06:24,325 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:06:24,325 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:06:25,131 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:06:25,142 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:25,568 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:27,249 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:29,279 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:30,620 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:31,295 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:33,329 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:35,385 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:35,675 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:37,437 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:38,778 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:06:39,337 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:06:39,338 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:06:39,476 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:41,532 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:41,603 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:43,590 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:45,619 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:06:45,628 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:47,555 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:47,679 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:49,735 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:51,777 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:52,586 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:53,819 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:54,345 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:06:54,346 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:06:55,856 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:57,631 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:06:57,951 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:06:59,974 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:02,017 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:02,671 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:04,064 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:06,101 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:07,102 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:07:08,170 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:08,210 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:08,780 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:07:09,347 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:07:09,347 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:07:10,225 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:12,288 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:13,632 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:14,325 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:16,384 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:18,436 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:18,648 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:20,472 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:22,523 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:23,520 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:07:24,174 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:24,347 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:07:24,347 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:07:24,573 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:26,612 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:28,718 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:29,637 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:30,742 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:32,774 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:34,671 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:34,808 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:36,843 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:38,791 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:07:38,879 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:39,356 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:07:39,356 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:07:39,876 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:07:40,609 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:40,933 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:42,978 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:45,019 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:45,664 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:47,059 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:49,121 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:50,697 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:51,172 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:53,212 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:07:53,224 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:54,375 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:07:54,375 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:07:55,277 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:56,621 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:07:57,329 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:07:59,421 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:01,444 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:01,646 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:03,467 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:05,509 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:07,393 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:07,549 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:08:07,564 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:08,793 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:08:09,393 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:08:09,394 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:08:09,596 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:11,662 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:12,660 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:13,713 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:15,770 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:17,690 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:17,824 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:19,877 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:21,919 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:23,489 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:23,960 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:08:23,968 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:24,383 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:08:24,383 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:08:26,042 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:28,105 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:28,656 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:30,202 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:32,221 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:33,696 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:34,275 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:36,324 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:38,381 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:38,737 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:38,800 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:08:39,389 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:08:39,390 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:08:40,427 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:42,479 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:43,485 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:08:44,502 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:44,533 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:46,583 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:48,621 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:49,613 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:50,663 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:52,711 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:54,394 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:08:54,394 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:08:54,644 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:08:54,763 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:56,798 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:58,841 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:08:59,699 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:00,974 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:01,934 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:09:02,991 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:05,018 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:05,509 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:07,063 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:08,805 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:09:09,114 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:09,395 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:09:09,395 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:09:10,638 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:11,184 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:13,216 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:15,265 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:15,676 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:17,309 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:19,367 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:19,994 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: partial_history +2023-04-22 16:09:19,996 DEBUG SenderThread:11636 [sender.py:send():375] send: history +2023-04-22 16:09:19,997 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 16:09:19,998 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:09:20,359 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:09:21,248 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:21,415 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:09:21,431 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:22,424 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:09:23,473 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:24,399 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:09:24,399 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:09:25,520 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:26,671 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:27,569 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:29,608 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:31,705 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:31,718 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:33,739 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:35,759 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:36,762 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:37,813 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:38,806 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:09:39,420 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:09:39,421 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:09:39,863 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:41,900 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:09:41,909 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:41,950 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:43,965 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:46,008 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:47,000 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:48,045 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:50,098 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:52,131 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:52,141 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:54,182 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:54,429 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:09:54,430 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:09:56,223 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:09:57,694 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:09:58,275 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:00,331 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:02,419 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:03,597 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:04,396 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:10:04,431 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:06,452 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:08,489 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:08,648 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:08,820 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:10:09,442 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:10:09,442 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:10:10,537 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:12,590 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:13,727 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:14,648 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:16,705 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:18,756 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:18,766 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:20,806 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:22,851 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:23,790 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:24,440 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:10:24,440 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:10:24,908 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:25,911 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:10:26,985 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:29,021 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:29,748 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:31,080 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:33,175 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:34,775 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:35,192 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:37,211 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:38,823 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:10:39,249 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:39,427 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:10:39,428 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:10:40,693 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:41,282 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:43,332 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:45,379 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:45,738 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:47,428 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:48,429 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:10:49,462 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:51,065 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:51,506 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:53,556 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:54,441 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:10:54,441 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:10:55,613 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:56,706 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:10:57,668 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:10:59,711 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:01,748 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:01,762 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:03,884 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:05,912 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:06,929 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:07,877 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:11:07,933 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:08,828 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:11:09,443 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:11:09,444 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:11:09,951 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:11,986 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:12,699 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:14,043 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:16,097 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:17,752 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:18,150 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:20,198 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:22,260 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:22,806 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:24,301 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:24,467 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:11:24,468 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:11:26,329 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:11:26,336 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:28,375 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:28,661 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:30,418 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:32,474 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:33,698 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:34,587 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:36,615 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:38,645 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:38,754 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:38,831 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:11:39,475 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:11:39,476 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:11:40,705 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:42,774 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:43,774 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:44,808 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:46,850 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:48,892 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:11:48,910 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:48,977 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:50,951 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:53,001 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:54,037 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:11:54,490 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:11:54,490 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:11:55,039 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:57,091 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:59,127 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:11:59,766 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:01,164 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:03,220 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:04,817 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:05,319 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:07,339 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:08,319 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:12:08,837 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:12:09,369 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:09,502 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:12:09,502 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:12:10,769 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:11,408 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:13,461 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:15,573 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:15,806 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:17,650 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:19,696 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:20,860 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:21,749 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:23,790 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:24,509 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:12:24,510 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:12:25,826 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:26,765 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:27,882 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:28,895 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:12:29,938 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:32,008 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:32,308 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:34,046 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:36,148 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:37,346 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:38,170 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:38,837 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:12:39,523 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:12:39,524 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:12:40,198 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:42,251 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:42,800 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:44,283 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:46,330 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:47,525 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: partial_history +2023-04-22 16:12:47,527 DEBUG SenderThread:11636 [sender.py:send():375] send: history +2023-04-22 16:12:47,528 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: summary_record +2023-04-22 16:12:47,529 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:12:48,382 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:12:48,383 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:12:48,392 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:48,784 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:49,395 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:12:50,434 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:52,484 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:53,836 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:12:54,528 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:12:54,529 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:12:54,535 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:56,578 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:58,634 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:12:59,816 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:00,697 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:02,737 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:04,787 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:04,856 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:06,882 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:08,866 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:13:08,894 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:09,533 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: stop_status +2023-04-22 16:13:09,533 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: stop_status +2023-04-22 16:13:10,843 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:10,914 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:13:10,927 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:12,990 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:15,038 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:15,882 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:17,086 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:19,120 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:20,927 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:21,209 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:23,246 ERROR gpu :11636 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-22 16:13:24,213 DEBUG SenderThread:11636 [sender.py:send():375] send: exit +2023-04-22 16:13:24,213 INFO SenderThread:11636 [sender.py:send_exit():598] handling exit code: 255 +2023-04-22 16:13:24,213 INFO SenderThread:11636 [sender.py:send_exit():600] handling runtime: 1155 +2023-04-22 16:13:24,215 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:13:24,216 INFO SenderThread:11636 [sender.py:send_exit():606] send defer +2023-04-22 16:13:24,216 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,216 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-22 16:13:24,217 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,217 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-22 16:13:24,217 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 1 +2023-04-22 16:13:24,217 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,218 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-22 16:13:24,218 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,218 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-22 16:13:24,218 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 2 +2023-04-22 16:13:24,219 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,219 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-22 16:13:24,219 INFO HandlerThread:11636 [system_monitor.py:finish():190] Stopping system monitor +2023-04-22 16:13:24,219 DEBUG SystemMonitor:11636 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-22 16:13:24,219 INFO HandlerThread:11636 [interfaces.py:finish():202] Joined cpu monitor +2023-04-22 16:13:24,253 DEBUG SystemMonitor:11636 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-22 16:13:24,254 INFO HandlerThread:11636 [interfaces.py:finish():202] Joined disk monitor +2023-04-22 16:13:24,270 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:13:24,329 INFO HandlerThread:11636 [interfaces.py:finish():202] Joined gpu monitor +2023-04-22 16:13:24,329 INFO HandlerThread:11636 [interfaces.py:finish():202] Joined memory monitor +2023-04-22 16:13:24,329 INFO HandlerThread:11636 [interfaces.py:finish():202] Joined network monitor +2023-04-22 16:13:24,330 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,330 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-22 16:13:24,330 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 3 +2023-04-22 16:13:24,330 DEBUG SenderThread:11636 [sender.py:send():375] send: stats +2023-04-22 16:13:24,331 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,332 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-22 16:13:24,332 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,333 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-22 16:13:24,333 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 4 +2023-04-22 16:13:24,333 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,333 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-22 16:13:24,333 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,334 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-22 16:13:24,334 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 5 +2023-04-22 16:13:24,334 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,334 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-22 16:13:24,335 DEBUG SenderThread:11636 [sender.py:send():375] send: summary +2023-04-22 16:13:24,336 INFO SenderThread:11636 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-22 16:13:24,336 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,337 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-22 16:13:24,337 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 6 +2023-04-22 16:13:24,337 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,337 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-22 16:13:24,337 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,338 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-22 16:13:24,338 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 7 +2023-04-22 16:13:24,338 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:24,338 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:24,338 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-22 16:13:24,339 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:24,339 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-22 16:13:25,285 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:13:25,286 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:13:25,345 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:25,424 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 8 +2023-04-22 16:13:25,424 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:25,424 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:25,424 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-22 16:13:25,425 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:25,425 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-22 16:13:25,437 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 9 +2023-04-22 16:13:25,438 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:25,438 DEBUG SenderThread:11636 [sender.py:send():375] send: artifact +2023-04-22 16:13:25,438 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-22 16:13:26,297 INFO Thread-16 :11636 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:13:26,359 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:26,716 INFO SenderThread:11636 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'digest': 'ab8fa958b17d8260d4017e4943372340', 'state': 'COMMITTED', 'aliases': [{'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'latest'}, {'artifactCollectionName': 'job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py', 'alias': 'v5'}], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'versionIndex': 5}}, 'version': 'v5'} +2023-04-22 16:13:26,716 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:26,716 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-22 16:13:26,716 INFO SenderThread:11636 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-22 16:13:27,302 INFO SenderThread:11636 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files +2023-04-22 16:13:27,303 INFO SenderThread:11636 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\config.yaml config.yaml +2023-04-22 16:13:27,303 INFO SenderThread:11636 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log output.log +2023-04-22 16:13:27,306 INFO SenderThread:11636 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\requirements.txt requirements.txt +2023-04-22 16:13:27,310 INFO SenderThread:11636 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-metadata.json wandb-metadata.json +2023-04-22 16:13:27,310 INFO SenderThread:11636 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json wandb-summary.json +2023-04-22 16:13:27,315 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 10 +2023-04-22 16:13:27,316 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:27,316 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:27,316 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-22 16:13:27,317 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:27,317 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-22 16:13:27,319 INFO SenderThread:11636 [file_pusher.py:finish():168] shutting down file pusher +2023-04-22 16:13:27,363 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:27,364 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:28,370 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:28,371 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:28,799 INFO wandb-upload_2:11636 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\requirements.txt +2023-04-22 16:13:29,384 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:29,385 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:29,922 INFO wandb-upload_1:11636 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\output.log +2023-04-22 16:13:30,397 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:30,398 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:31,404 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:31,404 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:32,416 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:32,416 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:32,417 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:33,430 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:33,430 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:34,374 INFO wandb-upload_0:11636 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\config.yaml +2023-04-22 16:13:34,444 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:34,444 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:35,446 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:35,447 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:36,458 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:36,459 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:36,843 INFO wandb-upload_3:11636 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\files\wandb-summary.json +2023-04-22 16:13:37,054 INFO Thread-15 :11636 [sender.py:transition_state():626] send defer: 11 +2023-04-22 16:13:37,054 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:37,054 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-22 16:13:37,055 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:37,055 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-22 16:13:37,055 INFO SenderThread:11636 [file_pusher.py:join():173] waiting for file pusher +2023-04-22 16:13:37,055 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 12 +2023-04-22 16:13:37,055 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:37,055 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-22 16:13:37,056 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:37,056 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-22 16:13:37,463 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:38,062 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 13 +2023-04-22 16:13:38,062 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:38,062 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:38,062 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-22 16:13:38,062 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: status_report +2023-04-22 16:13:38,063 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:38,063 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-22 16:13:38,063 INFO SenderThread:11636 [sender.py:transition_state():626] send defer: 14 +2023-04-22 16:13:38,063 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: defer +2023-04-22 16:13:38,063 DEBUG SenderThread:11636 [sender.py:send():375] send: final +2023-04-22 16:13:38,063 INFO HandlerThread:11636 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-22 16:13:38,063 DEBUG SenderThread:11636 [sender.py:send():375] send: footer +2023-04-22 16:13:38,064 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: defer +2023-04-22 16:13:38,064 INFO SenderThread:11636 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-22 16:13:38,064 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:38,064 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:38,065 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-22 16:13:38,065 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: server_info +2023-04-22 16:13:38,065 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: get_summary +2023-04-22 16:13:38,065 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: poll_exit +2023-04-22 16:13:38,065 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-22 16:13:38,065 DEBUG SenderThread:11636 [sender.py:send_request():402] send_request: server_info +2023-04-22 16:13:38,304 INFO MainThread:11636 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-22 16:13:38,304 INFO MainThread:11636 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-22 16:13:38,308 INFO MainThread:11636 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-22 16:13:38,309 DEBUG HandlerThread:11636 [handler.py:handle_request():144] handle_request: shutdown +2023-04-22 16:13:38,309 INFO HandlerThread:11636 [handler.py:finish():845] shutting down handler +2023-04-22 16:13:39,072 INFO WriterThread:11636 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\run-eyf9ltgy.wandb +2023-04-22 16:13:39,308 INFO SenderThread:11636 [sender.py:finish():1550] shutting down sender +2023-04-22 16:13:39,308 INFO SenderThread:11636 [file_pusher.py:finish():168] shutting down file pusher +2023-04-22 16:13:39,308 INFO SenderThread:11636 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug.log b/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..82150719f65f7645baa1ad6e448ae57e3f57f78a --- /dev/null +++ b/ptuning/wandb/run-20230422_155407-eyf9ltgy/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Configure stats pid to 2616 +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-22 15:54:07,048 INFO MainThread:2616 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\logs\debug.log +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230422_155407-eyf9ltgy\logs\debug-internal.log +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:init():547] calling init triggers +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:init():595] starting backend +2023-04-22 15:54:07,049 INFO MainThread:2616 [wandb_init.py:init():599] setting up manager +2023-04-22 15:54:07,052 INFO MainThread:2616 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-22 15:54:07,056 INFO MainThread:2616 [wandb_init.py:init():605] backend started and connected +2023-04-22 15:54:07,058 INFO MainThread:2616 [wandb_init.py:init():695] updated telemetry +2023-04-22 15:54:07,137 INFO MainThread:2616 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-22 15:54:07,901 INFO MainThread:2616 [wandb_run.py:_on_init():2176] communicating current version +2023-04-22 15:54:08,469 INFO MainThread:2616 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-22 15:54:08,469 INFO MainThread:2616 [wandb_init.py:init():782] starting run threads in backend +2023-04-22 15:54:08,727 INFO MainThread:2616 [wandb_run.py:_console_start():2157] atexit reg +2023-04-22 15:54:08,728 INFO MainThread:2616 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-22 15:54:08,728 INFO MainThread:2616 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-22 15:54:08,728 INFO MainThread:2616 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-22 15:54:08,729 INFO MainThread:2616 [wandb_init.py:init():824] run started, returning control to user process +2023-04-22 15:54:08,732 INFO MainThread:2616 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr22_15-53-54_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-22 16:13:40,181 WARNING MsgRouterThr:2616 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230422_155407-eyf9ltgy/run-eyf9ltgy.wandb b/ptuning/wandb/run-20230422_155407-eyf9ltgy/run-eyf9ltgy.wandb new file mode 100644 index 0000000000000000000000000000000000000000..a3fe37de96d0e3672144f1160eceb595820cb95c Binary files /dev/null and b/ptuning/wandb/run-20230422_155407-eyf9ltgy/run-eyf9ltgy.wandb differ diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/files/config.yaml b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/config.yaml new file mode 100644 index 0000000000000000000000000000000000000000..1287a3cc335b33b1e9e08b3bf93ec23a24c5b813 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/config.yaml @@ -0,0 +1,636 @@ +wandb_version: 1 + +_wandb: + desc: null + value: + python_version: 3.8.10 + cli_version: 0.14.2 + framework: huggingface + huggingface_version: 4.27.1 + is_jupyter_run: false + is_kaggle_kernel: false + start_time: 1682187903.150408 + t: + 1: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 2: + - 1 + - 5 + - 11 + - 49 + - 51 + - 53 + - 55 + - 71 + 3: + - 7 + - 23 + 4: 3.8.10 + 5: 0.14.2 + 6: 4.27.1 + 8: + - 3 + - 5 + m: + - 1: train/global_step + 6: + - 3 + - 1: train/loss + 5: 1 + 6: + - 1 + - 1: train/learning_rate + 5: 1 + 6: + - 1 + - 1: train/epoch + 5: 1 + 6: + - 1 + - 1: train/train_runtime + 5: 1 + 6: + - 1 + - 1: train/train_samples_per_second + 5: 1 + 6: + - 1 + - 1: train/train_steps_per_second + 5: 1 + 6: + - 1 + - 1: train/total_flos + 5: 1 + 6: + - 1 + - 1: train/train_loss + 5: 1 + 6: + - 1 +num_layers: + desc: null + value: 28 +vocab_size: + desc: null + value: 130528 +hidden_size: + desc: null + value: 4096 +num_attention_heads: + desc: null + value: 32 +max_sequence_length: + desc: null + value: 2048 +layernorm_epsilon: + desc: null + value: 1.0e-05 +inner_hidden_size: + desc: null + value: 16384 +use_cache: + desc: null + value: true +bos_token_id: + desc: null + value: 130004 +eos_token_id: + desc: null + value: 130005 +pad_token_id: + desc: null + value: 3 +mask_token_id: + desc: null + value: 130000 +gmask_token_id: + desc: null + value: 130001 +position_encoding_2d: + desc: null + value: true +quantization_bit: + desc: null + value: 4 +quantization_embeddings: + desc: null + value: false +pre_seq_len: + desc: null + value: 128 +prefix_projection: + desc: null + value: false +return_dict: + desc: null + value: true +output_hidden_states: + desc: null + value: false +output_attentions: + desc: null + value: false +torchscript: + desc: null + value: false +torch_dtype: + desc: null + value: float16 +use_bfloat16: + desc: null + value: false +tf_legacy_loss: + desc: null + value: false +pruned_heads: + desc: null + value: {} +tie_word_embeddings: + desc: null + value: true +is_encoder_decoder: + desc: null + value: false +is_decoder: + desc: null + value: false +cross_attention_hidden_size: + desc: null + value: null +add_cross_attention: + desc: null + value: false +tie_encoder_decoder: + desc: null + value: false +max_length: + desc: null + value: 20 +min_length: + desc: null + value: 0 +do_sample: + desc: null + value: false +early_stopping: + desc: null + value: false +num_beams: + desc: null + value: 1 +num_beam_groups: + desc: null + value: 1 +diversity_penalty: + desc: null + value: 0.0 +temperature: + desc: null + value: 1.0 +top_k: + desc: null + value: 50 +top_p: + desc: null + value: 1.0 +typical_p: + desc: null + value: 1.0 +repetition_penalty: + desc: null + value: 1.0 +length_penalty: + desc: null + value: 1.0 +no_repeat_ngram_size: + desc: null + value: 0 +encoder_no_repeat_ngram_size: + desc: null + value: 0 +bad_words_ids: + desc: null + value: null +num_return_sequences: + desc: null + value: 1 +chunk_size_feed_forward: + desc: null + value: 0 +output_scores: + desc: null + value: false +return_dict_in_generate: + desc: null + value: false +forced_bos_token_id: + desc: null + value: null +forced_eos_token_id: + desc: null + value: null +remove_invalid_values: + desc: null + value: false +exponential_decay_length_penalty: + desc: null + value: null +suppress_tokens: + desc: null + value: null +begin_suppress_tokens: + desc: null + value: null +architectures: + desc: null + value: + - ChatGLMModel +finetuning_task: + desc: null + value: null +id2label: + desc: null + value: + '0': LABEL_0 + '1': LABEL_1 +label2id: + desc: null + value: + LABEL_0: 0 + LABEL_1: 1 +tokenizer_class: + desc: null + value: null +prefix: + desc: null + value: null +sep_token_id: + desc: null + value: null +decoder_start_token_id: + desc: null + value: null +task_specific_params: + desc: null + value: null +problem_type: + desc: null + value: null +_name_or_path: + desc: null + value: ..\models\chatglm-6b-int4 +transformers_version: + desc: null + value: 4.27.1 +auto_map: + desc: null + value: + AutoConfig: configuration_chatglm.ChatGLMConfig + AutoModel: modeling_chatglm.ChatGLMForConditionalGeneration + AutoModelForSeq2SeqLM: modeling_chatglm.ChatGLMForConditionalGeneration +model_type: + desc: null + value: chatglm +output_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +overwrite_output_dir: + desc: null + value: true +do_train: + desc: null + value: true +do_eval: + desc: null + value: false +do_predict: + desc: null + value: false +evaluation_strategy: + desc: null + value: 'no' +prediction_loss_only: + desc: null + value: false +per_device_train_batch_size: + desc: null + value: 1 +per_device_eval_batch_size: + desc: null + value: 1 +per_gpu_train_batch_size: + desc: null + value: None +per_gpu_eval_batch_size: + desc: null + value: None +gradient_accumulation_steps: + desc: null + value: 16 +eval_accumulation_steps: + desc: null + value: None +eval_delay: + desc: null + value: 0 +learning_rate: + desc: null + value: 0.02 +weight_decay: + desc: null + value: 0.0 +adam_beta1: + desc: null + value: 0.9 +adam_beta2: + desc: null + value: 0.999 +adam_epsilon: + desc: null + value: 1.0e-08 +max_grad_norm: + desc: null + value: 1.0 +num_train_epochs: + desc: null + value: 3.0 +max_steps: + desc: null + value: 1000 +lr_scheduler_type: + desc: null + value: linear +warmup_ratio: + desc: null + value: 0.0 +warmup_steps: + desc: null + value: 0 +log_level: + desc: null + value: passive +log_level_replica: + desc: null + value: warning +log_on_each_node: + desc: null + value: true +logging_dir: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2\runs\Apr23_02-24-50_LAPTOP-U8KCJD82 +logging_strategy: + desc: null + value: steps +logging_first_step: + desc: null + value: false +logging_steps: + desc: null + value: 10 +logging_nan_inf_filter: + desc: null + value: true +save_strategy: + desc: null + value: steps +save_steps: + desc: null + value: 10 +save_total_limit: + desc: null + value: None +save_on_each_node: + desc: null + value: false +no_cuda: + desc: null + value: false +use_mps_device: + desc: null + value: false +seed: + desc: null + value: 42 +data_seed: + desc: null + value: None +jit_mode_eval: + desc: null + value: false +use_ipex: + desc: null + value: false +bf16: + desc: null + value: false +fp16: + desc: null + value: false +fp16_opt_level: + desc: null + value: O1 +half_precision_backend: + desc: null + value: auto +bf16_full_eval: + desc: null + value: false +fp16_full_eval: + desc: null + value: false +tf32: + desc: null + value: None +local_rank: + desc: null + value: -1 +xpu_backend: + desc: null + value: None +tpu_num_cores: + desc: null + value: None +tpu_metrics_debug: + desc: null + value: false +debug: + desc: null + value: '[]' +dataloader_drop_last: + desc: null + value: false +eval_steps: + desc: null + value: None +dataloader_num_workers: + desc: null + value: 0 +past_index: + desc: null + value: -1 +run_name: + desc: null + value: output\adgen-chatglm-6b-pt-128-2e-2 +disable_tqdm: + desc: null + value: false +remove_unused_columns: + desc: null + value: true +label_names: + desc: null + value: None +load_best_model_at_end: + desc: null + value: false +metric_for_best_model: + desc: null + value: None +greater_is_better: + desc: null + value: None +ignore_data_skip: + desc: null + value: false +sharded_ddp: + desc: null + value: '[]' +fsdp: + desc: null + value: '[]' +fsdp_min_num_params: + desc: null + value: 0 +fsdp_config: + desc: null + value: '{''fsdp_min_num_params'': 0, ''xla'': False, ''xla_fsdp_grad_ckpt'': False}' +fsdp_transformer_layer_cls_to_wrap: + desc: null + value: None +deepspeed: + desc: null + value: None +label_smoothing_factor: + desc: null + value: 0.0 +optim: + desc: null + value: adamw_hf +optim_args: + desc: null + value: None +adafactor: + desc: null + value: false +group_by_length: + desc: null + value: false +length_column_name: + desc: null + value: length +report_to: + desc: null + value: '[''tensorboard'', ''wandb'']' +ddp_find_unused_parameters: + desc: null + value: None +ddp_bucket_cap_mb: + desc: null + value: None +dataloader_pin_memory: + desc: null + value: true +skip_memory_metrics: + desc: null + value: true +use_legacy_prediction_loop: + desc: null + value: false +push_to_hub: + desc: null + value: false +resume_from_checkpoint: + desc: null + value: None +hub_model_id: + desc: null + value: None +hub_strategy: + desc: null + value: every_save +hub_token: + desc: null + value: +hub_private_repo: + desc: null + value: false +gradient_checkpointing: + desc: null + value: false +include_inputs_for_metrics: + desc: null + value: false +fp16_backend: + desc: null + value: auto +push_to_hub_model_id: + desc: null + value: None +push_to_hub_organization: + desc: null + value: None +push_to_hub_token: + desc: null + value: +mp_parameters: + desc: null + value: '' +auto_find_batch_size: + desc: null + value: false +full_determinism: + desc: null + value: false +torchdynamo: + desc: null + value: None +ray_scope: + desc: null + value: last +ddp_timeout: + desc: null + value: 1800 +torch_compile: + desc: null + value: false +torch_compile_backend: + desc: null + value: None +torch_compile_mode: + desc: null + value: None +sortish_sampler: + desc: null + value: false +predict_with_generate: + desc: null + value: true +generation_max_length: + desc: null + value: 64 +generation_num_beams: + desc: null + value: None +train_batch_size: + desc: null + value: 1 +eval_batch_size: + desc: null + value: 1 diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/files/output.log b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/output.log new file mode 100644 index 0000000000000000000000000000000000000000..797a33ae008ea7820a2f0967e531c581338541f2 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/output.log @@ -0,0 +1,1279 @@ + + 0%| | 0/1000 [00:00> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:27:55,493 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:27:55,714 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:27:55,717 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:27:55,719 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-10\special_tokens_map.json +{'loss': 4.7949, 'learning_rate': 0.0198, 'epoch': 1.14} +Saving PrefixEncoder + + + + + + + + + 2%|██ | 20/1000 [05:19<3:40:50, 13.52s/it][INFO|configuration_utils.py:457] 2023-04-23 02:30:24,491 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:30:24,494 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\generation_config.json +{'loss': 3.7519, 'learning_rate': 0.0196, 'epoch': 2.29} +Saving PrefixEncoder +[INFO|modeling_utils.py:1762] 2023-04-23 02:30:24,705 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:30:24,709 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:30:24,710 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-20\special_tokens_map.json + + + + + + + + + 3%|██▉ | 29/1000 [07:15<3:27:26, 12.82s/it] +{'loss': 3.3049, 'learning_rate': 0.0194, 'epoch': 3.43} + 3%|███ | 30/1000 [07:28<3:28:05, 12.87s/it][INFO|configuration_utils.py:457] 2023-04-23 02:32:32,982 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:32:32,985 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:32:33,191 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:32:33,194 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:32:33,195 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-30\special_tokens_map.json + + + + + + + + + 4%|████ | 39/1000 [09:24<3:26:58, 12.92s/it] +{'loss': 2.8868, 'learning_rate': 0.0192, 'epoch': 4.57} + 4%|████ | 40/1000 [09:37<3:26:24, 12.90s/it][INFO|configuration_utils.py:457] 2023-04-23 02:34:42,494 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:34:42,497 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:34:42,687 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:34:42,691 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:34:42,692 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-40\special_tokens_map.json + + + + + + + + + 5%|█████▏ | 50/1000 [11:46<3:23:48, 12.87s/it][INFO|configuration_utils.py:457] 2023-04-23 02:36:51,630 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:36:51,633 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:36:51,862 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:36:51,865 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:36:51,866 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-50\special_tokens_map.json +{'loss': 2.4806, 'learning_rate': 0.019, 'epoch': 5.71} +Saving PrefixEncoder + + + + + + + + + 6%|██████ | 59/1000 [13:43<3:22:21, 12.90s/it] +{'loss': 1.865, 'learning_rate': 0.0188, 'epoch': 6.86} + 6%|██████▏ | 60/1000 [13:56<3:21:44, 12.88s/it][INFO|configuration_utils.py:457] 2023-04-23 02:39:01,210 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-60\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:39:01,213 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-60\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:39:01,461 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-60\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:39:01,466 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-60\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:39:01,467 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-60\special_tokens_map.json + + + + + + + + + 7%|███████▏ | 70/1000 [16:06<3:20:13, 12.92s/it][INFO|configuration_utils.py:457] 2023-04-23 02:41:11,152 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-70\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:41:11,155 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-70\generation_config.json +{'loss': 1.4186, 'learning_rate': 0.018600000000000002, 'epoch': 8.0} +Saving PrefixEncoder +[INFO|modeling_utils.py:1762] 2023-04-23 02:41:11,394 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-70\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:41:11,400 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-70\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:41:11,401 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-70\special_tokens_map.json + + + + + + + + + 8%|████████▏ | 79/1000 [18:02<3:17:47, 12.88s/it] +{'loss': 0.9316, 'learning_rate': 0.0184, 'epoch': 9.14} + 8%|████████▏ | 80/1000 [18:15<3:17:18, 12.87s/it][INFO|configuration_utils.py:457] 2023-04-23 02:43:20,330 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-80\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:43:20,333 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-80\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:43:20,565 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-80\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:43:20,569 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-80\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:43:20,571 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-80\special_tokens_map.json + + + + + + + + + 9%|█████████▏ | 89/1000 [20:11<3:14:45, 12.83s/it] +{'loss': 0.5643, 'learning_rate': 0.0182, 'epoch': 10.29} + 9%|█████████▎ | 90/1000 [20:24<3:13:49, 12.78s/it][INFO|configuration_utils.py:457] 2023-04-23 02:45:28,966 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-90\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:45:28,969 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-90\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:45:29,189 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-90\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:45:29,194 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-90\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:45:29,195 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-90\special_tokens_map.json + + + + + + + + + 10%|██████████▏ | 100/1000 [22:31<3:10:24, 12.69s/it][INFO|configuration_utils.py:457] 2023-04-23 02:47:36,193 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:47:36,197 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:47:36,415 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:47:36,419 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:47:36,421 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-100\special_tokens_map.json +{'loss': 0.3509, 'learning_rate': 0.018000000000000002, 'epoch': 11.43} +Saving PrefixEncoder + + + + + + + + + 11%|███████████▏ | 110/1000 [24:40<3:10:06, 12.82s/it][INFO|configuration_utils.py:457] 2023-04-23 02:49:45,329 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-110\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:49:45,331 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-110\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:49:45,548 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-110\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:49:45,552 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-110\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:49:45,553 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-110\special_tokens_map.json +{'loss': 0.2172, 'learning_rate': 0.0178, 'epoch': 12.57} +Saving PrefixEncoder + + + + + + + + + 12%|████████████▏ | 119/1000 [26:35<3:06:34, 12.71s/it] +{'loss': 0.1486, 'learning_rate': 0.0176, 'epoch': 13.71} + 12%|████████████▏ | 120/1000 [26:47<3:05:43, 12.66s/it][INFO|configuration_utils.py:457] 2023-04-23 02:51:52,823 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-120\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:51:52,826 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-120\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:51:53,041 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-120\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:51:53,048 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-120\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:51:53,049 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-120\special_tokens_map.json + + + + + + + + + 13%|█████████████▏ | 129/1000 [28:42<3:03:27, 12.64s/it] +{'loss': 0.1196, 'learning_rate': 0.0174, 'epoch': 14.86} + 13%|█████████████▎ | 130/1000 [28:54<3:03:18, 12.64s/it][INFO|configuration_utils.py:457] 2023-04-23 02:53:59,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-130\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:53:59,676 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-130\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:53:59,868 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-130\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:53:59,873 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-130\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:53:59,875 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-130\special_tokens_map.json + + + + + + + + + 14%|██████████████▎ | 140/1000 [31:01<3:01:13, 12.64s/it][INFO|configuration_utils.py:457] 2023-04-23 02:56:06,507 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-140\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:56:06,509 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-140\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:56:06,734 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-140\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:56:06,738 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-140\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:56:06,739 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-140\special_tokens_map.json +{'loss': 0.0802, 'learning_rate': 0.0172, 'epoch': 16.0} +Saving PrefixEncoder + + + + + + + + + 15%|███████████████▎ | 150/1000 [33:08<2:58:53, 12.63s/it][INFO|configuration_utils.py:457] 2023-04-23 02:58:13,381 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-150\config.json +[INFO|configuration_utils.py:362] 2023-04-23 02:58:13,385 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-150\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 02:58:13,618 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-150\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 02:58:13,623 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-150\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 02:58:13,624 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-150\special_tokens_map.json +{'loss': 0.0739, 'learning_rate': 0.017, 'epoch': 17.14} +Saving PrefixEncoder + + + + + + + + + 16%|████████████████▎ | 160/1000 [35:15<2:57:06, 12.65s/it][INFO|configuration_utils.py:457] 2023-04-23 03:00:20,468 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-160\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:00:20,471 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-160\generation_config.json +{'loss': 0.0619, 'learning_rate': 0.0168, 'epoch': 18.29} +Saving PrefixEncoder +[INFO|modeling_utils.py:1762] 2023-04-23 03:00:20,661 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-160\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 03:00:20,667 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-160\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:00:20,668 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-160\special_tokens_map.json + + + + + + + + + 17%|█████████████████▏ | 169/1000 [37:10<2:55:31, 12.67s/it] +{'loss': 0.062, 'learning_rate': 0.0166, 'epoch': 19.43} + 17%|█████████████████▎ | 170/1000 [37:22<2:55:31, 12.69s/it][INFO|configuration_utils.py:457] 2023-04-23 03:02:27,681 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-170\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:02:27,685 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-170\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 03:02:27,891 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-170\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 03:02:27,897 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-170\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:02:27,898 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-170\special_tokens_map.json + + + + + + + + + 18%|██████████████████▎ | 180/1000 [39:29<2:52:38, 12.63s/it][INFO|configuration_utils.py:457] 2023-04-23 03:04:34,429 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-180\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:04:34,431 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-180\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 03:04:34,616 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-180\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 03:04:34,622 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-180\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:04:34,624 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-180\special_tokens_map.json +{'loss': 0.0595, 'learning_rate': 0.016399999999999998, 'epoch': 20.57} +Saving PrefixEncoder + + + + + + + + + 19%|███████████████████▍ | 190/1000 [41:36<2:50:35, 12.64s/it][INFO|configuration_utils.py:457] 2023-04-23 03:06:41,413 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-190\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:06:41,416 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-190\generation_config.json +[INFO|modeling_utils.py:1762] 2023-04-23 03:06:41,657 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-190\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 03:06:41,662 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-190\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:06:41,664 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-190\special_tokens_map.json +{'loss': 0.0569, 'learning_rate': 0.016200000000000003, 'epoch': 21.71} +Saving PrefixEncoder + + + + + + + + + 20%|████████████████████▍ | 200/1000 [43:43<2:48:38, 12.65s/it][INFO|configuration_utils.py:457] 2023-04-23 03:08:48,481 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:08:48,484 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\generation_config.json +{'loss': 0.0533, 'learning_rate': 0.016, 'epoch': 22.86} +Saving PrefixEncoder +[INFO|modeling_utils.py:1762] 2023-04-23 03:08:48,680 >> Model weights saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\pytorch_model.bin +[INFO|tokenization_utils_base.py:2163] 2023-04-23 03:08:48,686 >> tokenizer config file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\tokenizer_config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:08:48,687 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-200\special_tokens_map.json + 20%|████████████████████▌ | 202/1000 [44:09<2:49:55, 12.78s/it] + 20%|████████████████████▋ | 203/1000 [44:22<2:49:20, 12.75s/it] + 20%|████████████████████▊ | 204/1000 [44:34<2:48:42, 12.72s/it] + 20%|████████████████████▉ | 205/1000 [44:47<2:48:00, 12.68s/it] + 21%|█████████████████████ | 206/1000 [45:00<2:47:45, 12.68s/it] + 21%|█████████████████████ | 207/1000 [45:12<2:47:50, 12.70s/it] + 21%|█████████████████████▏ | 208/1000 [45:25<2:47:23, 12.68s/it] + 21%|█████████████████████▎ | 209/1000 [45:38<2:47:08, 12.68s/it] + 21%|█████████████████████▎ | 209/1000 [45:38<2:47:08, 12.68s/it] +{'loss': 0.052, 'learning_rate': 0.0158, 'epoch': 24.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:10:55,913 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 21%|█████████████████████▌ | 211/1000 [46:04<2:48:47, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 21%|█████████████████████▌ | 212/1000 [46:16<2:47:55, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 21%|█████████████████████▋ | 213/1000 [46:29<2:46:58, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 21%|█████████████████████▊ | 214/1000 [46:41<2:45:54, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 22%|█████████████████████▉ | 215/1000 [46:54<2:45:49, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 22%|██████████████████████ | 216/1000 [47:07<2:45:22, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 22%|██████████████████████▏ | 217/1000 [47:19<2:45:10, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 22%|██████████████████████▏ | 218/1000 [47:32<2:44:42, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json + 22%|██████████████████████▎ | 219/1000 [47:45<2:44:40, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:10:55,673 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-210\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:13:02,944 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:13:02,944 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json +{'loss': 0.0471, 'learning_rate': 0.015600000000000001, 'epoch': 25.14} + 22%|██████████████████████▌ | 221/1000 [48:11<2:47:11, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 22%|██████████████████████▋ | 222/1000 [48:23<2:45:57, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 22%|██████████████████████▋ | 223/1000 [48:36<2:44:54, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 22%|██████████████████████▊ | 224/1000 [48:48<2:44:10, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 22%|██████████████████████▉ | 225/1000 [49:01<2:43:44, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 23%|███████████████████████ | 226/1000 [49:14<2:43:26, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 23%|███████████████████████▏ | 227/1000 [49:26<2:43:11, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 23%|███████████████████████▎ | 228/1000 [49:39<2:42:40, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json + 23%|███████████████████████▎ | 229/1000 [49:52<2:42:19, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:13:02,716 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-220\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:15:09,757 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:15:09,757 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json +{'loss': 0.0478, 'learning_rate': 0.0154, 'epoch': 26.29} + 23%|███████████████████████▌ | 231/1000 [50:18<2:44:21, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 23%|███████████████████████▋ | 232/1000 [50:30<2:43:18, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 23%|███████████████████████▊ | 233/1000 [50:43<2:42:34, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 23%|███████████████████████▊ | 234/1000 [50:55<2:41:49, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 24%|███████████████████████▉ | 235/1000 [51:08<2:41:23, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 24%|████████████████████████ | 236/1000 [51:21<2:40:54, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 24%|████████████████████████▏ | 237/1000 [51:33<2:40:26, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 24%|████████████████████████▎ | 238/1000 [51:46<2:40:19, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json + 24%|████████████████████████▍ | 239/1000 [51:58<2:40:13, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:15:09,542 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-230\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:17:16,648 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:17:16,648 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json +{'loss': 0.0415, 'learning_rate': 0.0152, 'epoch': 27.43} + 24%|████████████████████████▌ | 241/1000 [52:24<2:42:32, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 24%|████████████████████████▋ | 242/1000 [52:37<2:41:42, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 24%|████████████████████████▊ | 243/1000 [52:50<2:41:07, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 24%|████████████████████████▉ | 244/1000 [53:02<2:40:44, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 24%|████████████████████████▉ | 245/1000 [53:15<2:39:49, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 25%|█████████████████████████ | 246/1000 [53:28<2:39:18, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 25%|█████████████████████████▏ | 247/1000 [53:40<2:38:48, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 25%|█████████████████████████▎ | 248/1000 [53:53<2:38:22, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 25%|█████████████████████████▍ | 249/1000 [54:05<2:37:53, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json + 25%|█████████████████████████▍ | 249/1000 [54:05<2:37:53, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:17:16,395 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-240\config.json +{'loss': 0.0462, 'learning_rate': 0.015, 'epoch': 28.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:19:23,674 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 25%|█████████████████████████▌ | 251/1000 [54:31<2:40:17, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 25%|█████████████████████████▋ | 252/1000 [54:44<2:38:54, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 25%|█████████████████████████▊ | 253/1000 [54:57<2:38:30, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 25%|█████████████████████████▉ | 254/1000 [55:09<2:38:00, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 26%|██████████████████████████ | 255/1000 [55:22<2:37:31, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 26%|██████████████████████████ | 256/1000 [55:34<2:36:51, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 26%|██████████████████████████▏ | 257/1000 [55:47<2:36:28, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 26%|██████████████████████████▎ | 258/1000 [56:00<2:36:16, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json + 26%|██████████████████████████▍ | 259/1000 [56:12<2:36:21, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:19:23,418 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-250\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:21:30,553 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:21:30,553 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json +{'loss': 0.0437, 'learning_rate': 0.0148, 'epoch': 29.71} + 26%|██████████████████████████▌ | 261/1000 [56:38<2:37:38, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 26%|██████████████████████████▋ | 262/1000 [56:51<2:36:46, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 26%|██████████████████████████▊ | 263/1000 [57:03<2:36:18, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 26%|██████████████████████████▉ | 264/1000 [57:16<2:35:28, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 26%|███████████████████████████ | 265/1000 [57:29<2:34:52, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 27%|███████████████████████████▏ | 266/1000 [57:41<2:34:38, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 27%|███████████████████████████▏ | 267/1000 [57:54<2:34:02, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 27%|███████████████████████████▎ | 268/1000 [58:07<2:34:30, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json + 27%|███████████████████████████▍ | 269/1000 [58:19<2:34:10, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:21:30,353 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-260\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:23:37,481 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:23:37,481 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json +{'loss': 0.0452, 'learning_rate': 0.0146, 'epoch': 30.86} + 27%|███████████████████████████▋ | 271/1000 [58:45<2:36:00, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 27%|███████████████████████████▋ | 272/1000 [58:58<2:34:42, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 27%|███████████████████████████▊ | 273/1000 [59:10<2:34:16, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 27%|███████████████████████████▉ | 274/1000 [59:23<2:33:29, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 28%|████████████████████████████ | 275/1000 [59:36<2:33:03, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 28%|████████████████████████████▏ | 276/1000 [59:48<2:32:34, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 28%|███████████████████████████▋ | 277/1000 [1:00:01<2:32:48, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 28%|███████████████████████████▊ | 278/1000 [1:00:14<2:32:24, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json + 28%|███████████████████████████▉ | 279/1000 [1:00:26<2:31:58, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:23:37,246 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-270\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:25:44,398 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:25:44,398 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json +{'loss': 0.041, 'learning_rate': 0.0144, 'epoch': 32.0} + 28%|████████████████████████████ | 281/1000 [1:00:52<2:33:12, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 28%|████████████████████████████▏ | 282/1000 [1:01:05<2:32:16, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 28%|████████████████████████████▎ | 283/1000 [1:01:17<2:31:33, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 28%|████████████████████████████▍ | 284/1000 [1:01:30<2:31:06, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 28%|████████████████████████████▍ | 285/1000 [1:01:42<2:30:32, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 29%|████████████████████████████▌ | 286/1000 [1:01:55<2:30:21, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 29%|████████████████████████████▋ | 287/1000 [1:02:08<2:30:12, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 29%|████████████████████████████▊ | 288/1000 [1:02:20<2:29:39, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json + 29%|████████████████████████████▉ | 289/1000 [1:02:33<2:29:21, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:25:44,168 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-280\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:27:50,854 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:27:50,854 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json +{'loss': 0.0418, 'learning_rate': 0.014199999999999999, 'epoch': 33.14} + 29%|█████████████████████████████ | 291/1000 [1:02:59<2:31:07, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 29%|█████████████████████████████▏ | 292/1000 [1:03:11<2:30:21, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 29%|█████████████████████████████▎ | 293/1000 [1:03:24<2:29:33, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 29%|█████████████████████████████▍ | 294/1000 [1:03:36<2:29:07, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 30%|█████████████████████████████▌ | 295/1000 [1:03:49<2:28:50, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 30%|█████████████████████████████▌ | 296/1000 [1:04:02<2:28:18, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 30%|█████████████████████████████▋ | 297/1000 [1:04:14<2:27:59, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 30%|█████████████████████████████▊ | 298/1000 [1:04:27<2:27:40, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json + 30%|█████████████████████████████▉ | 299/1000 [1:04:39<2:27:11, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:27:50,660 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-290\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:29:57,544 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:29:57,544 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json +{'loss': 0.0469, 'learning_rate': 0.013999999999999999, 'epoch': 34.29} + 30%|██████████████████████████████ | 301/1000 [1:05:05<2:29:19, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 30%|██████████████████████████████▏ | 302/1000 [1:05:18<2:28:06, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 30%|██████████████████████████████▎ | 303/1000 [1:05:30<2:27:32, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 30%|██████████████████████████████▍ | 304/1000 [1:05:43<2:26:51, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 30%|██████████████████████████████▌ | 305/1000 [1:05:56<2:26:29, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 31%|██████████████████████████████▌ | 306/1000 [1:06:08<2:26:08, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 31%|██████████████████████████████▋ | 307/1000 [1:06:21<2:25:43, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 31%|██████████████████████████████▊ | 308/1000 [1:06:33<2:25:25, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json + 31%|██████████████████████████████▉ | 309/1000 [1:06:46<2:25:10, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:29:57,299 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-300\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:32:04,229 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:32:04,229 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json +{'loss': 0.0394, 'learning_rate': 0.0138, 'epoch': 35.43} + 31%|███████████████████████████████ | 311/1000 [1:07:12<2:27:18, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 31%|███████████████████████████████▏ | 312/1000 [1:07:25<2:26:41, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 31%|███████████████████████████████▎ | 313/1000 [1:07:37<2:25:59, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 31%|███████████████████████████████▍ | 314/1000 [1:07:50<2:25:19, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 32%|███████████████████████████████▌ | 315/1000 [1:08:03<2:25:10, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 32%|███████████████████████████████▌ | 316/1000 [1:08:15<2:24:54, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 32%|███████████████████████████████▋ | 317/1000 [1:08:28<2:24:25, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 32%|███████████████████████████████▊ | 318/1000 [1:08:41<2:23:51, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json + 32%|███████████████████████████████▉ | 319/1000 [1:08:53<2:23:48, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:32:03,978 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-310\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:34:11,298 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\generation_config.jsonration_utils.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:34:11,298 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\generation_config.jsonration_utils.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json +{'loss': 0.0444, 'learning_rate': 0.013600000000000001, 'epoch': 36.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:34:11,533 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 32%|████████████████████████████████ | 321/1000 [1:09:19<2:25:07, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 32%|████████████████████████████████▏ | 322/1000 [1:09:32<2:24:25, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 32%|████████████████████████████████▎ | 323/1000 [1:09:44<2:23:47, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 32%|████████████████████████████████▍ | 324/1000 [1:09:57<2:23:28, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 32%|████████████████████████████████▌ | 325/1000 [1:10:10<2:23:09, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 33%|████████████████████████████████▌ | 326/1000 [1:10:23<2:22:43, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 33%|████████████████████████████████▋ | 327/1000 [1:10:35<2:22:01, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 33%|████████████████████████████████▊ | 328/1000 [1:10:48<2:21:23, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json + 33%|████████████████████████████████▉ | 329/1000 [1:11:00<2:21:13, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:34:11,295 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-320\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:36:18,481 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:36:18,481 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json +{'loss': 0.0393, 'learning_rate': 0.0134, 'epoch': 37.71} + 33%|█████████████████████████████████ | 331/1000 [1:11:26<2:23:20, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 33%|█████████████████████████████████▏ | 332/1000 [1:11:39<2:22:25, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 33%|█████████████████████████████████▎ | 333/1000 [1:11:52<2:21:45, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 33%|█████████████████████████████████▍ | 334/1000 [1:12:04<2:21:09, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 34%|█████████████████████████████████▌ | 335/1000 [1:12:17<2:20:50, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 34%|█████████████████████████████████▌ | 336/1000 [1:12:30<2:20:28, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 34%|█████████████████████████████████▋ | 337/1000 [1:12:42<2:19:58, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 34%|█████████████████████████████████▊ | 338/1000 [1:12:55<2:19:46, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json + 34%|█████████████████████████████████▉ | 339/1000 [1:13:08<2:19:34, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:36:18,277 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-330\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:38:25,823 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:38:25,823 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json +{'loss': 0.045, 'learning_rate': 0.013200000000000002, 'epoch': 38.86} + 34%|██████████████████████████████████ | 341/1000 [1:13:34<2:21:40, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 34%|██████████████████████████████████▏ | 342/1000 [1:13:46<2:20:51, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 34%|██████████████████████████████████▎ | 343/1000 [1:13:59<2:20:07, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 34%|██████████████████████████████████▍ | 344/1000 [1:14:12<2:19:52, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 34%|██████████████████████████████████▌ | 345/1000 [1:14:24<2:19:14, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 35%|██████████████████████████████████▌ | 346/1000 [1:14:37<2:18:43, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 35%|██████████████████████████████████▋ | 347/1000 [1:14:50<2:18:15, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 35%|██████████████████████████████████▊ | 348/1000 [1:15:02<2:17:53, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 35%|██████████████████████████████████▉ | 349/1000 [1:15:15<2:17:36, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json + 35%|██████████████████████████████████▉ | 349/1000 [1:15:15<2:17:36, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:38:25,552 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-340\config.json +{'loss': 0.0392, 'learning_rate': 0.013000000000000001, 'epoch': 40.0} +[INFO|configuration_utils.py:362] 2023-04-23 03:40:33,131 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\generation_config.jsonration_utils.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:40:33,338 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 35%|███████████████████████████████████ | 351/1000 [1:15:41<2:19:22, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 35%|███████████████████████████████████▏ | 352/1000 [1:15:54<2:18:29, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 35%|███████████████████████████████████▎ | 353/1000 [1:16:07<2:17:49, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 35%|███████████████████████████████████▍ | 354/1000 [1:16:19<2:17:34, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▌ | 355/1000 [1:16:32<2:17:23, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▌ | 356/1000 [1:16:45<2:16:42, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▋ | 357/1000 [1:16:57<2:16:20, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▊ | 358/1000 [1:17:10<2:16:13, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▉ | 359/1000 [1:17:23<2:15:57, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json + 36%|███████████████████████████████████▉ | 359/1000 [1:17:23<2:15:57, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:40:33,128 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-350\config.json +{'loss': 0.0351, 'learning_rate': 0.0128, 'epoch': 41.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:42:41,217 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 36%|████████████████████████████████████ | 361/1000 [1:17:49<2:17:27, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 36%|████████████████████████████████████▏ | 362/1000 [1:18:02<2:16:37, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 36%|████████████████████████████████████▎ | 363/1000 [1:18:14<2:15:56, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 36%|████████████████████████████████████▍ | 364/1000 [1:18:27<2:15:22, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 36%|████████████████████████████████████▌ | 365/1000 [1:18:40<2:14:46, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 37%|████████████████████████████████████▌ | 366/1000 [1:18:52<2:14:27, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 37%|████████████████████████████████████▋ | 367/1000 [1:19:05<2:14:07, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 37%|████████████████████████████████████▊ | 368/1000 [1:19:18<2:13:57, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 37%|████████████████████████████████████▉ | 369/1000 [1:19:31<2:13:44, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json + 37%|████████████████████████████████████▉ | 369/1000 [1:19:31<2:13:44, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:42:40,983 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-360\config.json +{'loss': 0.0389, 'learning_rate': 0.0126, 'epoch': 42.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:44:48,757 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 37%|█████████████████████████████████████ | 371/1000 [1:19:56<2:14:56, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 37%|█████████████████████████████████████▏ | 372/1000 [1:20:09<2:14:04, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 37%|█████████████████████████████████████▎ | 373/1000 [1:20:22<2:13:23, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 37%|█████████████████████████████████████▍ | 374/1000 [1:20:35<2:13:20, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▌ | 375/1000 [1:20:47<2:12:37, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▌ | 376/1000 [1:21:00<2:12:36, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▋ | 377/1000 [1:21:13<2:12:16, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▊ | 378/1000 [1:21:25<2:11:52, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▉ | 379/1000 [1:21:38<2:11:38, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json + 38%|█████████████████████████████████████▉ | 379/1000 [1:21:38<2:11:38, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:44:48,526 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-370\config.json +{'loss': 0.0374, 'learning_rate': 0.0124, 'epoch': 43.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:46:56,384 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 38%|██████████████████████████████████████ | 381/1000 [1:22:04<2:13:17, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 38%|██████████████████████████████████████▏ | 382/1000 [1:22:17<2:12:54, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 38%|██████████████████████████████████████▎ | 383/1000 [1:22:30<2:12:19, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 38%|██████████████████████████████████████▍ | 384/1000 [1:22:43<2:11:38, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 38%|██████████████████████████████████████▌ | 385/1000 [1:22:55<2:11:14, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 39%|██████████████████████████████████████▌ | 386/1000 [1:23:08<2:11:41, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 39%|██████████████████████████████████████▋ | 387/1000 [1:23:21<2:11:23, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 39%|██████████████████████████████████████▊ | 388/1000 [1:23:34<2:11:00, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 39%|██████████████████████████████████████▉ | 389/1000 [1:23:47<2:10:13, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json + 39%|██████████████████████████████████████▉ | 389/1000 [1:23:47<2:10:13, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:46:56,167 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-380\config.json +{'loss': 0.035, 'learning_rate': 0.0122, 'epoch': 44.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:49:05,070 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 39%|███████████████████████████████████████ | 391/1000 [1:24:13<2:11:44, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 39%|███████████████████████████████████████▏ | 392/1000 [1:24:26<2:11:00, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 39%|███████████████████████████████████████▎ | 393/1000 [1:24:38<2:10:11, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 39%|███████████████████████████████████████▍ | 394/1000 [1:24:51<2:09:23, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▌ | 395/1000 [1:25:04<2:09:21, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▌ | 396/1000 [1:25:17<2:09:15, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▋ | 397/1000 [1:25:30<2:09:04, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▊ | 398/1000 [1:25:42<2:08:38, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▉ | 399/1000 [1:25:55<2:08:16, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json + 40%|███████████████████████████████████████▉ | 399/1000 [1:25:55<2:08:16, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:49:04,830 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-390\config.json +{'loss': 0.0349, 'learning_rate': 0.012, 'epoch': 45.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:51:13,534 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 40%|████████████████████████████████████████ | 401/1000 [1:26:21<2:09:18, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 40%|████████████████████████████████████████▏ | 402/1000 [1:26:34<2:08:39, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 40%|████████████████████████████████████████▎ | 403/1000 [1:26:47<2:08:19, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 40%|████████████████████████████████████████▍ | 404/1000 [1:27:00<2:07:41, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 40%|████████████████████████████████████████▌ | 405/1000 [1:27:13<2:07:28, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 41%|████████████████████████████████████████▌ | 406/1000 [1:27:25<2:06:53, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 41%|████████████████████████████████████████▋ | 407/1000 [1:27:38<2:06:34, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 41%|████████████████████████████████████████▊ | 408/1000 [1:27:51<2:06:09, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 41%|████████████████████████████████████████▉ | 409/1000 [1:28:04<2:05:48, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json + 41%|████████████████████████████████████████▉ | 409/1000 [1:28:04<2:05:48, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:51:13,312 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-400\config.json +{'loss': 0.0361, 'learning_rate': 0.0118, 'epoch': 46.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:53:21,935 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 41%|█████████████████████████████████████████ | 411/1000 [1:28:30<2:07:06, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 41%|█████████████████████████████████████████▏ | 412/1000 [1:28:42<2:06:12, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 41%|█████████████████████████████████████████▎ | 413/1000 [1:28:55<2:05:33, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 41%|█████████████████████████████████████████▍ | 414/1000 [1:29:08<2:05:09, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▌ | 415/1000 [1:29:21<2:04:42, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▌ | 416/1000 [1:29:34<2:04:37, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▋ | 417/1000 [1:29:47<2:05:25, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▊ | 418/1000 [1:30:00<2:06:10, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▉ | 419/1000 [1:30:13<2:06:41, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json + 42%|█████████████████████████████████████████▉ | 419/1000 [1:30:13<2:06:41, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:53:21,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-410\config.json +{'loss': 0.0368, 'learning_rate': 0.0116, 'epoch': 48.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:55:31,928 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 42%|██████████████████████████████████████████ | 421/1000 [1:30:40<2:09:04, 13.37s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 42%|██████████████████████████████████████████▏ | 422/1000 [1:30:53<2:07:47, 13.27s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 42%|██████████████████████████████████████████▎ | 423/1000 [1:31:06<2:06:47, 13.19s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 42%|██████████████████████████████████████████▍ | 424/1000 [1:31:19<2:05:41, 13.09s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 42%|██████████████████████████████████████████▌ | 425/1000 [1:31:32<2:05:03, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 43%|██████████████████████████████████████████▌ | 426/1000 [1:31:45<2:04:21, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 43%|██████████████████████████████████████████▋ | 427/1000 [1:31:58<2:03:55, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 43%|██████████████████████████████████████████▊ | 428/1000 [1:32:11<2:03:32, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 43%|██████████████████████████████████████████▉ | 429/1000 [1:32:24<2:03:07, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json + 43%|██████████████████████████████████████████▉ | 429/1000 [1:32:24<2:03:07, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:55:31,661 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-420\config.json +{'loss': 0.0352, 'learning_rate': 0.011399999999999999, 'epoch': 49.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:57:42,327 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 43%|███████████████████████████████████████████ | 431/1000 [1:32:50<2:04:54, 13.17s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 43%|███████████████████████████████████████████▏ | 432/1000 [1:33:03<2:04:07, 13.11s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 43%|███████████████████████████████████████████▎ | 433/1000 [1:33:16<2:03:29, 13.07s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 43%|███████████████████████████████████████████▍ | 434/1000 [1:33:29<2:03:06, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 44%|███████████████████████████████████████████▌ | 435/1000 [1:33:42<2:02:39, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 44%|███████████████████████████████████████████▌ | 436/1000 [1:33:55<2:02:19, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 44%|███████████████████████████████████████████▋ | 437/1000 [1:34:08<2:02:01, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 44%|███████████████████████████████████████████▊ | 438/1000 [1:34:21<2:01:43, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json + 44%|███████████████████████████████████████████▉ | 439/1000 [1:34:34<2:01:24, 12.99s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:57:42,049 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-430\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:59:52,540 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\generation_config.jsonration_utils.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json +[INFO|configuration_utils.py:362] 2023-04-23 03:59:52,540 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\generation_config.jsonration_utils.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json +{'loss': 0.0353, 'learning_rate': 0.011200000000000002, 'epoch': 50.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 03:59:52,793 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 44%|████████████████████████████████████████████ | 441/1000 [1:35:01<2:02:57, 13.20s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 44%|████████████████████████████████████████████▏ | 442/1000 [1:35:14<2:02:02, 13.12s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 44%|████████████████████████████████████████████▎ | 443/1000 [1:35:27<2:01:22, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 44%|████████████████████████████████████████████▍ | 444/1000 [1:35:40<2:00:53, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 44%|████████████████████████████████████████████▌ | 445/1000 [1:35:53<2:00:28, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 45%|████████████████████████████████████████████▌ | 446/1000 [1:36:06<2:00:07, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 45%|████████████████████████████████████████████▋ | 447/1000 [1:36:19<1:59:59, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 45%|████████████████████████████████████████████▊ | 448/1000 [1:36:32<2:00:14, 13.07s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 45%|████████████████████████████████████████████▉ | 449/1000 [1:36:45<2:00:22, 13.11s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json + 45%|████████████████████████████████████████████▉ | 449/1000 [1:36:45<2:00:22, 13.11s/it]0\special_tokens_map.json.py:457] 2023-04-23 03:59:52,536 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-440\config.json +{'loss': 0.0337, 'learning_rate': 0.011000000000000001, 'epoch': 51.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:02:04,354 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 45%|█████████████████████████████████████████████ | 451/1000 [1:37:13<2:04:23, 13.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 45%|█████████████████████████████████████████████▏ | 452/1000 [1:37:26<2:03:05, 13.48s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 45%|█████████████████████████████████████████████▎ | 453/1000 [1:37:39<2:01:28, 13.32s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 45%|█████████████████████████████████████████████▍ | 454/1000 [1:37:52<1:59:24, 13.12s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▌ | 455/1000 [1:38:05<1:58:33, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▌ | 456/1000 [1:38:18<1:57:55, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▋ | 457/1000 [1:38:31<1:57:17, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▊ | 458/1000 [1:38:44<1:56:57, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▉ | 459/1000 [1:38:56<1:56:26, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json + 46%|█████████████████████████████████████████████▉ | 459/1000 [1:38:56<1:56:26, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:02:04,093 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-450\config.json +{'loss': 0.032, 'learning_rate': 0.0108, 'epoch': 52.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:04:14,811 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 46%|██████████████████████████████████████████████ | 461/1000 [1:39:23<1:57:27, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 46%|██████████████████████████████████████████████▏ | 462/1000 [1:39:35<1:56:19, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 46%|██████████████████████████████████████████████▎ | 463/1000 [1:39:48<1:55:59, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 46%|██████████████████████████████████████████████▍ | 464/1000 [1:40:01<1:55:19, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 46%|██████████████████████████████████████████████▌ | 465/1000 [1:40:14<1:54:55, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 47%|██████████████████████████████████████████████▌ | 466/1000 [1:40:27<1:54:26, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 47%|██████████████████████████████████████████████▋ | 467/1000 [1:40:40<1:54:06, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 47%|██████████████████████████████████████████████▊ | 468/1000 [1:40:52<1:53:49, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 47%|██████████████████████████████████████████████▉ | 469/1000 [1:41:05<1:53:34, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json + 47%|██████████████████████████████████████████████▉ | 469/1000 [1:41:05<1:53:34, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:04:14,603 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-460\config.json +{'loss': 0.0366, 'learning_rate': 0.0106, 'epoch': 53.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:06:23,682 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 47%|███████████████████████████████████████████████ | 471/1000 [1:41:32<1:54:45, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 47%|███████████████████████████████████████████████▏ | 472/1000 [1:41:44<1:53:59, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 47%|███████████████████████████████████████████████▎ | 473/1000 [1:41:57<1:53:18, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 47%|███████████████████████████████████████████████▍ | 474/1000 [1:42:10<1:52:50, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 48%|███████████████████████████████████████████████▌ | 475/1000 [1:42:23<1:52:51, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 48%|███████████████████████████████████████████████▌ | 476/1000 [1:42:36<1:52:52, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 48%|███████████████████████████████████████████████▋ | 477/1000 [1:42:49<1:52:24, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 48%|███████████████████████████████████████████████▊ | 478/1000 [1:43:02<1:52:29, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json + 48%|███████████████████████████████████████████████▉ | 479/1000 [1:43:15<1:52:18, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:06:23,419 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-470\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:08:32,962 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:08:32,962 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json +{'loss': 0.0317, 'learning_rate': 0.010400000000000001, 'epoch': 54.86} + 48%|████████████████████████████████████████████████ | 481/1000 [1:43:41<1:52:57, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 48%|████████████████████████████████████████████████▏ | 482/1000 [1:43:54<1:52:20, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 48%|████████████████████████████████████████████████▎ | 483/1000 [1:44:07<1:51:38, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 48%|████████████████████████████████████████████████▍ | 484/1000 [1:44:19<1:50:58, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 48%|████████████████████████████████████████████████▌ | 485/1000 [1:44:32<1:50:31, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 49%|████████████████████████████████████████████████▌ | 486/1000 [1:44:45<1:49:59, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 49%|████████████████████████████████████████████████▋ | 487/1000 [1:44:58<1:49:49, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 49%|████████████████████████████████████████████████▊ | 488/1000 [1:45:11<1:49:31, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 49%|████████████████████████████████████████████████▉ | 489/1000 [1:45:23<1:49:15, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json + 49%|████████████████████████████████████████████████▉ | 489/1000 [1:45:23<1:49:15, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:08:32,746 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-480\config.json +{'loss': 0.0332, 'learning_rate': 0.0102, 'epoch': 56.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:10:41,847 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 49%|█████████████████████████████████████████████████ | 491/1000 [1:45:50<1:50:51, 13.07s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 49%|█████████████████████████████████████████████████▏ | 492/1000 [1:46:03<1:50:05, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 49%|█████████████████████████████████████████████████▎ | 493/1000 [1:46:16<1:49:27, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 49%|█████████████████████████████████████████████████▍ | 494/1000 [1:46:28<1:48:48, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▌ | 495/1000 [1:46:41<1:48:16, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▌ | 496/1000 [1:46:54<1:47:56, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▋ | 497/1000 [1:47:07<1:47:47, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▊ | 498/1000 [1:47:20<1:47:27, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▉ | 499/1000 [1:47:33<1:47:28, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json + 50%|█████████████████████████████████████████████████▉ | 499/1000 [1:47:33<1:47:28, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:10:41,610 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-490\config.json +{'loss': 0.0328, 'learning_rate': 0.01, 'epoch': 57.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:12:50,919 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 50%|██████████████████████████████████████████████████ | 501/1000 [1:47:59<1:48:19, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 50%|██████████████████████████████████████████████████▏ | 502/1000 [1:48:12<1:47:31, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 50%|██████████████████████████████████████████████████▎ | 503/1000 [1:48:24<1:47:05, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 50%|██████████████████████████████████████████████████▍ | 504/1000 [1:48:37<1:46:38, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 50%|██████████████████████████████████████████████████▌ | 505/1000 [1:48:50<1:46:12, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 51%|██████████████████████████████████████████████████▌ | 506/1000 [1:49:03<1:45:56, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 51%|██████████████████████████████████████████████████▋ | 507/1000 [1:49:16<1:45:33, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 51%|██████████████████████████████████████████████████▊ | 508/1000 [1:49:29<1:45:19, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 51%|██████████████████████████████████████████████████▉ | 509/1000 [1:49:41<1:45:01, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json + 51%|██████████████████████████████████████████████████▉ | 509/1000 [1:49:41<1:45:01, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:12:50,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-500\config.json +{'loss': 0.0325, 'learning_rate': 0.0098, 'epoch': 58.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:14:59,747 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 51%|███████████████████████████████████████████████████ | 511/1000 [1:50:08<1:46:35, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 51%|███████████████████████████████████████████████████▏ | 512/1000 [1:50:21<1:46:40, 13.12s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 51%|███████████████████████████████████████████████████▎ | 513/1000 [1:50:34<1:46:35, 13.13s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 51%|███████████████████████████████████████████████████▍ | 514/1000 [1:50:47<1:45:58, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▌ | 515/1000 [1:51:00<1:45:36, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▌ | 516/1000 [1:51:13<1:45:32, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▋ | 517/1000 [1:51:26<1:44:50, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▊ | 518/1000 [1:51:39<1:44:13, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▉ | 519/1000 [1:51:52<1:43:44, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json + 52%|███████████████████████████████████████████████████▉ | 519/1000 [1:51:52<1:43:44, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:14:59,511 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-510\config.json +{'loss': 0.0327, 'learning_rate': 0.0096, 'epoch': 59.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:17:10,454 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 52%|████████████████████████████████████████████████████ | 521/1000 [1:52:18<1:44:35, 13.10s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 52%|████████████████████████████████████████████████████▏ | 522/1000 [1:52:31<1:43:48, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 52%|████████████████████████████████████████████████████▎ | 523/1000 [1:52:44<1:42:56, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 52%|████████████████████████████████████████████████████▍ | 524/1000 [1:52:57<1:42:14, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 52%|████████████████████████████████████████████████████▌ | 525/1000 [1:53:09<1:41:43, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 53%|████████████████████████████████████████████████████▌ | 526/1000 [1:53:22<1:41:20, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 53%|████████████████████████████████████████████████████▋ | 527/1000 [1:53:35<1:41:00, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 53%|████████████████████████████████████████████████████▊ | 528/1000 [1:53:48<1:40:52, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json + 53%|████████████████████████████████████████████████████▉ | 529/1000 [1:54:01<1:40:35, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:17:10,195 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-520\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:19:19,126 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:19:19,126 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json +{'loss': 0.0344, 'learning_rate': 0.0094, 'epoch': 60.57} + 53%|█████████████████████████████████████████████████████ | 531/1000 [1:54:27<1:42:05, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 53%|█████████████████████████████████████████████████████▏ | 532/1000 [1:54:40<1:41:26, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 53%|█████████████████████████████████████████████████████▎ | 533/1000 [1:54:53<1:40:48, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 53%|█████████████████████████████████████████████████████▍ | 534/1000 [1:55:06<1:40:44, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▌ | 535/1000 [1:55:19<1:40:16, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▌ | 536/1000 [1:55:32<1:39:58, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▋ | 537/1000 [1:55:44<1:39:34, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▊ | 538/1000 [1:55:57<1:39:22, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▉ | 539/1000 [1:56:10<1:39:00, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json + 54%|█████████████████████████████████████████████████████▉ | 539/1000 [1:56:10<1:39:00, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:19:18,883 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-530\config.json +{'loss': 0.0351, 'learning_rate': 0.0092, 'epoch': 61.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:21:28,703 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 54%|██████████████████████████████████████████████████████ | 541/1000 [1:56:37<1:40:05, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 54%|██████████████████████████████████████████████████████▏ | 542/1000 [1:56:49<1:39:14, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 54%|██████████████████████████████████████████████████████▎ | 543/1000 [1:57:02<1:38:32, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 54%|██████████████████████████████████████████████████████▍ | 544/1000 [1:57:15<1:38:07, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▌ | 545/1000 [1:57:28<1:37:37, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▌ | 546/1000 [1:57:41<1:37:05, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▋ | 547/1000 [1:57:53<1:36:34, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▊ | 548/1000 [1:58:06<1:36:32, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▉ | 549/1000 [1:58:19<1:36:32, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json + 55%|██████████████████████████████████████████████████████▉ | 549/1000 [1:58:19<1:36:32, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:21:28,492 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-540\config.json +{'loss': 0.0325, 'learning_rate': 0.009000000000000001, 'epoch': 62.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:23:37,487 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 55%|███████████████████████████████████████████████████████ | 551/1000 [1:58:46<1:37:50, 13.07s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 55%|███████████████████████████████████████████████████████▏ | 552/1000 [1:58:58<1:36:55, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 55%|███████████████████████████████████████████████████████▎ | 553/1000 [1:59:11<1:36:39, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 55%|███████████████████████████████████████████████████████▍ | 554/1000 [1:59:24<1:35:50, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▌ | 555/1000 [1:59:37<1:35:45, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▌ | 556/1000 [1:59:50<1:35:26, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▋ | 557/1000 [2:00:03<1:34:55, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▊ | 558/1000 [2:00:15<1:34:39, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▉ | 559/1000 [2:00:28<1:34:19, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json + 56%|███████████████████████████████████████████████████████▉ | 559/1000 [2:00:28<1:34:19, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:23:37,231 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-550\config.json +{'loss': 0.0329, 'learning_rate': 0.0088, 'epoch': 64.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:25:46,473 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 56%|████████████████████████████████████████████████████████ | 561/1000 [2:00:55<1:35:27, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 56%|████████████████████████████████████████████████████████▏ | 562/1000 [2:01:07<1:34:54, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 56%|████████████████████████████████████████████████████████▎ | 563/1000 [2:01:20<1:34:42, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 56%|████████████████████████████████████████████████████████▍ | 564/1000 [2:01:33<1:34:15, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 56%|████████████████████████████████████████████████████████▍ | 565/1000 [2:01:46<1:33:41, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 57%|████████████████████████████████████████████████████████▌ | 566/1000 [2:01:59<1:33:13, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 57%|████████████████████████████████████████████████████████▋ | 567/1000 [2:02:12<1:33:12, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 57%|████████████████████████████████████████████████████████▊ | 568/1000 [2:02:25<1:32:41, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json + 57%|████████████████████████████████████████████████████████▉ | 569/1000 [2:02:37<1:32:19, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:25:46,235 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-560\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:27:55,795 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:27:55,795 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json +{'loss': 0.031, 'learning_rate': 0.0086, 'epoch': 65.14} + 57%|█████████████████████████████████████████████████████████ | 571/1000 [2:03:04<1:33:10, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 57%|█████████████████████████████████████████████████████████▏ | 572/1000 [2:03:17<1:32:52, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 57%|█████████████████████████████████████████████████████████▎ | 573/1000 [2:03:30<1:32:11, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 57%|█████████████████████████████████████████████████████████▍ | 574/1000 [2:03:42<1:31:55, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 57%|█████████████████████████████████████████████████████████▍ | 575/1000 [2:03:55<1:31:40, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 58%|█████████████████████████████████████████████████████████▌ | 576/1000 [2:04:08<1:31:00, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 58%|█████████████████████████████████████████████████████████▋ | 577/1000 [2:04:21<1:30:33, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 58%|█████████████████████████████████████████████████████████▊ | 578/1000 [2:04:34<1:30:10, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json + 58%|█████████████████████████████████████████████████████████▉ | 579/1000 [2:04:46<1:29:55, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:27:55,597 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-570\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:30:04,658 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\generation_config.jsonration_utils.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:30:04,658 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\generation_config.jsonration_utils.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json +{'loss': 0.0326, 'learning_rate': 0.0084, 'epoch': 66.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:30:04,895 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 58%|██████████████████████████████████████████████████████████ | 581/1000 [2:05:13<1:30:59, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 58%|██████████████████████████████████████████████████████████▏ | 582/1000 [2:05:26<1:30:14, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 58%|██████████████████████████████████████████████████████████▎ | 583/1000 [2:05:38<1:29:34, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 58%|██████████████████████████████████████████████████████████▍ | 584/1000 [2:05:51<1:29:05, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 58%|██████████████████████████████████████████████████████████▌ | 585/1000 [2:06:04<1:28:40, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 59%|██████████████████████████████████████████████████████████▌ | 586/1000 [2:06:17<1:28:31, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 59%|██████████████████████████████████████████████████████████▋ | 587/1000 [2:06:29<1:28:09, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 59%|██████████████████████████████████████████████████████████▊ | 588/1000 [2:06:42<1:28:08, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json + 59%|██████████████████████████████████████████████████████████▉ | 589/1000 [2:06:55<1:28:02, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:30:04,656 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-580\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:32:13,471 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\generation_config.jsonration_utils.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:32:13,471 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\generation_config.jsonration_utils.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json +{'loss': 0.0311, 'learning_rate': 0.008199999999999999, 'epoch': 67.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:32:13,674 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 59%|███████████████████████████████████████████████████████████ | 591/1000 [2:07:22<1:29:02, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 59%|███████████████████████████████████████████████████████████▏ | 592/1000 [2:07:35<1:28:29, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 59%|███████████████████████████████████████████████████████████▎ | 593/1000 [2:07:47<1:28:06, 12.99s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 59%|███████████████████████████████████████████████████████████▍ | 594/1000 [2:08:01<1:27:57, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 60%|███████████████████████████████████████████████████████████▌ | 595/1000 [2:08:13<1:27:16, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 60%|███████████████████████████████████████████████████████████▌ | 596/1000 [2:08:26<1:26:49, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 60%|███████████████████████████████████████████████████████████▋ | 597/1000 [2:08:39<1:26:29, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 60%|███████████████████████████████████████████████████████████▊ | 598/1000 [2:08:52<1:26:02, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json + 60%|███████████████████████████████████████████████████████████▉ | 599/1000 [2:09:05<1:25:49, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:32:13,469 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-590\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:34:22,958 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:34:22,958 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json +{'loss': 0.033, 'learning_rate': 0.008, 'epoch': 68.57} + 60%|████████████████████████████████████████████████████████████ | 601/1000 [2:09:31<1:26:49, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 60%|████████████████████████████████████████████████████████████▏ | 602/1000 [2:09:44<1:25:59, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 60%|████████████████████████████████████████████████████████████▎ | 603/1000 [2:09:57<1:25:34, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 60%|████████████████████████████████████████████████████████████▍ | 604/1000 [2:10:09<1:25:11, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 60%|████████████████████████████████████████████████████████████▌ | 605/1000 [2:10:22<1:24:45, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 61%|████████████████████████████████████████████████████████████▌ | 606/1000 [2:10:35<1:25:05, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 61%|████████████████████████████████████████████████████████████▋ | 607/1000 [2:10:48<1:24:33, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 61%|████████████████████████████████████████████████████████████▊ | 608/1000 [2:11:01<1:24:03, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json + 61%|████████████████████████████████████████████████████████████▉ | 609/1000 [2:11:14<1:23:46, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:34:22,732 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-600\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:36:32,195 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:36:32,195 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json +{'loss': 0.0297, 'learning_rate': 0.0078000000000000005, 'epoch': 69.71} + 61%|█████████████████████████████████████████████████████████████ | 611/1000 [2:11:40<1:24:35, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 61%|█████████████████████████████████████████████████████████████▏ | 612/1000 [2:11:53<1:23:53, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 61%|█████████████████████████████████████████████████████████████▎ | 613/1000 [2:12:06<1:23:35, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 61%|█████████████████████████████████████████████████████████████▍ | 614/1000 [2:12:19<1:23:18, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▌ | 615/1000 [2:12:32<1:22:51, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▌ | 616/1000 [2:12:44<1:22:37, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▋ | 617/1000 [2:12:57<1:22:21, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▊ | 618/1000 [2:13:10<1:21:52, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▉ | 619/1000 [2:13:23<1:21:44, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json + 62%|█████████████████████████████████████████████████████████████▉ | 619/1000 [2:13:23<1:21:44, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:36:31,919 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-610\config.json +{'loss': 0.0331, 'learning_rate': 0.0076, 'epoch': 70.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:38:41,434 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 62%|██████████████████████████████████████████████████████████████ | 621/1000 [2:13:49<1:22:16, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 62%|██████████████████████████████████████████████████████████████▏ | 622/1000 [2:14:02<1:21:37, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 62%|██████████████████████████████████████████████████████████████▎ | 623/1000 [2:14:15<1:21:08, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 62%|██████████████████████████████████████████████████████████████▍ | 624/1000 [2:14:28<1:20:44, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 62%|██████████████████████████████████████████████████████████████▌ | 625/1000 [2:14:40<1:20:20, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 63%|██████████████████████████████████████████████████████████████▌ | 626/1000 [2:14:53<1:20:21, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 63%|██████████████████████████████████████████████████████████████▋ | 627/1000 [2:15:06<1:20:07, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 63%|██████████████████████████████████████████████████████████████▊ | 628/1000 [2:15:19<1:19:38, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 63%|██████████████████████████████████████████████████████████████▉ | 629/1000 [2:15:32<1:19:12, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json + 63%|██████████████████████████████████████████████████████████████▉ | 629/1000 [2:15:32<1:19:12, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:38:41,205 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-620\config.json +{'loss': 0.0318, 'learning_rate': 0.0074, 'epoch': 72.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:40:50,268 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 63%|███████████████████████████████████████████████████████████████ | 631/1000 [2:15:58<1:20:12, 13.04s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 63%|███████████████████████████████████████████████████████████████▏ | 632/1000 [2:16:11<1:19:42, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 63%|███████████████████████████████████████████████████████████████▎ | 633/1000 [2:16:24<1:19:21, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 63%|███████████████████████████████████████████████████████████████▍ | 634/1000 [2:16:37<1:19:03, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 64%|███████████████████████████████████████████████████████████████▌ | 635/1000 [2:16:50<1:18:43, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 64%|███████████████████████████████████████████████████████████████▌ | 636/1000 [2:17:03<1:18:20, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 64%|███████████████████████████████████████████████████████████████▋ | 637/1000 [2:17:16<1:18:09, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 64%|███████████████████████████████████████████████████████████████▊ | 638/1000 [2:17:29<1:18:10, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json + 64%|███████████████████████████████████████████████████████████████▉ | 639/1000 [2:17:41<1:17:40, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:40:50,035 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-630\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:42:59,995 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:42:59,995 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json +{'loss': 0.0307, 'learning_rate': 0.0072, 'epoch': 73.14} + 64%|████████████████████████████████████████████████████████████████ | 641/1000 [2:18:08<1:18:29, 13.12s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 64%|████████████████████████████████████████████████████████████████▏ | 642/1000 [2:18:21<1:17:51, 13.05s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 64%|████████████████████████████████████████████████████████████████▎ | 643/1000 [2:18:34<1:17:24, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 64%|████████████████████████████████████████████████████████████████▍ | 644/1000 [2:18:47<1:17:06, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 64%|████████████████████████████████████████████████████████████████▌ | 645/1000 [2:19:00<1:16:48, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 65%|████████████████████████████████████████████████████████████████▌ | 646/1000 [2:19:13<1:16:40, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 65%|████████████████████████████████████████████████████████████████▋ | 647/1000 [2:19:26<1:16:10, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 65%|████████████████████████████████████████████████████████████████▊ | 648/1000 [2:19:38<1:15:33, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 65%|████████████████████████████████████████████████████████████████▉ | 649/1000 [2:19:51<1:15:22, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json + 65%|████████████████████████████████████████████████████████████████▉ | 649/1000 [2:19:51<1:15:22, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:42:59,793 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-640\config.json +{'loss': 0.03, 'learning_rate': 0.006999999999999999, 'epoch': 74.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:45:09,438 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 65%|█████████████████████████████████████████████████████████████████ | 651/1000 [2:20:18<1:16:04, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 65%|█████████████████████████████████████████████████████████████████▏ | 652/1000 [2:20:30<1:15:26, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 65%|█████████████████████████████████████████████████████████████████▎ | 653/1000 [2:20:43<1:14:58, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 65%|█████████████████████████████████████████████████████████████████▍ | 654/1000 [2:20:56<1:14:39, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 66%|█████████████████████████████████████████████████████████████████▌ | 655/1000 [2:21:09<1:14:09, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 66%|█████████████████████████████████████████████████████████████████▌ | 656/1000 [2:21:22<1:13:34, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 66%|█████████████████████████████████████████████████████████████████▋ | 657/1000 [2:21:35<1:13:33, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 66%|█████████████████████████████████████████████████████████████████▊ | 658/1000 [2:21:47<1:13:24, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json + 66%|█████████████████████████████████████████████████████████████████▉ | 659/1000 [2:22:00<1:13:01, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:45:09,247 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-650\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:47:18,685 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:47:18,685 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json +{'loss': 0.0307, 'learning_rate': 0.0068000000000000005, 'epoch': 75.43} + 66%|██████████████████████████████████████████████████████████████████ | 661/1000 [2:22:27<1:13:48, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 66%|██████████████████████████████████████████████████████████████████▏ | 662/1000 [2:22:40<1:13:23, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 66%|██████████████████████████████████████████████████████████████████▎ | 663/1000 [2:22:53<1:12:59, 12.99s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 66%|██████████████████████████████████████████████████████████████████▍ | 664/1000 [2:23:05<1:12:41, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 66%|██████████████████████████████████████████████████████████████████▌ | 665/1000 [2:23:18<1:12:24, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 67%|██████████████████████████████████████████████████████████████████▌ | 666/1000 [2:23:31<1:12:02, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 67%|██████████████████████████████████████████████████████████████████▋ | 667/1000 [2:23:44<1:11:39, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 67%|██████████████████████████████████████████████████████████████████▊ | 668/1000 [2:23:57<1:11:22, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 67%|██████████████████████████████████████████████████████████████████▉ | 669/1000 [2:24:10<1:11:01, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json + 67%|██████████████████████████████████████████████████████████████████▉ | 669/1000 [2:24:10<1:11:01, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:47:18,421 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-660\config.json +{'loss': 0.0334, 'learning_rate': 0.006600000000000001, 'epoch': 76.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:49:28,296 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 67%|███████████████████████████████████████████████████████████████████ | 671/1000 [2:24:36<1:11:40, 13.07s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 67%|███████████████████████████████████████████████████████████████████▏ | 672/1000 [2:24:49<1:11:05, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 67%|███████████████████████████████████████████████████████████████████▎ | 673/1000 [2:25:02<1:10:42, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 67%|███████████████████████████████████████████████████████████████████▍ | 674/1000 [2:25:15<1:10:21, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 68%|███████████████████████████████████████████████████████████████████▌ | 675/1000 [2:25:28<1:09:56, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 68%|███████████████████████████████████████████████████████████████████▌ | 676/1000 [2:25:40<1:09:29, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 68%|███████████████████████████████████████████████████████████████████▋ | 677/1000 [2:25:53<1:09:18, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 68%|███████████████████████████████████████████████████████████████████▊ | 678/1000 [2:26:06<1:09:01, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json + 68%|███████████████████████████████████████████████████████████████████▉ | 679/1000 [2:26:19<1:08:55, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:49:28,070 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-670\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:51:37,524 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:51:37,524 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json +{'loss': 0.0321, 'learning_rate': 0.0064, 'epoch': 77.71} + 68%|████████████████████████████████████████████████████████████████████ | 681/1000 [2:26:45<1:09:20, 13.04s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 68%|████████████████████████████████████████████████████████████████████▏ | 682/1000 [2:26:58<1:08:50, 12.99s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 68%|████████████████████████████████████████████████████████████████████▎ | 683/1000 [2:27:11<1:08:25, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 68%|████████████████████████████████████████████████████████████████████▍ | 684/1000 [2:27:24<1:07:59, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 68%|████████████████████████████████████████████████████████████████████▌ | 685/1000 [2:27:37<1:07:42, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 69%|████████████████████████████████████████████████████████████████████▌ | 686/1000 [2:27:50<1:07:24, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 69%|████████████████████████████████████████████████████████████████████▋ | 687/1000 [2:28:02<1:07:07, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 69%|████████████████████████████████████████████████████████████████████▊ | 688/1000 [2:28:15<1:06:55, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json + 69%|████████████████████████████████████████████████████████████████████▉ | 689/1000 [2:28:28<1:06:41, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:51:37,304 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-680\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:53:46,702 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:53:46,702 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json +{'loss': 0.029, 'learning_rate': 0.0062, 'epoch': 78.86} + 69%|█████████████████████████████████████████████████████████████████████ | 691/1000 [2:28:55<1:07:23, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 69%|█████████████████████████████████████████████████████████████████████▏ | 692/1000 [2:29:07<1:06:38, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 69%|█████████████████████████████████████████████████████████████████████▎ | 693/1000 [2:29:20<1:06:14, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 69%|█████████████████████████████████████████████████████████████████████▍ | 694/1000 [2:29:33<1:05:59, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 70%|█████████████████████████████████████████████████████████████████████▌ | 695/1000 [2:29:46<1:05:38, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 70%|█████████████████████████████████████████████████████████████████████▌ | 696/1000 [2:29:59<1:05:20, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 70%|█████████████████████████████████████████████████████████████████████▋ | 697/1000 [2:30:12<1:05:01, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 70%|█████████████████████████████████████████████████████████████████████▊ | 698/1000 [2:30:25<1:04:48, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json + 70%|█████████████████████████████████████████████████████████████████████▉ | 699/1000 [2:30:38<1:04:36, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:53:46,459 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-690\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:55:55,710 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\generation_config.jsonration_utils.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json +[INFO|configuration_utils.py:362] 2023-04-23 04:55:55,710 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\generation_config.jsonration_utils.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json +{'loss': 0.0315, 'learning_rate': 0.006, 'epoch': 80.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:55:55,948 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 70%|██████████████████████████████████████████████████████████████████████ | 701/1000 [2:31:04<1:05:11, 13.08s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 70%|██████████████████████████████████████████████████████████████████████▏ | 702/1000 [2:31:17<1:04:40, 13.02s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 70%|██████████████████████████████████████████████████████████████████████▎ | 703/1000 [2:31:29<1:03:56, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 70%|██████████████████████████████████████████████████████████████████████▍ | 704/1000 [2:31:42<1:03:25, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 70%|██████████████████████████████████████████████████████████████████████▌ | 705/1000 [2:31:55<1:03:19, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 71%|██████████████████████████████████████████████████████████████████████▌ | 706/1000 [2:32:08<1:03:08, 12.89s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 71%|██████████████████████████████████████████████████████████████████████▋ | 707/1000 [2:32:21<1:02:46, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 71%|██████████████████████████████████████████████████████████████████████▊ | 708/1000 [2:32:34<1:02:26, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 71%|██████████████████████████████████████████████████████████████████████▉ | 709/1000 [2:32:46<1:02:15, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json + 71%|██████████████████████████████████████████████████████████████████████▉ | 709/1000 [2:32:46<1:02:15, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:55:55,706 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-700\config.json +{'loss': 0.0294, 'learning_rate': 0.0058, 'epoch': 81.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 04:58:04,910 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 71%|███████████████████████████████████████████████████████████████████████ | 711/1000 [2:33:13<1:02:54, 13.06s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 71%|███████████████████████████████████████████████████████████████████████▏ | 712/1000 [2:33:26<1:02:24, 13.00s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 71%|███████████████████████████████████████████████████████████████████████▎ | 713/1000 [2:33:39<1:01:56, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 71%|███████████████████████████████████████████████████████████████████████▍ | 714/1000 [2:33:51<1:01:32, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 72%|███████████████████████████████████████████████████████████████████████▌ | 715/1000 [2:34:04<1:01:12, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 72%|███████████████████████████████████████████████████████████████████████▌ | 716/1000 [2:34:17<1:00:57, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 72%|███████████████████████████████████████████████████████████████████████▋ | 717/1000 [2:34:30<1:00:43, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 72%|███████████████████████████████████████████████████████████████████████▊ | 718/1000 [2:34:43<1:00:28, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json + 72%|███████████████████████████████████████████████████████████████████████▉ | 719/1000 [2:34:56<1:00:20, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 04:58:04,659 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-710\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:00:14,192 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:00:14,192 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json +{'loss': 0.0323, 'learning_rate': 0.005600000000000001, 'epoch': 82.29} + 72%|████████████████████████████████████████████████████████████████████████ | 721/1000 [2:35:22<1:00:52, 13.09s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 72%|████████████████████████████████████████████████████████████████████████▏ | 722/1000 [2:35:35<1:00:24, 13.04s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 72%|█████████████████████████████████████████████████████████████████████████▋ | 723/1000 [2:35:48<59:56, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 72%|█████████████████████████████████████████████████████████████████████████▊ | 724/1000 [2:36:01<59:33, 12.95s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 72%|█████████████████████████████████████████████████████████████████████████▉ | 725/1000 [2:36:14<59:13, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 73%|██████████████████████████████████████████████████████████████████████████ | 726/1000 [2:36:27<58:54, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 73%|██████████████████████████████████████████████████████████████████████████▏ | 727/1000 [2:36:39<58:37, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 73%|██████████████████████████████████████████████████████████████████████████▎ | 728/1000 [2:36:52<58:22, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json + 73%|██████████████████████████████████████████████████████████████████████████▎ | 729/1000 [2:37:05<58:07, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:00:13,951 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-720\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:02:23,527 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:02:23,527 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json +{'loss': 0.0274, 'learning_rate': 0.0054, 'epoch': 83.43} + 73%|██████████████████████████████████████████████████████████████████████████▌ | 731/1000 [2:37:32<58:58, 13.15s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 73%|██████████████████████████████████████████████████████████████████████████▋ | 732/1000 [2:37:45<58:15, 13.04s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 73%|██████████████████████████████████████████████████████████████████████████▊ | 733/1000 [2:37:57<57:48, 12.99s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 73%|██████████████████████████████████████████████████████████████████████████▊ | 734/1000 [2:38:10<57:31, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 74%|██████████████████████████████████████████████████████████████████████████▉ | 735/1000 [2:38:23<57:06, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 74%|███████████████████████████████████████████████████████████████████████████ | 736/1000 [2:38:36<56:49, 12.92s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 74%|███████████████████████████████████████████████████████████████████████████▏ | 737/1000 [2:38:49<56:31, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 74%|███████████████████████████████████████████████████████████████████████████▎ | 738/1000 [2:39:02<56:13, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json + 74%|███████████████████████████████████████████████████████████████████████████▍ | 739/1000 [2:39:15<56:00, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:02:23,257 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-730\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:04:33,006 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:04:33,006 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json +{'loss': 0.0305, 'learning_rate': 0.005200000000000001, 'epoch': 84.57} + 74%|███████████████████████████████████████████████████████████████████████████▌ | 741/1000 [2:39:41<56:17, 13.04s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 74%|███████████████████████████████████████████████████████████████████████████▋ | 742/1000 [2:39:54<55:46, 12.97s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 74%|███████████████████████████████████████████████████████████████████████████▊ | 743/1000 [2:40:06<55:17, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 74%|███████████████████████████████████████████████████████████████████████████▉ | 744/1000 [2:40:19<54:55, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 74%|███████████████████████████████████████████████████████████████████████████▉ | 745/1000 [2:40:32<54:33, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 75%|████████████████████████████████████████████████████████████████████████████ | 746/1000 [2:40:45<54:14, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 75%|████████████████████████████████████████████████████████████████████████████▏ | 747/1000 [2:40:58<54:00, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 75%|████████████████████████████████████████████████████████████████████████████▎ | 748/1000 [2:41:10<53:44, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 75%|████████████████████████████████████████████████████████████████████████████▍ | 749/1000 [2:41:23<53:27, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json + 75%|████████████████████████████████████████████████████████████████████████████▍ | 749/1000 [2:41:23<53:27, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:04:32,777 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-740\config.json +{'loss': 0.0316, 'learning_rate': 0.005, 'epoch': 85.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:06:41,504 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 75%|████████████████████████████████████████████████████████████████████████████▌ | 751/1000 [2:41:49<54:00, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 75%|████████████████████████████████████████████████████████████████████████████▋ | 752/1000 [2:42:02<53:29, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 75%|████████████████████████████████████████████████████████████████████████████▊ | 753/1000 [2:42:15<53:12, 12.93s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 75%|████████████████████████████████████████████████████████████████████████████▉ | 754/1000 [2:42:28<52:56, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████ | 755/1000 [2:42:41<52:35, 12.88s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████ | 756/1000 [2:42:54<52:18, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▏ | 757/1000 [2:43:06<52:02, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▎ | 758/1000 [2:43:19<51:44, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▍ | 759/1000 [2:43:32<51:29, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▍ | 759/1000 [2:43:32<51:29, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:06:41,241 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-750\config.json +{'loss': 0.0262, 'learning_rate': 0.0048, 'epoch': 86.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:08:50,417 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▌ | 761/1000 [2:43:58<51:55, 13.03s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▋ | 762/1000 [2:44:11<51:30, 12.98s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▊ | 763/1000 [2:44:24<51:06, 12.94s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 76%|█████████████████████████████████████████████████████████████████████████████▉ | 764/1000 [2:44:37<50:45, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 76%|██████████████████████████████████████████████████████████████████████████████ | 765/1000 [2:44:50<50:25, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▏ | 766/1000 [2:45:02<50:08, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▏ | 767/1000 [2:45:15<49:54, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▎ | 768/1000 [2:45:28<49:40, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▍ | 769/1000 [2:45:41<49:28, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▍ | 769/1000 [2:45:41<49:28, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:08:50,178 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-760\config.json +{'loss': 0.0305, 'learning_rate': 0.0046, 'epoch': 88.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:10:59,441 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▋ | 771/1000 [2:46:07<49:39, 13.01s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▋ | 772/1000 [2:46:20<49:02, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▊ | 773/1000 [2:46:33<48:37, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 77%|██████████████████████████████████████████████████████████████████████████████▉ | 774/1000 [2:46:45<48:14, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████ | 775/1000 [2:46:58<47:58, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▏ | 776/1000 [2:47:11<47:39, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▎ | 777/1000 [2:47:24<47:25, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▎ | 778/1000 [2:47:36<47:07, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▍ | 779/1000 [2:47:49<46:54, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▍ | 779/1000 [2:47:49<46:54, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:10:59,237 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-770\config.json +{'loss': 0.0294, 'learning_rate': 0.0044, 'epoch': 89.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:13:07,328 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▋ | 781/1000 [2:48:15<47:17, 12.96s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▊ | 782/1000 [2:48:28<46:52, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▊ | 783/1000 [2:48:41<46:25, 12.84s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 78%|███████████████████████████████████████████████████████████████████████████████▉ | 784/1000 [2:48:53<46:01, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 78%|████████████████████████████████████████████████████████████████████████████████ | 785/1000 [2:49:06<45:43, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▏ | 786/1000 [2:49:19<45:25, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▎ | 787/1000 [2:49:31<45:09, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▍ | 788/1000 [2:49:44<44:57, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▍ | 789/1000 [2:49:57<44:43, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▍ | 789/1000 [2:49:57<44:43, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:13:07,091 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-780\config.json +{'loss': 0.0291, 'learning_rate': 0.0042, 'epoch': 90.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:15:15,142 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▋ | 791/1000 [2:50:23<44:55, 12.90s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▊ | 792/1000 [2:50:35<44:27, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▉ | 793/1000 [2:50:48<44:00, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 79%|████████████████████████████████████████████████████████████████████████████████▉ | 794/1000 [2:51:01<43:38, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████ | 795/1000 [2:51:13<43:20, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▏ | 796/1000 [2:51:26<43:05, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▎ | 797/1000 [2:51:39<42:50, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▍ | 798/1000 [2:51:51<42:37, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▍ | 799/1000 [2:52:04<42:26, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▍ | 799/1000 [2:52:04<42:26, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:15:14,899 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-790\config.json +{'loss': 0.0274, 'learning_rate': 0.004, 'epoch': 91.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:17:22,214 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▋ | 801/1000 [2:52:30<42:48, 12.91s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▊ | 802/1000 [2:52:43<42:20, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 80%|█████████████████████████████████████████████████████████████████████████████████▉ | 803/1000 [2:52:55<41:59, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 80%|██████████████████████████████████████████████████████████████████████████████████ | 804/1000 [2:53:08<41:39, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 80%|██████████████████████████████████████████████████████████████████████████████████ | 805/1000 [2:53:21<41:18, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▏ | 806/1000 [2:53:33<41:02, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▎ | 807/1000 [2:53:46<40:47, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▍ | 808/1000 [2:53:59<40:33, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▌ | 809/1000 [2:54:11<40:19, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▌ | 809/1000 [2:54:11<40:19, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:17:21,990 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-800\config.json +{'loss': 0.032, 'learning_rate': 0.0038, 'epoch': 92.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:19:29,544 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▋ | 811/1000 [2:54:37<40:29, 12.86s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▊ | 812/1000 [2:54:50<40:02, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 81%|██████████████████████████████████████████████████████████████████████████████████▉ | 813/1000 [2:55:03<39:41, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 81%|███████████████████████████████████████████████████████████████████████████████████ | 814/1000 [2:55:15<39:24, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▏ | 815/1000 [2:55:28<39:09, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▏ | 816/1000 [2:55:41<38:54, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▎ | 817/1000 [2:55:53<38:39, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▍ | 818/1000 [2:56:06<38:28, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▌ | 819/1000 [2:56:19<38:19, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:19:29,311 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-810\config.json +[INFO|configuration_utils.py:362] 2023-04-23 05:21:36,587 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\generation_config.jsonration_utils.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json +[INFO|configuration_utils.py:362] 2023-04-23 05:21:36,587 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\generation_config.jsonration_utils.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json +{'loss': 0.0262, 'learning_rate': 0.0036, 'epoch': 93.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:21:36,814 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▋ | 821/1000 [2:56:44<38:20, 12.85s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▊ | 822/1000 [2:56:57<37:54, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 82%|███████████████████████████████████████████████████████████████████████████████████▉ | 823/1000 [2:57:10<37:33, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 82%|████████████████████████████████████████████████████████████████████████████████████ | 824/1000 [2:57:22<37:14, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 82%|████████████████████████████████████████████████████████████████████████████████████▏ | 825/1000 [2:57:35<36:59, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▎ | 826/1000 [2:57:48<36:46, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▎ | 827/1000 [2:58:00<36:33, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▍ | 828/1000 [2:58:13<36:16, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▌ | 829/1000 [2:58:26<36:04, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▌ | 829/1000 [2:58:26<36:04, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:21:36,584 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-820\config.json +{'loss': 0.0325, 'learning_rate': 0.0034000000000000002, 'epoch': 94.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:23:43,892 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▊ | 831/1000 [2:58:52<36:15, 12.87s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▊ | 832/1000 [2:59:04<35:49, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 83%|████████████████████████████████████████████████████████████████████████████████████▉ | 833/1000 [2:59:17<35:24, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 83%|█████████████████████████████████████████████████████████████████████████████████████ | 834/1000 [2:59:29<35:05, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▏ | 835/1000 [2:59:42<34:46, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▎ | 836/1000 [2:59:55<34:31, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▎ | 837/1000 [3:00:07<34:17, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▍ | 838/1000 [3:00:20<34:02, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▌ | 839/1000 [3:00:32<33:50, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▌ | 839/1000 [3:00:32<33:50, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:23:43,643 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-830\config.json +{'loss': 0.0276, 'learning_rate': 0.0032, 'epoch': 96.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:25:50,561 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▊ | 841/1000 [3:00:58<33:56, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▉ | 842/1000 [3:01:11<33:31, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 84%|█████████████████████████████████████████████████████████████████████████████████████▉ | 843/1000 [3:01:23<33:14, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 84%|██████████████████████████████████████████████████████████████████████████████████████ | 844/1000 [3:01:36<32:59, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 84%|██████████████████████████████████████████████████████████████████████████████████████▏ | 845/1000 [3:01:49<32:46, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▎ | 846/1000 [3:02:01<32:31, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▍ | 847/1000 [3:02:14<32:15, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▍ | 848/1000 [3:02:27<31:56, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▌ | 849/1000 [3:02:39<31:44, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▌ | 849/1000 [3:02:39<31:44, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:25:50,317 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-840\config.json +{'loss': 0.029, 'learning_rate': 0.003, 'epoch': 97.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:27:57,249 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▊ | 851/1000 [3:03:05<31:46, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 85%|██████████████████████████████████████████████████████████████████████████████████████▉ | 852/1000 [3:03:18<31:27, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 85%|███████████████████████████████████████████████████████████████████████████████████████ | 853/1000 [3:03:30<31:08, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 85%|███████████████████████████████████████████████████████████████████████████████████████ | 854/1000 [3:03:43<30:48, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▏ | 855/1000 [3:03:55<30:33, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▎ | 856/1000 [3:04:08<30:19, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▍ | 857/1000 [3:04:21<30:03, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▌ | 858/1000 [3:04:33<29:48, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▌ | 859/1000 [3:04:46<29:32, 12.57s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▌ | 859/1000 [3:04:46<29:32, 12.57s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:27:57,011 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-850\config.json +{'loss': 0.0256, 'learning_rate': 0.0028000000000000004, 'epoch': 98.29} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:30:03,777 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▊ | 861/1000 [3:05:11<29:33, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 86%|███████████████████████████████████████████████████████████████████████████████████████▉ | 862/1000 [3:05:24<29:13, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 86%|████████████████████████████████████████████████████████████████████████████████████████ | 863/1000 [3:05:37<28:54, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 86%|████████████████████████████████████████████████████████████████████████████████████████▏ | 864/1000 [3:05:49<28:38, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 86%|████████████████████████████████████████████████████████████████████████████████████████▏ | 865/1000 [3:06:02<28:22, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▎ | 866/1000 [3:06:14<28:14, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▍ | 867/1000 [3:06:27<27:58, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▌ | 868/1000 [3:06:40<27:44, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▋ | 869/1000 [3:06:52<27:32, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▋ | 869/1000 [3:06:52<27:32, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:30:03,554 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-860\config.json +{'loss': 0.0305, 'learning_rate': 0.0026000000000000003, 'epoch': 99.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:32:10,252 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▊ | 871/1000 [3:07:18<27:30, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 87%|████████████████████████████████████████████████████████████████████████████████████████▉ | 872/1000 [3:07:31<27:13, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 87%|█████████████████████████████████████████████████████████████████████████████████████████ | 873/1000 [3:07:43<27:01, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 87%|█████████████████████████████████████████████████████████████████████████████████████████▏ | 874/1000 [3:07:56<26:43, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▎ | 875/1000 [3:08:09<26:28, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▎ | 876/1000 [3:08:21<26:12, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▍ | 877/1000 [3:08:34<26:00, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▌ | 878/1000 [3:08:47<25:40, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▋ | 879/1000 [3:08:59<25:28, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▋ | 879/1000 [3:08:59<25:28, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:32:10,051 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-870\config.json +{'loss': 0.0271, 'learning_rate': 0.0024, 'epoch': 100.57} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:34:17,348 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▊ | 881/1000 [3:09:25<25:24, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 88%|█████████████████████████████████████████████████████████████████████████████████████████▉ | 882/1000 [3:09:38<25:05, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 88%|██████████████████████████████████████████████████████████████████████████████████████████ | 883/1000 [3:09:50<24:48, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 88%|██████████████████████████████████████████████████████████████████████████████████████████▏ | 884/1000 [3:10:03<24:32, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 88%|██████████████████████████████████████████████████████████████████████████████████████████▎ | 885/1000 [3:10:16<24:17, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▎ | 886/1000 [3:10:28<24:05, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▍ | 887/1000 [3:10:41<23:50, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▌ | 888/1000 [3:10:53<23:35, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▋ | 889/1000 [3:11:06<23:24, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▋ | 889/1000 [3:11:06<23:24, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:34:17,141 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-880\config.json +{'loss': 0.0302, 'learning_rate': 0.0022, 'epoch': 101.71} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:36:24,366 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▉ | 891/1000 [3:11:32<23:17, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 89%|██████████████████████████████████████████████████████████████████████████████████████████▉ | 892/1000 [3:11:45<22:57, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 89%|███████████████████████████████████████████████████████████████████████████████████████████ | 893/1000 [3:11:57<22:40, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 89%|███████████████████████████████████████████████████████████████████████████████████████████▏ | 894/1000 [3:12:10<22:24, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▎ | 895/1000 [3:12:22<22:08, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▍ | 896/1000 [3:12:35<21:54, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▍ | 897/1000 [3:12:48<21:40, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▌ | 898/1000 [3:13:00<21:26, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▋ | 899/1000 [3:13:13<21:13, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▋ | 899/1000 [3:13:13<21:13, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:36:24,172 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-890\config.json +{'loss': 0.0288, 'learning_rate': 0.002, 'epoch': 102.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:38:31,007 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 90%|███████████████████████████████████████████████████████████████████████████████████████████▉ | 901/1000 [3:13:39<21:07, 12.81s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 90%|████████████████████████████████████████████████████████████████████████████████████████████ | 902/1000 [3:13:51<20:47, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 90%|████████████████████████████████████████████████████████████████████████████████████████████ | 903/1000 [3:14:04<20:31, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 90%|████████████████████████████████████████████████████████████████████████████████████████████▏ | 904/1000 [3:14:16<20:15, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 90%|████████████████████████████████████████████████████████████████████████████████████████████▎ | 905/1000 [3:14:29<19:59, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 91%|████████████████████████████████████████████████████████████████████████████████████████████▍ | 906/1000 [3:14:42<19:45, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 91%|████████████████████████████████████████████████████████████████████████████████████████████▌ | 907/1000 [3:14:54<19:32, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 91%|████████████████████████████████████████████████████████████████████████████████████████████▌ | 908/1000 [3:15:07<19:20, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json + 91%|████████████████████████████████████████████████████████████████████████████████████████████▋ | 909/1000 [3:15:19<19:07, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:38:30,752 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-900\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:40:37,529 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:40:37,529 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json +{'loss': 0.0261, 'learning_rate': 0.0018, 'epoch': 104.0} + 91%|████████████████████████████████████████████████████████████████████████████████████████████▉ | 911/1000 [3:15:45<18:59, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 91%|█████████████████████████████████████████████████████████████████████████████████████████████ | 912/1000 [3:15:58<18:39, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 91%|█████████████████████████████████████████████████████████████████████████████████████████████▏ | 913/1000 [3:16:10<18:23, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 91%|█████████████████████████████████████████████████████████████████████████████████████████████▏ | 914/1000 [3:16:23<18:07, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▎ | 915/1000 [3:16:36<17:54, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▍ | 916/1000 [3:16:48<17:41, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▌ | 917/1000 [3:17:01<17:27, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▋ | 918/1000 [3:17:13<17:15, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▋ | 919/1000 [3:17:26<17:03, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▋ | 919/1000 [3:17:26<17:03, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:40:37,284 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-910\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:42:44,213 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:42:44,213 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 92%|█████████████████████████████████████████████████████████████████████████████████████████████▉ | 921/1000 [3:17:52<16:53, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 92%|██████████████████████████████████████████████████████████████████████████████████████████████ | 922/1000 [3:18:04<16:32, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 92%|██████████████████████████████████████████████████████████████████████████████████████████████▏ | 923/1000 [3:18:17<16:20, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 92%|██████████████████████████████████████████████████████████████████████████████████████████████▏ | 924/1000 [3:18:30<16:03, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 92%|██████████████████████████████████████████████████████████████████████████████████████████████▎ | 925/1000 [3:18:42<15:51, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 93%|██████████████████████████████████████████████████████████████████████████████████████████████▍ | 926/1000 [3:18:55<15:35, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 93%|██████████████████████████████████████████████████████████████████████████████████████████████▌ | 927/1000 [3:19:08<15:23, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 93%|██████████████████████████████████████████████████████████████████████████████████████████████▋ | 928/1000 [3:19:20<15:09, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json + 93%|██████████████████████████████████████████████████████████████████████████████████████████████▊ | 929/1000 [3:19:33<14:56, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:42:43,953 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-920\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:44:51,003 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:44:51,003 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json +{'loss': 0.0285, 'learning_rate': 0.0014000000000000002, 'epoch': 106.29} + 93%|██████████████████████████████████████████████████████████████████████████████████████████████▉ | 931/1000 [3:19:59<14:45, 12.83s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 93%|███████████████████████████████████████████████████████████████████████████████████████████████ | 932/1000 [3:20:11<14:28, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 93%|███████████████████████████████████████████████████████████████████████████████████████████████▏ | 933/1000 [3:20:24<14:13, 12.74s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 93%|███████████████████████████████████████████████████████████████████████████████████████████████▎ | 934/1000 [3:20:37<13:59, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▎ | 935/1000 [3:20:49<13:43, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▍ | 936/1000 [3:21:02<13:29, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▌ | 937/1000 [3:21:14<13:15, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▋ | 938/1000 [3:21:27<13:02, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▊ | 939/1000 [3:21:40<12:50, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:44:50,755 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-930\config.json +[INFO|configuration_utils.py:362] 2023-04-23 05:46:57,712 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\generation_config.jsonration_utils.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json +[INFO|configuration_utils.py:362] 2023-04-23 05:46:57,712 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\generation_config.jsonration_utils.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json +{'loss': 0.0271, 'learning_rate': 0.0012, 'epoch': 107.43} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:46:57,937 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 94%|███████████████████████████████████████████████████████████████████████████████████████████████▉ | 941/1000 [3:22:06<12:36, 12.82s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 94%|████████████████████████████████████████████████████████████████████████████████████████████████ | 942/1000 [3:22:18<12:20, 12.77s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 94%|████████████████████████████████████████████████████████████████████████████████████████████████▏ | 943/1000 [3:22:31<12:05, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 94%|████████████████████████████████████████████████████████████████████████████████████████████████▎ | 944/1000 [3:22:44<11:51, 12.70s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 94%|████████████████████████████████████████████████████████████████████████████████████████████████▍ | 945/1000 [3:22:56<11:36, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 95%|████████████████████████████████████████████████████████████████████████████████████████████████▍ | 946/1000 [3:23:09<11:21, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 95%|████████████████████████████████████████████████████████████████████████████████████████████████▌ | 947/1000 [3:23:21<11:06, 12.58s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 95%|████████████████████████████████████████████████████████████████████████████████████████████████▋ | 948/1000 [3:23:34<10:54, 12.58s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json + 95%|████████████████████████████████████████████████████████████████████████████████████████████████▊ | 949/1000 [3:23:46<10:40, 12.55s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:46:57,709 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-940\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:49:04,269 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:49:04,269 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json +{'loss': 0.0289, 'learning_rate': 0.001, 'epoch': 108.57} + 95%|█████████████████████████████████████████████████████████████████████████████████████████████████ | 951/1000 [3:24:12<10:24, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 95%|█████████████████████████████████████████████████████████████████████████████████████████████████ | 952/1000 [3:24:25<10:10, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 95%|█████████████████████████████████████████████████████████████████████████████████████████████████▏ | 953/1000 [3:24:37<09:59, 12.76s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 95%|█████████████████████████████████████████████████████████████████████████████████████████████████▎ | 954/1000 [3:24:50<09:46, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 96%|█████████████████████████████████████████████████████████████████████████████████████████████████▍ | 955/1000 [3:25:03<09:32, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 96%|█████████████████████████████████████████████████████████████████████████████████████████████████▌ | 956/1000 [3:25:15<09:17, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 96%|█████████████████████████████████████████████████████████████████████████████████████████████████▌ | 957/1000 [3:25:28<09:03, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 96%|█████████████████████████████████████████████████████████████████████████████████████████████████▋ | 958/1000 [3:25:40<08:49, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json + 96%|█████████████████████████████████████████████████████████████████████████████████████████████████▊ | 959/1000 [3:25:53<08:36, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:49:04,074 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-950\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:51:11,210 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:51:11,210 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json +{'loss': 0.028, 'learning_rate': 0.0008, 'epoch': 109.71} + 96%|██████████████████████████████████████████████████████████████████████████████████████████████████ | 961/1000 [3:26:19<08:18, 12.79s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 96%|██████████████████████████████████████████████████████████████████████████████████████████████████ | 962/1000 [3:26:31<08:03, 12.73s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 96%|██████████████████████████████████████████████████████████████████████████████████████████████████▏ | 963/1000 [3:26:44<07:49, 12.68s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 96%|██████████████████████████████████████████████████████████████████████████████████████████████████▎ | 964/1000 [3:26:57<07:36, 12.67s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 96%|██████████████████████████████████████████████████████████████████████████████████████████████████▍ | 965/1000 [3:27:09<07:22, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 97%|██████████████████████████████████████████████████████████████████████████████████████████████████▌ | 966/1000 [3:27:22<07:10, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 97%|██████████████████████████████████████████████████████████████████████████████████████████████████▋ | 967/1000 [3:27:35<06:57, 12.64s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 97%|██████████████████████████████████████████████████████████████████████████████████████████████████▋ | 968/1000 [3:27:47<06:44, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 97%|██████████████████████████████████████████████████████████████████████████████████████████████████▊ | 969/1000 [3:28:00<06:30, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json + 97%|██████████████████████████████████████████████████████████████████████████████████████████████████▊ | 969/1000 [3:28:00<06:30, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:51:10,981 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-960\config.json +{'loss': 0.0263, 'learning_rate': 0.0006, 'epoch': 110.86} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:53:17,887 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 97%|███████████████████████████████████████████████████████████████████████████████████████████████████ | 971/1000 [3:28:26<06:11, 12.80s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 97%|███████████████████████████████████████████████████████████████████████████████████████████████████▏ | 972/1000 [3:28:38<05:56, 12.75s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 97%|███████████████████████████████████████████████████████████████████████████████████████████████████▏ | 973/1000 [3:28:51<05:43, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 97%|███████████████████████████████████████████████████████████████████████████████████████████████████▎ | 974/1000 [3:29:04<05:30, 12.71s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▍ | 975/1000 [3:29:16<05:17, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▌ | 976/1000 [3:29:29<05:03, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▋ | 977/1000 [3:29:41<04:50, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▊ | 978/1000 [3:29:54<04:37, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▊ | 979/1000 [3:30:07<04:25, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json + 98%|███████████████████████████████████████████████████████████████████████████████████████████████████▊ | 979/1000 [3:30:07<04:25, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:53:17,684 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-970\config.json +{'loss': 0.0279, 'learning_rate': 0.0004, 'epoch': 112.0} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:55:24,863 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 98%|████████████████████████████████████████████████████████████████████████████████████████████████████ | 981/1000 [3:30:32<04:02, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 98%|████████████████████████████████████████████████████████████████████████████████████████████████████▏ | 982/1000 [3:30:45<03:48, 12.72s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 98%|████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 983/1000 [3:30:58<03:35, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 98%|████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 984/1000 [3:31:10<03:22, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 98%|████████████████████████████████████████████████████████████████████████████████████████████████████▍ | 985/1000 [3:31:23<03:09, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 99%|████████████████████████████████████████████████████████████████████████████████████████████████████▌ | 986/1000 [3:31:35<02:56, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 99%|████████████████████████████████████████████████████████████████████████████████████████████████████▋ | 987/1000 [3:31:48<02:43, 12.61s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 99%|████████████████████████████████████████████████████████████████████████████████████████████████████▊ | 988/1000 [3:32:00<02:31, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 99%|████████████████████████████████████████████████████████████████████████████████████████████████████▉ | 989/1000 [3:32:13<02:18, 12.58s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json + 99%|████████████████████████████████████████████████████████████████████████████████████████████████████▉ | 989/1000 [3:32:13<02:18, 12.58s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:55:24,623 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-980\config.json +{'loss': 0.0272, 'learning_rate': 0.0002, 'epoch': 113.14} +[INFO|tokenization_utils_base.py:2170] 2023-04-23 05:57:31,203 >> Special tokens file saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json + 99%|█████████████████████████████████████████████████████████████████████████████████████████████████████ | 991/1000 [3:32:39<01:55, 12.78s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json + 99%|█████████████████████████████████████████████████████████████████████████████████████████████████████▏| 992/1000 [3:32:51<01:41, 12.69s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json + 99%|█████████████████████████████████████████████████████████████████████████████████████████████████████▎| 993/1000 [3:33:04<01:28, 12.66s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json + 99%|█████████████████████████████████████████████████████████████████████████████████████████████████████▍| 994/1000 [3:33:17<01:15, 12.65s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▍| 995/1000 [3:33:29<01:03, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▌| 996/1000 [3:33:42<00:50, 12.62s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▋| 997/1000 [3:33:54<00:37, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▊| 998/1000 [3:34:07<00:25, 12.60s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▉| 999/1000 [3:34:20<00:12, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████▉| 999/1000 [3:34:20<00:12, 12.63s/it]0\special_tokens_map.json.py:457] 2023-04-23 05:57:30,939 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-990\config.json +{'loss': 0.0285, 'learning_rate': 0.0, 'epoch': 114.29} +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 1000/1000 [3:34:33<00:00, 12.87s/it]00\special_tokens_map.jsonpy:457] 2023-04-23 05:59:37,562 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\config.json +100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 1000/1000 [3:34:33<00:00, 12.87s/it]00\special_tokens_map.jsonpy:457] 2023-04-23 05:59:37,562 >> Configuration saved in output\adgen-chatglm-6b-pt-128-2e-2\checkpoint-1000\config.json +{'train_runtime': 12879.4173, 'train_samples_per_second': 1.242, 'train_steps_per_second': 0.078, 'train_loss': 0.2596614052057266, 'epoch': 114.29} +***** train metrics ***** + epoch = 114.29 + train_loss = 0.2597 + train_runtime = 3:34:39.41 + train_samples = 140 + train_samples_per_second = 1.242 + train_steps_per_second = 0.078 \ No newline at end of file diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/files/requirements.txt b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..b1c5887dbde19aeef8b7d993f1ad21a385d07e57 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/requirements.txt @@ -0,0 +1,451 @@ +-pencv-python==4.5.5.62 +-ywin32==302 +absl-py==1.2.0 +accelerate==0.18.0 +addict==2.4.0 +aiofiles==22.1.0 +aiohttp==3.8.4 +aiosignal==1.3.1 +alibabacloud-nls-java-sdk==2.0.0 +aliyun-python-sdk-core-v3==2.13.33 +aliyun-python-sdk-core==2.13.3 +aliyun-python-sdk-kms==2.15.0 +altair==4.2.2 +altgraph==0.17.3 +anyio==3.6.2 +appdirs==1.4.4 +apscheduler==3.9.1.post1 +argcomplete==2.0.0 +argon2-cffi-bindings==21.2.0 +argon2-cffi==21.3.0 +arrow==1.2.3 +arxiv==1.4.4 +astor==0.8.1 +asttokens==2.2.1 +async-timeout==4.0.2 +attrdict==2.0.1 +attrs==22.1.0 +babel==2.12.1 +backcall==0.2.0 +backoff==2.2.1 +backports.zoneinfo==0.2.1 +basicsr==1.4.2 +bce-python-sdk==0.8.83 +beautifulsoup4==4.11.1 +betterproto==1.2.5 +bitsandbytes==0.38.1 +black==23.3.0 +bleach==5.0.1 +blinker==1.5 +brotli==1.0.9 +cachetools==5.3.0 +certifi==2022.12.7 +cffi==1.15.0 +cfgv==3.3.1 +charset-normalizer==2.0.12 +chatgpt-api==0.2.1 +click==8.1.3 +cohere==4.1.4 +colorama==0.4.4 +colorcet==3.0.1 +comm==0.1.2 +commonmark==0.9.1 +configparser==5.2.0 +contourpy==1.0.5 +cpm-kernels==1.0.11 +crcmod==1.7 +cryptography==38.0.4 +cssselect==1.2.0 +cssutils==2.6.0 +cycler==0.11.0 +cython==0.29.32 +dataclasses-json==0.5.7 +datasets==2.11.0 +datetime==4.4 +debugpy==1.6.5 +decorator==5.1.1 +defusedxml==0.7.1 +deprecated==1.2.13 +dicttoxml==1.7.4 +dill==0.3.6 +distlib==0.3.6 +distro==1.8.0 +docker-pycreds==0.4.0 +entrypoints==0.4 +et-xmlfile==1.1.0 +exceptiongroup==1.1.0 +executing==1.2.0 +facexlib==0.2.5 +fastapi==0.95.0 +fastjsonschema==2.16.2 +feedparser==6.0.10 +ffmpeg-python==0.2.0 +ffmpy==0.3.0 +filelock==3.10.7 +filterpy==1.4.5 +fire==0.5.0 +flask-babel==3.0.1 +flask==2.2.3 +flatbuffers==22.12.6 +fonttools==4.37.4 +fqdn==1.5.1 +frozenlist==1.3.3 +fschat==0.2.1 +fsspec==2023.3.0 +future==0.18.2 +gevent==22.10.2 +geventhttpclient==2.0.2 +gfpgan==1.3.8 +gitdb==4.0.10 +gitpython==3.1.31 +glfw==2.5.5 +google-auth-oauthlib==1.0.0 +google-auth==2.16.0 +googlebard==0.0.7 +gptcache==0.1.11 +gradio-client==0.1.3 +gradio==3.23.0 +greenlet==2.0.1 +grpcio==1.51.1 +grpclib==0.4.3 +h11==0.14.0 +h2==4.1.0 +heartrate==0.2.2 +hpack==4.0.0 +httpcore==0.16.3 +httpx==0.23.1 +huggingface-hub==0.13.3 +hypercorn==0.14.3 +hyperframe==6.0.1 +identify==2.5.22 +idna==3.3 +imageio==2.26.1 +importlib-metadata==6.0.0 +importlib-resources==5.10.2 +infi==0.0.1 +iniconfig==2.0.0 +ipydatawidgets==4.3.2 +ipykernel==6.19.4 +ipympl==0.9.3 +ipython-genutils==0.2.0 +ipython==8.7.0 +ipywidgets==8.0.6 +isoduration==20.11.0 +itk-core==5.3.0 +itk-filtering==5.3.0 +itk-meshtopolydata==0.10.0 +itk-numerics==5.3.0 +itkwidgets==0.32.6 +itsdangerous==2.1.2 +jedi==0.18.2 +jieba==0.42.1 +jinja2==3.1.2 +jmespath==0.10.0 +joblib==1.2.0 +jsonlines==3.1.0 +jsonpointer==2.3 +jsonschema==4.17.3 +jupyter-client==7.4.8 +jupyter-console==6.4.4 +jupyter-contrib-core==0.4.2 +jupyter-contrib-nbextensions==0.7.0 +jupyter-core==5.1.2 +jupyter-events==0.5.0 +jupyter-highlight-selected-word==0.2.0 +jupyter-nbextensions-configurator==0.6.1 +jupyter-server-terminals==0.4.3 +jupyter-server==2.0.6 +jupyter==1.0.0 +jupyterlab-pygments==0.2.2 +jupyterlab-widgets==3.0.7 +keyboard==0.13.5 +kiwisolver==1.4.4 +klembord==0.3.0 +kociemba==1.2.1 +labelimg==1.8.6 +langchain==0.0.139 +latex2mathml==3.75.2 +lazy-loader==0.1 +linkify-it-py==2.0.0 +llama-index==0.5.15 +llvmlite==0.39.1 +lmdb==1.4.0 +loguru==0.7.0 +loralib==0.1.1 +lxml==4.9.0 +markdown-it-py==2.2.0 +markdown2==2.4.8 +markdown==3.4.1 +markupsafe==2.1.1 +marshmallow-enum==1.5.1 +marshmallow==3.19.0 +matplotlib-inline==0.1.6 +matplotlib==3.6.0 +matrix-webcam==0.4.2 +mdit-py-plugins==0.3.3 +mdtex2html==1.2.0 +mdurl==0.1.2 +mediapipe==0.8.11 +medpy==0.4.0 +mistune==2.0.4 +mne==1.3.1 +more-itertools==9.1.0 +mouseinfo==0.1.3 +mpmath==1.3.0 +multidict==6.0.3 +multiprocess==0.70.14 +mypy-extensions==1.0.0 +natsort==8.2.0 +nbclassic==0.4.8 +nbclient==0.7.2 +nbconvert==7.2.7 +nbformat==5.7.1 +nest-asyncio==1.5.6 +networkx==3.0 +nibabel==5.0.1 +nls==1.0.0 +nltk==3.8.1 +nodeenv==1.7.0 +nomic==1.1.6 +notebook-shim==0.2.2 +notebook==6.4.12 +nptyping==2.5.0 +nuitka==0.6.19.3 +numba==0.56.4 +numpy==1.23.3 +oauthlib==3.2.2 +onnx==1.12.0 +onnxruntime==1.11.1 +openai-whisper==20230314 +openai==0.27.4 +openapi-schema-pydantic==1.2.4 +opencv-contrib-python==4.5.5.64 +opencv-python==3.4.9.31 +opengraph-py3==0.71 +openpyxl==3.0.10 +opt-einsum==3.3.0 +orjson==3.8.8 +packaging==23.1 +paddle-bfloat==0.1.7 +paddlepaddle-gpu==2.4.2 +paddlepaddle==2.3.2 +pandas-stubs==1.5.2.221213 +pandas==1.5.2 +pandocfilters==1.5.0 +param==1.13.0 +parso==0.8.3 +pascal-voc-writer==0.1.4 +pathspec==0.11.1 +pathtools==0.1.2 +pdfkit==1.0.0 +pefile==2022.5.30 +peft==0.3.0.dev0 +pickleshare==0.7.5 +pillow==9.1.0 +ping3==4.0.4 +pip==23.0.1 +pipx==1.1.0 +pkgutil-resolve-name==1.3.10 +platformdirs==2.6.2 +pluggy==1.0.0 +pooch==1.7.0 +pre-commit==3.2.1 +premailer==3.10.0 +priority==2.0.0 +prometheus-client==0.15.0 +prompt-toolkit==3.0.36 +protobuf==3.20.0 +psutil==5.9.4 +pure-eval==0.2.2 +py-cpuinfo==9.0.0 +pyarrow==11.0.0 +pyasn1-modules==0.2.8 +pyasn1==0.4.8 +pyaudio==0.2.11 +pyautogui==0.9.53 +pyclipper==1.3.0.post3 +pycparser==2.21 +pycryptodome==3.14.1 +pyct==0.5.0 +pydantic==1.10.7 +pydeck==0.8.1b0 +pydicom==2.3.1 +pydub==0.25.1 +pyee==9.0.4 +pyexecjs==1.5.1 +pygame==2.1.2 +pygameshader==1.0.8 +pygetwindow==0.0.9 +pygithub==1.57 +pyglm==2.6.0 +pygments==2.13.0 +pyinstaller-hooks-contrib==2022.14 +pyinstaller==5.7.0 +pyjwt==2.6.0 +pymediainfo==5.1.0 +pympler==1.0.1 +pymsgbox==1.0.9 +pymupdf==1.19.6 +pynacl==1.5.0 +pynput==1.7.6 +pynrrd==1.0.0 +pyopengl==3.1.6 +pyopenssl==22.1.0 +pyparsing==3.0.9 +pyperclip==1.8.2 +pypinyin==0.46.0 +pypiwin32==223 +pyqt5-plugins==5.15.4.2.2 +pyqt5-qt5==5.15.2 +pyqt5-sip==12.9.1 +pyqt5-tools==5.15.4.3.2 +pyqt5==5.15.4 +pyrect==0.2.0 +pyrsistent==0.19.3 +pyscreeze==0.1.28 +pyshader==0.7.0 +pyside2==5.15.2.1 +pysocks==1.7.1 +pytest==7.2.1 +python-dateutil==2.8.2 +python-docx==0.8.11 +python-dotenv==0.21.1 +python-json-logger==2.0.4 +python-magic-bin==0.4.14 +python-markdown-math==0.8 +python-multipart==0.0.6 +python-rapidjson==1.10 +pytweening==1.0.4 +pytz-deprecation-shim==0.1.0.post0 +pytz==2022.7.1 +pywavelets==1.4.1 +pywifi==1.1.12 +pywin32-ctypes==0.2.0 +pywin32==306 +pywinpty==2.0.10 +pyyaml==6.0 +pyzmq==24.0.1 +qrcode==7.3.1 +qt5-applications==5.15.2.2.2 +qt5-tools==5.15.2.1.2 +qtconsole==5.4.0 +qtpy==2.3.0 +quart-cors==0.5.0 +quart==0.18.3 +rarfile==4.0 +regex==2023.3.23 +requests-oauthlib==1.3.1 +requests==2.27.1 +responses==0.18.0 +rfc3339-validator==0.1.4 +rfc3986-validator==0.1.1 +rfc3986==1.5.0 +rich==12.6.0 +rouge-chinese==1.0.3 +rsa==4.9 +rwkv==0.7.3 +scikit-build==0.16.4 +scikit-image==0.20.0 +scikit-learn==1.2.1 +scipy==1.8.0 +seaborn==0.12.2 +semantic-version==2.10.0 +send2trash==1.8.0 +sentencepiece==0.1.98 +sentry-sdk==1.18.0 +setproctitle==1.3.2 +setuptools-rust==1.5.2 +setuptools==66.0.0 +sgmllib3k==1.0.0 +shellingham==1.5.0.post1 +shiboken2==5.15.2.1 +simpleitk==2.2.1 +six==1.16.0 +sklearn==0.0.post1 +smmap==5.0.0 +sniffio==1.3.0 +sounddevice==0.4.3 +soupsieve==2.3.2.post1 +sqlalchemy==1.4.46 +srt==3.5.1 +stack-data==0.6.2 +starlette==0.26.1 +streamlit==1.21.0 +stringcase==1.2.0 +svgwrite==1.4.3 +sympy==1.11.1 +tb-nightly==2.13.0a20230319 +tenacity==8.2.2 +tencentcloud-sdk-python==3.0.592 +tensorboard-data-server==0.7.0 +tensorboard-plugin-wit==1.8.1 +tensorboard==2.12.0 +tensorboardx==2.5 +termcolor==2.2.0 +terminado==0.17.1 +thop==0.1.1.post2209072238 +threadpoolctl==3.1.0 +tifffile==2023.3.15 +tiktoken==0.3.1 +tinycss2==1.2.1 +tokenize-rt==5.0.0 +tokenizers==0.13.3 +toml==0.10.2 +tomli==2.0.1 +toolz==0.12.0 +torch==2.0.0+cu117 +torchaudio==2.0.1+cu117 +torchvision==0.15.1+cu117 +tornado==6.2 +tqdm==4.64.1 +traitlets==5.9.0 +traittypes==0.2.1 +transformers==4.27.1 +tritonclient==2.31.0 +typer==0.7.0 +types-pytz==2022.7.0.0 +typing-extensions==4.4.0 +typing-inspect==0.8.0 +tzdata==2022.7 +tzlocal==4.2 +uc-micro-py==1.0.1 +ultralytics==8.0.59 +uri-template==1.2.0 +urllib3==1.23 +userpath==1.8.0 +uvicorn==0.21.1 +validators==0.20.0 +virtualenv==20.21.0 +visualdl==2.5.1 +vtk==9.2.6 +wandb==0.14.2 +watchdog==2.1.9 +wavedrom==2.0.3.post3 +wcwidth==0.2.5 +webcolors==1.12 +webencodings==0.5.1 +websocket-client==1.3.1 +websockets==10.4 +wechaty-grpc==0.20.19 +wechaty-puppet-service==0.8.10 +wechaty-puppet==0.4.23 +wechaty==0.10.7 +werkzeug==2.2.2 +wheel==0.38.4 +widgetsnbextension==4.0.7 +win32-setctime==1.1.0 +windows-curses==2.3.0 +wonderwords==2.2.0 +wrapt==1.14.1 +wsproto==1.2.0 +x2paddle==1.4.0 +xxhash==3.2.0 +yagmail==0.15.293 +yapf==0.32.0 +yarl==1.8.2 +you-get==0.4.1555 +youtube-dl==2021.12.17 +zipp==3.11.0 +zope.event==4.6 +zope.interface==5.4.0 +zstandard==0.20.0 \ No newline at end of file diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-metadata.json b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-metadata.json new file mode 100644 index 0000000000000000000000000000000000000000..05d6012770a2881754b8d4a79bd8704f83fd80f7 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-metadata.json @@ -0,0 +1,84 @@ +{ + "os": "Windows-10-10.0.19041-SP0", + "python": "3.8.10", + "heartbeatAt": "2023-04-22T18:25:04.689821", + "startedAt": "2023-04-22T18:25:03.115711", + "docker": null, + "cuda": null, + "args": [ + "--do_train", + "--train_file", + ".\\datasets\\Zettels\\train.json", + "--validation_file", + ".\\datasets\\Zettels\\dev.json", + "--prompt_column", + "content", + "--response_column", + "summary", + "--overwrite_cache", + "--model_name_or_path", + "..\\models\\chatglm-6b-int4", + "--output_dir", + "output\\adgen-chatglm-6b-pt-128-2e-2", + "--overwrite_output_dir", + "--max_source_length", + "64", + "--max_target_length", + "64", + "--per_device_train_batch_size", + "1", + "--per_device_eval_batch_size", + "1", + "--gradient_accumulation_steps", + "16", + "--predict_with_generate", + "--max_steps", + "1000", + "--logging_steps", + "10", + "--save_steps", + "10", + "--learning_rate", + "2e-2", + "--pre_seq_len", + "128", + "--quantization_bit", + "4" + ], + "state": "running", + "program": "main.py", + "codePath": "ptuning\\main.py", + "git": { + "remote": "https://github.com/THUDM/ChatGLM-6B", + "commit": "01e6313abf4122d789d6e68128856af52847b355" + }, + "cpu_count": 6, + "cpu_count_logical": 12, + "cpu_freq": { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + }, + "cpu_freq_per_core": [ + { + "current": 2592.0, + "min": 0.0, + "max": 2592.0 + } + ], + "disk": { + "total": 500.32030868530273, + "used": 240.75128936767578 + }, + "gpu": "NVIDIA GeForce RTX 2060", + "gpu_count": 1, + "gpu_devices": [ + { + "name": "NVIDIA GeForce RTX 2060", + "memory_total": 6442450944 + } + ], + "memory": { + "total": 63.87089538574219 + } +} diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-summary.json b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-summary.json new file mode 100644 index 0000000000000000000000000000000000000000..efa0d15883065aa1c6a3647a157f97c6f66fc599 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/files/wandb-summary.json @@ -0,0 +1 @@ +{"train/loss": 0.0285, "train/learning_rate": 0.0, "train/epoch": 114.29, "train/global_step": 1000, "_timestamp": 1682200778.1567664, "_runtime": 12875.006358385086, "_step": 100, "train/train_runtime": 12879.4173, "train/train_samples_per_second": 1.242, "train/train_steps_per_second": 0.078, "train/total_flos": 3.4665721233408e+16, "train/train_loss": 0.2596614052057266, "_wandb": {"runtime": 12873}} \ No newline at end of file diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug-internal.log b/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug-internal.log new file mode 100644 index 0000000000000000000000000000000000000000..cab1c62d19ac6e182142a9b865264992f789c777 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug-internal.log @@ -0,0 +1,12600 @@ +2023-04-23 02:25:03,135 INFO StreamThr :34348 [internal.py:wandb_internal():86] W&B internal server running at pid: 34348, started at: 2023-04-23 02:25:03.134408 +2023-04-23 02:25:03,137 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status +2023-04-23 02:25:03,152 INFO WriterThread:34348 [datastore.py:open_for_write():85] open: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\run-g3y4djvd.wandb +2023-04-23 02:25:03,152 DEBUG SenderThread:34348 [sender.py:send():375] send: header +2023-04-23 02:25:03,216 DEBUG SenderThread:34348 [sender.py:send():375] send: run +2023-04-23 02:25:03,919 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: check_version +2023-04-23 02:25:03,920 INFO SenderThread:34348 [dir_watcher.py:__init__():219] watching files in: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files +2023-04-23 02:25:03,920 INFO SenderThread:34348 [sender.py:_start_run_threads():1124] run started: g3y4djvd with start time 1682187903.150408 +2023-04-23 02:25:03,920 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:25:03,921 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:25:03,921 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: check_version +2023-04-23 02:25:04,568 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: run_start +2023-04-23 02:25:04,621 DEBUG HandlerThread:34348 [system_info.py:__init__():31] System info init +2023-04-23 02:25:04,621 DEBUG HandlerThread:34348 [system_info.py:__init__():46] System info init done +2023-04-23 02:25:04,621 INFO HandlerThread:34348 [system_monitor.py:start():181] Starting system monitor +2023-04-23 02:25:04,622 INFO SystemMonitor:34348 [system_monitor.py:_start():145] Starting system asset monitoring threads +2023-04-23 02:25:04,622 INFO HandlerThread:34348 [system_monitor.py:probe():201] Collecting system info +2023-04-23 02:25:04,631 INFO SystemMonitor:34348 [interfaces.py:start():190] Started cpu monitoring +2023-04-23 02:25:04,631 INFO SystemMonitor:34348 [interfaces.py:start():190] Started disk monitoring +2023-04-23 02:25:04,632 INFO SystemMonitor:34348 [interfaces.py:start():190] Started gpu monitoring +2023-04-23 02:25:04,645 INFO SystemMonitor:34348 [interfaces.py:start():190] Started memory monitoring +2023-04-23 02:25:04,664 INFO SystemMonitor:34348 [interfaces.py:start():190] Started network monitoring +2023-04-23 02:25:04,689 DEBUG HandlerThread:34348 [system_info.py:probe():195] Probing system +2023-04-23 02:25:04,692 DEBUG HandlerThread:34348 [system_info.py:_probe_git():180] Probing git +2023-04-23 02:25:04,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:04,790 DEBUG HandlerThread:34348 [system_info.py:_probe_git():188] Probing git done +2023-04-23 02:25:04,790 DEBUG HandlerThread:34348 [system_info.py:probe():240] Probing system done +2023-04-23 02:25:04,790 DEBUG HandlerThread:34348 [system_monitor.py:probe():210] {'os': 'Windows-10-10.0.19041-SP0', 'python': '3.8.10', 'heartbeatAt': '2023-04-22T18:25:04.689821', 'startedAt': '2023-04-22T18:25:03.115711', 'docker': None, 'cuda': None, 'args': ('--do_train', '--train_file', '.\\datasets\\Zettels\\train.json', '--validation_file', '.\\datasets\\Zettels\\dev.json', '--prompt_column', 'content', '--response_column', 'summary', '--overwrite_cache', '--model_name_or_path', '..\\models\\chatglm-6b-int4', '--output_dir', 'output\\adgen-chatglm-6b-pt-128-2e-2', '--overwrite_output_dir', '--max_source_length', '64', '--max_target_length', '64', '--per_device_train_batch_size', '1', '--per_device_eval_batch_size', '1', '--gradient_accumulation_steps', '16', '--predict_with_generate', '--max_steps', '1000', '--logging_steps', '10', '--save_steps', '10', '--learning_rate', '2e-2', '--pre_seq_len', '128', '--quantization_bit', '4'), 'state': 'running', 'program': 'main.py', 'codePath': 'ptuning\\main.py', 'git': {'remote': 'https://github.com/THUDM/ChatGLM-6B', 'commit': '01e6313abf4122d789d6e68128856af52847b355'}, 'cpu_count': 6, 'cpu_count_logical': 12, 'cpu_freq': {'current': 2592.0, 'min': 0.0, 'max': 2592.0}, 'cpu_freq_per_core': [{'current': 2592.0, 'min': 0.0, 'max': 2592.0}], 'disk': {'total': 500.32030868530273, 'used': 240.75128936767578}, 'gpu': 'NVIDIA GeForce RTX 2060', 'gpu_count': 1, 'gpu_devices': [{'name': 'NVIDIA GeForce RTX 2060', 'memory_total': 6442450944}], 'memory': {'total': 63.87089538574219}} +2023-04-23 02:25:04,791 INFO HandlerThread:34348 [system_monitor.py:probe():211] Finished collecting system info +2023-04-23 02:25:04,791 INFO HandlerThread:34348 [system_monitor.py:probe():214] Publishing system info +2023-04-23 02:25:04,791 DEBUG HandlerThread:34348 [system_info.py:_save_pip():51] Saving list of pip packages installed into the current environment +2023-04-23 02:25:04,792 DEBUG HandlerThread:34348 [system_info.py:_save_pip():67] Saving pip packages done +2023-04-23 02:25:04,793 INFO HandlerThread:34348 [system_monitor.py:probe():216] Finished publishing system info +2023-04-23 02:25:04,807 DEBUG SenderThread:34348 [sender.py:send():375] send: files +2023-04-23 02:25:04,807 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-metadata.json with policy now +2023-04-23 02:25:04,820 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:25:04,820 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:25:04,936 INFO Thread-16 :34348 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\requirements.txt +2023-04-23 02:25:04,936 INFO Thread-16 :34348 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-metadata.json +2023-04-23 02:25:04,936 INFO Thread-16 :34348 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:25:05,207 DEBUG SenderThread:34348 [sender.py:send():375] send: telemetry +2023-04-23 02:25:05,207 DEBUG SenderThread:34348 [sender.py:send():375] send: config +2023-04-23 02:25:05,208 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 02:25:05,208 DEBUG SenderThread:34348 [sender.py:send():375] send: telemetry +2023-04-23 02:25:05,208 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 02:25:05,208 WARNING SenderThread:34348 [sender.py:send_metric():1329] Seen metric with glob (shouldn't happen) +2023-04-23 02:25:05,938 INFO Thread-16 :34348 [dir_watcher.py:_on_file_created():278] file/dir created: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:06,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:07,959 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:08,301 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:08,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:08,973 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:10,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:11,682 INFO wandb-upload_0:34348 [upload_job.py:push():137] Uploaded file C:\Users\Lenovo\AppData\Local\Temp\tmpsvd1s5c8wandb\jrb6svnl-wandb-metadata.json +2023-04-23 02:25:12,896 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:13,327 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:14,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:16,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:18,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:19,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:19,834 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:25:19,835 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:25:21,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:23,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:23,709 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:25,157 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:26,160 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:27,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:28,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:29,252 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:31,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:33,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:33,791 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:34,852 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:25:34,852 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:25:35,403 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\config.yaml +2023-04-23 02:25:35,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:37,501 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:39,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:40,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:41,522 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:41,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:43,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:45,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:45,661 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:47,659 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:49,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:49,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:25:49,859 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:25:51,102 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:51,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:53,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:55,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:56,154 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:25:57,870 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:25:57,878 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:25:59,924 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:01,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:02,058 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:04,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:04,691 DEBUG SystemMonitor:34348 [system_monitor.py:_start():159] Starting system metrics aggregation loop +2023-04-23 02:26:04,692 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:26:04,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:26:04,864 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:26:06,230 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:07,151 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:08,252 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:10,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:12,244 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:12,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:14,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:16,406 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:26:16,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:17,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:18,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:19,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:26:19,866 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:26:20,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:22,544 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:23,120 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:24,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:26,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:28,160 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:28,677 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:30,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:31,752 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:26:32,800 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:33,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:34,700 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:26:34,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:34,885 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:26:34,886 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:26:36,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:38,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:39,178 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:40,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:43,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:44,238 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:45,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:47,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:48,154 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:26:49,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:49,856 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:49,902 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:26:49,902 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:26:51,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:53,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:55,175 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:26:55,345 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:57,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:26:59,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:00,225 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:01,492 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:03,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:04,706 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:27:04,911 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:27:04,912 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:27:05,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:07,264 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:07,719 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:09,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:10,728 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:27:11,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:12,917 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:13,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:15,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:17,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:17,964 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:19,960 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:27:19,960 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:27:19,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:22,004 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:23,717 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:24,038 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:27:24,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:26,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:28,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:28,745 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:30,198 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:32,250 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:33,772 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:34,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:34,721 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:27:34,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:27:34,955 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:27:36,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:38,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:39,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:40,421 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:27:40,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:42,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:44,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:44,535 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:46,568 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:48,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:49,314 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:49,971 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:27:49,971 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:27:50,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:52,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:54,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:55,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:27:55,475 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:27:55,477 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 02:27:55,477 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 02:27:55,477 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 02:27:55,478 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:27:55,478 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:27:55,479 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:27:55,766 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:27:56,799 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:27:56,802 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:27:57,815 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:27:58,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:00,750 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:00,905 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:02,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:04,728 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:28:05,012 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:28:05,013 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:28:05,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:06,267 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:07,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:09,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:11,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:11,306 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:13,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:14,237 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:28:15,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:17,233 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:17,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:18,320 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\config.yaml +2023-04-23 02:28:19,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:19,997 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:28:19,997 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:28:21,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:23,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:23,475 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:25,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:27,553 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:28,327 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:29,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:31,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:32,650 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:28:33,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:34,100 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:34,736 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:28:35,008 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:28:35,009 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:28:35,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:37,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:39,291 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:39,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:41,910 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:43,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:44,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:45,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:48,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:49,008 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:28:49,919 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:50,058 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:28:50,058 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:28:50,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:52,143 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:54,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:55,396 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:28:56,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:28:58,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:00,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:00,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:02,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:04,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:04,748 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:29:05,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:29:05,015 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:29:06,239 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:06,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:08,566 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:29:08,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:10,967 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:11,296 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:12,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:15,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:16,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:17,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:19,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:20,021 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:29:20,022 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:29:20,730 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:29:21,113 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:22,283 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:23,160 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:25,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:27,249 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:27,343 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:29,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:31,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:32,874 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:29:33,257 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:33,405 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:34,755 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:29:35,033 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:29:35,034 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:29:35,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:37,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:38,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:39,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:41,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:43,350 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:43,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:45,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:47,034 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:29:47,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:49,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:49,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:50,044 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:29:50,045 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:29:51,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:53,888 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:54,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:29:55,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:57,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:29:59,865 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:00,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:01,219 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:02,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:04,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:04,758 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:30:04,965 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:05,040 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:30:05,040 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:30:06,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:08,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:10,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:10,350 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:12,403 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:13,375 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:14,416 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:15,780 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:16,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:18,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:20,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:30:20,041 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:30:20,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:21,292 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:22,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:24,475 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:30:24,476 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:30:24,477 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:30:24,478 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:30:24,629 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:24,629 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:30:24,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:26,680 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:26,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:26,729 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:28,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:30,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:31,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:32,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:34,764 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:30:34,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:35,060 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:30:35,061 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:30:36,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:37,312 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:38,975 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:38,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:41,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:42,741 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:43,134 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:45,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:47,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:47,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:49,202 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:50,054 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:30:50,055 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:30:51,229 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:30:51,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:53,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:53,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:55,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:57,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:30:58,488 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:30:59,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:01,467 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:03,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:04,215 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:04,766 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:31:05,064 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:31:05,064 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:31:05,555 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:31:05,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:07,609 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:09,339 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:09,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:11,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:13,815 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:14,375 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:15,835 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:17,810 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:31:17,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:19,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:20,015 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:20,063 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:31:20,063 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:31:21,911 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:23,953 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:25,351 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:25,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:28,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:30,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:30,841 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:31,088 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:31:32,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:34,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:34,782 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:31:35,065 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:31:35,066 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:31:36,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:36,330 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:38,261 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:40,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:41,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:42,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:43,368 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:31:44,470 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:46,485 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:46,660 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:48,512 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:50,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:31:50,077 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:31:50,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:52,353 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:52,595 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:54,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:55,650 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:31:56,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:31:57,474 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:31:58,723 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:00,786 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:02,522 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:02,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:04,790 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:32:04,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:05,089 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:32:05,089 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:32:06,918 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:08,169 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:08,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:09,968 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:32:11,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:13,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:13,214 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:15,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:17,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:18,251 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:19,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:20,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:32:20,099 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:32:21,269 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:32:21,278 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:23,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:23,365 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:25,411 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:27,457 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:28,462 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:29,516 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:31,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:32,966 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:32:32,968 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:32:32,968 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:32:32,969 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:32:33,590 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:32:33,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:34,197 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:34,603 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:32:34,802 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:32:35,100 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:32:35,100 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:32:35,636 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:32:35,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:37,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:39,375 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:39,727 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:41,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:43,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:44,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:45,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:47,889 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:32:47,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:49,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:50,067 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:50,113 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:32:50,113 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:32:52,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:54,052 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:55,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:32:56,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:32:58,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:00,183 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:33:00,191 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:01,261 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:02,230 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:04,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:04,818 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:33:05,117 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:33:05,117 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:33:06,312 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:06,377 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:08,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:10,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:11,416 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:12,481 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:13,443 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:33:14,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:16,623 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:17,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:18,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:20,125 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:33:20,125 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:33:20,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:22,384 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:22,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:24,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:25,771 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:33:26,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:28,091 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:28,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:30,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:32,972 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:33,113 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:34,832 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:33:35,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:35,127 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:33:35,128 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:33:37,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:38,933 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:39,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:40,120 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:33:41,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:43,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:43,953 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:45,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:47,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:48,991 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:49,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:50,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:33:50,126 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:33:51,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:52,359 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:33:53,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:54,865 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:33:55,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:57,495 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:59,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:33:59,913 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:01,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:03,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:04,844 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:34:05,098 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:05,144 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:34:05,144 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:34:05,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:06,657 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:07,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:09,752 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:10,433 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:11,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:13,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:15,475 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:15,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:17,958 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:17,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:20,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:20,153 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:34:20,154 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:34:21,410 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:22,033 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:24,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:26,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:26,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:28,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:30,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:31,636 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:32,242 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:32,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:34,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:34,851 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:34:35,149 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:34:35,150 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:34:36,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:37,404 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:38,383 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:40,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:42,459 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:42,464 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:42,479 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:34:42,481 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:34:42,481 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:34:42,482 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:34:43,454 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:43,454 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:34:44,500 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:44,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:46,553 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:47,739 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:48,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:50,169 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:34:50,169 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:34:50,677 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:52,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:53,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:34:54,742 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:56,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:58,815 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:34:58,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:34:58,952 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:00,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:02,945 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:03,981 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:04,866 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:35:04,986 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:05,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:35:05,164 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:35:07,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:09,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:09,765 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:10,082 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:35:11,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:13,173 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:14,814 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:15,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:17,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:19,363 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:19,852 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:20,169 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:35:20,170 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:35:21,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:22,360 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:35:23,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:25,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:25,669 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:27,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:29,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:30,721 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:31,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:33,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:34,878 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:35:35,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:35:35,193 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:35:35,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:36,454 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:36,646 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:35:37,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:39,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:41,513 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:41,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:43,807 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:45,853 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:46,547 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:47,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:48,891 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:35:50,005 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:50,210 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:35:50,211 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:35:52,039 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:52,491 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:54,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:56,103 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:35:57,542 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:35:58,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:00,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:02,212 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:02,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:03,237 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:04,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:04,885 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:36:05,216 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:36:05,216 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:36:06,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:08,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:08,484 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:10,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:12,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:14,010 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:14,488 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:14,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:16,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:18,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:19,035 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:20,218 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:36:20,219 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:36:20,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:22,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:24,514 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:24,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:26,738 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:26,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:28,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:29,843 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:30,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:32,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:34,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:34,928 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:36:34,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:35,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:36:35,230 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:36:36,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:39,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:40,705 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:41,084 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:41,093 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:43,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:45,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:45,722 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:47,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:49,296 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:50,246 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:36:50,247 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:36:51,405 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:51,497 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:51,613 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:36:51,614 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:36:51,614 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:36:51,615 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:36:52,377 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:52,377 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:36:53,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:54,388 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:36:55,448 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:56,896 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:36:57,475 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:36:59,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:01,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:01,944 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:03,608 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:04,934 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:37:05,261 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:37:05,261 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:37:05,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:06,657 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:37:07,528 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:07,706 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:09,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:11,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:12,571 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:13,843 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:15,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:17,613 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:17,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:18,923 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:37:19,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:20,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:37:20,278 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:37:22,084 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:23,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:24,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:26,120 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:28,151 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:28,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:30,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:32,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:33,240 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:37:34,088 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:34,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:34,939 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:37:35,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:37:35,288 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:37:36,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:38,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:39,572 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:40,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:42,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:44,514 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:37:44,522 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:44,875 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:46,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:48,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:49,922 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:50,297 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:37:50,298 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:37:50,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:52,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:54,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:55,577 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:37:56,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:37:58,873 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:37:58,886 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:00,805 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:00,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:02,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:04,939 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:38:05,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:05,302 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:38:05,302 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:38:06,556 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:07,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:09,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:11,156 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:38:11,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:11,671 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:13,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:15,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:16,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:17,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:19,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:20,314 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:38:20,314 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:38:21,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:22,577 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:23,475 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:38:23,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:25,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:27,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:27,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:29,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:31,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:32,718 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:33,660 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:34,944 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:38:35,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:38:35,320 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:38:35,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:37,754 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:38:37,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:38,581 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:39,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:41,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:43,627 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:43,912 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:45,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:47,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:49,006 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:38:49,370 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:50,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:50,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:38:50,346 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:38:52,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:54,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:54,620 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:38:56,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:58,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:38:59,671 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:00,308 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:01,193 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:39:01,194 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:39:01,194 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:39:01,195 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:39:01,310 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:39:02,350 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:02,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:03,360 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:04,413 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:04,951 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:39:04,952 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:05,354 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:39:05,354 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:39:06,481 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:08,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:10,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:10,618 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:12,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:14,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:15,665 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:15,883 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:16,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:18,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:20,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:39:20,359 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:39:20,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:21,622 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:22,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:24,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:26,661 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:27,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:28,997 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:29,035 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:31,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:31,790 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:33,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:34,960 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:39:35,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:35,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:39:35,371 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:39:37,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:37,635 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:39,280 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:41,321 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:41,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:42,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:43,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:45,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:47,470 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:47,736 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:49,509 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:50,378 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:39:50,378 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:39:51,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:53,448 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:53,599 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:55,678 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:39:55,731 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:57,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:39:58,480 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:39:59,767 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:01,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:03,538 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:03,869 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:04,973 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:40:05,380 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:40:05,380 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:40:05,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:07,939 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:40:07,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:08,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:09,986 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:12,055 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:13,598 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:14,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:16,119 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:18,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:18,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:20,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:20,406 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:40:20,406 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:40:22,252 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:40:22,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:23,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:24,305 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:26,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:28,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:28,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:30,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:32,509 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:33,517 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:40:34,547 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:34,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:34,976 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:40:35,417 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:40:35,418 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:40:36,604 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:38,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:39,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:40,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:42,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:44,764 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:44,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:46,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:47,819 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:40:48,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:50,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:50,427 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:40:50,428 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:40:50,911 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:52,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:54,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:55,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:40:57,108 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:40:59,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:00,098 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:01,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:01,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:03,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:04,978 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:41:05,214 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:05,437 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:41:05,437 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:41:06,692 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:07,254 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:09,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:11,137 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:41:11,138 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:41:11,138 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:41:11,139 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:41:11,360 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:41:11,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:12,364 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:12,409 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:13,400 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:13,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:14,409 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:15,444 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:17,459 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:17,485 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:19,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:20,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:41:20,452 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:41:21,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:22,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:23,626 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:25,658 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:25,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:27,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:28,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:29,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:31,822 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:33,664 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:33,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:34,986 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:41:35,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:41:35,470 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:41:35,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:37,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:39,506 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:39,957 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:39,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:42,014 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:44,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:44,550 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:46,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:48,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:49,610 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:50,204 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:50,462 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:41:50,463 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:41:52,252 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:41:52,259 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:54,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:54,774 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:41:56,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:58,481 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:41:59,835 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:00,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:02,521 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:04,564 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:42:04,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:04,986 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:42:04,987 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:05,483 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:42:05,483 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:42:06,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:08,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:10,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:10,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:12,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:14,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:15,833 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:16,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:17,838 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:42:18,883 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:20,493 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:42:20,494 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:42:20,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:21,753 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:22,972 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:25,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:26,799 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:27,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:29,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:30,117 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:42:31,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:31,871 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:33,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:34,992 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:42:35,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:35,505 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:42:35,506 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:42:37,283 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:37,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:39,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:41,381 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:43,444 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:43,678 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:44,436 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:42:45,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:47,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:48,718 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:49,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:50,518 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:42:50,518 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:42:51,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:53,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:53,798 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:55,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:56,707 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:42:57,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:42:59,607 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:42:59,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:01,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:03,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:04,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:05,000 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:43:05,515 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:43:05,516 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:43:05,943 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:07,984 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:08,977 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:43:10,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:10,482 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:12,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:14,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:15,533 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:16,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:18,203 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:20,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:20,313 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:43:20,314 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:43:20,314 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:43:20,316 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:43:20,553 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:20,566 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:43:20,566 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:43:21,244 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:43:21,244 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:43:22,286 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:43:22,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:24,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:25,861 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:26,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:28,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:30,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:30,872 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:32,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:34,549 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:43:34,588 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:35,012 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:43:35,565 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:43:35,565 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:43:36,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:36,814 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:38,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:40,685 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:41,848 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:42,736 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:44,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:46,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:47,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:48,863 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:43:48,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:50,560 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:43:50,560 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:43:50,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:52,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:52,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:54,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:57,045 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:43:57,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:43:59,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:01,166 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:44:01,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:03,200 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:03,413 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:05,016 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:44:05,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:05,570 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:44:05,571 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:44:07,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:08,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:09,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:11,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:13,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:14,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:14,382 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:44:15,428 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:17,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:19,247 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:19,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:20,580 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:44:20,580 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:44:21,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:23,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:24,858 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:25,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:26,632 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:44:27,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:29,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:30,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:31,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:33,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:35,020 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:44:35,422 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:35,578 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:44:35,578 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:44:35,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:37,936 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:38,933 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:44:39,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:40,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:42,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:44,069 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:45,886 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:46,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:48,152 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:50,189 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:50,584 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:44:50,585 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:44:51,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:52,249 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:53,243 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:44:54,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:56,337 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:44:56,912 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:44:58,373 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:00,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:01,971 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:02,517 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:04,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:05,033 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:45:05,508 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:05,601 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:45:05,601 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:45:06,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:07,875 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:08,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:10,628 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:12,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:12,901 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:14,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:16,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:16,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:18,285 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:18,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:20,615 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:45:20,615 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:45:20,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:22,892 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:23,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:24,950 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:26,990 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:28,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:28,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:45:28,952 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:45:28,953 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:45:28,954 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:45:29,025 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:45:29,038 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:30,037 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:31,085 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:31,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:33,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:34,240 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:35,047 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:45:35,229 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:35,625 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:45:35,626 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:45:37,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:39,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:39,922 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:41,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:43,380 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:43,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:45,208 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:45,442 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:47,495 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:49,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:50,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:50,641 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:45:50,641 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:45:51,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:53,648 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:55,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:55,827 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:45:57,738 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:45:57,746 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:45:59,786 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:00,884 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:01,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:03,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:05,058 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:46:05,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:46:05,654 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:46:05,915 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:05,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:07,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:08,968 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:46:10,016 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:11,511 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:12,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:14,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:16,146 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:16,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:18,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:20,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:20,660 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:46:20,661 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:46:21,228 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:46:21,920 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:22,270 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:24,316 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:26,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:26,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:28,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:30,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:31,998 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:32,503 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:33,498 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:46:34,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:35,072 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:46:35,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:46:35,668 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:46:36,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:37,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:38,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:40,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:42,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:42,979 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:44,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:46,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:47,818 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:46:48,426 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:48,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:50,673 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:46:50,674 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:46:50,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:52,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:53,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:46:54,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:57,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:59,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:46:59,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:00,076 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:01,126 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:03,172 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:04,151 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:05,086 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:47:05,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:05,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:47:05,683 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:47:07,305 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:09,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:09,966 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:11,355 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:11,364 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:13,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:15,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:15,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:17,475 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:19,517 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:20,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:47:20,694 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:47:20,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:21,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:23,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:25,640 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:25,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:26,467 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:27,690 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:29,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:31,494 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:31,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:33,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:35,093 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:47:35,712 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:47:35,713 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:47:35,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:36,176 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:47:36,177 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:47:36,177 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:47:36,179 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:47:36,923 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:47:37,430 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:37,932 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:37,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:38,933 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:39,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:42,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:42,477 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:44,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:46,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:47,520 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:48,151 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:50,191 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:47:50,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:50,724 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:47:50,724 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:47:52,250 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:52,993 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:54,304 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:56,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:47:58,051 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:47:58,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:00,452 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:02,503 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:03,356 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:03,501 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:48:04,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:05,110 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:48:05,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:48:05,734 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:48:06,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:08,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:09,002 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:10,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:12,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:14,044 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:14,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:16,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:17,856 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:48:18,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:19,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:20,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:48:20,734 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:48:20,961 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:23,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:25,064 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:25,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:27,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:29,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:30,171 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:48:30,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:31,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:33,255 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:35,125 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:48:35,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:35,585 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:35,741 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:48:35,741 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:48:37,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:39,426 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:41,029 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:41,449 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:42,445 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:48:43,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:45,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:46,093 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:47,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:49,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:50,761 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:48:50,761 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:48:51,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:52,021 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:53,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:55,761 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:48:55,770 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:57,808 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:48:57,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:48:59,883 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:01,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:02,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:03,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:05,128 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:49:05,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:49:05,768 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:49:06,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:07,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:08,099 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:49:08,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:10,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:12,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:12,891 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:14,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:16,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:17,925 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:18,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:20,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:20,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:49:20,775 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:49:22,437 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:49:22,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:23,043 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:24,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:26,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:28,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:28,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:30,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:32,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:33,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:34,718 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:49:34,727 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:35,141 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:49:35,799 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:49:35,799 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:49:36,778 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:38,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:39,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:40,926 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:42,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:44,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:45,003 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:45,314 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:49:45,315 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:49:45,316 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:49:45,316 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:49:46,011 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:49:46,012 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:49:47,042 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:49:47,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:49,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:49,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:50,808 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:49:50,809 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:49:51,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:53,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:55,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:49:55,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:57,279 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:49:59,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:00,325 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:50:00,575 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:01,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:03,401 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:05,151 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:50:05,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:05,621 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:05,828 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:50:05,828 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:50:07,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:09,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:11,128 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:11,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:12,608 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:50:13,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:15,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:16,410 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:17,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:19,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:20,835 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:50:20,836 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:50:21,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:22,096 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:23,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:24,829 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:50:25,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:27,924 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:28,069 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:29,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:32,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:33,107 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:34,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:35,163 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:50:35,841 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:50:35,842 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:50:36,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:38,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:38,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:39,169 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:50:40,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:42,287 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:43,786 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:44,304 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:46,329 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:48,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:48,821 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:50,428 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:50:50,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:50,861 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:50:50,861 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:50:52,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:54,117 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:50:54,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:56,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:58,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:50:59,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:00,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:02,712 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:02,723 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:04,181 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:04,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:05,170 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:51:05,854 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:51:05,854 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:51:06,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:08,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:10,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:10,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:12,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:15,004 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:15,902 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:17,036 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:17,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:19,085 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:20,864 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:51:20,865 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:51:21,123 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:21,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:23,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:25,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:26,184 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:27,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:29,301 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:29,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:31,354 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:31,569 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:33,401 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:35,179 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:51:35,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:35,882 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:51:35,882 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:51:37,142 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:37,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:39,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:40,547 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:41,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:42,236 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:43,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:45,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:47,267 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:47,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:49,752 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:50,902 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:51:50,902 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:51:51,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:52,785 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:51:52,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:51:52,808 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:51:52,809 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:51:52,810 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:51:53,837 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:53,837 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:51:53,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:54,852 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:51:55,896 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:57,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:51:58,093 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:00,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:02,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:03,150 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:04,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:05,188 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:52:05,917 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:52:05,917 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:52:06,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:08,159 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:08,178 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:09,158 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:52:10,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:12,367 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:13,235 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:14,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:16,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:18,284 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:18,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:20,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:20,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:52:20,936 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:52:21,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:52:22,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:24,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:24,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:26,603 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:28,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:29,263 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:30,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:32,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:52:32,757 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:34,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:35,189 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:52:35,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:35,933 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:52:35,934 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:52:36,839 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:38,896 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:40,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:40,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:43,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:45,010 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:52:45,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:45,796 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:47,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:49,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:50,947 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:52:50,947 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:52:51,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:51,205 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:53,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:55,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:56,251 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:52:57,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:52:59,317 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:52:59,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:01,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:01,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:03,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:05,198 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:53:05,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:05,959 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:53:05,959 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:53:07,224 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:07,500 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:09,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:11,581 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:53:11,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:13,134 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:13,703 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:15,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:17,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:18,179 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:19,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:20,963 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:53:20,964 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:53:21,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:23,790 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:23,856 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:53:23,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:25,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:27,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:28,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:29,999 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:32,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:33,867 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:34,079 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:35,083 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:53:35,208 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:53:35,977 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:53:35,978 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:53:36,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:38,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:39,275 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:40,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:42,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:44,326 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:44,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:46,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:48,449 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:49,400 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:53:50,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:50,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:50,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:53:50,990 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:53:52,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:54,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:55,271 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:53:56,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:58,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:53:59,655 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:53:59,657 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:53:59,657 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:53:59,658 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:54:00,690 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:00,690 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:54:00,701 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:00,878 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:01,694 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:02,745 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:04,801 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:05,213 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:54:06,012 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:54:06,012 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:54:06,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:06,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:08,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:10,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:11,321 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:12,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:13,977 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:15,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:16,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:17,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:19,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:21,023 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:54:21,023 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:54:21,162 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:22,277 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:23,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:25,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:27,291 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:27,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:27,415 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:29,356 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:31,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:32,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:33,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:35,216 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:54:35,465 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:36,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:54:36,037 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:54:37,509 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:38,091 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:39,557 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:39,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:41,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:43,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:43,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:45,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:47,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:48,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:49,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:51,043 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:54:51,043 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:54:51,852 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:54:51,854 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:53,308 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:54:53,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:55,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:58,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:54:58,346 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:00,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:02,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:04,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:04,348 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:05,220 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:55:06,052 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:55:06,053 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:55:06,168 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:55:06,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:08,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:10,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:10,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:12,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:14,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:15,413 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:16,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:17,515 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:55:18,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:20,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:20,832 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:21,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:55:21,069 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:55:22,618 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:24,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:26,375 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:26,708 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:28,757 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:29,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:55:30,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:31,595 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:32,854 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:34,898 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:35,232 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:55:36,083 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:55:36,084 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:55:36,936 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:37,344 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:38,983 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:41,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:42,024 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:55:43,079 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:43,225 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:45,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:47,233 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:48,286 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:49,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:51,088 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:55:51,088 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:55:51,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:53,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:53,361 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:55,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:56,342 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:55:57,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:55:58,825 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:55:59,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:01,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:03,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:03,868 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:05,237 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:56:05,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:06,097 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:56:06,097 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:56:06,492 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:56:06,494 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:56:06,494 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:56:06,495 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:56:06,566 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:56:07,626 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:08,634 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:56:09,716 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:09,763 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:11,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:13,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:14,809 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:15,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:17,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:19,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:20,734 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:21,097 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:56:21,098 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:56:21,945 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:56:21,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:24,014 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:26,079 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:26,400 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:28,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:30,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:31,442 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:32,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:34,257 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:56:34,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:35,240 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:56:36,092 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:56:36,092 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:56:36,317 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:37,358 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:38,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:40,407 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:42,412 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:42,453 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:44,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:46,531 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:56:46,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:47,968 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:48,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:50,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:51,102 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:56:51,102 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:56:52,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:53,376 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:54,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:56,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:56:58,569 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:56:58,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:00,868 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:57:00,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:02,920 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:03,614 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:04,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:05,245 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:57:06,111 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:57:06,112 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:57:07,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:09,053 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:09,367 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:11,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:12,089 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:57:13,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:15,175 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:15,241 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:17,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:19,331 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:20,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:21,110 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:57:21,111 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:57:21,355 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:23,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:24,358 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:57:25,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:25,910 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:27,449 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:29,491 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:30,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:31,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:33,589 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:35,249 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:57:35,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:36,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:57:36,120 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:57:36,361 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:36,638 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:57:37,679 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:39,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:41,408 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:41,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:43,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:45,875 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:46,464 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:47,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:50,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:51,029 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:57:51,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:57:51,139 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:57:52,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:52,401 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:54,084 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:56,122 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:57:57,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:57:58,170 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:00,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:02,257 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:02,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:02,759 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:04,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:05,255 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:58:06,150 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:58:06,150 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:58:06,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:08,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:08,427 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:10,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:12,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:13,363 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 02:58:13,364 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 02:58:13,365 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 02:58:13,366 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 02:58:13,524 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 02:58:13,619 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:14,548 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:14,557 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:15,556 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:16,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:18,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:18,668 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:20,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:21,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:58:21,157 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:58:22,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:24,427 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:24,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:26,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:28,898 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:28,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:29,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:30,971 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:33,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:34,726 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:35,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:35,257 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:58:36,152 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:58:36,153 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:58:37,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:39,160 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:40,353 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:41,195 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:41,204 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:43,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:45,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:45,394 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:47,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:49,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:50,430 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:51,165 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:58:51,165 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:58:51,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:53,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:54,479 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:58:55,546 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:56,010 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:58:57,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:58:59,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:01,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:01,657 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:03,703 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:05,271 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:59:05,741 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:06,188 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:59:06,189 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:59:06,450 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:06,739 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:59:07,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:09,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:11,499 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:11,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:13,945 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:15,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:16,545 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:18,041 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:19,041 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:59:20,087 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:21,194 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:59:21,194 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:59:22,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:22,450 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:24,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:26,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:27,498 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:28,283 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:30,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:31,335 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:59:32,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:32,881 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:34,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:35,282 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 02:59:36,224 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:59:36,225 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:59:36,476 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:38,518 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:38,531 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:40,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:42,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:44,522 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:44,655 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:59:44,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:46,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:48,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:49,581 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:50,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:51,239 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 02:59:51,239 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 02:59:52,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:54,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:55,135 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 02:59:56,947 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 02:59:56,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 02:59:59,005 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:00,156 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:01,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:03,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:05,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:05,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:05,290 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:00:06,257 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:00:06,257 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:00:07,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:09,229 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:09,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:10,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:11,284 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:13,336 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:15,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:15,861 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:17,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:19,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:20,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:00:20,454 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:00:20,455 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:00:20,456 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:00:20,463 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:00:21,043 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:21,268 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:00:21,268 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:00:21,497 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:21,504 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:22,511 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:23,574 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:23,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:25,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:26,571 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:27,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:29,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:31,623 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:31,739 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:33,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:35,292 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:00:35,830 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:35,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:36,277 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:00:36,277 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:00:37,524 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:37,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:39,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:41,984 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:42,568 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:44,022 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:46,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:47,062 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:00:48,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:48,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:50,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:51,280 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:00:51,280 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:00:52,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:53,547 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:00:54,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:56,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:58,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:00:58,589 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:00,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:01,428 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:01:02,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:04,020 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:04,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:05,301 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:01:06,291 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:01:06,292 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:01:06,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:08,625 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:09,563 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:10,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:12,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:13,717 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:01:14,609 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:14,756 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:16,806 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:18,855 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:19,650 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:20,895 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:21,309 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:01:21,310 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:01:22,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:25,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:25,313 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:26,008 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:01:27,072 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:29,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:30,353 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:31,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:33,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:35,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:35,312 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:01:36,085 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:36,322 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:01:36,322 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:01:37,264 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:01:37,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:39,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:41,377 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:42,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:43,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:45,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:47,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:47,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:49,544 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:51,344 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:01:51,344 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:01:51,586 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:01:51,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:52,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:53,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:55,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:57,663 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:01:57,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:01:59,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:01,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:03,218 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:03,870 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:03,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:05,316 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:02:05,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:06,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:02:06,343 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:02:07,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:08,596 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:10,020 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:12,070 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:13,626 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:14,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:16,162 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:16,172 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:18,217 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:18,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:20,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:21,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:02:21,337 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:02:22,307 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:24,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:24,614 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:26,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:27,664 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:02:27,665 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:02:27,665 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:02:27,666 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:02:28,491 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:28,491 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:02:28,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:29,505 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:29,926 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:30,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:32,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:34,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:34,946 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:35,324 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:02:36,346 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:02:36,346 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:02:36,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:38,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:40,608 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:40,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:41,826 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:42,870 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:44,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:45,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:46,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:49,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:50,927 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:51,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:51,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:02:51,349 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:02:53,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:55,170 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:56,163 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:02:56,534 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:02:57,278 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:02:59,296 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:01,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:01,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:03,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:05,324 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:03:05,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:06,373 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:03:06,373 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:03:06,616 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:07,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:08,451 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:03:09,491 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:11,535 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:11,665 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:13,592 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:15,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:16,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:17,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:19,737 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:03:19,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:21,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:03:21,382 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:03:21,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:22,639 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:23,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:25,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:27,674 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:27,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:30,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:32,031 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:03:32,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:33,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:34,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:35,334 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:03:36,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:36,396 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:03:36,396 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:03:38,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:38,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:40,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:42,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:43,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:44,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:46,370 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:03:46,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:48,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:48,953 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:50,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:51,399 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:03:51,400 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:03:52,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:54,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:54,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:03:56,595 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:03:58,674 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:03:58,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:00,536 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:00,739 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:02,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:04,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:05,347 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:04:06,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:06,405 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:04:06,406 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:04:06,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:08,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:09,883 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:04:10,940 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:11,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:12,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:15,033 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:16,180 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:17,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:19,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:21,172 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:21,413 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:04:21,413 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:04:21,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:23,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:24,221 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:04:25,279 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:26,812 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:27,325 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:29,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:31,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:31,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:33,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:34,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:04:34,415 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:04:34,415 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:04:34,416 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:04:34,468 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:04:35,362 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:04:35,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:36,424 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:04:36,425 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:04:36,518 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:04:37,554 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:04:37,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:37,689 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:39,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:41,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:42,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:43,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:45,745 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:47,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:48,687 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:48,798 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:04:49,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:51,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:04:51,439 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:04:51,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:53,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:04:53,937 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:55,974 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:58,020 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:04:58,741 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:00,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:02,127 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:05:02,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:04,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:04,322 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:05,364 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:05:06,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:06,441 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:05:06,442 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:05:08,269 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:09,710 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:10,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:12,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:14,411 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:05:14,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:14,977 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:16,459 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:18,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:20,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:20,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:21,452 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:05:21,452 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:05:22,605 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:24,650 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:25,570 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:26,683 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:05:26,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:28,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:30,619 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:30,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:32,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:34,932 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:35,365 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:05:36,184 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:36,452 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:05:36,452 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:05:36,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:38,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:41,031 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:05:41,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:41,222 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:43,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:45,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:46,261 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:47,185 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:49,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:51,269 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:51,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:05:51,454 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:05:51,700 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:52,270 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:05:53,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:55,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:56,745 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:05:57,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:05:59,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:01,536 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:01,786 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:03,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:04,534 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:06:05,371 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:06:05,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:06,458 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:06:06,458 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:06:07,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:07,722 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:09,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:11,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:12,780 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:13,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:15,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:17,830 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:18,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:18,822 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:06:19,878 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:21,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:06:21,466 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:06:21,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:23,740 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:23,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:25,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:28,038 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:28,776 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:30,070 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:31,069 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:06:32,214 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:33,820 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:34,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:35,383 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:06:36,259 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:36,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:06:36,470 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:06:38,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:39,763 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:40,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:41,385 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:06:41,387 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:06:41,388 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:06:41,388 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:06:42,343 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:06:42,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:43,359 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:06:44,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:45,692 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:46,444 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:48,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:50,521 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:50,720 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:51,484 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:06:51,485 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:06:52,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:54,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:56,671 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:06:56,672 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:06:56,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:06:58,719 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:00,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:01,732 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:02,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:04,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:05,392 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:07:06,503 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:07:06,503 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:07:06,755 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:06,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:08,918 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:07:08,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:10,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:12,331 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:13,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:15,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:17,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:17,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:19,148 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:21,174 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:07:21,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:21,519 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:07:21,519 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:07:22,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:23,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:25,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:27,305 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:27,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:29,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:31,395 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:33,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:33,625 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:35,394 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:07:35,481 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:07:35,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:36,525 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:07:36,526 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:07:37,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:38,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:39,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:41,647 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:43,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:43,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:45,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:47,771 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:07:47,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:49,249 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:49,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:51,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:07:51,553 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:07:51,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:53,910 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:54,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:07:55,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:57,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:07:59,005 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:07:59,871 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:00,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:02,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:04,198 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:04,926 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:05,400 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:08:06,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:06,565 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:08:06,566 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:08:08,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:10,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:10,517 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:12,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:13,317 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:14,362 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:15,559 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:16,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:18,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:20,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:20,611 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:21,571 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:08:21,572 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:08:22,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:24,588 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:25,602 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:26,199 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:26,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:28,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:30,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:31,235 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:32,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:34,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:35,411 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:08:36,292 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:36,589 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:08:36,589 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:08:36,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:37,874 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:38,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:40,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:41,885 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:43,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:45,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:46,941 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:47,094 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:48,467 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:08:48,468 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:08:48,469 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:08:48,469 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:08:49,128 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:49,129 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:08:49,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:50,132 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:51,170 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:08:51,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:51,619 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:08:51,620 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:08:52,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:53,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:55,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:57,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:08:57,915 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:08:59,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:01,385 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:03,418 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:09:03,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:03,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:05,460 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:09:05,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:06,623 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:09:06,624 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:09:07,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:08,874 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:09,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:11,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:13,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:13,921 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:15,684 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:09:15,692 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:17,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:19,393 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:19,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:21,625 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:09:21,625 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:09:21,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:23,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:24,905 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:25,911 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:27,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:29,985 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:09:29,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:30,073 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:32,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:34,115 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:35,129 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:35,431 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:09:36,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:36,639 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:09:36,640 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:09:38,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:40,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:40,687 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:41,268 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:09:42,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:44,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:45,724 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:46,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:48,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:50,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:50,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:51,649 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:09:51,649 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:09:52,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:53,533 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:09:54,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:56,312 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:09:56,615 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:09:58,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:00,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:01,367 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:02,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:04,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:05,446 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:10:06,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:10:06,654 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:10:06,918 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:06,951 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:07,932 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:08,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:10,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:11,984 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:13,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:15,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:17,097 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:17,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:19,170 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:20,167 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:21,216 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:21,666 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:10:21,667 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:10:22,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:23,274 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:25,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:27,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:27,974 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:29,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:31,465 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:31,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:33,368 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:33,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:35,456 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:10:35,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:36,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:10:36,683 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:10:37,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:38,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:39,683 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:41,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:43,764 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:43,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:44,007 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:45,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:47,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:49,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:49,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:51,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:10:51,686 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:10:51,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:54,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:55,000 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:10:55,657 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:10:55,659 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:10:55,659 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:10:55,661 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:10:56,051 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:10:56,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:10:57,063 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:58,087 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:10:58,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:00,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:00,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:02,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:04,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:05,459 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:11:06,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:06,375 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:06,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:11:06,693 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:11:08,338 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:10,336 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:11:10,362 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:11,906 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:12,404 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:14,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:16,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:16,949 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:18,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:20,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:21,695 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:11:21,695 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:11:22,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:22,954 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:24,644 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:11:24,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:26,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:28,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:28,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:30,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:32,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:33,025 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:34,886 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:35,474 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:11:35,885 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:11:36,703 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:11:36,703 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:11:36,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:38,976 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:39,034 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:41,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:43,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:44,034 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:45,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:47,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:48,163 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:11:49,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:49,679 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:51,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:51,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:11:51,701 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:11:53,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:54,981 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:11:55,352 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:57,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:11:59,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:00,362 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:00,441 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:12:01,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:03,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:05,402 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:05,481 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:12:05,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:06,709 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:12:06,709 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:12:07,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:09,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:10,996 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:11,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:13,770 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:14,756 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:12:15,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:16,001 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:17,877 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:19,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:21,034 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:21,728 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:12:21,729 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:12:22,003 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:24,041 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:26,089 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:12:26,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:26,652 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:28,138 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:30,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:31,705 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:32,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:34,284 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:35,493 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:12:36,335 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:36,753 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:12:36,754 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:12:37,010 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:38,372 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:12:38,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:40,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:42,262 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:42,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:44,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:46,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:47,333 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:48,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:50,616 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:12:50,630 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:51,766 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:12:51,767 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:12:52,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:53,031 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:54,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:56,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:12:58,066 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:12:58,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:00,859 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:02,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:13:02,699 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:13:02,699 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:13:02,701 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:13:02,910 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:13:02,924 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:03,954 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:04,954 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:13:04,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:05,496 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:13:06,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:13:06,758 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:13:07,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:09,029 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:09,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:11,143 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:13,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:14,102 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:15,175 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:17,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:18,233 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:13:19,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:20,030 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:21,308 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:21,769 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:13:21,769 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:13:23,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:25,039 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:25,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:27,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:29,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:30,481 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:13:30,633 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:31,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:33,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:35,507 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:13:35,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:36,467 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:36,765 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:13:36,765 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:13:37,647 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:39,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:41,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:42,210 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:42,777 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:13:43,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:45,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:47,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:47,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:49,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:51,761 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:13:51,761 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:13:51,978 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:53,002 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:13:54,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:55,027 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:13:56,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:58,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:13:58,848 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:00,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:02,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:03,887 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:04,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:05,508 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:14:06,296 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:06,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:14:06,781 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:14:08,336 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:09,048 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:09,335 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:14:10,375 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:12,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:14,088 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:14,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:16,516 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:18,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:20,086 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:20,576 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:14:20,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:21,789 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:14:21,790 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:14:22,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:24,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:26,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:26,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:28,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:30,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:31,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:32,848 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:14:32,856 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:34,896 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:35,524 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:14:36,501 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:36,801 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:14:36,801 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:14:36,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:38,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:41,022 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:42,092 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:43,132 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:45,120 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:14:45,151 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:47,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:47,358 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:49,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:51,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:51,798 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:14:51,798 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:14:53,047 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:53,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:55,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:57,390 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:14:58,966 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:14:59,415 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:14:59,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:01,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:03,504 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:03,993 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:05,529 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:15:05,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:06,810 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:15:06,811 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:15:07,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:09,086 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:09,525 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:15:09,526 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:15:09,527 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:15:09,528 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:15:09,617 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:15:09,625 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:11,663 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:15:11,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:13,791 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:14,803 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:15,806 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:17,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:19,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:19,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:21,817 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:15:21,817 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:15:21,900 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:23,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:24,937 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:15:25,849 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:25,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:28,016 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:30,050 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:30,910 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:32,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:34,171 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:35,529 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:15:36,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:36,532 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:36,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:15:36,829 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:15:37,225 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:15:38,288 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:40,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:42,129 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:42,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:44,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:46,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:47,174 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:48,553 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:49,532 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:15:50,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:51,834 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:15:51,834 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:15:52,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:53,087 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:54,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:56,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:15:58,134 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:15:58,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:00,855 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:01,848 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:16:02,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:03,661 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:04,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:05,532 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:16:06,843 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:16:06,843 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:16:06,967 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:09,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:09,114 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:11,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:13,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:14,264 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:15,174 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:16:15,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:17,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:19,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:19,303 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:21,287 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:21,844 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:16:21,844 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:16:23,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:25,116 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:25,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:27,407 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:16:27,416 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:29,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:30,871 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:31,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:33,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:35,537 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:16:35,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:36,544 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:36,843 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:16:36,844 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:16:37,627 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:39,670 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:16:39,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:41,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:42,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:43,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:45,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:47,492 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:47,895 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:49,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:51,861 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:16:51,861 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:16:51,962 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:53,123 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:16:53,993 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:16:54,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:56,039 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:58,087 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:16:58,166 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:00,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:02,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:03,212 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:04,217 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:05,230 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:05,534 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:17:06,262 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:06,870 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:17:06,871 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:17:08,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:09,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:10,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:12,410 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:14,193 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:14,448 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:16,373 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:17:16,374 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:17:16,375 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:17:16,377 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:17:16,530 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:17:16,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:17,537 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:18,553 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:18,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:19,673 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:20,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:21,887 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:17:21,887 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:17:22,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:24,685 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:25,171 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:26,734 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:28,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:30,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:30,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:31,838 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:32,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:34,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:35,565 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:17:36,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:36,891 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:17:36,891 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:17:36,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:39,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:41,045 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:42,165 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:43,087 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:44,082 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:45,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:47,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:47,416 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:49,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:51,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:51,896 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:17:51,896 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:17:53,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:53,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:55,338 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:17:55,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:57,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:17:59,116 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:17:59,426 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:01,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:03,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:04,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:05,572 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:18:05,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:06,917 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:18:06,918 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:18:07,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:09,692 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:18:09,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:09,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:11,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:13,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:14,888 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:15,826 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:17,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:19,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:19,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:21,961 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:18:21,961 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:18:21,963 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:18:21,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:24,016 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:25,243 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:26,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:28,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:30,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:30,272 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:32,224 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:34,260 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:18:34,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:35,582 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:18:35,583 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:36,316 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:36,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:18:36,937 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:18:38,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:40,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:41,214 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:42,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:44,500 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:46,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:46,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:47,553 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:18:48,666 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:50,685 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:51,610 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:51,941 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:18:51,942 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:18:52,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:54,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:56,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:57,228 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:18:58,835 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:18:59,835 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:00,896 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:02,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:02,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:04,974 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:05,596 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:19:06,950 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:19:06,951 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:19:07,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:08,219 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:09,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:11,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:12,100 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:13,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:13,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:15,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:17,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:18,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:19,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:21,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:21,965 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:19:21,965 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:19:23,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:23,403 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:19:23,404 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:19:23,404 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:19:23,406 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:19:24,348 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:24,348 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:19:24,680 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:25,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:26,395 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:27,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:29,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:29,710 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:31,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:33,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:34,764 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:35,612 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:19:35,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:36,988 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:19:36,989 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:19:37,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:38,681 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:39,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:40,258 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:41,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:43,815 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:45,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:45,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:47,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:50,000 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:19:50,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:51,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:52,017 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:19:52,017 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:19:52,038 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:54,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:56,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:19:56,308 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:19:58,155 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:00,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:01,341 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:02,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:04,277 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:20:04,287 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:05,618 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:20:06,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:06,629 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:07,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:20:07,006 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:20:08,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:10,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:12,284 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:12,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:14,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:16,533 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:20:16,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:17,643 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:18,588 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:20,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:22,025 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:20:22,026 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:20:22,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:23,285 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:24,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:26,772 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:28,789 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:20:28,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:29,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:30,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:32,910 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:34,316 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:34,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:35,645 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:20:36,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:37,028 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:20:37,029 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:20:39,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:39,819 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:40,038 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:20:41,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:43,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:44,868 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:45,167 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:47,202 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:49,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:49,926 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:51,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:52,030 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:20:52,031 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:20:53,361 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:54,344 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:20:55,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:55,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:20:57,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:20:59,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:00,496 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:01,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:03,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:05,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:05,646 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:21:05,647 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:06,627 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:07,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:21:07,036 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:21:07,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:09,727 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:11,327 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:11,778 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:13,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:15,878 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:16,364 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:17,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:18,919 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:19,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:21,677 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:22,059 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:21:22,059 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:21:22,069 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:24,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:26,110 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:27,354 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:28,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:30,188 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:30,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:21:30,338 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:21:30,338 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:21:30,339 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:21:31,186 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:21:32,238 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:32,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:32,573 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:33,251 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:34,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:35,649 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:21:36,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:37,058 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:21:37,059 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:21:38,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:38,377 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:40,439 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:42,500 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:43,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:44,547 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:44,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:46,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:48,639 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:48,650 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:50,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:52,063 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:21:52,064 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:21:52,801 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:54,340 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:21:54,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:56,846 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:21:58,861 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:21:58,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:00,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:00,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:02,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:05,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:05,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:05,655 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:22:07,065 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:22:07,065 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:22:07,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:09,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:10,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:11,158 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:22:11,162 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:13,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:15,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:15,875 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:17,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:19,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:20,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:21,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:22,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:22:22,057 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:22:23,453 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:22:23,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:25,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:26,339 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:27,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:29,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:31,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:31,588 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:33,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:34,635 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:22:35,672 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:22:35,683 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:36,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:37,062 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:22:37,063 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:22:37,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:39,761 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:41,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:42,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:43,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:45,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:47,601 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:47,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:48,921 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:22:49,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:52,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:52,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:22:52,062 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:22:53,326 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:22:54,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:56,152 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:58,167 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:22:58,379 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:00,202 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:01,200 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:02,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:04,162 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:04,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:05,678 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:23:06,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:07,067 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:23:07,067 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:23:08,381 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:09,326 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:10,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:12,467 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:13,455 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:14,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:14,932 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:16,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:18,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:19,959 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:20,625 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:22,062 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:23:22,063 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:23:22,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:24,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:25,560 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:25,765 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:26,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:28,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:30,601 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:30,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:32,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:34,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:35,643 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:35,691 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:23:36,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:37,075 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:23:37,076 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:23:37,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:23:37,335 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:23:37,335 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:23:37,335 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:23:37,992 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:23:39,015 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:39,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:40,025 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:41,075 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:41,510 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:43,115 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:45,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:46,536 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:47,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:49,259 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:51,303 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:23:51,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:51,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:52,083 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:23:52,083 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:23:53,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:55,464 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:57,385 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:23:57,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:23:59,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:01,560 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:02,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:03,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:05,633 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:24:05,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:05,695 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:24:07,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:24:07,096 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:24:07,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:08,343 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:09,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:11,800 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:13,377 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:13,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:15,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:17,919 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:24:17,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:18,728 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:19,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:21,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:22,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:24:22,099 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:24:24,038 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:24,357 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:26,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:28,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:29,127 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:24:30,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:30,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:32,232 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:34,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:35,328 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:35,708 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:24:36,329 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:37,100 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:24:37,100 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:24:38,367 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:40,412 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:40,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:42,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:43,456 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:24:44,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:45,997 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:46,557 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:48,592 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:50,638 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:51,033 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:52,101 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:24:52,101 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:24:52,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:54,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:55,724 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:24:56,532 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:24:56,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:24:58,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:00,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:01,591 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:02,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:05,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:05,709 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:25:06,797 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:07,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:07,121 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:25:07,121 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:25:08,073 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:25:09,122 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:11,155 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:12,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:13,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:15,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:17,294 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:17,454 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:19,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:21,363 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:25:21,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:22,133 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:25:22,133 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:25:22,665 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:23,411 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:25,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:27,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:27,684 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:29,597 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:31,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:33,517 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:33,631 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:25:33,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:35,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:35,720 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:25:37,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:25:37,138 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:25:37,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:39,412 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:39,778 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:41,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:43,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:44,149 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:25:44,151 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:25:44,151 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:25:44,152 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:25:44,867 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:25:45,403 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:45,892 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:25:45,899 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:47,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:49,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:50,441 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:52,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:52,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:25:52,140 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:25:54,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:56,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:25:56,412 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:25:58,242 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:25:58,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:00,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:02,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:02,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:04,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:05,736 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:26:06,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:07,143 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:26:07,143 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:26:07,397 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:08,418 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:10,465 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:12,495 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:26:12,510 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:12,868 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:14,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:16,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:17,910 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:18,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:20,683 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:22,145 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:26:22,146 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:26:22,723 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:23,422 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:23,723 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:26:24,764 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:26,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:28,476 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:28,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:30,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:32,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:33,530 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:34,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:35,742 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:26:36,012 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:26:37,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:37,149 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:26:37,149 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:26:39,069 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:39,423 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:41,107 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:43,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:44,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:45,200 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:47,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:49,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:49,623 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:50,290 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:26:51,335 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:52,158 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:26:52,158 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:26:53,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:55,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:26:55,434 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:57,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:26:59,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:01,248 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:01,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:02,572 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:27:03,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:05,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:05,750 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:27:06,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:07,166 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:27:07,166 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:27:07,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:09,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:11,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:12,501 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:13,820 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:27:13,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:15,874 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:17,920 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:17,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:19,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:21,999 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:22,167 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:27:22,168 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:27:23,447 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:24,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:26,069 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:27:26,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:28,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:28,461 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:30,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:32,255 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:33,502 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:34,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:35,765 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:27:36,296 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:37,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:27:37,190 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:27:38,331 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:39,106 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:40,372 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:27:40,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:42,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:44,148 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:44,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:46,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:48,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:49,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:50,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:50,643 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:27:50,645 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:27:50,645 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:27:50,646 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:27:51,582 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:27:52,188 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:27:52,188 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:27:52,613 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:27:52,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:54,457 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:27:54,668 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:56,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:58,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:27:59,505 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:00,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:02,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:04,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:04,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:05,778 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:28:06,948 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:28:06,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:07,196 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:28:07,197 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:28:09,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:10,486 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:11,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:13,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:15,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:15,535 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:17,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:18,187 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:28:19,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:21,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:21,528 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:22,219 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:28:22,220 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:28:23,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:25,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:27,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:27,513 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:29,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:30,482 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:28:31,583 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:33,072 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:33,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:35,629 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:35,792 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:28:37,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:28:37,229 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:28:37,652 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:38,492 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:39,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:41,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:42,734 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:28:43,703 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:43,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:45,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:47,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:48,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:49,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:51,950 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:52,249 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:28:52,250 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:28:53,987 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:54,341 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:28:56,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:57,037 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:28:58,085 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:28:59,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:00,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:02,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:04,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:04,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:05,794 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:29:06,261 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:07,248 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:29:07,248 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:29:08,280 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:29:08,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:09,520 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:10,331 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:12,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:14,440 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:14,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:16,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:18,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:20,538 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:20,583 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:29:20,595 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:22,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:29:22,259 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:29:22,653 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:24,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:26,534 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:26,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:28,795 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:30,870 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:31,580 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:32,958 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:29:32,978 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:34,999 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:35,798 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:29:36,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:37,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:37,279 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:29:37,280 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:29:39,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:41,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:42,557 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:43,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:45,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:47,236 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:29:47,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:47,702 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:49,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:51,325 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:52,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:29:52,289 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:29:53,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:53,551 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:55,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:57,284 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:29:57,285 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:29:57,285 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:29:57,286 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:29:57,434 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:29:57,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:29:58,556 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:29:59,474 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:29:59,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:01,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:03,625 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:03,638 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:05,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:05,805 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:30:07,301 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:30:07,301 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:30:07,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:09,570 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:09,703 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:11,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:12,742 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:30:13,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:14,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:15,826 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:17,877 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:19,690 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:19,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:21,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:22,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:30:22,305 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:30:24,025 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:25,019 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:30:25,152 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:26,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:28,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:30,151 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:30,197 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:32,217 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:34,325 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:35,263 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:35,815 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:30:36,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:37,328 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:30:37,329 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:30:37,330 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:30:38,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:40,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:40,606 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:42,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:44,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:45,668 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:46,531 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:48,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:49,573 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:30:50,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:51,355 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:52,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:30:52,336 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:30:52,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:54,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:56,609 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:30:56,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:30:58,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:00,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:01,959 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:02,897 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:31:02,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:05,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:05,819 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:31:07,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:07,052 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:07,354 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:31:07,355 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:31:09,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:11,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:12,640 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:13,170 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:15,206 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:31:15,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:17,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:18,600 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:19,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:21,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:22,373 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:31:22,373 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:31:23,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:23,640 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:25,404 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:27,436 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:31:27,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:29,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:29,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:31,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:33,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:34,185 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:35,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:35,832 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:31:37,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:31:37,382 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:31:37,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:39,725 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:39,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:41,763 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:31:41,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:43,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:44,819 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:45,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:47,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:49,963 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:49,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:52,033 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:52,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:31:52,388 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:31:54,064 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:31:54,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:55,657 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:31:56,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:31:58,160 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:00,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:00,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:02,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:03,962 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:32:03,964 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:32:03,964 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:32:03,965 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:32:04,309 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:32:04,312 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:05,324 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:32:05,843 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:32:05,844 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:06,434 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:07,404 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:32:07,405 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:32:08,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:10,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:11,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:12,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:14,544 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:16,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:16,740 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:18,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:19,639 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:32:20,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:22,056 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:22,400 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:32:22,400 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:32:22,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:24,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:26,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:27,679 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:28,900 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:30,944 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:31,946 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:32:32,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:33,010 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:35,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:35,843 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:32:37,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:37,417 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:32:37,417 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:32:38,683 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:39,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:41,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:43,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:44,233 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:32:44,662 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:45,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:47,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:49,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:49,702 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:51,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:52,436 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:32:52,454 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:32:53,467 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:54,739 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:32:55,515 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:32:55,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:57,561 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:32:59,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:00,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:01,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:03,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:05,339 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:05,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:05,859 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:33:07,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:33:07,446 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:33:07,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:09,852 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:33:09,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:11,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:11,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:13,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:15,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:16,056 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:18,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:20,041 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:21,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:22,084 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:33:22,092 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:22,471 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:33:22,472 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:33:24,142 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:26,189 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:26,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:28,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:30,287 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:31,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:32,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:34,378 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:33:34,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:35,858 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:33:36,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:36,865 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:37,476 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:33:37,477 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:33:38,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:40,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:42,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:42,752 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:44,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:46,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:47,661 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:33:47,911 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:48,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:50,758 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:52,490 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:33:52,490 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:33:52,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:53,748 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:54,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:56,905 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:58,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:33:59,604 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:33:59,954 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:01,030 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:03,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:04,630 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:05,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:05,859 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:34:07,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:07,498 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:34:07,498 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:34:09,318 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:09,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:11,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:34:11,289 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:34:11,289 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:34:11,290 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:34:11,291 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:34:11,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:12,298 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:13,364 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:14,335 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:15,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:15,584 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:17,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:19,448 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:20,637 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:21,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:22,511 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:34:22,512 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:34:23,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:25,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:26,468 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:26,565 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:27,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:29,669 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:31,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:31,708 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:33,742 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:35,786 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:35,867 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:34:36,881 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:37,527 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:34:37,528 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:34:37,843 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:38,834 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:39,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:41,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:42,812 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:43,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:46,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:47,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:48,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:50,094 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:34:50,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:52,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:52,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:34:52,524 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:34:53,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:34:54,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:56,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:58,300 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:34:58,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:00,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:02,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:04,428 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:35:04,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:04,525 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:05,877 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:35:06,476 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:07,541 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:35:07,541 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:35:08,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:09,805 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:10,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:12,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:14,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:14,846 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:16,685 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:35:16,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:18,734 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:20,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:20,777 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:22,541 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:35:22,541 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:35:22,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:24,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:25,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:26,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:28,961 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:35:28,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:30,888 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:31,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:33,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:35,101 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:35,891 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:35:35,892 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:37,138 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:37,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:35:37,556 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:35:39,181 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:41,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:41,429 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:42,280 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:35:43,322 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:45,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:46,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:47,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:49,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:51,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:51,533 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:52,577 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:35:52,577 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:35:53,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:54,505 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:35:55,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:57,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:35:57,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:35:59,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:01,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:02,056 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:03,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:05,752 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:05,903 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:36:06,756 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:36:07,151 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:07,581 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:36:07,582 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:36:07,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:09,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:11,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:12,919 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:14,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:16,020 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:17,970 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:18,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:18,262 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:36:18,263 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:36:18,264 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:36:18,265 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:36:19,042 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:36:20,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:21,094 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:36:22,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:22,591 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:36:22,591 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:36:23,851 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:24,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:26,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:28,300 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:28,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:30,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:32,397 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:36:32,405 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:34,452 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:34,644 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:35,906 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:36:36,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:37,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:36:37,603 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:36:38,546 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:39,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:40,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:42,698 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:44,692 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:36:44,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:45,284 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:46,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:48,801 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:50,320 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:50,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:52,625 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:36:52,626 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:36:52,875 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:54,910 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:55,893 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:36:56,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:36:58,978 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:36:58,986 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:01,011 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:01,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:03,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:05,142 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:05,909 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:37:06,919 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:07,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:07,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:37:07,631 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:37:09,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:11,288 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:37:11,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:12,584 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:13,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:15,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:17,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:17,623 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:19,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:21,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:22,649 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:37:22,649 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:37:22,892 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:23,587 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:37:23,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:25,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:27,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:27,927 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:29,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:31,772 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:32,976 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:33,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:35,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:35,918 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:37:36,861 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:37:37,669 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:37:37,670 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:37:37,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:38,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:39,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:41,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:44,030 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:44,088 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:46,107 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:48,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:49,128 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:37:49,522 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:50,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:52,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:52,681 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:37:52,682 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:37:54,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:54,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:37:56,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:58,356 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:37:59,997 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:00,385 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:01,384 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:38:02,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:04,473 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:05,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:05,923 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:38:06,516 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:07,691 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:38:07,691 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:38:08,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:10,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:10,942 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:12,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:14,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:15,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:38:16,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:16,883 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:18,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:20,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:21,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:22,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:38:22,693 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:38:22,883 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:24,932 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:25,536 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:38:25,537 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:38:25,537 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:38:25,538 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:38:25,933 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:38:26,969 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:38:26,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:27,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:29,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:31,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:32,873 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:33,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:35,157 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:35,936 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:38:37,208 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:37,700 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:38:37,700 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:38:37,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:39,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:41,295 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:38:41,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:42,982 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:43,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:45,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:47,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:48,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:49,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:51,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:52,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:38:52,704 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:38:53,559 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:38:53,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:53,948 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:55,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:57,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:38:58,995 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:38:59,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:01,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:03,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:04,045 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:05,821 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:39:05,836 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:05,940 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:39:07,711 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:39:07,711 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:39:07,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:09,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:09,989 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:11,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:14,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:15,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:16,110 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:18,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:19,095 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:39:20,148 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:20,171 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:22,173 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:22,729 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:39:22,729 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:39:24,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:25,997 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:26,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:28,270 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:30,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:31,315 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:39:31,832 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:32,364 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:34,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:35,945 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:39:36,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:36,958 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:37,749 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:39:37,749 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:39:38,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:40,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:42,035 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:42,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:43,568 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:39:44,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:46,727 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:47,521 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:48,752 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:50,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:52,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:39:52,743 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:39:52,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:52,986 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:54,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:55,842 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:39:56,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:39:58,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:39:58,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:00,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:03,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:03,196 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:05,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:05,947 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:40:07,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:07,765 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:40:07,765 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:40:09,028 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:09,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:10,152 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:40:11,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:13,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:14,078 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:15,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:17,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:19,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:19,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:21,438 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:40:21,473 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:22,769 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:40:22,770 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:40:23,501 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:25,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:25,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:27,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:29,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:30,089 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:31,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:33,114 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:40:33,115 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:40:33,115 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:40:33,116 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:40:33,742 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:40:33,743 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:40:33,752 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:35,361 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:35,776 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:40:35,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:35,962 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:40:37,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:40:37,776 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:40:37,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:39,874 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:41,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:41,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:43,971 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:46,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:46,115 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:48,095 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:40:48,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:50,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:51,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:52,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:52,790 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:40:52,791 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:40:54,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:56,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:40:57,073 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:40:58,278 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:00,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:02,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:02,351 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:41:02,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:04,397 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:05,965 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:41:06,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:07,309 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:07,797 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:41:07,798 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:41:08,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:10,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:12,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:12,853 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:14,619 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:41:14,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:16,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:17,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:18,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:20,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:22,813 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:41:22,813 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:41:22,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:23,082 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:24,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:25,872 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:41:26,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:28,650 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:28,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:31,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:33,055 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:33,683 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:35,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:35,968 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:41:37,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:37,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:41:37,830 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:41:38,142 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:41:39,082 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:39,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:41,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:43,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:44,118 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:45,318 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:47,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:49,183 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:49,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:51,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:52,457 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:41:52,836 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:41:52,837 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:41:53,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:55,096 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:41:55,535 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:57,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:41:59,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:00,150 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:01,650 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:03,701 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:04,702 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:42:05,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:05,744 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:05,983 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:42:07,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:07,849 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:42:07,850 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:42:09,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:11,125 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:11,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:13,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:15,948 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:42:15,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:16,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:17,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:20,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:21,543 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:22,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:22,854 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:42:22,854 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:42:24,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:26,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:27,169 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:28,232 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:30,262 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:42:30,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:32,273 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:32,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:34,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:35,985 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:42:36,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:37,374 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:37,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:42:37,866 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:42:38,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:40,503 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:40,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:42:40,940 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:42:40,940 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:42:40,943 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:42:41,501 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:42:42,537 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:42:42,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:43,223 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:44,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:46,629 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:48,267 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:48,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:50,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:52,802 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:52,882 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:42:52,882 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:42:54,128 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:42:54,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:56,826 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:42:56,836 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:58,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:42:59,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:00,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:02,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:04,318 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:05,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:05,985 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:43:07,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:07,904 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:43:07,904 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:43:09,104 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:43:09,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:10,185 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:11,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:13,202 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:15,225 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:15,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:17,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:19,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:20,330 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:43:20,672 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:21,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:22,915 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:43:22,915 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:43:23,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:25,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:26,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:27,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:29,560 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:31,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:31,608 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:33,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:34,657 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:43:35,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:35,999 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:43:37,005 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:37,770 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:37,933 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:43:37,933 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:43:39,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:41,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:42,216 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:43,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:45,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:46,988 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:43:48,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:48,029 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:50,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:52,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:52,950 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:43:52,950 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:43:53,208 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:54,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:56,259 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:58,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:43:58,720 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:43:59,274 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:44:00,314 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:02,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:03,752 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:04,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:06,013 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:44:06,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:07,968 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:44:07,968 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:44:08,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:09,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:10,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:12,561 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:44:12,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:14,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:14,618 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:16,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:18,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:19,473 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:20,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:22,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:22,976 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:44:22,977 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:44:24,828 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:44:24,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:25,248 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:26,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:28,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:30,294 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:30,972 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:33,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:35,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:35,344 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:36,025 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:44:37,091 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:44:37,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:37,987 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:44:37,988 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:44:39,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:41,196 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:41,286 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:43,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:45,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:46,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:47,343 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:48,495 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:44:48,496 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:44:48,497 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:44:48,498 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:44:49,374 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:44:49,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:51,421 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:44:51,430 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:51,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:53,000 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:44:53,001 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:44:53,536 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:55,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:57,277 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:44:57,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:44:59,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:01,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:02,631 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:45:02,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:03,677 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:05,723 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:06,032 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:45:07,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:08,005 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:45:08,005 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:45:08,246 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:09,801 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:11,851 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:13,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:13,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:14,885 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:45:15,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:17,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:18,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:20,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:22,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:23,021 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:45:23,022 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:45:24,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:24,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:26,198 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:28,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:29,213 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:45:30,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:30,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:32,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:34,361 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:35,179 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:36,034 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:45:36,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:38,027 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:45:38,028 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:45:38,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:40,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:40,917 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:41,470 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:45:42,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:44,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:45,980 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:46,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:48,658 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:50,695 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:51,034 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:52,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:53,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:45:53,041 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:45:54,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:55,797 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:45:56,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:45:56,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:45:58,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:00,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:01,368 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:02,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:05,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:06,045 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:46:07,060 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:07,061 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:46:07,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:08,047 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:46:08,048 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:46:09,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:11,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:12,329 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:13,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:15,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:17,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:17,370 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:19,303 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:46:19,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:21,352 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:22,513 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:23,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:46:23,046 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:46:23,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:25,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:27,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:28,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:29,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:31,564 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:46:31,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:33,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:33,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:35,657 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:36,055 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:46:37,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:38,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:46:38,062 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:46:39,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:39,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:41,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:43,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:44,439 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:45,864 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:46:45,869 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:47,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:49,486 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:49,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:52,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:53,064 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:46:53,064 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:46:54,087 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:55,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:46:56,153 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:46:56,154 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:46:56,168 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:46:56,184 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:46:56,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:46:57,169 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:46:57,169 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:46:58,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:00,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:00,411 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:02,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:04,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:05,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:06,066 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:47:06,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:08,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:47:08,069 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:47:08,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:10,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:10,535 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:11,417 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:47:12,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:14,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:15,586 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:16,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:18,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:20,635 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:20,636 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:22,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:23,078 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:47:23,078 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:47:23,677 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:47:24,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:26,331 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:26,822 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:28,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:30,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:31,369 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:32,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:34,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:36,073 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:47:36,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:37,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:37,999 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:47:38,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:47:38,078 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:47:39,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:41,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:42,353 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:43,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:45,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:47,229 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:47,400 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:49,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:50,267 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:47:51,320 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:52,582 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:53,072 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:47:53,072 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:47:53,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:55,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:57,531 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:47:58,410 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:47:59,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:01,548 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:48:01,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:03,609 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:03,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:05,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:06,074 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:48:07,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:08,088 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:48:08,088 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:48:09,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:09,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:11,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:13,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:14,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:15,918 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:48:15,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:17,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:19,739 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:20,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:22,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:23,098 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:48:23,098 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:48:24,092 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:25,346 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:26,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:28,235 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:48:28,262 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:30,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:30,557 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:32,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:34,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:35,607 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:36,080 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:48:36,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:38,093 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:48:38,093 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:48:38,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:40,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:41,346 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:42,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:48:42,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:44,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:46,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:46,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:48,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:50,658 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:51,439 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:52,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:53,097 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:48:53,097 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:48:53,695 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:48:54,745 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:56,791 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:48:57,382 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:48:58,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:00,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:02,450 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:02,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:04,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:49:04,817 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:49:04,817 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:49:04,818 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:49:04,957 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:49:04,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:05,964 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:49:06,092 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:49:07,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:07,612 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:08,104 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:49:08,104 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:49:09,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:11,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:13,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:13,401 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:15,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:17,233 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:19,260 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:19,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:20,264 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:49:21,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:23,109 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:49:23,110 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:49:23,356 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:24,372 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:25,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:27,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:29,505 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:29,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:31,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:32,572 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:49:33,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:35,055 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:35,657 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:36,098 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:49:37,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:38,116 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:49:38,117 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:49:39,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:40,383 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:41,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:43,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:45,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:45,878 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:49:45,888 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:47,934 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:49,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:50,818 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:52,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:53,128 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:49:53,128 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:49:54,101 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:56,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:49:56,387 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:49:58,177 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:49:58,188 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:00,294 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:01,479 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:02,308 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:04,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:06,113 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:50:06,345 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:07,122 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:08,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:50:08,120 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:50:08,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:10,430 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:50:10,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:12,317 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:12,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:14,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:16,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:17,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:18,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:20,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:22,632 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:22,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:23,134 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:50:23,134 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:50:24,725 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:50:24,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:26,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:28,437 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:28,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:30,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:32,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:33,494 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:34,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:36,127 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:50:37,004 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:50:37,013 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:38,150 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:50:38,151 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:50:39,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:39,413 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:41,100 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:43,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:44,463 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:45,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:47,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:48,226 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:50:49,275 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:49,808 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:51,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:53,166 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:50:53,166 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:50:53,364 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:55,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:55,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:50:57,442 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:50:59,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:00,497 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:01,588 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:02,566 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:51:03,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:05,595 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:05,634 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:06,131 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:51:07,652 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:08,182 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:51:08,182 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:51:09,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:11,473 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:11,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:13,295 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:51:13,297 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:51:13,297 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:51:13,298 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:51:13,785 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:51:13,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:14,791 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:51:15,853 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:16,562 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:17,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:19,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:21,611 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:21,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:23,193 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:51:23,193 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:51:24,023 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:26,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:26,628 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:28,107 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:29,105 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:51:30,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:31,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:32,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:34,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:36,146 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:51:36,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:37,156 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:38,201 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:51:38,202 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:51:38,336 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:40,372 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:51:40,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:42,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:42,466 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:44,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:46,516 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:47,516 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:48,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:50,592 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:52,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:52,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:53,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:51:53,208 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:51:54,681 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:51:54,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:56,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:51:58,515 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:51:58,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:00,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:02,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:03,561 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:04,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:06,159 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:52:06,985 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:52:06,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:08,216 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:52:08,217 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:52:09,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:09,482 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:11,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:13,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:14,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:15,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:17,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:19,228 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:52:19,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:19,931 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:21,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:23,234 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:52:23,234 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:52:23,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:25,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:25,508 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:27,429 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:29,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:30,537 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:31,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:33,580 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:52:33,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:35,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:35,681 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:36,174 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:52:37,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:38,241 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:52:38,241 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:52:39,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:41,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:41,742 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:43,781 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:44,783 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:52:45,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:47,466 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:47,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:49,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:51,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:52,510 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:52:53,253 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:52:53,253 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:52:53,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:56,025 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:57,023 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:52:58,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:52:58,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:00,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:02,171 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:03,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:04,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:06,189 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:53:06,323 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:08,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:53:08,266 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:53:08,372 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:08,527 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:10,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:11,430 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:53:12,474 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:13,954 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:14,516 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:16,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:18,603 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:18,992 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:20,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:21,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:53:21,695 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:53:21,695 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:53:21,696 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:53:22,675 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:53:22,676 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:53:22,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:23,272 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:53:23,273 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:53:23,680 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:53:24,519 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:24,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:26,767 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:28,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:29,572 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:30,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:32,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:34,615 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:35,013 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:36,199 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:53:37,003 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:53:37,022 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:38,292 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:53:38,292 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:53:39,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:40,569 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:41,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:43,140 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:45,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:45,621 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:47,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:49,273 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:53:49,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:50,777 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:51,323 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:53,303 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:53:53,304 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:53:53,373 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:55,418 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:56,577 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:53:57,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:53:59,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:01,544 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:54:01,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:02,504 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:03,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:05,736 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:06,201 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:54:07,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:07,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:08,314 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:54:08,314 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:54:09,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:11,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:13,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:13,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:15,892 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:54:15,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:17,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:18,294 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:19,992 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:22,030 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:23,318 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:54:23,318 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:54:23,569 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:24,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:26,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:28,152 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:54:28,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:29,027 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:30,203 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:32,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:34,069 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:34,304 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:36,216 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:54:36,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:38,334 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:54:38,334 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:54:38,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:39,406 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:54:39,831 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:40,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:42,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:44,583 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:44,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:46,653 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:48,695 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:49,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:50,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:52,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:53,355 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:54:53,356 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:54:53,814 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:54:54,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:55,637 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:54:56,918 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:54:58,971 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:00,688 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:01,038 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:03,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:05,148 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:06,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:06,230 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:55:07,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:08,241 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:55:08,365 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:55:08,365 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:55:09,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:11,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:11,648 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:13,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:15,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:16,694 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:17,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:19,561 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:55:19,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:21,628 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:22,521 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:23,391 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:55:23,391 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:55:23,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:25,716 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:27,745 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:27,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:29,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:31,641 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:55:31,643 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:55:31,644 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:55:31,645 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:55:31,853 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:55:31,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:32,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:33,899 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:55:33,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:35,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:36,231 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:55:38,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:38,397 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:55:38,397 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:55:38,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:40,090 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:42,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:43,695 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:44,159 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:46,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:48,244 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:55:48,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:49,674 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:50,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:52,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:53,415 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:55:53,416 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:55:54,444 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:55,673 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:55:56,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:58,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:55:59,533 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:56:00,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:01,679 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:02,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:04,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:06,230 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:56:06,772 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:07,238 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:08,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:56:08,438 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:56:08,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:10,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:12,660 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:12,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:13,909 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:56:14,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:17,003 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:17,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:19,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:21,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:22,768 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:23,138 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:23,448 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:56:23,449 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:56:25,191 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:26,191 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:56:27,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:28,580 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:29,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:31,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:33,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:33,619 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:35,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:36,237 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:56:37,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:38,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:56:38,466 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:56:38,497 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:56:38,723 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:39,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:41,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:43,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:43,774 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:45,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:47,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:48,810 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:49,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:51,804 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:56:51,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:53,488 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:56:53,488 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:56:53,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:54,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:56:55,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:57,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:56:59,804 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:00,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:02,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:04,115 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:57:04,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:05,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:06,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:06,252 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:57:08,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:08,494 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:57:08,494 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:57:10,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:10,773 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:12,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:14,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:15,836 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:16,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:18,478 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:57:18,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:20,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:21,254 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:22,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:23,513 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:57:23,513 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:57:24,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:26,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:26,796 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:28,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:30,766 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:57:30,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:32,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:32,811 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:34,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:36,265 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:57:36,914 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:37,277 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:38,533 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:57:38,534 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:57:38,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:41,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:42,026 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:57:42,028 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:57:42,028 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:57:42,030 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:57:42,042 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:57:42,321 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:43,107 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:44,074 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:57:45,132 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:47,174 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:47,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:49,216 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:51,270 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:52,421 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:53,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:53,541 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:57:53,542 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:57:55,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:57,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:57:57,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:57:58,391 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:57:59,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:01,466 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:02,810 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:03,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:05,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:06,267 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:58:07,609 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:07,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:08,559 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:58:08,560 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:58:09,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:10,662 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:58:11,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:13,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:13,867 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:15,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:17,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:18,914 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:19,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:21,943 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:22,938 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:58:23,565 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:58:23,565 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:58:23,991 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:24,827 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:26,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:28,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:29,874 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:30,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:32,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:34,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:35,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:36,315 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:58:36,317 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:58:36,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:38,373 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:38,560 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:58:38,561 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:58:40,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:40,833 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:42,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:44,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:45,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:46,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:48,628 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:58:48,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:50,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:51,687 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:52,736 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:53,565 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:58:53,565 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:58:54,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:56,824 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:58:56,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:58:58,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:00,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:02,650 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:02,965 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:02,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:05,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:06,307 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:59:07,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:07,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:08,562 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:59:08,563 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:59:09,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:11,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:13,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:13,621 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:15,265 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:15,304 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:17,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:18,665 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:19,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:21,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:23,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:23,574 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:59:23,574 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:59:23,834 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:25,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:27,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:28,599 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:29,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:29,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:31,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:33,726 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:34,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:35,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:36,313 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 03:59:37,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:38,589 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:59:38,590 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:59:39,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:40,562 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:40,858 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:41,912 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:44,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:45,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:46,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:48,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:50,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:50,650 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:52,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:52,518 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 03:59:52,519 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 03:59:52,520 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 03:59:52,521 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 03:59:53,207 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:53,208 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 03:59:53,594 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 03:59:53,595 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 03:59:54,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:55,241 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 03:59:55,871 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 03:59:56,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 03:59:58,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:00,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:00,943 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:02,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:04,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:06,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:06,320 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:00:06,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:07,508 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:00:08,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:08,598 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:00:08,599 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:00:10,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:11,894 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:12,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:14,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:16,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:16,943 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:18,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:20,835 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:00:20,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:22,199 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:22,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:23,607 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:00:23,608 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:00:24,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:26,975 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:27,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:29,033 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:31,085 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:33,138 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:00:33,145 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:33,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:35,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:36,328 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:00:37,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:38,613 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:00:38,614 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:00:38,869 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:39,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:41,337 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:43,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:43,889 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:45,517 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:47,493 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:00:47,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:49,133 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:49,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:51,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:53,632 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:00:53,632 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:00:53,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:54,898 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:00:55,690 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:57,734 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:00:59,759 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:00:59,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:00,103 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:01,816 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:03,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:05,153 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:05,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:06,337 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:01:07,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:08,645 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:01:08,645 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:01:09,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:10,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:12,063 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:13,059 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:01:14,110 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:16,144 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:16,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:18,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:20,275 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:21,269 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:22,329 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:23,645 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:01:23,646 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:01:24,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:25,383 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:01:26,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:27,124 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:28,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:30,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:32,165 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:32,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:34,690 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:36,353 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:01:36,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:37,295 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:38,644 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:01:38,644 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:01:38,813 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:39,800 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:01:40,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:42,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:42,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:44,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:47,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:48,000 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:49,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:51,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:52,135 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:01:53,118 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:53,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:53,656 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:01:53,656 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:01:55,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:57,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:01:58,954 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:01:59,372 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:01,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:03,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:04,000 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:04,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:02:04,070 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:02:04,070 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:02:04,071 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:02:04,496 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:02:05,535 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:02:05,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:06,362 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:02:07,636 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:08,663 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:02:08,664 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:02:09,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:09,937 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:11,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:13,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:14,996 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:15,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:18,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:19,974 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:02:20,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:20,493 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:22,049 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:23,681 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:02:23,682 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:02:24,096 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:25,959 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:26,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:28,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:30,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:30,994 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:32,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:34,353 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:02:34,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:36,396 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:02:36,397 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:36,410 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:38,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:38,692 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:02:38,693 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:02:40,514 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:41,981 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:42,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:44,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:45,601 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:02:46,659 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:47,664 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:48,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:50,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:52,706 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:52,795 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:53,684 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:02:53,684 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:02:54,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:56,884 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:02:57,890 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:02:58,298 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:02:58,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:00,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:03,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:03,323 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:05,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:06,376 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:03:07,103 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:08,695 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:03:08,696 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:03:08,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:09,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:11,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:12,190 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:03:13,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:14,228 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:15,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:17,320 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:19,265 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:19,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:21,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:23,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:23,712 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:03:23,712 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:03:24,464 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:03:24,983 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:25,510 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:27,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:29,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:30,023 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:31,647 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:33,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:35,071 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:35,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:36,386 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:03:37,758 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:03:37,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:38,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:03:38,733 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:03:39,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:41,001 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:41,853 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:43,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:45,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:46,042 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:48,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:50,094 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:03:50,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:51,873 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:52,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:53,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:03:53,746 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:03:54,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:56,229 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:03:57,031 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:03:58,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:00,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:02,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:02,684 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:04,408 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:04,416 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:06,398 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:04:06,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:08,191 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:08,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:08,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:04:08,758 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:04:10,544 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:12,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:14,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:14,560 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:04:14,561 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:04:14,561 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:04:14,562 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:04:14,611 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:04:14,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:15,621 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:16,659 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:16,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:18,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:19,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:20,830 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:22,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:23,766 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:04:23,766 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:04:24,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:25,020 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:26,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:28,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:29,953 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:30,060 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:31,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:33,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:35,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:35,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:36,403 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:04:37,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:38,766 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:04:38,767 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:04:39,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:40,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:41,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:42,222 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:43,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:45,323 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:45,826 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:47,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:49,401 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:50,884 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:51,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:53,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:53,772 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:04:53,772 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:04:55,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:56,051 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:04:56,528 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:04:57,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:04:59,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:01,098 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:01,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:03,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:05,742 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:06,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:06,419 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:05:07,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:08,780 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:05:08,780 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:05:08,782 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:05:09,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:11,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:12,053 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:13,932 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:15,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:17,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:18,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:20,068 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:21,069 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:05:22,167 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:22,363 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:23,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:05:23,793 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:05:24,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:26,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:28,073 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:28,260 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:30,296 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:32,352 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:33,124 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:34,381 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:05:34,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:36,431 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:05:36,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:38,232 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:38,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:38,799 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:05:38,800 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:05:40,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:42,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:44,103 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:44,615 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:46,654 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:05:46,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:48,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:49,977 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:50,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:52,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:53,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:05:53,815 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:05:54,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:55,081 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:05:56,899 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:05:58,927 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:05:58,934 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:00,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:00,991 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:03,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:05,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:05,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:06,433 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:06:07,110 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:08,818 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:06:08,819 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:06:09,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:11,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:11,586 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:13,229 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:06:13,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:15,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:16,639 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:17,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:19,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:21,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:21,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:23,458 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:06:23,460 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:06:23,460 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:06:23,493 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:06:23,494 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:06:23,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:23,922 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:06:23,922 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:06:24,503 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:06:25,514 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:06:25,557 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:27,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:27,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:29,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:31,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:32,240 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:33,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:35,719 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:36,433 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:06:37,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:37,855 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:38,767 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:06:38,876 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:06:38,876 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:06:39,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:41,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:43,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:43,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:45,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:48,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:48,175 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:50,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:51,081 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:06:52,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:53,264 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:06:53,876 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:06:53,876 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:06:54,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:56,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:58,337 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:06:59,163 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:00,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:02,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:04,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:04,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:05,448 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:07:06,436 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:07:06,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:08,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:08,895 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:07:08,895 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:07:10,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:10,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:12,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:14,660 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:15,215 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:16,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:17,710 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:07:18,765 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:20,281 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:20,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:22,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:23,899 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:07:23,900 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:07:24,944 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:26,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:26,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:28,991 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:07:29,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:31,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:31,206 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:33,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:35,146 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:36,268 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:36,442 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:07:37,189 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:38,910 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:07:38,910 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:07:39,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:41,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:42,194 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:43,338 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:07:43,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:45,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:47,239 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:47,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:49,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:51,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:52,284 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:53,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:53,906 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:07:53,906 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:07:55,627 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:07:55,683 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:57,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:07:58,175 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:07:59,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:01,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:03,227 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:03,827 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:05,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:06,453 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:08:07,916 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:08,315 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:08,915 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:08:08,915 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:08:09,944 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:08:09,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:12,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:14,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:14,217 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:16,100 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:18,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:19,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:20,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:21,197 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:08:22,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:23,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:08:23,936 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:08:24,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:25,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:26,397 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:28,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:30,241 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:30,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:32,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:32,731 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:08:32,733 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:08:32,733 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:08:32,734 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:08:33,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:08:33,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:08:34,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:35,537 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:08:35,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:36,453 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:08:36,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:38,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:38,941 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:08:38,941 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:08:40,658 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:41,216 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:42,706 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:44,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:46,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:47,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:47,793 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:08:48,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:50,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:52,240 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:52,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:53,930 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:08:53,931 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:08:54,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:57,068 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:08:58,220 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:08:59,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:01,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:02,114 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:09:03,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:04,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:05,190 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:06,460 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:09:07,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:08,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:09:08,938 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:09:09,200 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:09,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:11,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:13,362 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:09:13,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:14,943 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:15,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:17,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:19,503 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:19,980 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:21,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:23,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:23,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:09:23,937 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:09:25,615 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:09:25,623 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:25,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:27,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:29,757 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:30,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:31,781 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:33,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:35,806 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:35,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:36,464 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:09:37,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:38,947 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:09:38,947 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:09:39,962 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:09:39,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:41,225 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:42,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:44,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:46,101 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:46,260 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:48,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:50,191 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:51,280 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:52,237 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:09:52,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:53,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:09:53,955 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:09:54,280 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:56,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:09:57,221 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:09:58,457 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:00,470 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:02,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:02,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:03,523 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:04,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:06,473 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:10:06,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:07,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:08,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:08,959 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:10:08,959 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:10:10,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:12,765 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:13,222 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:14,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:16,838 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:17,836 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:18,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:18,969 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:20,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:22,975 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:23,983 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:10:23,984 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:10:24,255 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:25,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:27,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:29,184 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:29,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:30,146 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:31,200 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:33,216 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:34,832 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:35,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:36,473 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:10:37,288 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:38,995 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:10:38,996 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:10:39,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:40,264 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:41,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:41,593 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:10:41,594 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:10:41,595 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:10:41,596 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:10:42,393 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:10:43,424 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:43,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:44,436 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:45,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:45,889 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:47,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:49,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:50,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:51,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:53,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:54,009 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:10:54,010 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:10:55,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:56,221 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:10:56,694 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:10:57,739 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:10:59,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:01,270 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:01,883 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:03,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:05,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:06,315 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:06,476 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:11:07,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:09,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:11:09,020 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:11:10,011 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:11:10,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:12,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:12,316 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:14,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:16,173 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:17,354 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:18,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:20,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:22,304 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:11:22,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:22,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:24,028 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:11:24,028 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:11:24,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:26,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:28,314 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:28,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:30,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:32,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:33,365 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:34,555 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:11:34,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:36,483 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:11:36,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:38,433 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:38,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:39,042 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:11:39,043 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:11:40,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:42,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:44,335 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:44,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:46,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:48,880 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:11:48,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:49,495 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:50,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:52,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:54,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:11:54,047 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:11:55,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:55,298 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:11:57,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:11:59,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:00,116 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:12:01,216 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:01,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:03,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:05,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:06,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:06,494 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:12:07,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:09,059 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:12:09,060 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:12:09,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:11,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:12,151 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:13,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:14,398 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:12:15,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:17,195 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:17,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:19,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:21,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:22,249 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:23,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:24,087 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:12:24,087 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:12:25,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:26,675 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:12:27,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:27,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:29,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:31,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:33,045 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:33,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:35,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:36,497 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:12:37,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:38,459 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:39,086 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:12:39,087 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:12:40,006 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:41,015 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:12:42,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:44,108 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:44,395 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:46,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:48,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:49,442 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:50,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:50,692 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:12:50,694 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:12:50,694 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:12:50,695 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:12:51,247 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:12:52,307 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:12:52,322 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:54,108 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:12:54,108 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:12:54,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:55,374 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:12:56,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:12:58,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:00,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:00,492 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:02,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:04,619 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:13:04,634 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:06,131 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:06,506 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:13:06,660 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:08,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:09,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:13:09,119 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:13:10,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:11,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:12,800 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:14,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:16,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:16,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:18,921 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:13:18,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:20,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:21,968 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:23,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:24,118 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:13:24,119 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:13:25,052 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:27,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:27,385 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:29,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:31,185 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:13:31,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:32,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:33,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:35,320 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:36,520 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:13:37,343 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:38,488 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:39,117 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:13:39,118 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:13:39,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:41,410 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:43,466 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:43,618 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:45,504 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:13:45,506 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:47,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:48,657 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:49,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:51,653 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:53,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:54,120 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:13:54,121 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:13:54,372 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:55,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:56,732 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:13:57,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:13:59,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:13:59,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:01,899 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:04,005 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:04,510 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:06,022 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:06,534 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:14:08,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:09,134 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:14:09,135 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:14:10,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:10,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:11,068 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:14:12,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:14,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:15,439 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:16,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:18,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:20,297 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:20,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:22,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:23,349 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:14:24,151 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:14:24,151 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:14:24,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:26,426 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:26,428 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:28,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:30,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:31,482 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:32,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:34,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:35,697 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:14:36,540 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:14:36,541 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:36,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:38,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:39,166 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:14:39,166 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:14:40,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:42,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:42,830 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:44,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:46,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:47,736 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:48,958 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:14:48,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:51,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:52,798 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:53,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:54,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:14:54,189 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:14:55,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:57,155 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:58,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:14:59,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:14:59,495 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:14:59,497 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:14:59,497 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:14:59,498 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:15:00,195 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:15:01,220 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:15:01,229 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:03,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:03,779 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:05,383 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:06,554 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:15:07,397 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:09,188 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:15:09,188 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:15:09,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:09,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:11,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:13,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:15,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:15,581 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:15:15,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:17,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:19,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:20,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:21,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:23,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:24,193 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:15:24,194 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:15:25,475 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:25,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:27,873 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:15:27,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:29,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:31,443 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:32,004 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:34,059 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:36,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:36,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:36,568 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:15:38,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:39,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:15:39,212 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:15:40,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:41,220 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:15:41,582 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:42,275 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:44,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:46,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:46,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:48,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:50,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:51,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:52,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:53,549 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:15:54,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:15:54,230 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:15:54,608 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:56,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:15:57,511 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:15:58,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:00,768 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:02,559 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:02,815 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:04,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:06,573 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:16:06,991 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:07,588 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:07,950 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:16:09,014 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:09,244 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:16:09,244 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:16:11,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:13,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:13,526 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:15,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:17,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:18,559 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:19,184 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:20,200 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:16:21,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:23,283 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:23,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:24,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:16:24,257 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:16:25,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:27,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:29,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:29,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:31,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:33,500 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:16:33,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:34,590 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:35,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:36,582 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:16:37,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:39,261 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:16:39,261 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:16:39,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:40,524 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:41,708 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:43,742 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:45,772 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:16:45,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:46,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:47,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:49,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:51,495 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:51,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:53,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:54,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:16:54,276 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:16:56,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:16:56,537 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:16:58,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:00,081 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:00,090 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:02,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:02,330 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:04,214 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:06,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:06,589 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:17:07,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:08,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:09,294 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:17:09,294 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:17:10,177 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:17:10,179 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:17:10,179 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:17:10,180 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:17:10,350 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:17:10,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:11,363 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:12,403 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:12,413 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:13,475 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:14,464 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:16,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:18,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:18,548 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:20,578 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:22,615 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:23,615 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:24,310 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:17:24,310 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:17:24,666 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:25,670 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:26,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:28,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:29,602 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:30,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:32,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:34,639 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:34,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:36,600 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:17:36,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:37,935 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:39,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:39,316 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:17:39,316 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:17:40,570 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:41,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:43,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:45,100 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:45,597 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:47,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:49,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:50,177 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:17:51,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:51,326 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:53,278 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:54,321 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:17:54,321 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:17:55,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:56,595 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:17:57,363 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:17:59,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:01,453 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:01,630 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:03,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:04,496 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:18:05,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:06,612 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:18:07,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:07,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:09,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:18:09,337 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:18:09,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:11,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:12,659 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:13,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:15,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:16,755 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:18:17,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:17,831 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:19,838 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:21,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:22,885 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:23,926 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:24,358 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:18:24,358 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:18:25,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:28,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:28,591 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:30,056 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:18:30,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:32,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:33,629 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:34,175 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:36,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:36,615 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:18:38,262 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:38,635 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:39,366 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:18:39,366 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:18:40,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:42,349 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:18:42,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:44,417 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:44,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:46,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:48,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:49,488 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:50,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:52,589 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:54,373 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:18:54,373 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:18:54,636 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:18:54,637 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:18:54,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:56,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:58,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:18:59,685 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:00,807 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:02,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:04,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:04,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:06,628 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:19:06,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:08,987 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:19:08,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:09,390 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:19:09,391 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:19:10,649 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:11,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:13,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:15,148 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:15,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:17,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:18,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:19:18,868 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:19:18,868 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:19:18,869 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:19:19,246 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:19:19,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:20,258 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:19:21,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:21,316 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:19:21,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:23,367 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:24,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:19:24,388 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:19:25,411 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:26,665 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:27,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:29,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:31,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:31,716 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:33,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:34,578 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:19:35,618 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:36,643 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:19:37,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:37,655 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:39,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:19:39,389 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:19:39,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:41,815 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:42,670 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:43,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:45,846 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:46,815 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:19:47,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:48,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:49,899 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:51,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:53,432 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:19:53,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:54,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:19:54,408 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:19:56,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:58,103 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:19:59,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:00,173 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:01,168 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:20:02,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:04,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:04,262 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:06,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:06,653 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:20:08,349 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:09,411 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:20:09,412 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:20:09,689 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:10,397 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:12,469 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:20:12,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:14,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:15,222 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:16,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:18,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:20,254 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:20,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:22,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:24,436 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:20:24,437 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:20:24,696 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:20:24,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:25,702 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:26,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:28,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:30,839 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:30,854 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:32,888 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:34,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:35,929 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:36,665 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:20:36,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:39,018 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:20:39,025 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:39,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:20:39,438 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:20:41,084 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:41,703 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:43,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:45,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:46,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:47,230 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:49,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:51,310 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:20:51,320 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:51,820 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:53,367 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:54,442 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:20:54,442 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:20:55,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:57,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:20:57,703 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:20:59,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:01,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:02,711 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:03,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:04,575 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:05,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:06,670 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:21:07,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:08,690 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:09,444 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:21:09,444 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:21:09,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:11,753 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:13,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:13,860 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:15,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:16,873 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:17,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:19,596 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:19,920 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:21,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:24,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:24,462 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:21:24,463 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:21:24,717 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:26,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:28,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:28,472 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:21:28,474 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:21:28,475 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:21:28,476 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:21:29,078 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:21:30,120 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:30,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:30,719 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:31,134 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:32,203 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:34,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:35,753 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:36,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:36,676 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:21:38,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:39,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:21:39,470 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:21:40,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:41,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:42,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:43,417 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:44,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:46,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:47,030 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:48,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:50,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:52,086 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:52,681 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:54,466 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:21:54,466 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:21:54,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:56,788 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:21:56,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:21:57,827 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:21:58,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:00,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:02,911 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:02,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:05,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:06,693 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:22:07,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:08,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:09,104 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:22:09,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:09,476 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:22:09,478 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:22:11,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:13,208 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:13,773 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:15,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:17,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:18,806 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:19,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:21,379 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:22:21,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:23,428 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:23,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:24,496 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:22:24,496 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:22:25,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:27,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:29,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:29,789 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:31,605 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:33,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:35,234 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:35,691 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:22:35,698 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:36,702 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:22:37,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:39,509 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:22:39,510 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:22:39,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:40,791 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:41,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:43,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:45,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:46,006 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:47,976 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:22:48,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:50,049 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:51,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:52,090 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:54,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:54,521 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:22:54,521 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:22:56,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:22:56,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:22:58,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:00,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:01,269 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:02,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:02,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:04,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:06,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:06,710 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:23:07,716 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:08,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:09,527 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:23:09,528 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:23:10,509 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:12,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:13,544 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:13,545 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:14,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:16,708 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:18,633 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:18,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:20,756 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:22,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:23,666 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:24,540 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:23:24,540 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:23:24,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:25,832 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:26,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:28,834 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:28,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:30,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:33,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:33,886 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:35,088 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:36,725 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:23:37,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:37,215 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:23:37,217 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:23:37,217 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:23:37,218 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:23:38,129 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:23:39,162 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:39,171 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:39,548 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:23:39,549 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:23:39,800 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:40,170 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:41,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:43,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:44,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:45,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:47,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:49,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:49,884 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:51,469 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:23:51,486 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:53,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:54,558 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:23:54,558 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:23:55,600 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:55,809 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:23:57,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:23:59,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:00,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:01,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:03,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:05,802 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:24:05,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:06,660 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:06,739 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:24:07,859 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:09,568 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:24:09,569 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:24:09,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:11,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:11,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:14,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:16,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:17,595 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:18,132 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:24:18,178 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:20,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:22,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:22,645 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:24,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:24,590 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:24:24,590 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:24:26,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:27,872 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:28,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:30,377 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:24:30,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:32,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:33,332 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:34,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:36,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:36,749 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:24:38,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:38,766 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:39,602 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:24:39,603 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:24:40,626 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:42,677 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:43,676 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:24:44,260 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:44,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:46,822 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:48,936 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:49,316 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:50,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:52,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:54,616 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:24:54,617 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:24:54,868 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:24:55,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:56,046 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:24:57,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:24:59,143 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:00,134 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:01,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:03,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:05,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:05,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:06,750 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:25:07,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:09,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:09,633 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:25:09,634 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:25:10,372 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:25:10,903 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:11,418 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:13,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:15,509 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:15,937 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:17,568 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:19,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:21,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:21,722 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:22,655 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:25:23,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:24,655 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:25:24,655 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:25:25,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:26,918 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:27,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:29,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:31,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:31,975 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:33,900 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:25:33,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:35,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:36,755 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:25:37,761 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:38,007 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:39,670 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:25:39,671 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:25:40,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:42,085 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:43,210 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:44,124 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:46,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:46,217 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:25:46,219 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:25:46,219 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:25:46,220 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:25:47,175 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:25:47,175 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:25:48,207 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:25:48,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:48,509 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:50,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:52,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:53,558 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:25:54,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:54,684 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:25:54,685 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:25:56,430 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:58,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:25:58,985 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:00,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:02,558 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:26:02,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:04,609 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:04,876 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:06,659 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:06,766 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:26:08,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:09,678 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:26:09,679 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:26:09,930 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:10,746 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:12,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:14,825 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:26:14,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:15,759 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:16,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:18,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:20,805 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:21,070 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:23,093 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:24,694 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:26:24,694 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:26:25,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:26,106 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:26:26,750 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:27,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:29,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:31,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:31,818 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:33,300 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:35,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:36,780 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:26:37,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:37,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:39,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:39,718 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:26:39,719 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:26:40,431 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:26:41,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:42,974 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:43,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:45,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:47,615 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:48,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:49,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:51,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:52,732 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:26:53,472 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:53,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:54,721 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:26:54,722 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:26:55,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:57,861 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:26:59,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:26:59,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:01,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:04,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:04,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:06,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:06,784 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:27:07,054 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:27:08,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:09,725 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:27:09,726 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:27:09,984 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:10,173 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:12,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:14,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:15,027 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:16,335 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:18,363 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:27:18,372 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:20,270 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:20,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:22,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:24,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:24,743 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:27:24,743 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:27:26,009 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:26,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:28,603 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:30,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:31,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:32,727 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:27:32,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:34,802 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:36,067 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:36,789 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:27:36,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:38,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:39,752 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:27:39,753 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:27:40,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:42,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:42,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:45,025 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:27:45,035 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:47,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:47,861 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:49,108 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:51,151 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:52,904 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:53,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:54,765 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:27:54,765 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:27:55,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:55,582 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:27:55,585 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:27:55,585 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:27:55,586 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:27:56,271 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:27:57,313 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:27:57,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:27:58,823 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:27:59,360 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:01,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:03,465 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:03,865 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:05,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:06,792 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:28:07,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:09,067 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:09,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:09,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:28:09,762 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:28:10,625 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:28:11,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:13,731 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:15,051 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:15,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:17,822 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:19,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:20,098 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:21,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:22,910 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:28:24,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:24,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:28:24,776 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:28:26,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:26,035 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:28,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:30,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:31,082 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:32,143 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:34,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:36,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:36,801 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:28:36,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:37,229 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:28:38,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:39,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:28:39,782 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:28:40,318 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:42,053 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:42,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:44,414 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:46,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:47,112 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:48,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:49,543 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:28:50,592 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:52,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:52,856 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:54,736 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:54,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:28:54,783 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:28:56,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:28:58,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:28:58,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:00,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:02,882 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:29:02,890 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:03,772 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:04,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:06,804 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:29:06,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:08,834 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:09,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:09,785 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:29:09,786 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:29:11,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:13,115 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:14,481 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:15,150 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:29:15,159 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:17,203 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:19,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:19,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:21,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:23,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:24,783 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:29:24,784 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:29:25,034 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:25,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:27,426 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:29:27,486 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:29,504 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:30,281 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:31,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:33,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:35,332 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:35,597 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:36,815 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:29:37,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:39,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:39,787 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:29:39,787 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:29:41,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:41,703 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:29:41,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:43,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:45,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:46,084 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:47,856 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:49,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:51,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:51,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:52,956 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:29:54,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:54,801 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:29:54,801 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:29:56,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:29:57,071 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:29:58,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:00,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:02,132 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:02,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:04,230 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:04,634 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:30:04,636 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:30:04,637 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:30:04,638 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:30:05,224 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:05,224 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:30:06,262 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:06,272 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:06,830 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:30:07,273 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:07,844 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:08,316 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:09,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:30:09,802 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:30:10,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:12,439 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:13,053 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:14,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:16,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:18,082 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:18,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:19,587 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:20,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:22,685 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:23,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:24,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:24,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:30:24,816 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:30:26,870 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:28,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:29,104 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:30,940 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:32,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:34,000 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:34,994 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:35,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:36,837 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:30:37,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:39,157 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:39,808 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:30:39,808 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:30:40,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:41,198 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:43,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:45,276 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:45,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:45,683 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:47,322 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:49,367 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:50,731 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:51,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:53,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:54,825 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:30:54,826 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:30:55,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:56,083 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:30:57,566 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:30:57,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:30:59,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:01,479 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:01,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:03,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:05,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:06,526 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:06,843 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:31:07,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:09,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:09,839 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:31:09,840 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:31:11,831 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:31:11,839 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:12,102 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:13,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:15,923 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:17,144 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:17,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:19,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:22,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:23,040 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:24,070 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:31:24,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:24,839 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:31:24,839 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:31:26,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:28,171 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:28,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:30,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:32,280 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:33,179 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:34,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:36,381 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:31:36,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:36,846 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:31:38,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:38,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:39,849 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:31:39,849 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:31:40,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:42,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:44,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:44,587 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:46,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:48,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:49,673 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:31:49,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:50,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:52,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:54,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:54,865 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:31:54,866 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:31:55,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:31:56,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:31:58,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:00,163 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:00,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:01,979 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:03,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:05,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:05,618 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:06,850 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:32:07,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:09,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:09,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:32:09,877 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:32:11,183 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:11,196 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:13,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:13,454 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:32:13,456 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:32:13,456 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:32:13,456 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:32:14,248 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:14,248 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:32:15,277 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:15,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:16,291 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:16,696 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:17,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:19,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:21,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:21,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:23,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:24,894 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:32:24,894 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:32:25,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:26,984 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:27,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:28,570 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:29,705 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:31,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:32,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:33,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:35,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:36,855 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:32:37,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:37,867 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:39,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:39,901 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:32:39,901 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:32:41,908 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:41,916 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:43,158 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:43,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:46,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:48,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:48,201 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:50,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:52,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:53,827 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:32:54,204 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:32:54,217 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:54,924 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:32:54,924 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:32:56,254 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:58,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:32:59,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:00,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:02,467 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:04,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:04,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:06,512 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:06,860 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:33:08,528 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:33:08,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:09,293 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:09,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:33:09,928 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:33:10,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:12,628 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:14,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:15,254 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:16,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:18,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:20,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:20,798 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:33:20,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:22,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:24,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:24,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:33:24,936 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:33:26,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:26,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:28,990 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:31,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:31,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:32,105 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:33:33,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:35,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:36,461 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:36,871 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:33:37,225 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:39,269 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:39,930 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:33:39,930 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:33:41,314 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:42,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:43,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:45,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:46,385 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:33:47,279 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:47,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:49,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:51,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:52,328 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:53,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:54,944 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:33:54,944 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:33:55,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:57,652 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:33:58,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:33:58,646 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:33:59,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:01,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:03,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:03,830 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:05,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:06,874 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:34:07,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:08,896 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:09,929 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:09,968 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:34:09,969 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:34:10,939 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:34:12,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:14,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:14,250 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:16,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:18,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:19,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:20,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:22,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:22,715 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:34:22,716 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:34:22,717 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:34:22,717 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:34:23,234 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:34:24,261 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:34:24,274 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:24,320 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:24,984 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:34:24,984 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:34:25,269 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:34:26,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:28,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:30,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:30,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:32,553 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:34,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:35,320 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:36,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:36,878 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:34:38,606 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:34:38,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:39,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:34:39,990 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:34:40,650 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:41,250 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:42,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:44,737 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:46,301 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:46,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:48,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:50,855 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:34:50,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:52,020 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:52,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:54,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:55,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:34:55,006 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:34:56,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:34:57,274 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:34:59,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:01,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:02,877 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:03,177 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:35:03,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:05,232 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:06,890 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:35:07,255 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:07,903 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:09,279 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:10,022 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:35:10,022 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:35:11,329 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:13,295 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:13,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:15,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:16,437 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:35:17,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:18,764 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:19,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:21,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:23,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:23,805 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:25,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:35:25,036 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:35:25,668 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:27,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:28,727 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:35:29,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:29,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:31,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:33,962 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:34,533 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:35,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:36,900 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:35:38,006 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:40,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:40,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:35:40,047 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:35:40,299 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:42,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:43,092 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:35:44,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:45,705 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:46,157 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:48,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:50,259 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:50,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:52,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:54,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:55,063 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:35:55,064 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:35:55,365 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:35:56,318 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:35:56,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:35:58,444 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:00,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:01,366 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:02,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:04,636 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:06,624 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:06,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:06,904 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:36:06,905 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:08,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:10,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:36:10,058 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:36:10,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:12,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:12,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:14,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:16,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:17,380 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:18,924 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:20,958 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:20,975 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:23,004 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:23,087 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:25,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:25,066 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:36:25,067 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:36:27,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:28,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:29,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:31,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:31,870 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:36:31,872 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:36:31,872 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:36:31,873 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:36:32,207 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:36:33,234 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:33,242 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:34,243 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:34,245 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:35,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:36,917 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:36:37,410 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:39,405 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:39,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:40,065 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:36:40,066 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:36:41,456 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:43,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:45,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:45,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:47,564 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:47,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:49,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:50,442 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:51,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:53,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:55,060 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:36:55,061 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:36:55,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:56,322 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:36:57,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:36:59,821 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:36:59,835 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:01,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:02,239 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:03,914 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:06,030 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:06,928 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:37:07,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:08,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:10,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:37:10,068 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:37:10,082 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:12,108 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:13,102 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:37:13,133 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:14,162 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:16,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:18,246 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:18,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:20,314 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:22,354 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:23,290 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:24,392 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:25,091 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:37:25,092 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:37:25,392 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:37:26,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:28,356 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:28,493 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:30,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:32,608 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:33,384 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:34,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:36,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:36,933 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:37:38,761 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:38,944 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:39,731 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:37:40,094 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:37:40,103 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:37:40,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:42,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:44,378 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:44,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:46,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:48,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:49,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:50,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:51,974 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:37:53,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:54,448 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:37:55,079 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:55,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:37:55,095 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:37:57,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:37:59,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:00,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:01,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:03,251 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:03,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:05,297 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:05,666 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:06,946 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:38:07,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:09,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:10,104 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:38:10,105 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:38:11,379 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:11,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:13,495 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:15,535 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:16,426 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:17,579 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:17,590 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:19,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:21,471 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:21,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:23,736 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:25,114 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:38:25,115 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:38:25,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:27,385 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:27,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:29,842 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:29,855 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:31,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:33,361 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:33,956 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:35,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:36,955 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:38:38,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:38,971 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:40,117 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:38:40,118 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:38:40,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:41,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:38:41,190 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:38:41,190 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:38:41,191 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:38:42,160 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:42,160 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:38:42,172 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:44,218 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:44,231 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:44,467 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:46,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:48,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:49,506 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:50,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:52,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:54,459 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:54,574 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:38:55,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:38:55,127 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:38:55,457 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:38:56,513 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:38:58,552 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:00,425 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:00,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:02,659 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:04,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:05,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:06,739 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:06,968 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:39:08,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:09,829 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:39:10,137 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:39:10,138 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:39:10,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:11,382 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:12,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:14,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:16,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:16,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:18,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:21,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:22,033 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:39:22,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:23,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:25,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:25,142 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:39:25,142 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:39:27,189 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:27,415 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:29,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:31,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:32,480 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:33,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:34,326 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:39:35,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:36,977 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:39:37,418 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:37,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:39,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:40,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:39:40,140 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:39:41,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:43,420 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:43,583 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:45,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:47,656 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:39:47,668 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:48,813 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:49,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:51,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:53,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:53,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:55,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:39:55,142 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:39:55,866 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:57,914 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:39:59,768 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:39:59,945 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:39:59,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:01,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:04,034 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:04,803 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:06,090 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:06,986 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:40:08,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:10,167 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:40:10,180 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:40:10,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:10,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:12,269 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:14,272 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:40:14,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:15,709 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:16,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:18,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:20,404 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:20,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:22,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:24,504 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:25,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:40:25,157 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:40:26,405 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:26,534 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:40:26,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:28,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:30,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:31,461 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:32,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:34,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:36,506 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:36,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:36,990 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:40:38,819 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:40:38,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:40,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:40:40,155 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:40:40,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:42,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:42,950 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:44,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:47,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:47,476 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:49,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:50,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:40:50,019 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:40:50,020 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:40:50,020 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:40:50,049 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:40:51,077 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:40:51,084 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:52,092 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:40:53,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:53,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:55,167 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:40:55,168 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:40:55,171 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:57,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:40:58,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:40:59,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:01,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:03,338 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:03,500 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:04,346 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:41:05,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:06,997 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:41:07,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:09,022 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:09,464 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:10,177 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:41:10,177 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:41:11,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:13,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:14,482 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:15,599 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:17,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:18,638 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:41:19,682 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:20,452 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:21,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:23,764 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:25,182 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:41:25,183 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:41:25,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:26,454 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:27,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:29,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:30,905 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:41:31,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:32,357 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:34,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:36,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:37,003 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:41:38,009 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:38,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:40,142 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:40,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:41:40,190 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:41:42,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:43,272 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:44,242 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:41:44,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:46,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:48,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:48,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:50,385 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:52,448 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:53,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:54,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:55,184 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:41:55,184 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:41:56,511 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:41:56,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:41:58,468 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:41:58,561 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:00,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:02,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:03,514 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:04,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:06,719 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:07,019 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:42:08,760 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:42:08,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:09,026 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:10,177 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:42:10,178 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:42:10,815 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:12,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:14,450 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:14,937 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:16,953 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:18,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:19,497 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:21,053 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:23,105 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:42:23,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:24,600 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:25,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:25,202 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:42:25,203 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:42:27,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:29,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:30,495 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:31,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:33,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:35,383 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:42:35,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:36,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:37,023 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:42:37,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:39,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:40,199 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:42:40,200 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:42:41,464 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:41,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:43,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:45,647 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:46,503 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:47,669 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:48,672 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:42:49,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:51,746 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:51,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:53,779 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:55,204 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:42:55,205 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:42:55,813 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:57,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:42:57,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:42:59,777 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:42:59,778 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:42:59,779 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:42:59,780 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:42:59,895 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:42:59,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:00,908 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:01,933 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:01,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:03,013 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:03,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:06,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:07,027 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:43:08,065 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:08,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:10,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:10,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:43:10,212 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:43:12,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:13,324 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:14,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:15,293 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:16,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:18,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:18,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:20,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:22,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:23,412 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:24,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:25,226 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:43:25,226 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:43:26,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:27,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:28,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:29,235 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:30,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:32,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:34,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:34,691 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:36,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:37,028 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:43:38,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:39,632 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:40,243 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:43:40,244 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:43:40,813 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:40,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:42,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:44,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:45,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:47,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:49,020 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:50,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:51,049 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:53,079 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:43:53,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:55,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:55,251 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:43:55,251 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:43:56,509 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:43:57,188 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:43:59,224 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:01,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:01,556 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:03,325 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:05,381 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:07,033 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:44:07,035 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:07,415 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:44:07,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:09,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:10,265 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:44:10,266 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:44:11,527 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:12,544 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:13,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:15,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:17,633 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:17,703 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:19,690 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:44:19,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:21,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:23,093 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:23,783 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:25,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:44:25,259 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:44:25,827 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:27,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:28,527 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:29,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:31,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:33,950 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:34,019 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:44:34,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:36,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:37,048 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:44:38,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:39,064 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:40,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:40,270 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:44:40,271 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:44:42,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:44,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:44,629 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:45,244 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:44:46,350 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:48,360 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:49,681 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:50,383 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:52,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:54,459 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:55,283 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:44:55,284 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:44:55,540 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:44:56,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:44:57,506 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:44:58,536 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:00,565 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:00,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:02,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:04,695 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:05,603 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:06,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:07,059 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:45:08,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:09,231 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:45:09,232 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:45:09,233 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:45:09,233 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:45:09,782 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:45:09,783 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:45:10,275 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:45:10,276 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:45:10,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:11,543 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:11,824 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:45:12,884 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:14,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:16,579 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:17,045 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:19,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:21,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:21,605 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:23,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:24,132 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:45:25,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:25,285 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:45:25,286 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:45:27,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:27,561 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:29,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:31,284 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:32,612 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:33,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:35,373 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:37,063 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:45:37,400 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:45:37,407 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:38,075 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:39,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:40,276 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:45:40,277 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:45:41,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:43,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:43,604 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:45,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:47,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:49,593 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:49,689 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:45:49,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:51,734 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:53,781 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:54,638 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:45:55,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:45:55,288 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:45:55,830 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:57,874 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:45:59,920 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:00,590 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:01,961 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:03,997 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:46:04,005 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:06,053 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:06,531 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:07,065 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:46:08,100 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:10,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:10,308 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:46:10,309 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:46:11,575 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:12,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:14,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:16,272 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:46:16,280 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:17,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:18,385 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:20,404 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:22,400 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:22,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:24,449 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:25,321 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:46:25,321 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:46:26,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:27,954 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:28,531 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:46:28,540 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:30,599 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:32,629 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:32,995 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:34,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:36,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:37,079 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:46:38,084 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:38,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:40,335 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:46:40,336 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:46:40,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:41,845 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:46:42,905 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:43,611 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:44,934 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:46,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:48,650 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:49,079 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:51,103 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:53,113 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:53,814 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:46:54,099 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:46:55,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:55,347 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:46:55,348 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:46:57,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:59,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:46:59,632 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:01,270 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:03,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:04,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:05,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:07,084 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:47:07,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:08,389 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:47:09,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:09,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:10,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:47:10,343 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:47:11,466 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:13,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:15,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:15,633 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:17,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:18,404 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:47:18,406 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:47:18,406 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:47:18,407 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:47:18,626 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:47:19,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:20,697 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:47:20,712 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:21,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:23,809 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:25,342 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:47:25,343 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:47:25,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:26,600 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:27,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:29,943 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:31,634 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:32,003 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:34,032 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:47:34,039 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:36,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:37,070 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:37,086 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:47:38,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:40,162 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:40,355 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:47:40,356 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:47:42,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:42,657 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:44,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:46,273 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:47:46,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:47,974 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:48,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:50,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:52,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:53,031 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:47:54,452 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:55,370 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:47:55,371 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:47:56,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:58,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:47:58,853 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:00,604 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:48:00,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:02,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:03,896 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:04,698 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:06,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:07,086 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:48:08,806 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:09,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:10,387 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:48:10,387 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:48:10,854 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:12,901 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:48:12,910 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:14,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:14,956 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:16,990 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:19,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:19,862 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:21,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:23,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:24,135 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:48:25,185 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:25,395 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:48:25,395 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:48:25,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:27,217 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:29,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:30,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:31,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:33,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:35,377 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:35,729 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:37,089 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:48:37,412 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:38,420 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:48:39,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:40,391 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:48:40,392 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:48:41,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:41,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:43,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:45,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:46,697 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:47,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:49,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:50,717 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:48:51,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:52,502 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:53,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:55,406 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:48:55,406 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:48:55,870 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:57,666 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:48:57,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:48:59,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:01,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:03,338 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:04,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:05,043 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:06,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:07,102 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:49:08,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:09,125 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:10,196 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:10,410 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:49:10,410 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:49:12,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:14,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:14,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:16,331 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:16,338 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:18,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:20,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:20,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:22,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:24,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:25,406 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:49:25,406 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:49:25,652 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:26,601 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:28,055 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:49:28,056 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:49:28,056 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:49:28,057 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:49:28,650 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:49:28,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:29,654 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:30,711 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:30,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:31,321 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:32,757 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:34,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:36,364 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:36,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:37,117 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:49:38,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:40,403 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:49:40,404 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:49:40,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:41,555 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:42,980 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:42,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:45,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:46,597 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:47,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:49,124 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:51,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:51,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:53,274 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:55,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:55,419 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:49:55,420 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:49:56,680 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:49:57,286 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:49:57,304 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:49:59,335 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:01,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:01,727 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:03,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:05,462 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:06,779 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:07,126 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:50:07,504 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:08,501 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:50:09,549 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:10,436 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:50:10,437 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:50:11,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:12,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:13,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:15,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:17,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:17,752 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:19,780 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:20,773 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:50:21,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:23,224 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:23,936 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:25,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:50:25,446 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:50:25,961 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:27,983 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:28,717 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:30,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:32,082 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:34,039 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:34,128 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:35,129 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:50:36,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:37,132 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:50:38,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:39,152 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:40,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:40,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:50:40,451 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:50:42,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:44,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:44,748 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:46,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:47,401 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:50:48,464 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:49,850 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:50,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:52,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:54,666 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:54,901 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:50:55,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:50:55,470 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:50:56,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:58,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:50:59,706 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:51:00,744 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:00,757 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:02,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:04,831 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:05,774 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:06,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:07,136 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:51:08,937 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:10,477 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:51:10,477 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:51:10,986 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:11,516 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:13,017 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:51:13,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:15,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:16,554 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:17,124 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:19,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:21,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:21,597 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:23,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:25,314 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:51:25,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:25,506 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:51:25,506 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:51:26,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:27,360 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:29,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:31,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:31,778 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:33,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:35,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:36,815 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:37,141 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:51:37,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:51:37,290 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:51:37,291 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:51:37,291 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:51:37,532 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:51:37,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:39,580 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:51:39,590 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:40,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:51:40,513 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:51:41,624 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:42,779 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:43,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:45,701 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:47,743 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:47,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:49,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:51,877 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:53,764 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:51:53,913 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:51:53,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:55,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:51:55,512 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:51:56,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:58,041 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:51:58,783 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:00,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:02,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:04,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:04,599 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:05,135 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:52:06,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:07,157 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:52:08,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:09,952 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:10,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:10,512 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:52:10,512 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:52:12,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:14,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:15,803 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:16,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:17,440 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:52:18,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:20,536 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:21,535 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:22,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:24,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:25,524 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:52:25,526 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:52:26,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:26,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:28,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:30,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:31,798 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:52:32,304 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:32,843 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:34,884 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:36,932 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:37,164 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:52:38,176 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:38,973 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:40,536 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:52:40,537 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:52:41,003 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:43,050 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:44,051 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:52:44,160 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:45,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:47,134 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:49,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:49,195 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:51,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:53,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:54,253 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:52:55,302 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:55,558 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:52:55,558 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:52:57,380 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:52:57,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:59,441 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:52:59,832 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:01,466 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:03,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:04,878 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:05,529 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:07,174 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:53:07,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:09,618 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:53:09,629 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:09,980 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:10,572 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:53:10,572 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:53:11,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:13,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:15,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:15,867 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:17,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:19,843 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:21,712 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:21,874 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:53:21,883 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:23,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:25,590 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:53:25,590 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:53:25,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:26,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:28,092 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:30,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:31,872 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:32,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:34,195 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:36,228 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:53:36,237 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:37,184 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:53:37,185 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:38,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:40,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:40,602 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:53:40,602 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:53:42,377 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:42,874 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:44,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:46,466 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:53:46,469 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:53:46,469 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:53:46,470 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:53:46,471 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:53:46,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:48,514 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:53:48,522 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:48,718 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:50,566 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:52,599 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:53,757 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:53:54,636 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:55,616 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:53:55,617 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:53:56,685 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:58,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:53:58,893 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:00,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:01,811 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:54:02,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:04,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:04,914 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:06,951 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:07,188 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:54:08,999 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:09,209 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:10,634 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:54:10,635 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:54:11,042 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:13,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:14,078 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:54:14,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:15,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:17,175 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:19,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:19,811 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:21,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:23,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:24,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:25,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:25,649 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:54:25,649 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:54:27,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:28,382 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:54:29,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:29,924 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:31,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:33,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:34,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:35,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:37,192 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:54:37,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:39,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:40,027 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:40,662 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:54:40,683 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:54:40,684 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:54:41,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:43,741 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:45,786 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:45,964 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:47,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:49,881 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:51,007 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:51,944 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:53,988 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:54:54,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:55,691 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:54:55,691 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:54:56,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:54:56,945 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:54:58,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:00,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:01,987 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:02,261 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:04,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:06,304 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:06,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:07,202 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:55:07,203 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:08,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:10,390 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:10,688 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:55:10,688 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:55:12,434 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:12,961 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:14,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:16,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:18,094 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:18,556 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:18,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:20,608 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:22,657 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:23,143 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:24,703 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:25,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:55:25,698 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:55:26,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:28,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:28,977 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:30,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:32,888 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:32,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:33,981 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:34,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:36,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:37,212 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:55:39,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:39,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:40,716 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:55:40,717 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:55:41,072 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:43,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:44,115 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:44,858 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:45,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:47,198 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:49,249 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:49,911 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:51,295 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:53,356 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:54,946 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:55:55,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:55,665 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:55:55,666 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:55:55,666 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:55:55,667 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:55:55,931 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:55:55,931 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:55:56,382 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:56,383 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:55:57,412 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:57,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:55:58,417 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:55:59,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:00,214 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:01,578 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:03,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:05,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:05,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:07,225 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:56:07,668 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:09,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:10,704 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:56:10,765 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:56:10,766 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:56:11,022 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:11,749 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:13,795 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:15,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:16,076 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:17,888 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:19,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:21,125 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:21,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:24,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:25,015 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:56:25,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:56:25,759 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:56:26,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:27,007 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:28,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:30,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:32,072 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:32,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:34,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:36,303 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:56:36,322 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:37,229 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:56:37,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:38,356 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:40,407 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:40,773 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:56:40,774 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:56:42,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:43,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:44,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:46,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:48,542 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:48,594 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:56:48,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:50,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:52,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:53,595 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:56:54,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:55,796 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:56:55,796 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:56:56,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:58,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:56:59,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:00,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:02,955 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:57:02,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:04,485 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:05,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:07,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:07,232 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:57:09,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:10,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:10,805 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:57:10,806 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:57:11,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:13,139 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:15,175 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:57:15,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:15,390 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:17,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:19,279 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:20,437 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:21,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:23,355 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:25,407 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:25,818 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:57:25,818 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:57:26,069 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:27,437 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:57:27,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:29,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:31,194 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:31,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:33,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:35,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:36,213 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:37,238 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:57:37,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:39,728 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:40,727 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:57:40,833 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:57:40,834 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:57:41,767 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:42,116 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:43,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:45,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:47,161 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:47,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:49,947 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:52,023 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:52,771 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:53,025 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:57:54,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:55,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:57:55,841 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:57:56,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:57:58,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:57:58,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:00,213 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:02,257 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:03,177 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:04,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:04,642 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 04:58:04,644 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 04:58:04,644 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 04:58:04,645 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 04:58:05,352 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:05,352 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 04:58:06,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:07,252 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:58:07,379 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:08,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:08,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:10,473 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:10,857 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:58:10,858 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:58:12,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:14,127 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:14,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:16,597 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:18,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:19,195 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:19,641 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:20,690 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:22,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:24,238 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:24,770 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:25,857 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:58:25,857 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:58:26,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:28,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:30,118 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:30,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:32,965 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:32,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:35,113 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:36,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:37,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:37,256 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:58:39,141 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:40,852 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:58:40,853 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:58:41,107 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:41,191 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:43,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:45,276 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:45,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:46,918 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:47,327 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:49,372 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:51,434 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:51,960 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:53,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:55,513 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:55,864 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:58:55,864 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:58:57,556 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:58:57,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:58:57,720 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:58:59,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:01,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:02,755 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:03,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:05,786 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:07,266 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:59:07,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:08,279 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:09,822 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:10,886 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:59:10,887 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:59:11,859 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:59:11,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:13,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:14,137 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:15,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:18,012 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:19,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:20,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:22,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:24,134 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:59:24,142 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:24,415 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:25,897 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:59:25,897 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:59:26,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:28,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:30,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:30,283 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:32,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:34,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:35,226 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:36,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:37,267 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 04:59:37,471 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:59:38,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:40,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:40,901 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:59:40,901 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:59:41,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:42,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:44,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:46,193 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:46,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:48,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:49,701 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 04:59:50,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:52,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:52,777 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:54,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:55,905 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 04:59:55,905 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 04:59:56,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 04:59:57,170 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 04:59:58,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:00,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:01,960 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:00:03,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:03,049 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:05,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:07,171 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:07,273 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:00:08,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:09,189 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:10,916 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:00:10,917 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:00:11,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:13,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:13,923 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:13,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:00:13,935 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:00:13,935 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:00:13,936 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:00:14,243 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:00:15,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:16,277 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:00:17,338 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:19,235 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:19,375 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:21,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:23,467 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:24,279 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:25,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:25,916 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:00:25,917 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:00:27,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:29,525 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:29,607 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:00:29,615 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:31,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:33,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:34,575 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:35,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:37,282 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:00:37,824 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:39,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:40,251 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:40,925 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:00:40,925 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:00:41,836 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:00:41,846 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:43,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:45,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:46,216 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:47,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:50,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:51,270 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:52,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:54,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:55,940 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:00:55,941 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:00:56,133 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:00:56,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:00:57,200 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:00:58,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:00,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:02,254 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:02,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:04,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:06,387 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:07,289 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:01:07,290 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:08,476 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:01:08,510 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:10,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:10,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:01:10,935 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:01:12,544 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:13,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:14,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:16,634 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:18,240 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:18,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:20,711 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:01:20,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:22,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:24,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:24,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:25,946 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:01:25,946 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:01:26,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:28,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:29,231 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:30,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:32,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:33,966 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:01:34,864 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:35,016 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:37,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:37,302 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:01:39,184 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:40,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:40,964 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:01:40,965 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:01:41,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:43,230 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:45,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:45,695 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:46,243 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:01:47,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:49,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:50,756 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:51,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:53,431 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:55,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:55,975 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:01:55,975 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:01:56,228 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:01:57,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:01:58,514 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:01:59,546 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:01,591 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:01,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:03,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:05,688 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:06,636 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:07,316 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:02:07,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:09,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:10,984 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:02:10,984 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:02:11,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:12,231 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:12,812 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:02:13,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:15,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:17,295 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:17,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:19,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:21,984 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:22,348 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:23,233 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:02:23,235 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:02:23,236 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:02:23,237 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:02:24,020 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:02:24,035 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:25,029 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:02:25,983 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:02:25,983 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:02:26,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:28,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:28,238 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:30,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:32,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:33,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:34,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:36,357 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:37,321 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:02:38,332 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:38,396 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:02:38,404 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:40,517 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:40,989 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:02:40,989 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:02:42,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:44,280 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:44,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:46,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:48,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:49,312 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:50,654 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:02:50,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:52,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:54,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:54,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:02:55,999 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:02:55,999 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:02:56,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:02:58,862 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:00,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:00,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:02,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:05,010 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:03:05,025 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:05,747 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:07,059 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:07,336 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:03:09,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:10,996 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:03:10,997 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:03:11,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:11,237 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:13,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:15,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:16,671 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:17,284 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:03:17,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:19,329 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:21,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:21,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:23,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:25,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:26,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:03:26,005 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:03:27,266 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:27,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:29,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:30,543 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:03:31,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:32,518 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:33,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:35,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:37,341 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:03:37,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:38,345 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:39,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:41,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:03:41,019 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:03:41,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:42,850 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:03:43,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:43,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:45,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:47,952 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:48,417 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:49,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:52,048 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:53,477 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:03:54,091 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:56,017 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:03:56,017 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:03:56,150 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:57,147 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:03:58,185 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:03:59,296 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:00,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:02,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:04,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:04,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:06,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:07,352 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:04:08,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:09,376 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:09,409 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:04:10,454 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:11,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:04:11,025 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:04:12,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:14,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:15,322 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:16,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:18,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:20,740 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:04:20,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:20,941 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:22,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:24,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:26,039 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:04:26,040 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:04:26,298 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:26,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:28,916 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:30,978 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:31,357 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:32,762 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:04:32,763 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:04:32,763 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:04:32,764 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:04:33,064 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:04:33,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:35,099 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:04:35,108 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:37,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:37,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:37,360 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:04:39,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:41,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:04:41,058 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:04:41,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:42,317 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:43,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:45,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:47,385 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:04:47,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:48,234 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:49,440 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:51,473 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:53,297 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:53,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:55,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:56,073 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:04:56,074 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:04:57,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:04:58,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:04:59,649 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:01,696 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:05:01,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:03,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:04,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:05,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:07,367 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:05:07,861 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:09,384 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:09,899 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:11,091 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:05:11,092 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:05:11,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:12,961 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:05:14,069 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:14,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:16,097 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:18,115 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:19,860 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:20,138 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:22,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:24,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:25,211 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:05:25,446 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:26,089 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:05:26,089 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:05:26,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:28,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:30,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:31,392 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:32,405 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:34,453 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:36,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:36,487 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:37,378 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:05:38,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:39,533 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:05:40,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:41,084 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:05:41,084 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:05:42,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:42,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:44,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:46,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:47,386 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:48,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:50,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:51,771 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:05:52,813 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:53,116 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:54,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:56,103 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:05:56,103 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:05:56,914 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:05:58,360 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:05:58,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:01,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:03,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:03,898 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:05,101 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:06:05,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:07,159 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:07,382 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:06:09,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:09,397 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:11,104 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:06:11,105 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:06:11,240 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:13,305 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:14,399 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:15,426 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:17,403 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:06:17,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:19,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:19,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:21,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:23,526 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:24,748 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:25,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:26,133 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:06:26,133 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:06:27,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:29,644 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:06:29,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:30,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:31,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:33,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:35,469 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:35,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:37,390 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:06:37,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:39,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:40,498 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:41,131 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:06:41,132 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:06:41,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:06:41,387 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:06:41,387 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:06:41,388 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:06:41,921 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:06:41,932 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:43,972 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:06:43,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:45,539 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:46,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:48,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:50,130 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:50,554 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:52,190 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:54,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:56,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:06:56,141 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:06:56,256 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:06:56,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:06:56,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:06:58,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:00,337 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:01,462 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:02,375 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:04,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:06,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:06,520 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:07,393 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:07:08,513 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:09,525 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:07:10,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:11,142 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:07:11,142 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:07:12,405 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:12,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:14,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:16,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:17,446 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:18,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:20,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:21,858 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:07:22,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:23,435 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:24,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:26,145 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:07:26,145 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:07:27,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:29,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:29,428 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:31,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:33,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:35,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:35,302 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:36,219 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:07:37,255 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:37,406 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:07:39,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:40,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:41,149 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:07:41,150 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:07:41,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:43,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:45,429 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:45,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:47,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:48,514 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:07:49,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:51,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:51,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:53,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:55,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:56,161 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:07:56,161 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:07:56,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:07:57,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:07:59,735 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:07:59,745 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:01,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:01,926 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:03,858 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:05,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:06,983 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:07,411 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:08:07,930 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:09,971 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:11,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:08:11,157 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:08:12,016 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:12,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:14,056 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:08:14,064 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:16,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:17,780 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:18,224 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:20,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:22,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:22,821 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:24,308 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:26,159 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:08:26,160 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:08:26,334 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:08:26,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:28,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:28,414 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:30,418 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:32,471 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:33,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:34,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:36,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:37,425 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:08:38,596 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:08:38,603 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:39,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:40,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:41,162 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:08:41,163 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:08:42,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:44,442 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:44,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:46,777 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:48,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:49,503 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:50,135 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:08:50,136 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:08:50,136 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:08:50,137 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:08:50,868 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:08:50,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:51,870 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:08:52,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:54,956 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:55,464 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:08:56,173 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:08:56,173 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:08:56,990 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:08:59,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:00,474 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:01,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:03,122 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:04,124 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:09:05,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:05,685 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:07,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:07,431 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:09:09,273 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:11,188 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:09:11,188 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:09:11,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:11,437 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:13,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:15,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:16,480 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:17,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:18,448 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:09:19,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:21,569 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:21,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:23,605 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:25,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:26,217 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:09:26,217 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:09:27,461 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:27,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:29,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:30,707 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:09:31,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:33,408 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:33,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:35,839 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:37,440 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:09:37,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:38,454 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:39,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:41,220 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:09:41,221 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:09:41,964 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:44,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:44,191 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:45,029 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:09:46,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:48,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:49,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:50,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:52,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:54,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:54,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:09:56,228 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:09:56,229 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:09:56,278 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:09:56,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:58,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:09:59,515 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:00,376 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:02,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:04,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:04,566 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:06,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:07,447 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:10:08,568 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:10:08,577 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:09,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:10,618 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:11,234 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:10:11,235 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:10:12,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:14,709 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:15,516 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:16,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:18,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:20,557 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:20,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:22,926 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:10:22,950 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:24,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:25,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:26,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:10:26,245 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:10:26,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:29,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:31,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:31,543 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:33,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:35,145 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:10:35,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:37,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:37,452 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:10:37,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:39,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:41,260 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:10:41,261 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:10:41,269 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:42,511 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:43,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:45,351 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:47,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:48,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:48,389 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:10:49,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:51,545 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:53,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:53,572 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:55,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:56,257 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:10:56,258 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:10:57,605 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:10:58,524 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:10:59,220 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:10:59,222 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:10:59,222 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:10:59,223 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:10:59,661 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:10:59,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:00,674 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:11:01,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:03,746 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:04,484 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:05,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:07,465 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:11:07,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:09,884 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:10,489 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:11,273 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:11:11,274 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:11:11,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:13,983 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:14,992 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:11:15,616 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:16,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:18,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:20,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:20,668 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:22,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:24,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:25,677 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:26,278 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:11:26,279 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:11:26,288 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:27,256 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:11:28,300 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:30,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:31,579 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:32,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:34,442 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:36,489 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:36,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:37,478 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:11:38,532 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:39,525 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:11:40,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:41,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:11:41,301 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:11:42,561 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:42,627 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:44,666 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:46,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:47,585 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:48,746 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:50,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:52,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:52,871 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:11:52,912 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:54,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:56,308 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:11:56,309 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:11:56,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:11:58,573 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:11:58,974 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:01,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:03,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:04,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:05,120 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:12:05,127 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:07,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:07,483 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:12:09,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:09,499 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:11,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:11,320 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:12:11,321 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:12:13,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:14,587 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:15,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:17,390 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:12:17,401 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:19,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:20,182 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:21,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:23,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:25,238 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:25,598 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:26,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:12:26,350 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:12:27,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:29,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:30,904 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:31,690 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:12:31,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:33,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:35,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:35,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:37,489 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:12:37,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:39,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:41,354 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:12:41,355 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:12:41,611 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:41,908 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:42,906 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:12:43,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:45,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:46,666 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:48,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:50,082 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:51,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:52,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:54,241 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:55,198 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:12:56,262 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:12:56,369 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:12:56,369 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:12:57,626 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:12:58,288 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:00,316 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:02,349 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:02,661 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:04,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:06,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:07,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:13:07,047 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:13:07,048 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:13:07,049 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:13:07,425 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:13:07,502 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:13:08,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:08,507 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:09,468 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:13:10,530 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:11,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:13:11,390 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:13:12,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:13,676 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:14,600 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:16,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:18,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:18,706 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:20,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:21,725 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:13:22,777 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:24,524 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:24,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:26,397 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:13:26,398 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:13:26,895 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:28,918 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:29,646 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:30,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:33,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:35,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:35,282 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:36,089 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:13:37,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:37,518 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:13:39,193 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:40,540 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:41,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:41,417 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:13:41,418 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:13:43,279 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:45,325 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:45,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:47,367 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:13:47,375 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:49,412 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:50,998 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:51,452 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:53,500 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:55,609 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:56,431 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:13:56,432 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:13:56,683 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:13:57,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:13:59,633 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:13:59,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:01,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:02,670 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:03,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:05,758 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:07,531 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:14:07,808 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:08,540 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:09,851 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:11,444 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:14:11,445 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:14:11,906 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:13,710 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:13,931 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:14:13,940 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:15,975 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:18,021 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:18,759 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:20,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:22,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:23,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:24,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:26,264 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:14:26,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:26,466 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:14:26,467 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:14:28,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:29,748 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:30,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:32,362 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:34,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:34,793 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:36,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:37,469 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:14:37,528 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:14:38,507 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:40,552 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:40,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:41,463 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:14:41,464 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:14:42,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:44,648 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:45,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:46,693 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:48,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:50,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:51,443 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:51,770 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:14:52,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:54,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:56,471 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:14:56,471 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:14:56,725 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:14:56,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:14:58,984 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:01,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:01,761 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:03,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:04,069 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:05,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:07,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:07,183 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:07,533 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:15:09,203 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:11,233 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:11,473 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:15:11,474 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:15:12,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:13,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:14,882 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:15:14,883 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:15:14,883 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:15:14,884 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:15:15,339 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:15,339 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:15:15,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:16,355 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:17,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:18,176 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:19,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:21,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:23,232 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:23,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:25,567 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:26,465 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:15:26,465 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:15:27,678 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:29,180 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:29,700 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:30,685 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:31,720 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:33,753 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:34,220 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:35,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:37,546 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:15:37,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:39,559 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:39,903 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:41,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:15:41,471 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:15:41,937 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:41,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:43,978 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:44,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:46,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:48,058 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:49,798 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:50,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:52,162 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:54,194 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:15:54,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:55,443 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:15:56,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:15:56,471 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:15:56,472 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:15:58,344 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:00,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:00,778 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:02,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:04,416 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:05,841 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:06,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:07,557 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:16:08,505 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:16:08,514 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:10,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:11,490 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:16:11,490 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:16:11,746 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:12,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:14,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:16,706 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:16,790 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:18,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:20,789 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:16:20,796 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:22,693 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:22,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:24,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:26,507 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:16:26,508 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:16:26,937 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:27,778 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:29,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:31,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:32,050 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:16:33,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:33,312 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:35,135 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:37,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:37,573 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:16:38,586 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:39,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:41,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:41,518 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:16:41,519 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:16:43,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:43,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:45,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:46,388 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:16:47,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:48,987 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:49,497 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:51,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:53,570 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:54,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:16:55,607 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:56,528 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:16:56,528 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:16:57,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:58,654 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:16:59,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:16:59,802 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:01,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:03,798 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:04,849 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:05,826 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:07,585 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:17:07,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:09,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:10,305 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:10,898 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:17:11,544 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:17:11,545 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:17:11,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:14,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:15,814 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:16,050 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:18,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:20,129 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:20,856 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:21,974 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:17:21,975 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:17:21,975 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:17:21,977 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:17:22,175 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:17:22,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:24,224 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:17:24,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:25,889 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:26,275 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:26,561 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:17:26,561 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:17:28,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:30,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:31,849 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:32,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:34,469 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:36,477 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:17:36,485 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:37,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:37,599 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:17:38,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:40,571 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:41,568 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:17:41,569 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:17:42,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:42,923 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:44,667 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:46,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:47,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:48,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:17:48,756 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:50,801 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:52,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:53,124 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:54,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:56,593 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:17:56,593 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:17:56,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:17:58,870 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:17:58,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:01,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:03,056 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:18:03,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:04,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:05,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:07,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:07,615 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:18:09,190 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:10,641 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:11,235 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:11,597 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:18:11,597 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:18:13,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:15,320 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:18:15,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:16,467 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:17,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:19,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:21,466 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:21,517 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:23,513 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:25,546 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:26,545 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:18:26,592 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:18:26,593 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:18:26,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:27,590 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:29,643 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:31,767 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:31,869 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:33,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:35,817 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:36,913 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:37,628 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:18:37,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:39,905 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:40,902 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:18:41,594 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:18:41,594 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:18:41,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:42,845 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:44,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:46,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:47,892 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:48,088 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:50,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:52,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:53,190 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:18:53,378 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:18:54,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:56,274 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:56,591 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:18:56,592 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:18:58,311 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:18:58,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:00,358 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:02,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:03,891 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:04,482 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:05,465 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:19:06,505 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:07,629 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:19:08,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:09,653 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:10,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:11,596 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:19:11,596 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:19:12,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:14,677 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:14,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:16,732 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:18,775 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:19:18,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:20,709 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:20,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:22,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:24,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:25,751 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:26,602 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:19:26,603 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:19:26,981 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:29,011 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:29,294 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:19:29,296 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:19:29,297 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:19:29,298 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:19:30,013 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:19:31,057 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:19:31,070 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:31,558 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:33,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:35,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:36,599 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:37,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:37,642 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:19:39,290 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:41,322 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:41,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:19:41,625 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:19:41,881 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:43,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:45,410 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:19:45,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:47,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:47,667 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:49,495 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:51,543 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:52,714 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:53,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:55,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:56,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:19:56,632 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:19:57,679 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:19:57,680 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:19:57,896 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:19:59,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:01,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:02,937 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:03,872 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:05,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:07,655 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:20:07,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:08,886 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:08,887 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:20:09,936 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:11,654 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:20:11,655 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:20:11,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:13,927 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:14,030 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:16,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:18,117 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:18,969 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:20,164 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:21,167 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:20:22,209 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:24,255 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:24,557 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:26,301 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:26,652 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:20:26,652 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:20:28,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:29,933 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:30,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:32,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:34,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:35,196 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:35,510 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:20:36,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:37,669 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:20:38,582 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:40,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:40,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:41,670 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:20:41,671 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:20:42,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:44,707 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:45,842 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:46,754 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:47,755 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:20:48,806 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:50,850 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:50,888 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:52,892 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:54,942 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:55,932 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:20:56,686 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:20:56,686 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:20:56,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:20:59,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:00,024 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:01,080 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:01,526 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:03,115 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:05,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:06,577 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:07,232 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:07,682 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:21:09,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:11,280 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:11,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:21:11,683 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:21:11,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:13,305 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:13,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:15,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:16,971 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:17,399 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:19,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:21,481 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:22,020 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:23,520 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:25,553 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:25,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:26,687 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:21:26,688 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:21:27,600 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:28,002 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:29,640 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:31,684 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:33,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:33,735 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:35,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:36,566 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:21:36,567 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:21:36,568 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:21:36,569 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:21:36,820 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:21:37,688 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:21:37,833 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:37,867 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:38,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:38,843 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:39,872 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:39,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:41,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:21:41,705 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:21:41,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:43,978 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:43,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:46,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:48,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:49,050 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:50,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:52,165 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:21:52,181 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:54,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:54,869 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:21:56,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:56,716 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:21:56,716 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:21:58,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:21:59,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:00,361 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:02,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:03,408 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:22:04,458 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:05,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:06,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:07,695 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:22:08,571 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:10,586 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:10,720 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:11,722 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:22:11,723 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:22:12,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:14,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:16,048 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:16,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:17,712 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:22:18,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:20,789 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:21,111 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:22,833 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:24,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:26,714 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:22:26,714 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:22:26,915 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:26,955 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:28,960 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:29,963 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:22:31,022 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:32,682 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:33,068 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:35,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:37,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:37,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:37,708 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:22:39,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:41,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:41,724 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:22:41,725 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:22:42,271 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:22:42,992 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:43,305 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:45,342 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:47,384 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:48,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:49,456 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:51,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:53,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:53,997 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:54,556 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:22:55,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:56,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:22:56,734 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:22:57,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:22:59,000 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:22:59,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:01,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:03,787 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:04,034 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:05,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:07,721 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:23:07,901 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:07,950 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:09,742 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:09,976 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:11,741 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:23:11,741 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:23:11,999 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:14,015 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:15,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:16,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:18,100 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:20,141 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:20,149 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:20,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:22,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:24,242 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:25,316 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:26,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:26,751 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:23:26,751 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:23:28,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:30,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:30,935 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:32,411 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:32,419 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:34,480 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:36,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:36,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:37,730 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:23:38,622 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:40,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:41,051 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:41,751 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:23:41,752 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:23:42,648 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:43,626 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:23:43,628 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:23:43,628 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:23:43,629 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:23:43,630 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:23:44,681 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:45,686 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:46,725 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:46,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:46,921 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:48,771 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:50,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:51,956 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:52,859 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:54,907 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:56,764 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:23:56,765 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:23:56,949 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:23:57,022 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:23:57,956 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:23:58,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:01,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:02,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:03,091 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:05,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:07,172 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:07,183 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:07,744 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:24:09,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:10,267 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:24:11,314 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:11,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:24:11,767 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:24:13,045 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:13,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:15,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:17,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:18,102 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:19,447 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:21,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:23,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:23,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:24,552 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:24:25,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:26,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:24:26,767 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:24:27,612 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:29,032 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:29,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:31,708 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:33,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:34,066 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:35,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:36,782 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:24:37,745 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:24:37,823 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:39,761 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:39,944 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:41,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:24:41,782 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:24:41,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:43,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:45,051 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:46,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:48,069 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:49,062 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:24:50,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:50,304 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:52,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:54,215 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:55,359 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:24:56,258 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:24:56,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:24:56,783 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:24:58,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:00,339 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:25:00,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:00,895 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:02,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:04,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:05,941 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:06,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:07,746 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:25:08,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:10,621 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:11,111 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:11,783 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:25:11,783 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:25:12,637 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:14,656 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:25:14,668 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:16,534 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:16,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:18,756 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:20,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:21,593 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:22,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:24,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:26,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:25:26,781 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:25:26,909 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:25:26,917 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:27,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:28,962 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:31,006 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:32,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:33,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:35,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:37,138 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:37,146 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:37,756 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:25:39,182 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:25:39,190 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:41,295 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:41,776 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:25:41,777 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:25:43,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:43,320 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:45,352 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:47,395 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:48,099 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:49,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:50,299 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:25:50,300 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:25:50,301 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:25:50,301 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:25:50,445 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:25:51,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:52,496 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:25:53,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:53,578 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:55,620 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:56,790 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:25:56,791 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:25:57,669 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:25:59,300 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:25:59,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:01,758 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:03,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:04,576 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:04,812 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:26:05,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:07,757 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:26:07,895 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:09,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:09,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:11,796 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:26:11,797 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:26:12,070 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:14,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:15,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:16,104 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:17,074 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:26:18,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:20,141 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:20,170 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:22,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:24,250 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:25,246 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:26,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:26,811 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:26:26,812 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:26:28,331 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:30,377 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:30,769 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:31,376 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:26:32,430 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:34,488 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:35,813 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:36,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:37,767 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:26:38,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:40,645 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:41,173 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:41,817 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:26:41,818 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:26:42,723 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:26:42,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:44,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:46,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:47,101 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:48,846 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:50,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:52,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:52,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:54,975 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:26:54,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:56,813 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:26:56,814 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:26:57,037 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:26:58,064 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:26:59,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:01,123 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:03,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:03,175 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:05,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:07,254 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:07,780 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:27:08,791 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:09,280 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:27:09,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:11,331 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:11,826 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:27:11,827 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:27:13,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:14,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:15,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:17,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:19,124 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:19,523 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:21,558 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:27:21,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:23,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:24,364 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:25,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:26,848 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:27:26,848 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:27:27,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:29,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:30,117 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:31,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:33,811 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:27:33,819 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:35,860 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:35,890 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:37,786 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:27:37,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:39,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:41,212 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:41,847 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:27:41,847 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:27:41,986 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:44,087 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:45,062 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:27:46,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:46,474 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:48,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:50,160 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:51,510 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:52,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:54,250 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:56,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:56,843 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:27:56,844 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:27:56,995 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:27:57,095 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:27:57,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:27:57,095 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:27:57,097 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:27:57,299 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:27:58,364 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:27:59,366 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:28:00,420 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:02,297 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:02,464 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:04,513 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:06,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:07,348 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:07,793 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:28:08,594 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:10,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:11,626 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:28:11,858 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:28:11,859 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:28:12,675 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:13,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:14,773 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:16,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:18,173 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:18,804 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:20,849 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:22,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:23,886 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:28:23,933 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:24,926 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:26,863 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:28:26,864 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:28:26,966 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:29,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:29,127 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:31,062 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:33,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:34,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:35,143 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:37,182 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:37,806 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:28:38,186 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:28:39,223 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:39,830 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:41,261 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:41,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:28:41,859 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:28:43,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:45,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:45,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:47,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:49,416 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:28:49,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:51,109 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:51,485 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:53,515 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:55,553 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:56,144 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:28:56,872 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:28:56,872 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:28:57,606 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:28:59,652 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:01,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:01,705 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:29:01,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:03,751 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:05,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:06,785 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:07,816 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:29:07,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:09,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:11,905 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:29:11,906 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:29:11,922 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:12,153 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:13,951 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:16,029 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:29:16,050 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:17,340 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:18,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:20,085 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:22,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:22,400 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:24,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:26,184 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:26,883 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:29:26,884 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:29:28,140 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:28,226 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:29:28,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:30,274 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:32,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:33,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:34,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:36,413 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:37,819 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:29:38,437 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:38,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:40,483 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:29:40,491 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:41,887 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:29:41,888 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:29:42,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:44,143 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:44,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:46,690 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:48,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:49,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:50,731 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:51,719 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:29:52,782 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:54,827 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:54,975 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:29:56,878 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:29:56,909 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:29:56,909 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:29:58,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:00,197 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:00,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:03,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:03,537 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:30:03,539 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:30:03,539 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:30:03,540 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:30:04,014 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:30:05,044 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:05,051 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:05,793 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:06,047 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:07,118 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:07,826 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:30:09,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:10,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:11,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:11,915 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:30:11,916 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:30:13,251 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:15,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:16,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:17,406 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:18,364 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:19,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:21,437 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:21,745 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:23,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:25,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:26,928 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:30:26,928 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:30:27,176 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:27,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:29,604 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:30,603 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:31,644 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:32,302 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:33,683 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:35,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:37,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:37,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:37,841 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:30:39,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:41,861 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:41,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:30:41,939 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:30:43,189 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:43,890 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:43,902 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:45,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:48,055 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:48,234 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:50,078 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:52,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:53,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:30:54,142 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:56,171 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:30:56,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:56,930 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:30:56,931 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:30:58,226 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:30:59,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:00,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:02,317 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:04,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:04,345 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:06,386 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:07,842 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:31:08,414 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:31:08,421 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:09,860 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:10,478 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:11,936 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:31:11,936 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:31:12,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:14,561 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:15,230 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:16,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:18,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:20,698 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:31:20,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:20,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:22,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:24,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:25,767 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:26,829 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:26,944 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:31:26,944 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:31:28,868 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:30,924 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:31,227 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:32,955 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:34,986 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:31:34,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:36,293 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:37,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:37,841 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:31:39,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:41,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:41,346 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:41,973 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:31:41,974 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:31:43,180 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:45,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:46,222 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:31:46,870 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:47,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:49,371 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:51,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:51,932 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:53,410 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:55,439 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:56,973 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:31:56,973 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:31:57,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:31:57,486 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:31:58,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:31:59,531 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:01,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:02,551 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:03,616 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:05,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:07,600 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:07,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:07,856 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:32:09,729 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:10,033 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:32:10,036 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:32:10,036 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:32:10,037 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:32:10,732 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:32:11,756 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:32:11,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:11,968 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:32:11,969 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:32:12,761 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:32:13,218 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:13,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:15,834 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:17,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:18,273 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:20,013 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:22,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:23,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:24,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:25,033 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:32:26,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:26,976 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:32:26,977 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:32:28,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:29,243 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:30,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:32,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:34,327 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:34,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:36,394 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:37,870 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:32:38,416 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:32:38,425 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:39,882 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:40,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:41,990 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:32:41,990 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:32:42,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:44,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:45,271 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:46,599 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:48,639 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:50,713 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:32:50,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:50,782 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:52,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:54,802 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:55,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:32:56,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:32:57,007 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:32:57,008 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:32:58,871 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:00,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:01,272 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:02,955 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:33:02,965 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:05,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:06,434 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:07,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:07,873 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:33:09,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:11,144 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:12,018 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:33:12,019 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:33:12,267 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:13,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:15,247 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:33:15,254 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:17,299 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:18,085 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:19,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:21,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:23,119 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:23,476 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:25,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:27,019 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:33:27,020 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:33:27,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:28,274 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:28,557 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:33:29,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:31,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:33,295 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:33,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:35,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:37,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:37,876 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:33:38,880 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:39,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:40,837 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:33:41,888 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:42,008 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:33:42,009 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:33:43,928 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:44,282 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:45,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:48,032 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:49,337 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:50,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:52,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:53,173 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:33:54,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:54,887 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:33:56,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:33:57,017 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:33:57,017 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:33:58,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:00,298 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:00,307 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:02,349 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:04,395 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:05,501 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:06,439 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:07,430 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:07,889 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:34:08,486 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:10,522 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:10,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:12,024 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:34:12,024 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:34:12,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:14,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:16,299 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:16,651 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:17,126 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:34:17,127 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:34:17,127 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:34:17,128 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:34:17,659 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:34:18,693 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:18,704 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:19,696 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:20,753 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:21,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:22,860 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:24,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:26,449 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:26,900 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:27,033 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:34:27,034 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:34:28,940 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:30,981 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:30,991 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:32,387 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:33,031 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:35,066 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:37,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:37,438 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:37,899 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:34:39,156 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:41,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:42,038 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:34:42,039 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:34:42,998 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:43,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:45,286 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:45,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:47,335 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:48,046 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:49,372 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:51,417 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:53,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:53,536 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:55,559 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:57,065 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:34:57,066 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:34:57,578 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:34:57,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:34:58,328 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:34:59,632 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:01,681 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:03,389 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:03,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:05,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:07,812 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:07,906 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:35:09,268 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:09,837 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:35:09,845 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:11,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:12,083 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:35:12,083 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:35:13,956 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:14,349 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:16,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:18,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:19,390 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:20,089 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:22,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:23,123 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:35:24,284 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:24,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:26,293 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:27,094 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:35:27,095 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:35:28,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:30,368 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:30,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:32,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:34,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:35,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:35:35,598 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:36,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:37,921 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:35:38,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:40,630 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:40,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:42,106 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:35:42,107 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:35:42,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:44,711 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:46,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:46,744 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:47,749 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:35:48,790 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:50,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:51,247 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:52,882 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:55,002 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:56,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:35:57,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:35:57,108 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:35:57,108 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:35:59,045 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:00,002 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:01,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:01,797 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:03,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:05,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:06,846 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:07,201 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:07,935 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:36:09,245 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:11,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:12,127 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:36:12,127 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:36:12,374 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:13,317 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:13,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:15,365 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:17,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:17,409 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:19,470 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:21,524 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:22,456 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:23,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:24,156 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:36:24,158 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:36:24,158 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:36:24,158 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:36:24,558 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:24,559 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:36:25,627 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:25,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:27,128 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:36:27,129 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:36:27,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:28,390 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:29,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:31,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:33,435 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:33,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:35,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:37,873 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:37,949 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:36:38,958 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:39,927 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:39,941 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:41,984 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:42,128 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:36:42,128 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:36:44,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:44,388 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:46,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:48,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:49,463 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:50,163 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:52,197 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:36:52,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:54,243 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:55,005 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:36:56,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:36:57,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:36:57,140 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:36:58,359 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:00,379 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:00,419 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:02,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:04,454 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:37:04,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:05,611 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:06,502 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:07,955 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:37:08,537 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:10,590 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:10,982 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:12,147 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:37:12,147 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:37:12,648 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:14,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:15,692 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:37:16,208 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:16,730 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:18,775 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:20,812 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:21,246 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:22,854 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:24,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:26,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:26,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:27,158 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:37:27,159 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:37:29,008 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:29,998 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:37:31,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:31,801 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:33,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:35,136 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:36,851 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:37,178 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:37,962 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:37:39,216 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:41,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:42,155 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:37:42,156 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:37:42,265 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:37:42,406 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:43,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:45,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:47,408 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:47,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:49,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:51,501 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:52,482 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:53,557 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:54,567 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:37:55,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:57,161 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:37:57,162 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:37:57,738 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:37:58,424 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:37:59,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:01,793 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:03,487 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:03,812 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:05,833 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:05,847 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:07,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:07,972 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:38:08,981 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:09,926 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:11,980 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:12,175 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:38:12,175 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:38:14,018 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:14,446 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:16,067 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:18,106 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:20,149 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:20,157 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:20,176 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:22,200 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:24,238 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:25,244 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:26,285 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:27,186 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:38:27,187 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:38:28,426 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:30,455 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:30,472 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:30,704 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:38:30,705 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:38:30,705 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:38:30,706 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:38:31,410 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:31,410 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:38:32,440 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:32,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:34,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:36,066 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:36,568 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:37,978 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:38:38,633 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:40,687 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:41,614 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:42,183 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:38:42,184 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:38:42,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:44,788 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:46,825 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:46,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:47,040 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:48,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:50,920 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:52,074 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:52,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:55,004 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:57,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:38:57,188 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:38:57,189 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:38:57,444 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:38:58,051 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:38:59,167 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:01,196 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:02,484 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:03,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:05,267 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:07,316 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:07,535 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:07,992 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:39:09,353 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:10,360 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:39:11,411 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:12,183 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:39:12,184 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:39:13,459 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:13,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:15,510 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:17,569 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:18,491 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:19,610 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:21,666 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:23,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:23,766 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:24,719 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:39:25,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:27,196 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:39:27,197 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:39:27,810 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:29,462 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:29,918 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:31,935 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:33,957 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:35,313 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:36,006 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:36,999 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:39:37,992 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:39:38,051 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:40,093 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:41,020 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:42,126 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:42,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:39:42,207 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:39:44,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:46,206 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:46,509 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:48,245 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:39:48,253 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:50,289 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:51,902 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:52,348 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:54,389 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:56,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:39:57,213 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:39:57,214 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:39:57,471 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:39:58,472 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:00,570 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:00,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:02,497 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:02,634 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:04,662 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:06,689 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:07,549 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:08,008 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:40:08,734 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:10,769 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:12,236 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:40:12,237 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:40:12,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:13,504 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:14,842 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:14,852 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:16,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:18,543 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:18,948 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:21,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:23,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:23,586 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:25,086 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:27,122 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:27,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:27,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:40:27,253 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:40:29,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:29,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:31,271 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:33,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:34,563 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:35,315 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:37,267 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:40:37,269 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:40:37,270 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:40:37,270 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:40:37,328 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:40:37,340 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:38,042 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:40:39,370 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:39,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:40,071 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:40,386 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:41,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:42,263 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:40:42,264 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:40:43,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:45,533 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:45,535 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:47,574 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:49,614 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:50,545 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:51,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:52,658 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:40:53,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:55,596 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:40:55,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:57,271 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:40:57,272 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:40:57,784 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:40:59,827 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:01,582 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:01,945 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:03,972 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:04,936 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:41:05,983 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:07,129 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:08,025 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:08,031 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:41:10,065 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:12,120 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:12,287 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:41:12,287 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:41:12,542 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:14,152 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:16,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:17,190 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:41:17,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:18,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:20,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:22,313 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:22,763 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:24,355 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:26,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:27,303 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:41:27,303 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:41:28,251 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:28,429 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:29,426 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:41:30,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:32,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:33,307 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:34,590 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:36,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:38,043 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:41:38,638 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:39,052 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:40,669 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:42,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:41:42,320 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:41:42,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:43,715 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:41:44,593 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:44,774 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:46,825 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:48,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:49,652 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:50,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:52,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:54,993 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:41:55,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:55,513 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:41:57,035 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:41:57,323 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:41:57,323 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:41:59,071 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:00,605 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:01,111 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:03,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:05,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:05,641 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:07,239 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:42:07,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:08,057 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:42:09,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:11,085 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:11,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:12,334 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:42:12,334 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:42:13,352 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:15,393 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:16,626 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:17,424 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:19,463 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:21,497 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:42:21,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:21,713 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:23,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:25,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:26,738 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:27,348 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:42:27,348 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:42:27,631 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:29,676 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:31,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:32,353 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:33,810 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:42:33,832 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:35,841 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:37,409 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:37,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:38,069 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:42:39,925 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:41,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:42,360 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:42:42,361 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:42:42,613 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:43,932 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:42:43,933 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:42:43,934 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:42:43,934 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:42:44,013 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:42:44,020 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:45,016 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:42:46,060 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:47,064 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:42:48,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:48,241 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:50,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:52,204 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:53,289 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:54,246 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:56,292 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:57,376 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:42:57,377 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:42:58,341 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:42:58,644 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:42:59,339 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:43:00,383 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:02,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:03,715 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:04,550 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:06,564 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:08,083 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:43:08,580 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:09,087 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:10,626 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:11,626 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:43:12,393 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:43:12,394 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:43:12,670 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:14,708 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:14,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:16,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:18,795 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:19,725 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:20,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:22,895 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:23,902 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:43:24,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:25,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:26,982 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:27,407 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:43:27,408 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:43:29,026 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:30,677 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:31,074 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:33,119 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:35,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:36,075 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:37,239 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:38,090 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:43:38,230 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:43:39,256 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:41,124 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:41,281 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:42,399 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:43:42,400 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:43:43,332 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:45,370 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:46,689 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:47,412 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:49,450 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:43:49,456 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:51,498 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:51,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:53,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:55,581 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:56,812 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:43:57,398 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:43:57,399 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:43:57,625 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:43:59,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:01,695 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:01,702 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:02,310 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:03,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:05,860 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:07,352 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:07,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:08,105 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:44:09,900 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:11,951 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:12,408 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:44:12,408 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:44:12,659 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:14,004 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:14,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:16,047 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:17,999 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:18,092 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:20,125 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:22,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:23,039 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:24,204 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:26,234 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:27,411 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:44:27,412 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:44:28,267 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:28,276 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:28,678 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:30,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:32,374 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:33,711 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:34,416 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:36,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:38,110 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:44:38,548 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:39,165 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:40,543 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:40,563 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:42,427 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:44:42,428 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:44:42,600 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:44,646 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:44,714 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:46,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:48,748 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:49,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:50,737 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:44:50,739 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:44:50,739 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:44:50,740 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:44:50,795 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:44:50,805 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:51,798 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:52,830 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:44:52,837 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:54,891 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:55,032 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:44:56,943 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:44:57,445 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:44:57,445 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:44:59,001 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:00,712 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:01,068 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:03,114 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:05,176 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:06,077 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:06,173 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:45:07,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:08,114 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:45:09,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:11,135 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:11,333 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:12,470 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:45:12,471 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:45:13,361 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:15,378 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:16,696 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:17,400 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:18,402 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:45:19,443 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:21,476 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:21,752 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:23,518 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:25,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:26,810 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:27,481 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:45:27,482 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:45:27,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:29,625 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:30,627 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:45:31,671 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:32,372 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:33,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:35,763 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:37,415 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:37,884 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:38,128 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:45:39,913 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:41,937 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:42,504 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:45:42,504 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:45:42,749 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:43,960 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:45:43,970 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:46,012 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:47,792 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:48,055 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:50,102 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:52,161 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:52,839 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:45:54,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:56,223 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:45:56,233 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:57,501 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:45:57,502 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:45:58,272 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:45:58,779 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:00,308 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:02,355 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:03,838 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:04,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:06,445 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:08,143 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:46:08,518 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:08,551 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:09,153 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:10,562 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:12,519 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:46:12,520 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:46:12,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:14,613 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:14,791 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:16,656 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:18,695 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:20,730 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:20,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:20,788 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:22,776 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:24,818 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:25,825 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:26,859 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:27,522 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:46:27,522 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:46:28,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:30,956 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:31,794 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:32,998 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:35,030 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:35,045 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:37,088 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:37,421 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:38,150 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:46:39,197 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:41,210 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:42,523 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:46:42,524 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:46:42,777 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:43,228 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:45,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:46,277 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:47,307 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:48,029 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:49,349 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:51,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:53,058 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:53,427 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:55,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:57,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:46:57,539 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:46:57,540 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:46:57,690 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:46:57,801 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:46:57,801 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:46:57,802 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:46:58,512 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:58,512 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:46:58,951 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:46:59,546 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:46:59,555 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:00,554 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:47:01,596 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:03,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:03,998 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:05,672 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:07,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:08,151 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:47:09,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:09,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:11,843 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:12,561 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:47:12,561 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:47:12,814 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:47:13,856 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:14,826 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:15,894 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:17,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:19,869 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:19,979 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:22,029 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:24,091 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:25,085 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:47:25,571 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:26,133 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:27,576 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:47:27,576 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:47:28,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:30,208 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:30,840 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:32,263 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:34,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:35,879 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:36,354 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:37,362 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:47:38,154 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:47:38,398 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:40,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:41,164 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:42,542 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:42,589 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:47:42,589 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:47:44,556 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:46,579 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:46,870 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:48,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:50,666 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:47:50,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:52,726 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:52,871 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:54,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:56,807 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:47:57,597 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:47:57,597 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:47:58,847 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:47:58,857 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:00,919 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:02,951 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:48:02,959 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:04,451 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:05,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:07,036 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:08,160 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:48:09,083 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:10,184 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:11,202 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:12,605 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:48:12,606 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:48:13,212 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:15,239 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:48:15,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:15,971 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:17,286 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:19,323 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:21,017 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:21,361 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:23,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:25,461 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:26,075 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:27,487 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:48:27,495 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:27,624 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:48:27,625 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:48:29,557 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:31,603 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:31,889 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:33,655 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:35,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:36,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:37,741 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:38,175 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:48:39,792 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:40,790 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:48:41,889 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:41,989 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:42,621 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:48:42,622 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:48:43,901 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:45,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:47,919 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:47,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:50,028 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:52,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:53,068 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:48:53,508 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:48:54,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:56,169 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:57,631 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:48:57,632 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:48:58,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:48:58,892 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:00,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:02,321 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:03,946 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:04,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:49:04,059 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:49:04,059 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:49:04,060 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:49:04,385 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:49:04,395 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:05,393 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:06,422 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:06,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:08,190 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:49:08,477 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:09,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:10,525 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:12,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:12,640 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:49:12,640 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:49:14,648 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:14,891 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:16,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:18,718 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:19,718 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:20,288 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:20,772 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:22,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:24,877 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:25,336 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:26,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:27,645 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:49:27,645 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:49:29,010 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:30,908 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:31,061 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:31,076 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:33,112 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:35,166 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:35,942 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:37,207 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:38,199 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:49:39,248 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:41,228 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:41,303 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:42,646 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:49:42,646 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:49:43,436 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:45,418 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:45,451 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:46,920 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:47,468 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:49,517 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:51,565 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:51,965 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:53,611 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:55,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:57,045 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:49:57,695 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:49:57,695 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:49:57,698 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:49:57,706 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:49:59,747 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:01,794 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:03,001 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:03,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:05,897 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:07,945 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:08,052 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:08,212 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:50:09,981 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:50:09,994 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:12,041 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:12,680 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:50:12,681 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:50:13,939 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:14,154 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:16,177 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:18,199 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:18,978 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:20,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:21,208 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:50:22,270 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:24,310 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:24,739 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:26,347 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:27,692 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:50:27,693 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:50:28,385 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:29,969 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:30,435 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:32,484 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:34,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:35,249 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:35,534 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:50:36,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:38,225 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:50:38,619 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:40,661 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:41,255 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:42,688 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:50:42,688 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:50:42,696 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:44,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:46,817 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:46,842 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:47,801 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:50:48,864 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:50,885 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:51,866 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:52,931 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:54,968 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:56,953 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:50:57,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:50:57,691 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:50:57,691 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:50:59,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:00,052 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:51:01,105 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:02,440 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:03,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:05,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:07,221 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:07,498 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:08,229 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:51:09,268 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:10,966 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:51:10,968 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:51:10,968 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:51:10,969 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:51:11,325 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:51:11,334 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:12,327 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:51:12,701 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:51:12,702 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:51:12,947 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:13,380 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:15,496 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:17,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:18,006 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:19,534 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:21,575 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:23,057 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:23,618 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:25,646 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:51:25,654 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:27,699 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:27,702 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:51:27,703 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:51:28,967 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:29,733 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:31,797 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:33,855 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:34,004 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:35,933 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:37,977 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:51:37,983 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:38,243 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:51:39,256 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:40,024 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:42,061 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:42,721 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:51:42,722 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:51:44,103 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:44,985 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:46,220 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:48,244 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:50,265 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:50,381 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:52,292 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:51:52,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:54,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:55,427 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:51:56,381 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:51:57,733 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:51:57,733 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:51:58,430 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:00,476 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:01,014 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:02,511 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:04,538 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:52:04,547 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:06,039 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:06,592 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:08,246 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:52:08,636 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:10,686 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:11,273 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:12,723 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:12,744 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:52:12,745 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:52:14,761 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:15,768 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:52:16,617 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:16,863 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:18,887 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:20,900 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:21,664 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:22,939 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:24,988 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:26,708 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:27,033 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:27,744 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:52:27,745 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:52:28,040 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:52:29,077 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:31,131 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:32,036 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:33,179 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:35,218 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:37,082 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:37,266 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:38,259 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:52:39,319 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:41,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:42,177 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:42,383 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:52:42,758 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:52:42,758 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:52:43,428 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:45,486 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:47,595 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:48,053 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:49,605 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:51,628 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:53,509 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:53,665 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:54,670 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:52:55,713 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:57,755 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:52:57,775 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:52:57,775 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:52:59,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:52:59,795 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:01,828 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:03,876 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:04,088 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:05,920 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:05,921 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:07,977 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:08,261 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:53:09,269 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:10,023 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:12,073 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:12,793 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:53:12,794 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:53:14,116 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:15,041 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:16,158 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:17,668 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:53:17,669 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:53:17,670 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:53:17,670 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:53:18,267 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:53:18,294 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:19,265 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:20,274 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:20,306 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:20,925 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:22,324 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:24,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:25,957 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:26,402 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:27,816 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:53:27,816 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:53:28,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:30,490 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:31,908 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:32,520 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:32,528 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:34,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:36,634 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:36,958 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:38,276 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:53:38,663 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:40,714 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:42,185 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:42,759 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:42,813 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:53:42,813 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:53:44,802 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:44,814 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:46,859 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:47,567 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:48,969 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:50,995 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:52,615 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:53,019 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:55,039 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:57,081 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:53:57,807 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:53:57,808 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:53:58,061 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:53:59,114 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:53:59,121 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:01,165 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:03,095 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:03,219 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:05,264 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:07,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:08,167 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:08,283 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:54:09,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:10,333 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:54:11,388 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:12,818 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:54:12,819 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:54:13,432 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:14,079 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:15,479 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:17,519 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:19,106 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:19,641 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:21,669 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:22,624 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:54:23,697 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:24,537 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:25,715 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:27,739 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:27,824 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:54:27,825 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:54:29,781 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:30,086 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:31,821 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:33,865 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:34,864 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:54:35,097 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:35,911 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:37,963 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:38,295 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:54:39,997 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:40,320 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:42,057 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:42,829 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:54:42,829 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:54:44,098 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:46,123 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:46,153 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:48,205 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:49,200 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:54:50,309 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:51,707 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:52,328 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:54,346 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:56,391 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:54:56,768 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:54:57,835 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:54:57,836 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:54:58,438 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:00,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:01,479 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:02,318 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:02,530 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:04,576 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:06,629 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:07,373 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:08,305 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:55:08,674 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:10,717 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:12,751 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:12,761 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:12,841 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:55:12,842 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:55:13,085 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:14,803 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:16,840 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:18,125 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:18,880 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:20,985 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:23,000 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:23,184 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:24,607 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:55:24,609 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:55:24,609 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:55:24,610 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:55:25,027 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:55:25,040 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:26,033 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:27,093 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:27,107 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:27,843 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:55:27,843 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:55:29,089 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:29,148 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:31,192 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:33,227 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:34,132 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:35,277 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:37,317 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:38,316 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:55:39,322 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:39,362 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:39,368 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:41,415 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:42,859 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:55:42,860 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:55:43,470 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:45,136 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:45,508 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:47,554 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:49,585 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:50,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:51,658 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:55:51,694 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:53,710 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:55,297 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:55:55,724 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:57,785 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:55:57,869 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:55:57,869 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:55:59,820 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:01,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:01,856 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:03,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:04,902 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:56:05,946 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:06,945 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:07,996 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:08,330 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:56:10,043 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:12,095 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:12,254 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:12,880 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:56:12,880 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:56:14,137 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:16,186 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:17,183 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:56:17,453 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:18,232 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:20,261 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:22,369 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:22,511 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:24,382 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:26,396 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:27,886 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:56:27,886 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:56:28,136 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:28,423 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:29,419 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:56:30,460 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:32,499 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:33,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:34,538 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:36,568 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:38,229 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:38,340 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:56:38,617 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:40,673 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:41,667 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:56:42,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:42,894 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:56:42,894 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:56:44,139 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:44,766 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:46,799 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:48,844 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:49,172 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:50,898 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:53,017 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:54,210 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:56:55,027 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:56,006 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:56:57,054 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:56:57,893 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:56:57,893 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:56:59,109 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:00,157 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:01,152 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:03,196 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:05,236 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:05,247 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:07,282 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:07,291 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:08,355 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:57:09,326 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:10,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:11,366 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:12,904 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:57:12,905 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:57:13,407 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:15,450 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:16,182 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:17,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:19,538 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:19,539 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:21,326 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:21,584 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:23,701 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:25,721 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:26,371 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:27,740 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:27,921 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:57:27,922 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:57:29,764 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:30,922 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:57:30,924 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:57:30,924 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:57:30,925 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:57:31,804 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:31,804 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:57:31,813 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:32,218 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:33,846 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:33,855 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:35,909 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:37,262 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:37,958 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:38,369 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:57:40,009 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:42,056 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:42,312 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:42,929 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:57:42,930 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:57:44,099 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:46,137 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:46,145 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:48,211 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:48,222 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:50,282 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:52,339 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:53,252 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:57:54,446 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:56,457 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:57,424 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:57:57,942 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:57:57,942 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:57:58,494 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:57:59,207 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:00,560 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:02,602 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:04,264 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:04,642 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:06,681 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:08,375 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:58:08,725 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:09,721 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:58:10,224 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:10,762 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:12,800 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:12,931 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:58:12,931 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:58:14,848 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:16,203 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:16,893 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:18,938 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:20,993 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:21,245 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:23,046 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:24,041 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:58:25,168 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:26,897 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:27,187 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:27,938 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:58:27,938 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:58:29,211 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:31,242 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:32,241 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:33,283 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:35,343 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:36,338 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:58:37,390 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:37,459 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:38,377 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:58:39,433 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:41,483 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:42,934 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:58:42,935 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:58:43,190 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:43,541 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:45,573 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:47,615 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:58:47,623 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:49,022 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:49,664 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:51,712 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:53,750 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:54,068 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:55,861 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:57,879 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:58:57,950 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:58:57,950 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:58:59,198 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:58:59,904 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:01,906 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:01,927 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:03,954 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:04,620 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:05,989 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:08,044 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:08,387 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:59:10,093 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:10,411 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:12,147 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:12,950 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:59:12,950 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:59:14,185 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:14,194 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:16,233 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:16,236 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:18,298 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:20,330 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:21,271 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:22,375 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:24,422 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:26,493 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:26,530 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:26,879 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:27,952 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: stop_status +2023-04-23 05:59:27,953 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: stop_status +2023-04-23 05:59:28,558 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:30,593 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:32,240 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:32,635 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:34,679 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:36,722 ERROR gpu :34348 [interfaces.py:monitor():144] Failed to sample metric: Not Supported +2023-04-23 05:59:37,290 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:37,547 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:59:37,548 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:59:37,548 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:59:37,549 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:59:37,717 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:59:38,156 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: partial_history +2023-04-23 05:59:38,158 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send():375] send: metric +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send():375] send: history +2023-04-23 05:59:38,159 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: summary_record +2023-04-23 05:59:38,160 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:59:38,188 DEBUG SenderThread:34348 [sender.py:send():375] send: exit +2023-04-23 05:59:38,188 INFO SenderThread:34348 [sender.py:send_exit():598] handling exit code: 0 +2023-04-23 05:59:38,188 INFO SenderThread:34348 [sender.py:send_exit():600] handling runtime: 12873 +2023-04-23 05:59:38,188 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:59:38,189 INFO SenderThread:34348 [sender.py:send_exit():606] send defer +2023-04-23 05:59:38,191 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,191 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 0 +2023-04-23 05:59:38,191 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,191 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 0 +2023-04-23 05:59:38,191 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 1 +2023-04-23 05:59:38,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,192 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 1 +2023-04-23 05:59:38,192 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,192 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 1 +2023-04-23 05:59:38,192 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 2 +2023-04-23 05:59:38,192 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,192 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 2 +2023-04-23 05:59:38,193 INFO HandlerThread:34348 [system_monitor.py:finish():190] Stopping system monitor +2023-04-23 05:59:38,193 DEBUG SystemMonitor:34348 [system_monitor.py:_start():166] Finished system metrics aggregation loop +2023-04-23 05:59:38,193 DEBUG SystemMonitor:34348 [system_monitor.py:_start():170] Publishing last batch of metrics +2023-04-23 05:59:38,194 INFO HandlerThread:34348 [interfaces.py:finish():202] Joined cpu monitor +2023-04-23 05:59:38,194 INFO HandlerThread:34348 [interfaces.py:finish():202] Joined disk monitor +2023-04-23 05:59:38,256 INFO HandlerThread:34348 [interfaces.py:finish():202] Joined gpu monitor +2023-04-23 05:59:38,257 INFO HandlerThread:34348 [interfaces.py:finish():202] Joined memory monitor +2023-04-23 05:59:38,257 INFO HandlerThread:34348 [interfaces.py:finish():202] Joined network monitor +2023-04-23 05:59:38,257 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,257 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 2 +2023-04-23 05:59:38,257 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 3 +2023-04-23 05:59:38,258 DEBUG SenderThread:34348 [sender.py:send():375] send: stats +2023-04-23 05:59:38,258 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,258 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 3 +2023-04-23 05:59:38,258 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,258 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 3 +2023-04-23 05:59:38,259 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 4 +2023-04-23 05:59:38,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,259 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 4 +2023-04-23 05:59:38,259 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,259 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 4 +2023-04-23 05:59:38,259 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 5 +2023-04-23 05:59:38,259 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,259 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 5 +2023-04-23 05:59:38,260 DEBUG SenderThread:34348 [sender.py:send():375] send: summary +2023-04-23 05:59:38,260 INFO SenderThread:34348 [sender.py:_save_file():1378] saving file wandb-summary.json with policy end +2023-04-23 05:59:38,261 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,261 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 5 +2023-04-23 05:59:38,261 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 6 +2023-04-23 05:59:38,261 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,261 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 6 +2023-04-23 05:59:38,262 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,262 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 6 +2023-04-23 05:59:38,265 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:38,727 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:38,727 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:59:38,907 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 7 +2023-04-23 05:59:38,907 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:38,907 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 7 +2023-04-23 05:59:38,907 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:38,908 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 7 +2023-04-23 05:59:39,273 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-23 05:59:39,730 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\config.yaml +2023-04-23 05:59:40,300 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 8 +2023-04-23 05:59:40,301 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: poll_exit +2023-04-23 05:59:40,301 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:40,301 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 8 +2023-04-23 05:59:40,301 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:40,301 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 8 +2023-04-23 05:59:40,318 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 9 +2023-04-23 05:59:40,319 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:40,319 DEBUG SenderThread:34348 [sender.py:send():375] send: artifact +2023-04-23 05:59:40,319 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 9 +2023-04-23 05:59:40,739 INFO Thread-16 :34348 [dir_watcher.py:_on_file_modified():295] file/dir modified: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:41,812 INFO wandb-upload_0:34348 [upload_job.py:push():92] Skipped uploading C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmp62h9wd9b +2023-04-23 05:59:42,657 INFO wandb-upload_1:34348 [upload_job.py:push():95] Uploaded file C:\Users\Lenovo\AppData\Local\wandb\wandb\artifacts\staging\tmpvi_xae6k +2023-04-23 05:59:44,317 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: keepalive +2023-04-23 05:59:44,727 INFO SenderThread:34348 [sender.py:send_artifact():1474] sent artifact job-https___github.com_THUDM_ChatGLM-6B_ptuning_main.py - {'id': 'QXJ0aWZhY3Q6NDMxMDA2NTAw', 'digest': 'd16240778017459800c83cd904d36ea4', 'state': 'PENDING', 'aliases': [], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjYzNDcyMzc3', 'latestArtifact': {'id': 'QXJ0aWZhY3Q6NDI5OTE5NTc5', 'versionIndex': 5}}, 'version': 'latest'} +2023-04-23 05:59:44,727 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:44,727 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: status_report +2023-04-23 05:59:44,727 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 9 +2023-04-23 05:59:44,727 INFO SenderThread:34348 [dir_watcher.py:finish():365] shutting down directory watcher +2023-04-23 05:59:44,771 INFO SenderThread:34348 [dir_watcher.py:finish():395] scan: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files +2023-04-23 05:59:44,771 INFO SenderThread:34348 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\config.yaml config.yaml +2023-04-23 05:59:44,772 INFO SenderThread:34348 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log output.log +2023-04-23 05:59:44,774 INFO SenderThread:34348 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\requirements.txt requirements.txt +2023-04-23 05:59:44,777 INFO SenderThread:34348 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-metadata.json wandb-metadata.json +2023-04-23 05:59:44,777 INFO SenderThread:34348 [dir_watcher.py:finish():409] scan save: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json wandb-summary.json +2023-04-23 05:59:44,780 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 10 +2023-04-23 05:59:44,781 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:44,781 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 10 +2023-04-23 05:59:44,782 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:44,783 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 10 +2023-04-23 05:59:44,783 INFO SenderThread:34348 [file_pusher.py:finish():168] shutting down file pusher +2023-04-23 05:59:45,501 INFO wandb-upload_0:34348 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\config.yaml +2023-04-23 05:59:46,044 INFO wandb-upload_3:34348 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\wandb-summary.json +2023-04-23 05:59:46,136 INFO wandb-upload_2:34348 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\requirements.txt +2023-04-23 05:59:46,313 INFO wandb-upload_1:34348 [upload_job.py:push():137] Uploaded file E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\files\output.log +2023-04-23 05:59:46,519 INFO Thread-15 :34348 [sender.py:transition_state():626] send defer: 11 +2023-04-23 05:59:46,519 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:46,519 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 11 +2023-04-23 05:59:46,520 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:46,520 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 11 +2023-04-23 05:59:46,520 INFO SenderThread:34348 [file_pusher.py:join():173] waiting for file pusher +2023-04-23 05:59:46,520 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 12 +2023-04-23 05:59:46,520 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:46,520 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 12 +2023-04-23 05:59:46,520 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:46,521 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 12 +2023-04-23 05:59:47,695 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 13 +2023-04-23 05:59:47,697 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:47,697 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 13 +2023-04-23 05:59:47,697 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:47,697 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 13 +2023-04-23 05:59:47,697 INFO SenderThread:34348 [sender.py:transition_state():626] send defer: 14 +2023-04-23 05:59:47,698 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: defer +2023-04-23 05:59:47,698 INFO HandlerThread:34348 [handler.py:handle_request_defer():170] handle defer: 14 +2023-04-23 05:59:47,698 DEBUG SenderThread:34348 [sender.py:send():375] send: final +2023-04-23 05:59:47,698 DEBUG SenderThread:34348 [sender.py:send():375] send: footer +2023-04-23 05:59:47,698 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: defer +2023-04-23 05:59:47,698 INFO SenderThread:34348 [sender.py:send_request_defer():622] handle sender defer: 14 +2023-04-23 05:59:47,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: poll_exit +2023-04-23 05:59:47,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: server_info +2023-04-23 05:59:47,699 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: poll_exit +2023-04-23 05:59:47,699 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: get_summary +2023-04-23 05:59:47,699 DEBUG SenderThread:34348 [sender.py:send_request():402] send_request: server_info +2023-04-23 05:59:47,700 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: sampled_history +2023-04-23 05:59:47,945 INFO MainThread:34348 [wandb_run.py:_footer_history_summary_info():3476] rendering history +2023-04-23 05:59:47,959 INFO MainThread:34348 [wandb_run.py:_footer_history_summary_info():3508] rendering summary +2023-04-23 05:59:47,964 INFO MainThread:34348 [wandb_run.py:_footer_sync_info():3434] logging synced files +2023-04-23 05:59:47,966 DEBUG HandlerThread:34348 [handler.py:handle_request():144] handle_request: shutdown +2023-04-23 05:59:47,966 INFO HandlerThread:34348 [handler.py:finish():845] shutting down handler +2023-04-23 05:59:48,713 INFO WriterThread:34348 [datastore.py:close():298] close: E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\run-g3y4djvd.wandb +2023-04-23 05:59:48,950 INFO SenderThread:34348 [sender.py:finish():1550] shutting down sender +2023-04-23 05:59:48,950 INFO SenderThread:34348 [file_pusher.py:finish():168] shutting down file pusher +2023-04-23 05:59:48,950 INFO SenderThread:34348 [file_pusher.py:join():173] waiting for file pusher diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug.log b/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug.log new file mode 100644 index 0000000000000000000000000000000000000000..3c25b9f26095ee5769d63b1cd7fe735698b36eb3 --- /dev/null +++ b/ptuning/wandb/run-20230423_022503-g3y4djvd/logs/debug.log @@ -0,0 +1,28 @@ +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Configure stats pid to 34748 +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Loading settings from C:\Users\Lenovo\.config\wandb\settings +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Loading settings from E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\settings +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Loading settings from environment variables: {} +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False} +2023-04-23 02:25:03,128 INFO MainThread:34748 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program_relpath': 'ptuning\\main.py', 'program': 'main.py'} +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:_log_setup():507] Logging user logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\logs\debug.log +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:_log_setup():508] Logging internal logs to E:\Documents\Desktop\ChatGLM-6B\ptuning\wandb\run-20230423_022503-g3y4djvd\logs\debug-internal.log +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:init():547] calling init triggers +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:init():554] wandb.init called with sweep_config: {} +config: {} +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:init():595] starting backend +2023-04-23 02:25:03,129 INFO MainThread:34748 [wandb_init.py:init():599] setting up manager +2023-04-23 02:25:03,132 INFO MainThread:34748 [backend.py:_multiprocessing_setup():106] multiprocessing start_methods=spawn, using: spawn +2023-04-23 02:25:03,150 INFO MainThread:34748 [wandb_init.py:init():605] backend started and connected +2023-04-23 02:25:03,151 INFO MainThread:34748 [wandb_init.py:init():695] updated telemetry +2023-04-23 02:25:03,216 INFO MainThread:34748 [wandb_init.py:init():732] communicating run to backend with 60.0 second timeout +2023-04-23 02:25:03,918 INFO MainThread:34748 [wandb_run.py:_on_init():2176] communicating current version +2023-04-23 02:25:04,560 INFO MainThread:34748 [wandb_run.py:_on_init():2185] got version response upgrade_message: "wandb version 0.15.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade" + +2023-04-23 02:25:04,560 INFO MainThread:34748 [wandb_init.py:init():782] starting run threads in backend +2023-04-23 02:25:04,820 INFO MainThread:34748 [wandb_run.py:_console_start():2157] atexit reg +2023-04-23 02:25:04,820 INFO MainThread:34748 [wandb_run.py:_redirect():2012] redirect: SettingsConsole.WRAP_RAW +2023-04-23 02:25:04,820 INFO MainThread:34748 [wandb_run.py:_redirect():2077] Wrapping output streams. +2023-04-23 02:25:04,820 INFO MainThread:34748 [wandb_run.py:_redirect():2102] Redirects installed. +2023-04-23 02:25:04,821 INFO MainThread:34748 [wandb_init.py:init():824] run started, returning control to user process +2023-04-23 02:25:04,823 INFO MainThread:34748 [wandb_run.py:_config_callback():1285] config_cb None None {'num_layers': 28, 'vocab_size': 130528, 'hidden_size': 4096, 'num_attention_heads': 32, 'max_sequence_length': 2048, 'layernorm_epsilon': 1e-05, 'inner_hidden_size': 16384, 'use_cache': True, 'bos_token_id': 130004, 'eos_token_id': 130005, 'pad_token_id': 3, 'mask_token_id': 130000, 'gmask_token_id': 130001, 'position_encoding_2d': True, 'quantization_bit': 4, 'quantization_embeddings': False, 'pre_seq_len': 128, 'prefix_projection': False, 'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float16', 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['ChatGLMModel'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': '..\\models\\chatglm-6b-int4', 'transformers_version': '4.27.1', 'auto_map': {'AutoConfig': 'configuration_chatglm.ChatGLMConfig', 'AutoModel': 'modeling_chatglm.ChatGLMForConditionalGeneration', 'AutoModelForSeq2SeqLM': 'modeling_chatglm.ChatGLMForConditionalGeneration'}, 'model_type': 'chatglm', 'output_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': False, 'do_predict': False, 'evaluation_strategy': 'no', 'prediction_loss_only': False, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 16, 'eval_accumulation_steps': 'None', 'eval_delay': 0, 'learning_rate': 0.02, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 3.0, 'max_steps': 1000, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 0, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': 'output\\adgen-chatglm-6b-pt-128-2e-2\\runs\\Apr23_02-24-50_LAPTOP-U8KCJD82', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 10, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 10, 'save_total_limit': 'None', 'save_on_each_node': False, 'no_cuda': False, 'use_mps_device': False, 'seed': 42, 'data_seed': 'None', 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': False, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 'None', 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'output\\adgen-chatglm-6b-pt-128-2e-2', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'fsdp': '[]', 'fsdp_min_num_params': 0, 'fsdp_config': "{'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False}", 'fsdp_transformer_layer_cls_to_wrap': 'None', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'optim_args': 'None', 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': "['tensorboard', 'wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '', 'hub_private_repo': False, 'gradient_checkpointing': False, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': 'None', 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': 'None', 'torch_compile_mode': 'None', 'sortish_sampler': False, 'predict_with_generate': True, 'generation_max_length': 64, 'generation_num_beams': 'None', 'train_batch_size': 1, 'eval_batch_size': 1} +2023-04-23 05:59:49,831 WARNING MsgRouterThr:34748 [router.py:message_loop():77] message_loop has been closed diff --git a/ptuning/wandb/run-20230423_022503-g3y4djvd/run-g3y4djvd.wandb b/ptuning/wandb/run-20230423_022503-g3y4djvd/run-g3y4djvd.wandb new file mode 100644 index 0000000000000000000000000000000000000000..3eff76d15b42417bcdd2f272589aee43bd79fbba Binary files /dev/null and b/ptuning/wandb/run-20230423_022503-g3y4djvd/run-g3y4djvd.wandb differ