commit with good tokenizer
Browse files- README.md +103 -0
- added_tokens.json +1 -0
- all_results.json +14 -0
- config.json +107 -0
- eval.py +153 -0
- eval_results.json +9 -0
- preprocessor_config.json +9 -0
- pytorch_model.bin +3 -0
- run.sh +41 -0
- run_speech_recognition_ctc.py +748 -0
- special_tokens_map.json +1 -0
- tokenizer_config.json +1 -0
- train_results.json +8 -0
- trainer_state.json +1096 -0
- training_args.bin +3 -0
- vocab.json +1 -0
- wandb/debug-internal.log +1 -0
- wandb/debug.log +1 -0
- wandb/latest-run +1 -0
- wandb/run-20220129_131141-h6nhqm30/files/conda-environment.yaml +0 -0
- wandb/run-20220129_131141-h6nhqm30/files/config.yaml +0 -0
- wandb/run-20220129_131141-h6nhqm30/files/output.log +9003 -0
- wandb/run-20220129_131141-h6nhqm30/files/requirements.txt +180 -0
- wandb/run-20220129_131141-h6nhqm30/files/wandb-metadata.json +64 -0
- wandb/run-20220129_131141-h6nhqm30/files/wandb-summary.json +0 -0
- wandb/run-20220129_131141-h6nhqm30/logs/debug-internal.log +0 -0
- wandb/run-20220129_131141-h6nhqm30/logs/debug.log +24 -0
- wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb +3 -0
- wandb/run-20220129_215451-1vipdbow/files/conda-environment.yaml +0 -0
- wandb/run-20220129_215451-1vipdbow/files/config.yaml +0 -0
- wandb/run-20220129_215451-1vipdbow/files/output.log +0 -0
- wandb/run-20220129_215451-1vipdbow/files/requirements.txt +180 -0
- wandb/run-20220129_215451-1vipdbow/files/wandb-metadata.json +65 -0
- wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json +0 -0
- wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log +0 -0
- wandb/run-20220129_215451-1vipdbow/logs/debug.log +24 -0
- wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb +3 -0
README.md
ADDED
@@ -0,0 +1,103 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- fr
|
4 |
+
license: apache-2.0
|
5 |
+
tags:
|
6 |
+
- automatic-speech-recognition
|
7 |
+
- mozilla-foundation/common_voice_8_0
|
8 |
+
- generated_from_trainer
|
9 |
+
- robust-speech-event
|
10 |
+
datasets:
|
11 |
+
- common_voice
|
12 |
+
model-index:
|
13 |
+
- name: xls-r-300m-fr
|
14 |
+
results:
|
15 |
+
- task:
|
16 |
+
name: Speech Recognition
|
17 |
+
type: automatic-speech-recognition
|
18 |
+
dataset:
|
19 |
+
name: Common Voice 8.0 fr
|
20 |
+
type: mozilla-foundation/common_voice_8_0
|
21 |
+
args: fr
|
22 |
+
metrics:
|
23 |
+
- name: Test WER
|
24 |
+
type: wer
|
25 |
+
value: 36.81
|
26 |
+
---
|
27 |
+
|
28 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
29 |
+
should probably proofread and complete it, then remove this comment. -->
|
30 |
+
|
31 |
+
#
|
32 |
+
|
33 |
+
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - FR dataset.
|
34 |
+
It achieves the following results on the evaluation set:
|
35 |
+
- Loss: 0.2388
|
36 |
+
- Wer: 0.3681
|
37 |
+
|
38 |
+
## Model description
|
39 |
+
|
40 |
+
More information needed
|
41 |
+
|
42 |
+
## Intended uses & limitations
|
43 |
+
|
44 |
+
More information needed
|
45 |
+
|
46 |
+
## Training and evaluation data
|
47 |
+
|
48 |
+
More information needed
|
49 |
+
|
50 |
+
## Training procedure
|
51 |
+
|
52 |
+
### Training hyperparameters
|
53 |
+
|
54 |
+
The following hyperparameters were used during training:
|
55 |
+
- learning_rate: 0.0001
|
56 |
+
- train_batch_size: 64
|
57 |
+
- eval_batch_size: 64
|
58 |
+
- seed: 42
|
59 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
60 |
+
- lr_scheduler_type: linear
|
61 |
+
- lr_scheduler_warmup_steps: 1500
|
62 |
+
- num_epochs: 2.0
|
63 |
+
- mixed_precision_training: Native AMP
|
64 |
+
|
65 |
+
### Training results
|
66 |
+
|
67 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer |
|
68 |
+
|:-------------:|:-----:|:-----:|:---------------:|:------:|
|
69 |
+
| 4.3748 | 0.07 | 500 | 3.8784 | 1.0 |
|
70 |
+
| 2.8068 | 0.14 | 1000 | 2.8289 | 0.9826 |
|
71 |
+
| 1.6698 | 0.22 | 1500 | 0.8811 | 0.7127 |
|
72 |
+
| 1.3488 | 0.29 | 2000 | 0.5166 | 0.5369 |
|
73 |
+
| 1.2239 | 0.36 | 2500 | 0.4105 | 0.4741 |
|
74 |
+
| 1.1537 | 0.43 | 3000 | 0.3585 | 0.4448 |
|
75 |
+
| 1.1184 | 0.51 | 3500 | 0.3336 | 0.4292 |
|
76 |
+
| 1.0968 | 0.58 | 4000 | 0.3195 | 0.4180 |
|
77 |
+
| 1.0737 | 0.65 | 4500 | 0.3075 | 0.4141 |
|
78 |
+
| 1.0677 | 0.72 | 5000 | 0.3015 | 0.4089 |
|
79 |
+
| 1.0462 | 0.8 | 5500 | 0.2971 | 0.4077 |
|
80 |
+
| 1.0392 | 0.87 | 6000 | 0.2870 | 0.3997 |
|
81 |
+
| 1.0178 | 0.94 | 6500 | 0.2805 | 0.3963 |
|
82 |
+
| 0.992 | 1.01 | 7000 | 0.2748 | 0.3935 |
|
83 |
+
| 1.0197 | 1.09 | 7500 | 0.2691 | 0.3884 |
|
84 |
+
| 1.0056 | 1.16 | 8000 | 0.2682 | 0.3889 |
|
85 |
+
| 0.9826 | 1.23 | 8500 | 0.2647 | 0.3868 |
|
86 |
+
| 0.9815 | 1.3 | 9000 | 0.2603 | 0.3832 |
|
87 |
+
| 0.9717 | 1.37 | 9500 | 0.2561 | 0.3807 |
|
88 |
+
| 0.9605 | 1.45 | 10000 | 0.2523 | 0.3783 |
|
89 |
+
| 0.96 | 1.52 | 10500 | 0.2494 | 0.3788 |
|
90 |
+
| 0.9442 | 1.59 | 11000 | 0.2478 | 0.3760 |
|
91 |
+
| 0.9564 | 1.66 | 11500 | 0.2454 | 0.3733 |
|
92 |
+
| 0.9436 | 1.74 | 12000 | 0.2439 | 0.3747 |
|
93 |
+
| 0.938 | 1.81 | 12500 | 0.2411 | 0.3716 |
|
94 |
+
| 0.9353 | 1.88 | 13000 | 0.2397 | 0.3698 |
|
95 |
+
| 0.9271 | 1.95 | 13500 | 0.2388 | 0.3681 |
|
96 |
+
|
97 |
+
|
98 |
+
### Framework versions
|
99 |
+
|
100 |
+
- Transformers 4.17.0.dev0
|
101 |
+
- Pytorch 1.10.2+cu102
|
102 |
+
- Datasets 1.18.2.dev0
|
103 |
+
- Tokenizers 0.11.0
|
added_tokens.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{}
|
all_results.json
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 2.0,
|
3 |
+
"eval_loss": 0.23875188827514648,
|
4 |
+
"eval_runtime": 294.1776,
|
5 |
+
"eval_samples": 5792,
|
6 |
+
"eval_samples_per_second": 19.689,
|
7 |
+
"eval_steps_per_second": 0.309,
|
8 |
+
"eval_wer": 0.3680797679950471,
|
9 |
+
"train_loss": 1.442369053426242,
|
10 |
+
"train_runtime": 53680.5392,
|
11 |
+
"train_samples": 442265,
|
12 |
+
"train_samples_per_second": 16.478,
|
13 |
+
"train_steps_per_second": 0.257
|
14 |
+
}
|
config.json
ADDED
@@ -0,0 +1,107 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "facebook/wav2vec2-xls-r-300m",
|
3 |
+
"activation_dropout": 0.05,
|
4 |
+
"adapter_kernel_size": 3,
|
5 |
+
"adapter_stride": 2,
|
6 |
+
"add_adapter": false,
|
7 |
+
"apply_spec_augment": true,
|
8 |
+
"architectures": [
|
9 |
+
"Wav2Vec2ForCTC"
|
10 |
+
],
|
11 |
+
"attention_dropout": 0.0,
|
12 |
+
"bos_token_id": 1,
|
13 |
+
"classifier_proj_size": 256,
|
14 |
+
"codevector_dim": 768,
|
15 |
+
"contrastive_logits_temperature": 0.1,
|
16 |
+
"conv_bias": true,
|
17 |
+
"conv_dim": [
|
18 |
+
512,
|
19 |
+
512,
|
20 |
+
512,
|
21 |
+
512,
|
22 |
+
512,
|
23 |
+
512,
|
24 |
+
512
|
25 |
+
],
|
26 |
+
"conv_kernel": [
|
27 |
+
10,
|
28 |
+
3,
|
29 |
+
3,
|
30 |
+
3,
|
31 |
+
3,
|
32 |
+
2,
|
33 |
+
2
|
34 |
+
],
|
35 |
+
"conv_stride": [
|
36 |
+
5,
|
37 |
+
2,
|
38 |
+
2,
|
39 |
+
2,
|
40 |
+
2,
|
41 |
+
2,
|
42 |
+
2
|
43 |
+
],
|
44 |
+
"ctc_loss_reduction": "mean",
|
45 |
+
"ctc_zero_infinity": false,
|
46 |
+
"diversity_loss_weight": 0.1,
|
47 |
+
"do_stable_layer_norm": true,
|
48 |
+
"eos_token_id": 2,
|
49 |
+
"feat_extract_activation": "gelu",
|
50 |
+
"feat_extract_dropout": 0.0,
|
51 |
+
"feat_extract_norm": "layer",
|
52 |
+
"feat_proj_dropout": 0.0,
|
53 |
+
"feat_quantizer_dropout": 0.0,
|
54 |
+
"final_dropout": 0.0,
|
55 |
+
"hidden_act": "gelu",
|
56 |
+
"hidden_dropout": 0.0,
|
57 |
+
"hidden_size": 1024,
|
58 |
+
"initializer_range": 0.02,
|
59 |
+
"intermediate_size": 4096,
|
60 |
+
"layer_norm_eps": 1e-05,
|
61 |
+
"layerdrop": 0.0,
|
62 |
+
"mask_feature_length": 10,
|
63 |
+
"mask_feature_min_masks": 0,
|
64 |
+
"mask_feature_prob": 0.4,
|
65 |
+
"mask_time_length": 10,
|
66 |
+
"mask_time_min_masks": 2,
|
67 |
+
"mask_time_prob": 0.75,
|
68 |
+
"model_type": "wav2vec2",
|
69 |
+
"num_adapter_layers": 3,
|
70 |
+
"num_attention_heads": 16,
|
71 |
+
"num_codevector_groups": 2,
|
72 |
+
"num_codevectors_per_group": 320,
|
73 |
+
"num_conv_pos_embedding_groups": 16,
|
74 |
+
"num_conv_pos_embeddings": 128,
|
75 |
+
"num_feat_extract_layers": 7,
|
76 |
+
"num_hidden_layers": 24,
|
77 |
+
"num_negatives": 100,
|
78 |
+
"output_hidden_size": 1024,
|
79 |
+
"pad_token_id": 216,
|
80 |
+
"proj_codevector_dim": 768,
|
81 |
+
"tdnn_dilation": [
|
82 |
+
1,
|
83 |
+
2,
|
84 |
+
3,
|
85 |
+
1,
|
86 |
+
1
|
87 |
+
],
|
88 |
+
"tdnn_dim": [
|
89 |
+
512,
|
90 |
+
512,
|
91 |
+
512,
|
92 |
+
512,
|
93 |
+
1500
|
94 |
+
],
|
95 |
+
"tdnn_kernel": [
|
96 |
+
5,
|
97 |
+
3,
|
98 |
+
3,
|
99 |
+
1,
|
100 |
+
1
|
101 |
+
],
|
102 |
+
"torch_dtype": "float32",
|
103 |
+
"transformers_version": "4.17.0.dev0",
|
104 |
+
"use_weighted_layer_sum": false,
|
105 |
+
"vocab_size": 218,
|
106 |
+
"xvector_output_dim": 512
|
107 |
+
}
|
eval.py
ADDED
@@ -0,0 +1,153 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
#!/usr/bin/env python3
|
2 |
+
import argparse
|
3 |
+
import re
|
4 |
+
import unicodedata
|
5 |
+
from typing import Dict
|
6 |
+
|
7 |
+
import torch
|
8 |
+
from datasets import Audio, Dataset, load_dataset, load_metric
|
9 |
+
|
10 |
+
from transformers import AutoFeatureExtractor, pipeline
|
11 |
+
|
12 |
+
|
13 |
+
def log_results(result: Dataset, args: Dict[str, str]):
|
14 |
+
"""DO NOT CHANGE. This function computes and logs the result metrics."""
|
15 |
+
|
16 |
+
log_outputs = args.log_outputs
|
17 |
+
dataset_id = "_".join(args.dataset.split("/") + [args.config, args.split])
|
18 |
+
|
19 |
+
# load metric
|
20 |
+
wer = load_metric("wer")
|
21 |
+
cer = load_metric("cer")
|
22 |
+
|
23 |
+
# compute metrics
|
24 |
+
wer_result = wer.compute(references=result["target"], predictions=result["prediction"])
|
25 |
+
cer_result = cer.compute(references=result["target"], predictions=result["prediction"])
|
26 |
+
|
27 |
+
# print & log results
|
28 |
+
result_str = f"WER: {wer_result}\n" f"CER: {cer_result}"
|
29 |
+
print(result_str)
|
30 |
+
|
31 |
+
with open(f"{dataset_id}_eval_results.txt", "w") as f:
|
32 |
+
f.write(result_str)
|
33 |
+
|
34 |
+
# log all results in text file. Possibly interesting for analysis
|
35 |
+
if log_outputs is not None:
|
36 |
+
pred_file = f"log_{dataset_id}_predictions.txt"
|
37 |
+
target_file = f"log_{dataset_id}_targets.txt"
|
38 |
+
|
39 |
+
with open(pred_file, "w") as p, open(target_file, "w") as t:
|
40 |
+
|
41 |
+
# mapping function to write output
|
42 |
+
def write_to_file(batch, i):
|
43 |
+
p.write(f"{i}" + "\n")
|
44 |
+
p.write(batch["prediction"] + "\n")
|
45 |
+
t.write(f"{i}" + "\n")
|
46 |
+
t.write(batch["target"] + "\n")
|
47 |
+
|
48 |
+
result.map(write_to_file, with_indices=True)
|
49 |
+
|
50 |
+
|
51 |
+
chars_to_remove_regex = r'[\,\?\.\!\-\_\;\:\"\“\%\‘\”\�\^]'
|
52 |
+
|
53 |
+
def remove_accents(text):
|
54 |
+
nfkd_form = unicodedata.normalize('NFKD', text)
|
55 |
+
return u"".join([c for c in nfkd_form if not unicodedata.combining(c)])
|
56 |
+
|
57 |
+
def remove_special_characters(text):
|
58 |
+
text = re.sub(chars_to_remove_regex, '', text).lower()
|
59 |
+
text = re.sub("ç", r'[cedille]', text)
|
60 |
+
text = re.sub("&", r'et', text)
|
61 |
+
text = re.sub("%", r' pourcents', text)
|
62 |
+
text = re.sub("([0-9]+)(,|.)([0-9+])", r'\1 virgule \3', text)
|
63 |
+
text = re.sub("\$", r'dollar', text)
|
64 |
+
text = re.sub("\£", r'livre', text)
|
65 |
+
text = re.sub("\€", r'euro', text)
|
66 |
+
text = remove_accents(text)
|
67 |
+
text = re.sub(r"\[cedille\]", 'ç', text) + " "
|
68 |
+
return text
|
69 |
+
|
70 |
+
def normalize_text(text: str) -> str:
|
71 |
+
text = remove_special_characters(text)
|
72 |
+
|
73 |
+
# In addition, we can normalize the target text, e.g. removing new lines characters etc...
|
74 |
+
# note that order is important here!
|
75 |
+
token_sequences_to_ignore = ["\n\n", "\n", " ", " "]
|
76 |
+
|
77 |
+
for t in token_sequences_to_ignore:
|
78 |
+
text = " ".join(text.split(t))
|
79 |
+
|
80 |
+
return text
|
81 |
+
|
82 |
+
|
83 |
+
def main(args):
|
84 |
+
# load dataset
|
85 |
+
dataset = load_dataset(args.dataset, args.config, split=args.split, use_auth_token=True)
|
86 |
+
|
87 |
+
# for testing: only process the first two examples as a test
|
88 |
+
dataset = dataset.select(range(20))
|
89 |
+
|
90 |
+
# load processor
|
91 |
+
feature_extractor = AutoFeatureExtractor.from_pretrained(args.model_id)
|
92 |
+
sampling_rate = feature_extractor.sampling_rate
|
93 |
+
|
94 |
+
# resample audio
|
95 |
+
dataset = dataset.cast_column("audio", Audio(sampling_rate=sampling_rate))
|
96 |
+
|
97 |
+
# load eval pipeline
|
98 |
+
if args.device is None:
|
99 |
+
args.device = 0 if torch.cuda.is_available() else -1
|
100 |
+
asr = pipeline("automatic-speech-recognition", model=args.model_id, device=args.device)
|
101 |
+
|
102 |
+
# map function to decode audio
|
103 |
+
def map_to_pred(batch):
|
104 |
+
prediction = asr(
|
105 |
+
batch["audio"]["array"], chunk_length_s=args.chunk_length_s, stride_length_s=args.stride_length_s
|
106 |
+
)
|
107 |
+
|
108 |
+
batch["prediction"] = prediction["text"]# "".join(prediction["text"].split("<s>"))
|
109 |
+
batch["target"] = normalize_text(batch["sentence"])
|
110 |
+
return batch
|
111 |
+
|
112 |
+
# run inference on all examples
|
113 |
+
result = dataset.map(map_to_pred, remove_columns=dataset.column_names)
|
114 |
+
|
115 |
+
# compute and log_results
|
116 |
+
# do not change function below
|
117 |
+
log_results(result, args)
|
118 |
+
|
119 |
+
|
120 |
+
if __name__ == "__main__":
|
121 |
+
parser = argparse.ArgumentParser()
|
122 |
+
|
123 |
+
parser.add_argument(
|
124 |
+
"--model_id", type=str, required=True, help="Model identifier. Should be loadable with 🤗 Transformers"
|
125 |
+
)
|
126 |
+
parser.add_argument(
|
127 |
+
"--dataset",
|
128 |
+
type=str,
|
129 |
+
required=True,
|
130 |
+
help="Dataset name to evaluate the `model_id`. Should be loadable with 🤗 Datasets",
|
131 |
+
)
|
132 |
+
parser.add_argument(
|
133 |
+
"--config", type=str, required=True, help="Config of the dataset. *E.g.* `'en'` for Common Voice"
|
134 |
+
)
|
135 |
+
parser.add_argument("--split", type=str, required=True, help="Split of the dataset. *E.g.* `'test'`")
|
136 |
+
parser.add_argument(
|
137 |
+
"--chunk_length_s", type=float, default=None, help="Chunk length in seconds. Defaults to 5 seconds."
|
138 |
+
)
|
139 |
+
parser.add_argument(
|
140 |
+
"--stride_length_s", type=float, default=None, help="Stride of the audio chunks. Defaults to 1 second."
|
141 |
+
)
|
142 |
+
parser.add_argument(
|
143 |
+
"--log_outputs", action="store_true", help="If defined, write outputs to log file for analysis."
|
144 |
+
)
|
145 |
+
parser.add_argument(
|
146 |
+
"--device",
|
147 |
+
type=int,
|
148 |
+
default=None,
|
149 |
+
help="The device to run the pipeline on. -1 for CPU (default), 0 for the first GPU and so on.",
|
150 |
+
)
|
151 |
+
args = parser.parse_args()
|
152 |
+
|
153 |
+
main(args)
|
eval_results.json
ADDED
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 2.0,
|
3 |
+
"eval_loss": 0.23875188827514648,
|
4 |
+
"eval_runtime": 294.1776,
|
5 |
+
"eval_samples": 5792,
|
6 |
+
"eval_samples_per_second": 19.689,
|
7 |
+
"eval_steps_per_second": 0.309,
|
8 |
+
"eval_wer": 0.3680797679950471
|
9 |
+
}
|
preprocessor_config.json
ADDED
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"do_normalize": true,
|
3 |
+
"feature_extractor_type": "Wav2Vec2FeatureExtractor",
|
4 |
+
"feature_size": 1,
|
5 |
+
"padding_side": "right",
|
6 |
+
"padding_value": 0,
|
7 |
+
"return_attention_mask": true,
|
8 |
+
"sampling_rate": 16000
|
9 |
+
}
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7be356c0416d66c909300c8a24b255d6bc972bb2572b661bdcc3e0167f8aaba0
|
3 |
+
size 1262817457
|
run.sh
ADDED
@@ -0,0 +1,41 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
WANDB_PROJECT=auto-speech-recognition-french
|
2 |
+
python run_speech_recognition_ctc.py \
|
3 |
+
--dataset_name="mozilla-foundation/common_voice_8_0" \
|
4 |
+
--model_name_or_path="facebook/wav2vec2-xls-r-300m" \
|
5 |
+
--dataset_config_name="fr" \
|
6 |
+
--tokenizer_name_or_path="./" \
|
7 |
+
--output_dir="./" \
|
8 |
+
--overwrite_output_dir \
|
9 |
+
--num_train_epochs="2" \
|
10 |
+
--per_device_train_batch_size="64" \
|
11 |
+
--per_device_eval_batch_size="64" \
|
12 |
+
--gradient_accumulation_steps="1" \
|
13 |
+
--learning_rate="1e-4" \
|
14 |
+
--warmup_steps="1500" \
|
15 |
+
--length_column_name="input_length" \
|
16 |
+
--evaluation_strategy="steps" \
|
17 |
+
--text_column_name="sentence" \
|
18 |
+
--save_steps="500" \
|
19 |
+
--eval_steps="500" \
|
20 |
+
--logging_steps="100" \
|
21 |
+
--layerdrop="0.0" \
|
22 |
+
--activation_dropout="0.05" \
|
23 |
+
--save_total_limit="2" \
|
24 |
+
--freeze_feature_encoder \
|
25 |
+
--feat_proj_dropout="0.0" \
|
26 |
+
--mask_time_prob="0.75" \
|
27 |
+
--mask_time_length="10" \
|
28 |
+
--mask_feature_prob="0.4" \
|
29 |
+
--mask_feature_length="10" \
|
30 |
+
--gradient_checkpointing \
|
31 |
+
--report_to="wandb" \
|
32 |
+
--run_name="xls-r-300m-fr" \
|
33 |
+
--max_eval_samples="6000" \
|
34 |
+
--max_duration_in_seconds="9" \
|
35 |
+
--use_auth_token \
|
36 |
+
--fp16 \
|
37 |
+
--group_by_length \
|
38 |
+
--preprocessing_num_workers="64" \
|
39 |
+
--do_train --do_eval \
|
40 |
+
--load_best_model_at_end \
|
41 |
+
--push_to_hub
|
run_speech_recognition_ctc.py
ADDED
@@ -0,0 +1,748 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
#!/usr/bin/env python
|
2 |
+
# coding=utf-8
|
3 |
+
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
|
4 |
+
#
|
5 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
6 |
+
# you may not use this file except in compliance with the License.
|
7 |
+
# You may obtain a copy of the License at
|
8 |
+
#
|
9 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
10 |
+
#
|
11 |
+
# Unless required by applicable law or agreed to in writing, software
|
12 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
13 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
14 |
+
# See the License for the specific language governing permissions and
|
15 |
+
|
16 |
+
""" Fine-tuning a 🤗 Transformers CTC model for automatic speech recognition"""
|
17 |
+
|
18 |
+
import functools
|
19 |
+
import json
|
20 |
+
import logging
|
21 |
+
import os
|
22 |
+
import re
|
23 |
+
import sys
|
24 |
+
import warnings
|
25 |
+
from dataclasses import dataclass, field
|
26 |
+
from typing import Dict, List, Optional, Union
|
27 |
+
import unicodedata
|
28 |
+
|
29 |
+
import datasets
|
30 |
+
import numpy as np
|
31 |
+
import torch
|
32 |
+
from datasets import DatasetDict, load_dataset, load_metric
|
33 |
+
|
34 |
+
import transformers
|
35 |
+
from transformers import (
|
36 |
+
AutoConfig,
|
37 |
+
AutoFeatureExtractor,
|
38 |
+
AutoModelForCTC,
|
39 |
+
AutoProcessor,
|
40 |
+
AutoTokenizer,
|
41 |
+
HfArgumentParser,
|
42 |
+
Trainer,
|
43 |
+
TrainingArguments,
|
44 |
+
Wav2Vec2Processor,
|
45 |
+
Wav2Vec2CTCTokenizer,
|
46 |
+
set_seed,
|
47 |
+
)
|
48 |
+
from transformers.trainer_utils import get_last_checkpoint, is_main_process
|
49 |
+
from transformers.utils import check_min_version
|
50 |
+
from transformers.utils.versions import require_version
|
51 |
+
|
52 |
+
|
53 |
+
# Will error if the minimal version of Transformers is not installed. Remove at your own risks.
|
54 |
+
check_min_version("4.16.0.dev0")
|
55 |
+
|
56 |
+
require_version("datasets>=1.13.3", "To fix: pip install -r examples/pytorch/text-classification/requirements.txt")
|
57 |
+
|
58 |
+
|
59 |
+
logger = logging.getLogger(__name__)
|
60 |
+
|
61 |
+
|
62 |
+
def list_field(default=None, metadata=None):
|
63 |
+
return field(default_factory=lambda: default, metadata=metadata)
|
64 |
+
|
65 |
+
|
66 |
+
@dataclass
|
67 |
+
class ModelArguments:
|
68 |
+
"""
|
69 |
+
Arguments pertaining to which model/config/tokenizer we are going to fine-tune from.
|
70 |
+
"""
|
71 |
+
|
72 |
+
model_name_or_path: str = field(
|
73 |
+
metadata={"help": "Path to pretrained model or model identifier from huggingface.co/models"}
|
74 |
+
)
|
75 |
+
tokenizer_name_or_path: Optional[str] = field(
|
76 |
+
default=None,
|
77 |
+
metadata={"help": "Path to pretrained tokenizer or tokenizer identifier from huggingface.co/models"},
|
78 |
+
)
|
79 |
+
cache_dir: Optional[str] = field(
|
80 |
+
default=None,
|
81 |
+
metadata={"help": "Where do you want to store the pretrained models downloaded from huggingface.co"},
|
82 |
+
)
|
83 |
+
freeze_feature_encoder: bool = field(
|
84 |
+
default=True, metadata={"help": "Whether to freeze the feature encoder layers of the model."}
|
85 |
+
)
|
86 |
+
attention_dropout: float = field(
|
87 |
+
default=0.0, metadata={"help": "The dropout ratio for the attention probabilities."}
|
88 |
+
)
|
89 |
+
activation_dropout: float = field(
|
90 |
+
default=0.0, metadata={"help": "The dropout ratio for activations inside the fully connected layer."}
|
91 |
+
)
|
92 |
+
feat_proj_dropout: float = field(default=0.0, metadata={"help": "The dropout ratio for the projected features."})
|
93 |
+
hidden_dropout: float = field(
|
94 |
+
default=0.0,
|
95 |
+
metadata={
|
96 |
+
"help": "The dropout probability for all fully connected layers in the embeddings, encoder, and pooler."
|
97 |
+
},
|
98 |
+
)
|
99 |
+
final_dropout: float = field(
|
100 |
+
default=0.0,
|
101 |
+
metadata={"help": "The dropout probability for the final projection layer."},
|
102 |
+
)
|
103 |
+
mask_time_prob: float = field(
|
104 |
+
default=0.05,
|
105 |
+
metadata={
|
106 |
+
"help": "Probability of each feature vector along the time axis to be chosen as the start of the vector"
|
107 |
+
"span to be masked. Approximately ``mask_time_prob * sequence_length // mask_time_length`` feature"
|
108 |
+
"vectors will be masked along the time axis."
|
109 |
+
},
|
110 |
+
)
|
111 |
+
mask_time_length: int = field(
|
112 |
+
default=10,
|
113 |
+
metadata={"help": "Length of vector span to mask along the time axis."},
|
114 |
+
)
|
115 |
+
mask_feature_prob: float = field(
|
116 |
+
default=0.0,
|
117 |
+
metadata={
|
118 |
+
"help": "Probability of each feature vector along the feature axis to be chosen as the start of the vector"
|
119 |
+
"span to be masked. Approximately ``mask_feature_prob * sequence_length // mask_feature_length`` feature bins will be masked along the time axis."
|
120 |
+
},
|
121 |
+
)
|
122 |
+
mask_feature_length: int = field(
|
123 |
+
default=10,
|
124 |
+
metadata={"help": "Length of vector span to mask along the feature axis."},
|
125 |
+
)
|
126 |
+
layerdrop: float = field(default=0.0, metadata={"help": "The LayerDrop probability."})
|
127 |
+
ctc_loss_reduction: Optional[str] = field(
|
128 |
+
default="mean", metadata={"help": "The way the ctc loss should be reduced. Should be one of 'mean' or 'sum'."}
|
129 |
+
)
|
130 |
+
|
131 |
+
|
132 |
+
@dataclass
|
133 |
+
class DataTrainingArguments:
|
134 |
+
"""
|
135 |
+
Arguments pertaining to what data we are going to input our model for training and eval.
|
136 |
+
|
137 |
+
Using `HfArgumentParser` we can turn this class
|
138 |
+
into argparse arguments to be able to specify them on
|
139 |
+
the command line.
|
140 |
+
"""
|
141 |
+
|
142 |
+
dataset_name: str = field(
|
143 |
+
metadata={"help": "The configuration name of the dataset to use (via the datasets library)."}
|
144 |
+
)
|
145 |
+
dataset_config_name: str = field(
|
146 |
+
default=None, metadata={"help": "The configuration name of the dataset to use (via the datasets library)."}
|
147 |
+
)
|
148 |
+
train_split_name: str = field(
|
149 |
+
default="train+validation",
|
150 |
+
metadata={
|
151 |
+
"help": "The name of the training data set split to use (via the datasets library). Defaults to 'train'"
|
152 |
+
},
|
153 |
+
)
|
154 |
+
eval_split_name: str = field(
|
155 |
+
default="test",
|
156 |
+
metadata={
|
157 |
+
"help": "The name of the training data set split to use (via the datasets library). Defaults to 'test'"
|
158 |
+
},
|
159 |
+
)
|
160 |
+
audio_column_name: str = field(
|
161 |
+
default="audio",
|
162 |
+
metadata={"help": "The name of the dataset column containing the audio data. Defaults to 'audio'"},
|
163 |
+
)
|
164 |
+
text_column_name: str = field(
|
165 |
+
default="text",
|
166 |
+
metadata={"help": "The name of the dataset column containing the text data. Defaults to 'text'"},
|
167 |
+
)
|
168 |
+
overwrite_cache: bool = field(
|
169 |
+
default=False, metadata={"help": "Overwrite the cached preprocessed datasets or not."}
|
170 |
+
)
|
171 |
+
preprocessing_num_workers: Optional[int] = field(
|
172 |
+
default=None,
|
173 |
+
metadata={"help": "The number of processes to use for the preprocessing."},
|
174 |
+
)
|
175 |
+
max_train_samples: Optional[int] = field(
|
176 |
+
default=None,
|
177 |
+
metadata={
|
178 |
+
"help": "For debugging purposes or quicker training, truncate the number of training examples to this "
|
179 |
+
"value if set."
|
180 |
+
},
|
181 |
+
)
|
182 |
+
max_eval_samples: Optional[int] = field(
|
183 |
+
default=None,
|
184 |
+
metadata={
|
185 |
+
"help": "For debugging purposes or quicker training, truncate the number of validation examples to this "
|
186 |
+
"value if set."
|
187 |
+
},
|
188 |
+
)
|
189 |
+
chars_to_ignore: Optional[List[str]] = list_field(
|
190 |
+
default=None,
|
191 |
+
metadata={"help": "A list of characters to remove from the transcripts."},
|
192 |
+
)
|
193 |
+
eval_metrics: List[str] = list_field(
|
194 |
+
default=["wer"],
|
195 |
+
metadata={"help": "A list of metrics the model should be evaluated on. E.g. `'wer cer'`"},
|
196 |
+
)
|
197 |
+
max_duration_in_seconds: float = field(
|
198 |
+
default=20.0,
|
199 |
+
metadata={
|
200 |
+
"help": "Filter audio files that are longer than `max_duration_in_seconds` seconds to 'max_duration_in_seconds`"
|
201 |
+
},
|
202 |
+
)
|
203 |
+
min_duration_in_seconds: float = field(
|
204 |
+
default=0.0, metadata={"help": "Filter audio files that are shorter than `min_duration_in_seconds` seconds"}
|
205 |
+
)
|
206 |
+
preprocessing_only: bool = field(
|
207 |
+
default=False,
|
208 |
+
metadata={
|
209 |
+
"help": "Whether to only do data preprocessing and skip training. "
|
210 |
+
"This is especially useful when data preprocessing errors out in distributed training due to timeout. "
|
211 |
+
"In this case, one should run the preprocessing in a non-distributed setup with `preprocessing_only=True` "
|
212 |
+
"so that the cached datasets can consequently be loaded in distributed training"
|
213 |
+
},
|
214 |
+
)
|
215 |
+
use_auth_token: bool = field(
|
216 |
+
default=False,
|
217 |
+
metadata={
|
218 |
+
"help": "If :obj:`True`, will use the token generated when running"
|
219 |
+
":obj:`transformers-cli login` as HTTP bearer authorization for remote files."
|
220 |
+
},
|
221 |
+
)
|
222 |
+
unk_token: str = field(
|
223 |
+
default="[UNK]",
|
224 |
+
metadata={"help": "The unk token for the tokenizer"},
|
225 |
+
)
|
226 |
+
pad_token: str = field(
|
227 |
+
default="[PAD]",
|
228 |
+
metadata={"help": "The padding token for the tokenizer"},
|
229 |
+
)
|
230 |
+
word_delimiter_token: str = field(
|
231 |
+
default="|",
|
232 |
+
metadata={"help": "The word delimiter token for the tokenizer"},
|
233 |
+
)
|
234 |
+
phoneme_language: Optional[str] = field(
|
235 |
+
default=None,
|
236 |
+
metadata={
|
237 |
+
"help": "The target language that should be used be"
|
238 |
+
" passed to the tokenizer for tokenization. Note that"
|
239 |
+
" this is only relevant if the model classifies the"
|
240 |
+
" input audio to a sequence of phoneme sequences."
|
241 |
+
},
|
242 |
+
)
|
243 |
+
|
244 |
+
|
245 |
+
@dataclass
|
246 |
+
class DataCollatorCTCWithPadding:
|
247 |
+
"""
|
248 |
+
Data collator that will dynamically pad the inputs received.
|
249 |
+
Args:
|
250 |
+
processor (:class:`~transformers.AutoProcessor`)
|
251 |
+
The processor used for proccessing the data.
|
252 |
+
padding (:obj:`bool`, :obj:`str` or :class:`~transformers.tokenization_utils_base.PaddingStrategy`, `optional`, defaults to :obj:`True`):
|
253 |
+
Select a strategy to pad the returned sequences (according to the model's padding side and padding index)
|
254 |
+
among:
|
255 |
+
* :obj:`True` or :obj:`'longest'`: Pad to the longest sequence in the batch (or no padding if only a single
|
256 |
+
sequence if provided).
|
257 |
+
* :obj:`'max_length'`: Pad to a maximum length specified with the argument :obj:`max_length` or to the
|
258 |
+
maximum acceptable input length for the model if that argument is not provided.
|
259 |
+
* :obj:`False` or :obj:`'do_not_pad'` (default): No padding (i.e., can output a batch with sequences of
|
260 |
+
different lengths).
|
261 |
+
max_length (:obj:`int`, `optional`):
|
262 |
+
Maximum length of the ``input_values`` of the returned list and optionally padding length (see above).
|
263 |
+
max_length_labels (:obj:`int`, `optional`):
|
264 |
+
Maximum length of the ``labels`` returned list and optionally padding length (see above).
|
265 |
+
pad_to_multiple_of (:obj:`int`, `optional`):
|
266 |
+
If set will pad the sequence to a multiple of the provided value.
|
267 |
+
This is especially useful to enable the use of Tensor Cores on NVIDIA hardware with compute capability >=
|
268 |
+
7.5 (Volta).
|
269 |
+
"""
|
270 |
+
|
271 |
+
processor: AutoProcessor
|
272 |
+
padding: Union[bool, str] = "longest"
|
273 |
+
pad_to_multiple_of: Optional[int] = None
|
274 |
+
pad_to_multiple_of_labels: Optional[int] = None
|
275 |
+
|
276 |
+
def __call__(self, features: List[Dict[str, Union[List[int], torch.Tensor]]]) -> Dict[str, torch.Tensor]:
|
277 |
+
# split inputs and labels since they have to be of different lenghts and need
|
278 |
+
# different padding methods
|
279 |
+
input_features = [{"input_values": feature["input_values"]} for feature in features]
|
280 |
+
label_features = [{"input_ids": feature["labels"]} for feature in features]
|
281 |
+
|
282 |
+
batch = self.processor.pad(
|
283 |
+
input_features,
|
284 |
+
padding=self.padding,
|
285 |
+
pad_to_multiple_of=self.pad_to_multiple_of,
|
286 |
+
return_tensors="pt",
|
287 |
+
)
|
288 |
+
|
289 |
+
with self.processor.as_target_processor():
|
290 |
+
labels_batch = self.processor.pad(
|
291 |
+
label_features,
|
292 |
+
padding=self.padding,
|
293 |
+
pad_to_multiple_of=self.pad_to_multiple_of_labels,
|
294 |
+
return_tensors="pt",
|
295 |
+
)
|
296 |
+
|
297 |
+
# replace padding with -100 to ignore loss correctly
|
298 |
+
labels = labels_batch["input_ids"].masked_fill(labels_batch.attention_mask.ne(1), -100)
|
299 |
+
|
300 |
+
batch["labels"] = labels
|
301 |
+
|
302 |
+
return batch
|
303 |
+
|
304 |
+
|
305 |
+
def create_vocabulary_from_data(
|
306 |
+
datasets: DatasetDict,
|
307 |
+
word_delimiter_token: Optional[str] = None,
|
308 |
+
unk_token: Optional[str] = None,
|
309 |
+
pad_token: Optional[str] = None,
|
310 |
+
):
|
311 |
+
# Given training and test labels create vocabulary
|
312 |
+
def extract_all_chars(batch):
|
313 |
+
all_text = " ".join(batch["target_text"])
|
314 |
+
vocab = list(set(all_text))
|
315 |
+
return {"vocab": [vocab], "all_text": [all_text]}
|
316 |
+
|
317 |
+
vocabs = datasets.map(
|
318 |
+
extract_all_chars,
|
319 |
+
batched=True,
|
320 |
+
batch_size=-1,
|
321 |
+
keep_in_memory=True,
|
322 |
+
remove_columns=datasets["train"].column_names,
|
323 |
+
)
|
324 |
+
|
325 |
+
# take union of all unique characters in each dataset
|
326 |
+
vocab_set = functools.reduce(
|
327 |
+
lambda vocab_1, vocab_2: set(vocab_1["vocab"][0]) | set(vocab_2["vocab"][0]), vocabs.values()
|
328 |
+
)
|
329 |
+
|
330 |
+
vocab_dict = {v: k for k, v in enumerate(sorted(list(vocab_set)))}
|
331 |
+
|
332 |
+
# replace white space with delimiter token
|
333 |
+
if word_delimiter_token is not None:
|
334 |
+
vocab_dict[word_delimiter_token] = vocab_dict[" "]
|
335 |
+
del vocab_dict[" "]
|
336 |
+
|
337 |
+
# add unk and pad token
|
338 |
+
if unk_token is not None:
|
339 |
+
vocab_dict[unk_token] = len(vocab_dict)
|
340 |
+
|
341 |
+
if pad_token is not None:
|
342 |
+
vocab_dict[pad_token] = len(vocab_dict)
|
343 |
+
|
344 |
+
return vocab_dict
|
345 |
+
|
346 |
+
|
347 |
+
def main():
|
348 |
+
# See all possible arguments in src/transformers/training_args.py
|
349 |
+
# or by passing the --help flag to this script.
|
350 |
+
# We now keep distinct sets of args, for a cleaner separation of concerns.
|
351 |
+
|
352 |
+
parser = HfArgumentParser((ModelArguments, DataTrainingArguments, TrainingArguments))
|
353 |
+
if len(sys.argv) == 2 and sys.argv[1].endswith(".json"):
|
354 |
+
# If we pass only one argument to the script and it's the path to a json file,
|
355 |
+
# let's parse it to get our arguments.
|
356 |
+
model_args, data_args, training_args = parser.parse_json_file(json_file=os.path.abspath(sys.argv[1]))
|
357 |
+
else:
|
358 |
+
model_args, data_args, training_args = parser.parse_args_into_dataclasses()
|
359 |
+
|
360 |
+
# Detecting last checkpoint.
|
361 |
+
last_checkpoint = None
|
362 |
+
if os.path.isdir(training_args.output_dir) and training_args.do_train and not training_args.overwrite_output_dir:
|
363 |
+
last_checkpoint = get_last_checkpoint(training_args.output_dir)
|
364 |
+
if last_checkpoint is None and len(os.listdir(training_args.output_dir)) > 0:
|
365 |
+
raise ValueError(
|
366 |
+
f"Output directory ({training_args.output_dir}) already exists and is not empty. "
|
367 |
+
"Use --overwrite_output_dir to overcome."
|
368 |
+
)
|
369 |
+
elif last_checkpoint is not None:
|
370 |
+
logger.info(
|
371 |
+
f"Checkpoint detected, resuming training at {last_checkpoint}. To avoid this behavior, change "
|
372 |
+
"the `--output_dir` or add `--overwrite_output_dir` to train from scratch."
|
373 |
+
)
|
374 |
+
|
375 |
+
# Setup logging
|
376 |
+
logging.basicConfig(
|
377 |
+
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
|
378 |
+
datefmt="%m/%d/%Y %H:%M:%S",
|
379 |
+
handlers=[logging.StreamHandler(sys.stdout)],
|
380 |
+
)
|
381 |
+
logger.setLevel(logging.INFO if is_main_process(training_args.local_rank) else logging.WARN)
|
382 |
+
|
383 |
+
# Log on each process the small summary:
|
384 |
+
logger.warning(
|
385 |
+
f"Process rank: {training_args.local_rank}, device: {training_args.device}, n_gpu: {training_args.n_gpu}"
|
386 |
+
f"distributed training: {bool(training_args.local_rank != -1)}, 16-bits training: {training_args.fp16}"
|
387 |
+
)
|
388 |
+
# Set the verbosity to info of the Transformers logger (on main process only):
|
389 |
+
if is_main_process(training_args.local_rank):
|
390 |
+
transformers.utils.logging.set_verbosity_info()
|
391 |
+
logger.info("Training/evaluation parameters %s", training_args)
|
392 |
+
|
393 |
+
# Set seed before initializing model.
|
394 |
+
set_seed(training_args.seed)
|
395 |
+
|
396 |
+
# 1. First, let's load the dataset
|
397 |
+
raw_datasets = DatasetDict()
|
398 |
+
|
399 |
+
if training_args.do_train:
|
400 |
+
raw_datasets["train"] = load_dataset(
|
401 |
+
data_args.dataset_name,
|
402 |
+
data_args.dataset_config_name,
|
403 |
+
split=data_args.train_split_name,
|
404 |
+
use_auth_token=data_args.use_auth_token,
|
405 |
+
)
|
406 |
+
|
407 |
+
if data_args.audio_column_name not in raw_datasets["train"].column_names:
|
408 |
+
raise ValueError(
|
409 |
+
f"--audio_column_name '{data_args.audio_column_name}' not found in dataset '{data_args.dataset_name}'. "
|
410 |
+
"Make sure to set `--audio_column_name` to the correct audio column - one of "
|
411 |
+
f"{', '.join(raw_datasets['train'].column_names)}."
|
412 |
+
)
|
413 |
+
|
414 |
+
if data_args.text_column_name not in raw_datasets["train"].column_names:
|
415 |
+
raise ValueError(
|
416 |
+
f"--text_column_name {data_args.text_column_name} not found in dataset '{data_args.dataset_name}'. "
|
417 |
+
"Make sure to set `--text_column_name` to the correct text column - one of "
|
418 |
+
f"{', '.join(raw_datasets['train'].column_names)}."
|
419 |
+
)
|
420 |
+
|
421 |
+
if data_args.max_train_samples is not None:
|
422 |
+
raw_datasets["train"] = raw_datasets["train"].select(range(data_args.max_train_samples))
|
423 |
+
|
424 |
+
if training_args.do_eval:
|
425 |
+
raw_datasets["eval"] = load_dataset(
|
426 |
+
data_args.dataset_name,
|
427 |
+
data_args.dataset_config_name,
|
428 |
+
split=data_args.eval_split_name,
|
429 |
+
use_auth_token=data_args.use_auth_token,
|
430 |
+
)
|
431 |
+
|
432 |
+
if data_args.max_eval_samples is not None:
|
433 |
+
raw_datasets["eval"] = raw_datasets["eval"].shuffle(seed=42).select(range(data_args.max_eval_samples))
|
434 |
+
|
435 |
+
# 2. We remove some special characters from the datasets
|
436 |
+
# that make training complicated and do not help in transcribing the speech
|
437 |
+
# E.g. characters, such as `,` and `.` do not really have an acoustic characteristic
|
438 |
+
# that could be easily picked up by the model
|
439 |
+
text_column_name = data_args.text_column_name
|
440 |
+
|
441 |
+
chars_to_remove_regex = r'[\,\?\.\!\-\_\;\:\"\“\%\‘\”\�\^]'
|
442 |
+
|
443 |
+
def remove_accents(input_str):
|
444 |
+
nfkd_form = unicodedata.normalize('NFKD', input_str)
|
445 |
+
return u"".join([c for c in nfkd_form if not unicodedata.combining(c)])
|
446 |
+
|
447 |
+
def remove_special_characters(batch):
|
448 |
+
batch["target_text"] = re.sub(chars_to_remove_regex, '', batch[text_column_name]).lower()
|
449 |
+
batch["target_text"] = re.sub("ç", r'[cedille]', batch["target_text"])
|
450 |
+
batch["target_text"] = re.sub("&", r'et', batch["target_text"])
|
451 |
+
batch["target_text"] = re.sub("%", r' pourcents', batch["target_text"])
|
452 |
+
batch["target_text"] = re.sub("([0-9]+)(,|.)([0-9+])", r'\1 virgule \3', batch["target_text"])
|
453 |
+
batch["target_text"] = re.sub("\$", r'dollar', batch["target_text"])
|
454 |
+
batch["target_text"] = re.sub("\£", r'livre', batch["target_text"])
|
455 |
+
batch["target_text"] = re.sub("\€", r'euro', batch["target_text"])
|
456 |
+
batch["target_text"] = remove_accents(batch["target_text"])
|
457 |
+
batch["target_text"] = re.sub(r"\[cedille\]", 'ç', batch["target_text"]) + " "
|
458 |
+
return batch
|
459 |
+
|
460 |
+
with training_args.main_process_first(desc="dataset map special characters removal"):
|
461 |
+
raw_datasets = raw_datasets.map(
|
462 |
+
remove_special_characters,
|
463 |
+
remove_columns=[text_column_name],
|
464 |
+
desc="remove special characters from datasets"
|
465 |
+
)
|
466 |
+
|
467 |
+
# save special tokens for tokenizer
|
468 |
+
word_delimiter_token = data_args.word_delimiter_token
|
469 |
+
unk_token = data_args.unk_token
|
470 |
+
pad_token = data_args.pad_token
|
471 |
+
|
472 |
+
# 3. Next, let's load the config as we might need it to create
|
473 |
+
# the tokenizer
|
474 |
+
# load config
|
475 |
+
config = AutoConfig.from_pretrained(
|
476 |
+
model_args.model_name_or_path, cache_dir=model_args.cache_dir, use_auth_token=data_args.use_auth_token
|
477 |
+
)
|
478 |
+
|
479 |
+
# 4. Next, if no tokenizer file is defined,
|
480 |
+
# we create the vocabulary of the model by extracting all unique characters from
|
481 |
+
# the training and evaluation datasets
|
482 |
+
# We need to make sure that only first rank saves vocabulary
|
483 |
+
# make sure all processes wait until vocab is created
|
484 |
+
tokenizer_name_or_path = model_args.tokenizer_name_or_path
|
485 |
+
tokenizer_kwargs = {}
|
486 |
+
if tokenizer_name_or_path is None:
|
487 |
+
# save vocab in training output dir
|
488 |
+
tokenizer_name_or_path = training_args.output_dir
|
489 |
+
|
490 |
+
vocab_file = os.path.join(tokenizer_name_or_path, "vocab.json")
|
491 |
+
|
492 |
+
with training_args.main_process_first():
|
493 |
+
if training_args.overwrite_output_dir and os.path.isfile(vocab_file):
|
494 |
+
os.remove(vocab_file)
|
495 |
+
|
496 |
+
with training_args.main_process_first(desc="dataset map vocabulary creation"):
|
497 |
+
if not os.path.isfile(vocab_file):
|
498 |
+
os.makedirs(tokenizer_name_or_path, exist_ok=True)
|
499 |
+
vocab_dict = create_vocabulary_from_data(
|
500 |
+
raw_datasets,
|
501 |
+
word_delimiter_token=word_delimiter_token,
|
502 |
+
unk_token=unk_token,
|
503 |
+
pad_token=pad_token,
|
504 |
+
)
|
505 |
+
|
506 |
+
# save vocab dict to be loaded into tokenizer
|
507 |
+
with open(vocab_file, "w") as file:
|
508 |
+
json.dump(vocab_dict, file)
|
509 |
+
|
510 |
+
# if tokenizer has just been created
|
511 |
+
# it is defined by `tokenizer_class` if present in config else by `model_type`
|
512 |
+
tokenizer_kwargs = {
|
513 |
+
"config": config if config.tokenizer_class is not None else None,
|
514 |
+
"tokenizer_type": config.model_type if config.tokenizer_class is None else None,
|
515 |
+
"unk_token": unk_token,
|
516 |
+
"pad_token": pad_token,
|
517 |
+
"word_delimiter_token": word_delimiter_token,
|
518 |
+
}
|
519 |
+
|
520 |
+
# 5. Now we can instantiate the feature extractor, tokenizer and model
|
521 |
+
# Note for distributed training, the .from_pretrained methods guarantee that only
|
522 |
+
# one local process can concurrently download model & vocab.
|
523 |
+
|
524 |
+
# load feature_extractor and tokenizer
|
525 |
+
tokenizer = Wav2Vec2CTCTokenizer.from_pretrained(
|
526 |
+
tokenizer_name_or_path,
|
527 |
+
use_auth_token=data_args.use_auth_token,
|
528 |
+
**tokenizer_kwargs,
|
529 |
+
)
|
530 |
+
feature_extractor = AutoFeatureExtractor.from_pretrained(
|
531 |
+
model_args.model_name_or_path, cache_dir=model_args.cache_dir, use_auth_token=data_args.use_auth_token
|
532 |
+
)
|
533 |
+
|
534 |
+
# adapt config
|
535 |
+
config.update(
|
536 |
+
{
|
537 |
+
"feat_proj_dropout": model_args.feat_proj_dropout,
|
538 |
+
"attention_dropout": model_args.attention_dropout,
|
539 |
+
"hidden_dropout": model_args.hidden_dropout,
|
540 |
+
"final_dropout": model_args.final_dropout,
|
541 |
+
"mask_time_prob": model_args.mask_time_prob,
|
542 |
+
"mask_time_length": model_args.mask_time_length,
|
543 |
+
"mask_feature_prob": model_args.mask_feature_prob,
|
544 |
+
"mask_feature_length": model_args.mask_feature_length,
|
545 |
+
"gradient_checkpointing": training_args.gradient_checkpointing,
|
546 |
+
"layerdrop": model_args.layerdrop,
|
547 |
+
"ctc_loss_reduction": model_args.ctc_loss_reduction,
|
548 |
+
"pad_token_id": tokenizer.pad_token_id,
|
549 |
+
"vocab_size": len(tokenizer),
|
550 |
+
"activation_dropout": model_args.activation_dropout,
|
551 |
+
}
|
552 |
+
)
|
553 |
+
|
554 |
+
# create model
|
555 |
+
model = AutoModelForCTC.from_pretrained(
|
556 |
+
model_args.model_name_or_path,
|
557 |
+
cache_dir=model_args.cache_dir,
|
558 |
+
config=config,
|
559 |
+
use_auth_token=data_args.use_auth_token,
|
560 |
+
)
|
561 |
+
|
562 |
+
# freeze encoder
|
563 |
+
if model_args.freeze_feature_encoder:
|
564 |
+
model.freeze_feature_encoder()
|
565 |
+
|
566 |
+
# 6. Now we preprocess the datasets including loading the audio, resampling and normalization
|
567 |
+
# Thankfully, `datasets` takes care of automatically loading and resampling the audio,
|
568 |
+
# so that we just need to set the correct target sampling rate and normalize the input
|
569 |
+
# via the `feature_extractor`
|
570 |
+
|
571 |
+
# make sure that dataset decodes audio with correct sampling rate
|
572 |
+
dataset_sampling_rate = next(iter(raw_datasets.values())).features[data_args.audio_column_name].sampling_rate
|
573 |
+
if dataset_sampling_rate != feature_extractor.sampling_rate:
|
574 |
+
raw_datasets = raw_datasets.cast_column(
|
575 |
+
data_args.audio_column_name, datasets.features.Audio(sampling_rate=feature_extractor.sampling_rate)
|
576 |
+
)
|
577 |
+
|
578 |
+
# derive max & min input length for sample rate & max duration
|
579 |
+
max_input_length = data_args.max_duration_in_seconds * feature_extractor.sampling_rate
|
580 |
+
min_input_length = data_args.min_duration_in_seconds * feature_extractor.sampling_rate
|
581 |
+
audio_column_name = data_args.audio_column_name
|
582 |
+
num_workers = data_args.preprocessing_num_workers
|
583 |
+
|
584 |
+
# `phoneme_language` is only relevant if the model is fine-tuned on phoneme classification
|
585 |
+
phoneme_language = data_args.phoneme_language
|
586 |
+
|
587 |
+
# Preprocessing the datasets.
|
588 |
+
# We need to read the audio files as arrays and tokenize the targets.
|
589 |
+
def prepare_dataset(batch):
|
590 |
+
# load audio
|
591 |
+
sample = batch[audio_column_name]
|
592 |
+
|
593 |
+
inputs = feature_extractor(sample["array"], sampling_rate=sample["sampling_rate"])
|
594 |
+
batch["input_values"] = inputs.input_values[0]
|
595 |
+
batch["input_length"] = len(batch["input_values"])
|
596 |
+
|
597 |
+
# encode targets
|
598 |
+
additional_kwargs = {}
|
599 |
+
if phoneme_language is not None:
|
600 |
+
additional_kwargs["phonemizer_lang"] = phoneme_language
|
601 |
+
|
602 |
+
batch["labels"] = tokenizer(batch["target_text"], **additional_kwargs).input_ids
|
603 |
+
return batch
|
604 |
+
|
605 |
+
with training_args.main_process_first(desc="dataset map preprocessing"):
|
606 |
+
vectorized_datasets = raw_datasets.map(
|
607 |
+
prepare_dataset,
|
608 |
+
remove_columns=next(iter(raw_datasets.values())).column_names,
|
609 |
+
batch_size=-1,
|
610 |
+
desc="preprocess datasets",
|
611 |
+
)
|
612 |
+
|
613 |
+
def is_audio_in_length_range(length):
|
614 |
+
return length > min_input_length and length < max_input_length
|
615 |
+
|
616 |
+
# filter data that is shorter than min_input_length
|
617 |
+
vectorized_datasets = vectorized_datasets.filter(
|
618 |
+
is_audio_in_length_range,
|
619 |
+
num_proc=num_workers,
|
620 |
+
input_columns=["input_length"],
|
621 |
+
)
|
622 |
+
|
623 |
+
# 7. Next, we can prepare the training.
|
624 |
+
# Let's use word error rate (WER) as our evaluation metric,
|
625 |
+
# instantiate a data collator and the trainer
|
626 |
+
|
627 |
+
# Define evaluation metrics during training, *i.e.* word error rate, character error rate
|
628 |
+
eval_metrics = {metric: load_metric(metric) for metric in data_args.eval_metrics}
|
629 |
+
|
630 |
+
# for large datasets it is advised to run the preprocessing on a
|
631 |
+
# single machine first with ``args.preprocessing_only`` since there will mostly likely
|
632 |
+
# be a timeout when running the script in distributed mode.
|
633 |
+
# In a second step ``args.preprocessing_only`` can then be set to `False` to load the
|
634 |
+
# cached dataset
|
635 |
+
if data_args.preprocessing_only:
|
636 |
+
logger.info(f"Data preprocessing finished. Files cached at {vectorized_datasets.cache_files}")
|
637 |
+
return
|
638 |
+
|
639 |
+
def compute_metrics(pred):
|
640 |
+
pred_logits = pred.predictions
|
641 |
+
pred_ids = np.argmax(pred_logits, axis=-1)
|
642 |
+
|
643 |
+
pred.label_ids[pred.label_ids == -100] = tokenizer.pad_token_id
|
644 |
+
|
645 |
+
pred_str = tokenizer.batch_decode(pred_ids, skip_special_tokens=True)#being sure to remove <s> from the output
|
646 |
+
# we do not want to group tokens when computing the metrics
|
647 |
+
label_str = tokenizer.batch_decode(pred.label_ids, group_tokens=False)
|
648 |
+
|
649 |
+
metrics = {k: v.compute(predictions=pred_str, references=label_str) for k, v in eval_metrics.items()}
|
650 |
+
|
651 |
+
return metrics
|
652 |
+
|
653 |
+
# Now save everything to be able to create a single processor later
|
654 |
+
if is_main_process(training_args.local_rank):
|
655 |
+
# save feature extractor, tokenizer and config
|
656 |
+
feature_extractor.save_pretrained(training_args.output_dir)
|
657 |
+
tokenizer.save_pretrained(training_args.output_dir)
|
658 |
+
config.save_pretrained(training_args.output_dir)
|
659 |
+
|
660 |
+
try:
|
661 |
+
processor = AutoProcessor.from_pretrained(training_args.output_dir)
|
662 |
+
except (OSError, KeyError):
|
663 |
+
warnings.warn(
|
664 |
+
"Loading a processor from a feature extractor config that does not"
|
665 |
+
" include a `processor_class` attribute is deprecated and will be removed in v5. Please add the following "
|
666 |
+
" attribute to your `preprocessor_config.json` file to suppress this warning: "
|
667 |
+
" `'processor_class': 'Wav2Vec2Processor'`",
|
668 |
+
FutureWarning,
|
669 |
+
)
|
670 |
+
processor = Wav2Vec2Processor.from_pretrained(training_args.output_dir)
|
671 |
+
|
672 |
+
# Instantiate custom data collator
|
673 |
+
data_collator = DataCollatorCTCWithPadding(processor=processor)
|
674 |
+
|
675 |
+
# Initialize Trainer
|
676 |
+
trainer = Trainer(
|
677 |
+
model=model,
|
678 |
+
data_collator=data_collator,
|
679 |
+
args=training_args,
|
680 |
+
compute_metrics=compute_metrics,
|
681 |
+
train_dataset=vectorized_datasets["train"] if training_args.do_train else None,
|
682 |
+
eval_dataset=vectorized_datasets["eval"] if training_args.do_eval else None,
|
683 |
+
tokenizer=feature_extractor,
|
684 |
+
)
|
685 |
+
|
686 |
+
# 8. Finally, we can start training
|
687 |
+
|
688 |
+
# Training
|
689 |
+
if training_args.do_train:
|
690 |
+
|
691 |
+
# use last checkpoint if exist
|
692 |
+
if last_checkpoint is not None:
|
693 |
+
checkpoint = last_checkpoint
|
694 |
+
elif os.path.isdir(model_args.model_name_or_path):
|
695 |
+
checkpoint = model_args.model_name_or_path
|
696 |
+
else:
|
697 |
+
checkpoint = None
|
698 |
+
|
699 |
+
train_result = trainer.train(resume_from_checkpoint=checkpoint)
|
700 |
+
trainer.save_model()
|
701 |
+
|
702 |
+
metrics = train_result.metrics
|
703 |
+
max_train_samples = (
|
704 |
+
data_args.max_train_samples
|
705 |
+
if data_args.max_train_samples is not None
|
706 |
+
else len(vectorized_datasets["train"])
|
707 |
+
)
|
708 |
+
metrics["train_samples"] = min(max_train_samples, len(vectorized_datasets["train"]))
|
709 |
+
|
710 |
+
trainer.log_metrics("train", metrics)
|
711 |
+
trainer.save_metrics("train", metrics)
|
712 |
+
trainer.save_state()
|
713 |
+
|
714 |
+
# Evaluation
|
715 |
+
results = {}
|
716 |
+
if training_args.do_eval:
|
717 |
+
logger.info("*** Evaluate ***")
|
718 |
+
metrics = trainer.evaluate()
|
719 |
+
max_eval_samples = (
|
720 |
+
data_args.max_eval_samples if data_args.max_eval_samples is not None else len(vectorized_datasets["eval"])
|
721 |
+
)
|
722 |
+
metrics["eval_samples"] = min(max_eval_samples, len(vectorized_datasets["eval"]))
|
723 |
+
|
724 |
+
trainer.log_metrics("eval", metrics)
|
725 |
+
trainer.save_metrics("eval", metrics)
|
726 |
+
|
727 |
+
# Write model card and (optionally) push to hub
|
728 |
+
config_name = data_args.dataset_config_name if data_args.dataset_config_name is not None else "na"
|
729 |
+
kwargs = {
|
730 |
+
"finetuned_from": model_args.model_name_or_path,
|
731 |
+
"tasks": "speech-recognition",
|
732 |
+
"tags": ["automatic-speech-recognition", data_args.dataset_name],
|
733 |
+
"dataset_args": f"Config: {config_name}, Training split: {data_args.train_split_name}, Eval split: {data_args.eval_split_name}",
|
734 |
+
"dataset": f"{data_args.dataset_name.upper()} - {config_name.upper()}",
|
735 |
+
}
|
736 |
+
if "common_voice" in data_args.dataset_name:
|
737 |
+
kwargs["language"] = config_name
|
738 |
+
|
739 |
+
if training_args.push_to_hub:
|
740 |
+
trainer.push_to_hub(**kwargs)
|
741 |
+
else:
|
742 |
+
trainer.create_model_card(**kwargs)
|
743 |
+
|
744 |
+
return results
|
745 |
+
|
746 |
+
|
747 |
+
if __name__ == "__main__":
|
748 |
+
main()
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"bos_token": null, "eos_token": null, "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "bos_token": "<s>", "eos_token": "</s>", "pad_token": "[PAD]", "do_lower_case": false, "word_delimiter_token": "|", "config": null, "tokenizer_type": "wav2vec2", "special_tokens_map_file": null, "tokenizer_file": null, "name_or_path": "./", "tokenizer_class": "Wav2Vec2CTCTokenizer"}
|
train_results.json
ADDED
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 2.0,
|
3 |
+
"train_loss": 1.442369053426242,
|
4 |
+
"train_runtime": 53680.5392,
|
5 |
+
"train_samples": 442265,
|
6 |
+
"train_samples_per_second": 16.478,
|
7 |
+
"train_steps_per_second": 0.257
|
8 |
+
}
|
trainer_state.json
ADDED
@@ -0,0 +1,1096 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"best_metric": 0.23875188827514648,
|
3 |
+
"best_model_checkpoint": "./checkpoint-13500",
|
4 |
+
"epoch": 2.0,
|
5 |
+
"global_step": 13822,
|
6 |
+
"is_hyper_param_search": false,
|
7 |
+
"is_local_process_zero": true,
|
8 |
+
"is_world_process_zero": true,
|
9 |
+
"log_history": [
|
10 |
+
{
|
11 |
+
"epoch": 0.01,
|
12 |
+
"learning_rate": 6.533333333333333e-06,
|
13 |
+
"loss": 17.2403,
|
14 |
+
"step": 100
|
15 |
+
},
|
16 |
+
{
|
17 |
+
"epoch": 0.03,
|
18 |
+
"learning_rate": 1.32e-05,
|
19 |
+
"loss": 10.2311,
|
20 |
+
"step": 200
|
21 |
+
},
|
22 |
+
{
|
23 |
+
"epoch": 0.04,
|
24 |
+
"learning_rate": 1.9800000000000004e-05,
|
25 |
+
"loss": 7.834,
|
26 |
+
"step": 300
|
27 |
+
},
|
28 |
+
{
|
29 |
+
"epoch": 0.06,
|
30 |
+
"learning_rate": 2.646666666666667e-05,
|
31 |
+
"loss": 6.0656,
|
32 |
+
"step": 400
|
33 |
+
},
|
34 |
+
{
|
35 |
+
"epoch": 0.07,
|
36 |
+
"learning_rate": 3.313333333333333e-05,
|
37 |
+
"loss": 4.3748,
|
38 |
+
"step": 500
|
39 |
+
},
|
40 |
+
{
|
41 |
+
"epoch": 0.07,
|
42 |
+
"eval_loss": 3.878422975540161,
|
43 |
+
"eval_runtime": 285.8223,
|
44 |
+
"eval_samples_per_second": 20.264,
|
45 |
+
"eval_steps_per_second": 0.318,
|
46 |
+
"eval_wer": 1.0,
|
47 |
+
"step": 500
|
48 |
+
},
|
49 |
+
{
|
50 |
+
"epoch": 0.09,
|
51 |
+
"learning_rate": 3.9800000000000005e-05,
|
52 |
+
"loss": 3.2923,
|
53 |
+
"step": 600
|
54 |
+
},
|
55 |
+
{
|
56 |
+
"epoch": 0.1,
|
57 |
+
"learning_rate": 4.646666666666667e-05,
|
58 |
+
"loss": 2.9475,
|
59 |
+
"step": 700
|
60 |
+
},
|
61 |
+
{
|
62 |
+
"epoch": 0.12,
|
63 |
+
"learning_rate": 5.3133333333333335e-05,
|
64 |
+
"loss": 2.8639,
|
65 |
+
"step": 800
|
66 |
+
},
|
67 |
+
{
|
68 |
+
"epoch": 0.13,
|
69 |
+
"learning_rate": 5.9800000000000003e-05,
|
70 |
+
"loss": 2.8265,
|
71 |
+
"step": 900
|
72 |
+
},
|
73 |
+
{
|
74 |
+
"epoch": 0.14,
|
75 |
+
"learning_rate": 6.646666666666667e-05,
|
76 |
+
"loss": 2.8068,
|
77 |
+
"step": 1000
|
78 |
+
},
|
79 |
+
{
|
80 |
+
"epoch": 0.14,
|
81 |
+
"eval_loss": 2.828850746154785,
|
82 |
+
"eval_runtime": 292.3877,
|
83 |
+
"eval_samples_per_second": 19.809,
|
84 |
+
"eval_steps_per_second": 0.311,
|
85 |
+
"eval_wer": 0.9826485059793412,
|
86 |
+
"step": 1000
|
87 |
+
},
|
88 |
+
{
|
89 |
+
"epoch": 0.16,
|
90 |
+
"learning_rate": 7.306666666666668e-05,
|
91 |
+
"loss": 2.779,
|
92 |
+
"step": 1100
|
93 |
+
},
|
94 |
+
{
|
95 |
+
"epoch": 0.17,
|
96 |
+
"learning_rate": 7.973333333333334e-05,
|
97 |
+
"loss": 2.6402,
|
98 |
+
"step": 1200
|
99 |
+
},
|
100 |
+
{
|
101 |
+
"epoch": 0.19,
|
102 |
+
"learning_rate": 8.64e-05,
|
103 |
+
"loss": 2.1119,
|
104 |
+
"step": 1300
|
105 |
+
},
|
106 |
+
{
|
107 |
+
"epoch": 0.2,
|
108 |
+
"learning_rate": 9.306666666666667e-05,
|
109 |
+
"loss": 1.7965,
|
110 |
+
"step": 1400
|
111 |
+
},
|
112 |
+
{
|
113 |
+
"epoch": 0.22,
|
114 |
+
"learning_rate": 9.973333333333334e-05,
|
115 |
+
"loss": 1.6698,
|
116 |
+
"step": 1500
|
117 |
+
},
|
118 |
+
{
|
119 |
+
"epoch": 0.22,
|
120 |
+
"eval_loss": 0.881136417388916,
|
121 |
+
"eval_runtime": 297.1806,
|
122 |
+
"eval_samples_per_second": 19.49,
|
123 |
+
"eval_steps_per_second": 0.306,
|
124 |
+
"eval_wer": 0.7127472384241911,
|
125 |
+
"step": 1500
|
126 |
+
},
|
127 |
+
{
|
128 |
+
"epoch": 0.23,
|
129 |
+
"learning_rate": 9.92209056971271e-05,
|
130 |
+
"loss": 1.5882,
|
131 |
+
"step": 1600
|
132 |
+
},
|
133 |
+
{
|
134 |
+
"epoch": 0.25,
|
135 |
+
"learning_rate": 9.840934913163448e-05,
|
136 |
+
"loss": 1.5172,
|
137 |
+
"step": 1700
|
138 |
+
},
|
139 |
+
{
|
140 |
+
"epoch": 0.26,
|
141 |
+
"learning_rate": 9.759779256614186e-05,
|
142 |
+
"loss": 1.4579,
|
143 |
+
"step": 1800
|
144 |
+
},
|
145 |
+
{
|
146 |
+
"epoch": 0.27,
|
147 |
+
"learning_rate": 9.678623600064926e-05,
|
148 |
+
"loss": 1.3829,
|
149 |
+
"step": 1900
|
150 |
+
},
|
151 |
+
{
|
152 |
+
"epoch": 0.29,
|
153 |
+
"learning_rate": 9.597467943515663e-05,
|
154 |
+
"loss": 1.3488,
|
155 |
+
"step": 2000
|
156 |
+
},
|
157 |
+
{
|
158 |
+
"epoch": 0.29,
|
159 |
+
"eval_loss": 0.516592800617218,
|
160 |
+
"eval_runtime": 301.2842,
|
161 |
+
"eval_samples_per_second": 19.224,
|
162 |
+
"eval_steps_per_second": 0.302,
|
163 |
+
"eval_wer": 0.5369024731988661,
|
164 |
+
"step": 2000
|
165 |
+
},
|
166 |
+
{
|
167 |
+
"epoch": 0.3,
|
168 |
+
"learning_rate": 9.516312286966402e-05,
|
169 |
+
"loss": 1.2981,
|
170 |
+
"step": 2100
|
171 |
+
},
|
172 |
+
{
|
173 |
+
"epoch": 0.32,
|
174 |
+
"learning_rate": 9.43515663041714e-05,
|
175 |
+
"loss": 1.2845,
|
176 |
+
"step": 2200
|
177 |
+
},
|
178 |
+
{
|
179 |
+
"epoch": 0.33,
|
180 |
+
"learning_rate": 9.354000973867879e-05,
|
181 |
+
"loss": 1.2459,
|
182 |
+
"step": 2300
|
183 |
+
},
|
184 |
+
{
|
185 |
+
"epoch": 0.35,
|
186 |
+
"learning_rate": 9.272845317318618e-05,
|
187 |
+
"loss": 1.2255,
|
188 |
+
"step": 2400
|
189 |
+
},
|
190 |
+
{
|
191 |
+
"epoch": 0.36,
|
192 |
+
"learning_rate": 9.191689660769356e-05,
|
193 |
+
"loss": 1.2239,
|
194 |
+
"step": 2500
|
195 |
+
},
|
196 |
+
{
|
197 |
+
"epoch": 0.36,
|
198 |
+
"eval_loss": 0.4104757010936737,
|
199 |
+
"eval_runtime": 299.1395,
|
200 |
+
"eval_samples_per_second": 19.362,
|
201 |
+
"eval_steps_per_second": 0.304,
|
202 |
+
"eval_wer": 0.474111245071524,
|
203 |
+
"step": 2500
|
204 |
+
},
|
205 |
+
{
|
206 |
+
"epoch": 0.38,
|
207 |
+
"learning_rate": 9.110534004220094e-05,
|
208 |
+
"loss": 1.2024,
|
209 |
+
"step": 2600
|
210 |
+
},
|
211 |
+
{
|
212 |
+
"epoch": 0.39,
|
213 |
+
"learning_rate": 9.030189904236326e-05,
|
214 |
+
"loss": 1.1851,
|
215 |
+
"step": 2700
|
216 |
+
},
|
217 |
+
{
|
218 |
+
"epoch": 0.41,
|
219 |
+
"learning_rate": 8.949034247687063e-05,
|
220 |
+
"loss": 1.1768,
|
221 |
+
"step": 2800
|
222 |
+
},
|
223 |
+
{
|
224 |
+
"epoch": 0.42,
|
225 |
+
"learning_rate": 8.867878591137803e-05,
|
226 |
+
"loss": 1.1641,
|
227 |
+
"step": 2900
|
228 |
+
},
|
229 |
+
{
|
230 |
+
"epoch": 0.43,
|
231 |
+
"learning_rate": 8.786722934588541e-05,
|
232 |
+
"loss": 1.1537,
|
233 |
+
"step": 3000
|
234 |
+
},
|
235 |
+
{
|
236 |
+
"epoch": 0.43,
|
237 |
+
"eval_loss": 0.35850802063941956,
|
238 |
+
"eval_runtime": 299.662,
|
239 |
+
"eval_samples_per_second": 19.328,
|
240 |
+
"eval_steps_per_second": 0.304,
|
241 |
+
"eval_wer": 0.4448499462348073,
|
242 |
+
"step": 3000
|
243 |
+
},
|
244 |
+
{
|
245 |
+
"epoch": 0.45,
|
246 |
+
"learning_rate": 8.70556727803928e-05,
|
247 |
+
"loss": 1.1449,
|
248 |
+
"step": 3100
|
249 |
+
},
|
250 |
+
{
|
251 |
+
"epoch": 0.46,
|
252 |
+
"learning_rate": 8.624411621490018e-05,
|
253 |
+
"loss": 1.1379,
|
254 |
+
"step": 3200
|
255 |
+
},
|
256 |
+
{
|
257 |
+
"epoch": 0.48,
|
258 |
+
"learning_rate": 8.543255964940758e-05,
|
259 |
+
"loss": 1.1331,
|
260 |
+
"step": 3300
|
261 |
+
},
|
262 |
+
{
|
263 |
+
"epoch": 0.49,
|
264 |
+
"learning_rate": 8.462100308391495e-05,
|
265 |
+
"loss": 1.1205,
|
266 |
+
"step": 3400
|
267 |
+
},
|
268 |
+
{
|
269 |
+
"epoch": 0.51,
|
270 |
+
"learning_rate": 8.380944651842234e-05,
|
271 |
+
"loss": 1.1184,
|
272 |
+
"step": 3500
|
273 |
+
},
|
274 |
+
{
|
275 |
+
"epoch": 0.51,
|
276 |
+
"eval_loss": 0.333638072013855,
|
277 |
+
"eval_runtime": 297.0402,
|
278 |
+
"eval_samples_per_second": 19.499,
|
279 |
+
"eval_steps_per_second": 0.306,
|
280 |
+
"eval_wer": 0.42922545537489004,
|
281 |
+
"step": 3500
|
282 |
+
},
|
283 |
+
{
|
284 |
+
"epoch": 0.52,
|
285 |
+
"learning_rate": 8.299788995292971e-05,
|
286 |
+
"loss": 1.1014,
|
287 |
+
"step": 3600
|
288 |
+
},
|
289 |
+
{
|
290 |
+
"epoch": 0.54,
|
291 |
+
"learning_rate": 8.218633338743711e-05,
|
292 |
+
"loss": 1.1114,
|
293 |
+
"step": 3700
|
294 |
+
},
|
295 |
+
{
|
296 |
+
"epoch": 0.55,
|
297 |
+
"learning_rate": 8.13747768219445e-05,
|
298 |
+
"loss": 1.117,
|
299 |
+
"step": 3800
|
300 |
+
},
|
301 |
+
{
|
302 |
+
"epoch": 0.56,
|
303 |
+
"learning_rate": 8.056322025645188e-05,
|
304 |
+
"loss": 1.102,
|
305 |
+
"step": 3900
|
306 |
+
},
|
307 |
+
{
|
308 |
+
"epoch": 0.58,
|
309 |
+
"learning_rate": 7.975166369095926e-05,
|
310 |
+
"loss": 1.0968,
|
311 |
+
"step": 4000
|
312 |
+
},
|
313 |
+
{
|
314 |
+
"epoch": 0.58,
|
315 |
+
"eval_loss": 0.31949570775032043,
|
316 |
+
"eval_runtime": 296.0172,
|
317 |
+
"eval_samples_per_second": 19.566,
|
318 |
+
"eval_steps_per_second": 0.307,
|
319 |
+
"eval_wer": 0.4180162273127179,
|
320 |
+
"step": 4000
|
321 |
+
},
|
322 |
+
{
|
323 |
+
"epoch": 0.59,
|
324 |
+
"learning_rate": 7.894822269112158e-05,
|
325 |
+
"loss": 1.0942,
|
326 |
+
"step": 4100
|
327 |
+
},
|
328 |
+
{
|
329 |
+
"epoch": 0.61,
|
330 |
+
"learning_rate": 7.813666612562897e-05,
|
331 |
+
"loss": 1.0859,
|
332 |
+
"step": 4200
|
333 |
+
},
|
334 |
+
{
|
335 |
+
"epoch": 0.62,
|
336 |
+
"learning_rate": 7.732510956013635e-05,
|
337 |
+
"loss": 1.0767,
|
338 |
+
"step": 4300
|
339 |
+
},
|
340 |
+
{
|
341 |
+
"epoch": 0.64,
|
342 |
+
"learning_rate": 7.652166856029866e-05,
|
343 |
+
"loss": 1.0766,
|
344 |
+
"step": 4400
|
345 |
+
},
|
346 |
+
{
|
347 |
+
"epoch": 0.65,
|
348 |
+
"learning_rate": 7.571011199480604e-05,
|
349 |
+
"loss": 1.0737,
|
350 |
+
"step": 4500
|
351 |
+
},
|
352 |
+
{
|
353 |
+
"epoch": 0.65,
|
354 |
+
"eval_loss": 0.30754634737968445,
|
355 |
+
"eval_runtime": 296.2378,
|
356 |
+
"eval_samples_per_second": 19.552,
|
357 |
+
"eval_steps_per_second": 0.307,
|
358 |
+
"eval_wer": 0.41408973899442797,
|
359 |
+
"step": 4500
|
360 |
+
},
|
361 |
+
{
|
362 |
+
"epoch": 0.67,
|
363 |
+
"learning_rate": 7.489855542931342e-05,
|
364 |
+
"loss": 1.0807,
|
365 |
+
"step": 4600
|
366 |
+
},
|
367 |
+
{
|
368 |
+
"epoch": 0.68,
|
369 |
+
"learning_rate": 7.40869988638208e-05,
|
370 |
+
"loss": 1.071,
|
371 |
+
"step": 4700
|
372 |
+
},
|
373 |
+
{
|
374 |
+
"epoch": 0.69,
|
375 |
+
"learning_rate": 7.32754422983282e-05,
|
376 |
+
"loss": 1.0613,
|
377 |
+
"step": 4800
|
378 |
+
},
|
379 |
+
{
|
380 |
+
"epoch": 0.71,
|
381 |
+
"learning_rate": 7.246388573283557e-05,
|
382 |
+
"loss": 1.0635,
|
383 |
+
"step": 4900
|
384 |
+
},
|
385 |
+
{
|
386 |
+
"epoch": 0.72,
|
387 |
+
"learning_rate": 7.165232916734297e-05,
|
388 |
+
"loss": 1.0677,
|
389 |
+
"step": 5000
|
390 |
+
},
|
391 |
+
{
|
392 |
+
"epoch": 0.72,
|
393 |
+
"eval_loss": 0.30150118470191956,
|
394 |
+
"eval_runtime": 297.4742,
|
395 |
+
"eval_samples_per_second": 19.471,
|
396 |
+
"eval_steps_per_second": 0.306,
|
397 |
+
"eval_wer": 0.4089250219948516,
|
398 |
+
"step": 5000
|
399 |
+
},
|
400 |
+
{
|
401 |
+
"epoch": 0.74,
|
402 |
+
"learning_rate": 7.084077260185034e-05,
|
403 |
+
"loss": 1.0707,
|
404 |
+
"step": 5100
|
405 |
+
},
|
406 |
+
{
|
407 |
+
"epoch": 0.75,
|
408 |
+
"learning_rate": 7.002921603635774e-05,
|
409 |
+
"loss": 1.0617,
|
410 |
+
"step": 5200
|
411 |
+
},
|
412 |
+
{
|
413 |
+
"epoch": 0.77,
|
414 |
+
"learning_rate": 6.921765947086512e-05,
|
415 |
+
"loss": 1.0566,
|
416 |
+
"step": 5300
|
417 |
+
},
|
418 |
+
{
|
419 |
+
"epoch": 0.78,
|
420 |
+
"learning_rate": 6.84061029053725e-05,
|
421 |
+
"loss": 1.0518,
|
422 |
+
"step": 5400
|
423 |
+
},
|
424 |
+
{
|
425 |
+
"epoch": 0.8,
|
426 |
+
"learning_rate": 6.760266190553481e-05,
|
427 |
+
"loss": 1.0462,
|
428 |
+
"step": 5500
|
429 |
+
},
|
430 |
+
{
|
431 |
+
"epoch": 0.8,
|
432 |
+
"eval_loss": 0.297052800655365,
|
433 |
+
"eval_runtime": 296.1592,
|
434 |
+
"eval_samples_per_second": 19.557,
|
435 |
+
"eval_steps_per_second": 0.307,
|
436 |
+
"eval_wer": 0.4077193782788621,
|
437 |
+
"step": 5500
|
438 |
+
},
|
439 |
+
{
|
440 |
+
"epoch": 0.81,
|
441 |
+
"learning_rate": 6.679110534004221e-05,
|
442 |
+
"loss": 1.0514,
|
443 |
+
"step": 5600
|
444 |
+
},
|
445 |
+
{
|
446 |
+
"epoch": 0.82,
|
447 |
+
"learning_rate": 6.597954877454959e-05,
|
448 |
+
"loss": 1.0446,
|
449 |
+
"step": 5700
|
450 |
+
},
|
451 |
+
{
|
452 |
+
"epoch": 0.84,
|
453 |
+
"learning_rate": 6.516799220905698e-05,
|
454 |
+
"loss": 1.0358,
|
455 |
+
"step": 5800
|
456 |
+
},
|
457 |
+
{
|
458 |
+
"epoch": 0.85,
|
459 |
+
"learning_rate": 6.435643564356436e-05,
|
460 |
+
"loss": 1.0364,
|
461 |
+
"step": 5900
|
462 |
+
},
|
463 |
+
{
|
464 |
+
"epoch": 0.87,
|
465 |
+
"learning_rate": 6.354487907807174e-05,
|
466 |
+
"loss": 1.0392,
|
467 |
+
"step": 6000
|
468 |
+
},
|
469 |
+
{
|
470 |
+
"epoch": 0.87,
|
471 |
+
"eval_loss": 0.2870033383369446,
|
472 |
+
"eval_runtime": 295.9814,
|
473 |
+
"eval_samples_per_second": 19.569,
|
474 |
+
"eval_steps_per_second": 0.307,
|
475 |
+
"eval_wer": 0.3997034768157972,
|
476 |
+
"step": 6000
|
477 |
+
},
|
478 |
+
{
|
479 |
+
"epoch": 0.88,
|
480 |
+
"learning_rate": 6.273332251257913e-05,
|
481 |
+
"loss": 1.0375,
|
482 |
+
"step": 6100
|
483 |
+
},
|
484 |
+
{
|
485 |
+
"epoch": 0.9,
|
486 |
+
"learning_rate": 6.192176594708652e-05,
|
487 |
+
"loss": 1.0408,
|
488 |
+
"step": 6200
|
489 |
+
},
|
490 |
+
{
|
491 |
+
"epoch": 0.91,
|
492 |
+
"learning_rate": 6.11102093815939e-05,
|
493 |
+
"loss": 1.0382,
|
494 |
+
"step": 6300
|
495 |
+
},
|
496 |
+
{
|
497 |
+
"epoch": 0.93,
|
498 |
+
"learning_rate": 6.0298652816101284e-05,
|
499 |
+
"loss": 1.0335,
|
500 |
+
"step": 6400
|
501 |
+
},
|
502 |
+
{
|
503 |
+
"epoch": 0.94,
|
504 |
+
"learning_rate": 5.948709625060867e-05,
|
505 |
+
"loss": 1.0178,
|
506 |
+
"step": 6500
|
507 |
+
},
|
508 |
+
{
|
509 |
+
"epoch": 0.94,
|
510 |
+
"eval_loss": 0.28046590089797974,
|
511 |
+
"eval_runtime": 297.8045,
|
512 |
+
"eval_samples_per_second": 19.449,
|
513 |
+
"eval_steps_per_second": 0.306,
|
514 |
+
"eval_wer": 0.39629834794225943,
|
515 |
+
"step": 6500
|
516 |
+
},
|
517 |
+
{
|
518 |
+
"epoch": 0.95,
|
519 |
+
"learning_rate": 5.867553968511605e-05,
|
520 |
+
"loss": 1.0312,
|
521 |
+
"step": 6600
|
522 |
+
},
|
523 |
+
{
|
524 |
+
"epoch": 0.97,
|
525 |
+
"learning_rate": 5.786398311962344e-05,
|
526 |
+
"loss": 1.033,
|
527 |
+
"step": 6700
|
528 |
+
},
|
529 |
+
{
|
530 |
+
"epoch": 0.98,
|
531 |
+
"learning_rate": 5.7052426554130825e-05,
|
532 |
+
"loss": 1.0289,
|
533 |
+
"step": 6800
|
534 |
+
},
|
535 |
+
{
|
536 |
+
"epoch": 1.0,
|
537 |
+
"learning_rate": 5.624086998863821e-05,
|
538 |
+
"loss": 1.027,
|
539 |
+
"step": 6900
|
540 |
+
},
|
541 |
+
{
|
542 |
+
"epoch": 1.01,
|
543 |
+
"learning_rate": 5.54293134231456e-05,
|
544 |
+
"loss": 0.992,
|
545 |
+
"step": 7000
|
546 |
+
},
|
547 |
+
{
|
548 |
+
"epoch": 1.01,
|
549 |
+
"eval_loss": 0.2747785747051239,
|
550 |
+
"eval_runtime": 298.4841,
|
551 |
+
"eval_samples_per_second": 19.405,
|
552 |
+
"eval_steps_per_second": 0.305,
|
553 |
+
"eval_wer": 0.39352862589201343,
|
554 |
+
"step": 7000
|
555 |
+
},
|
556 |
+
{
|
557 |
+
"epoch": 1.03,
|
558 |
+
"learning_rate": 5.461775685765298e-05,
|
559 |
+
"loss": 1.0025,
|
560 |
+
"step": 7100
|
561 |
+
},
|
562 |
+
{
|
563 |
+
"epoch": 1.04,
|
564 |
+
"learning_rate": 5.3806200292160366e-05,
|
565 |
+
"loss": 1.0122,
|
566 |
+
"step": 7200
|
567 |
+
},
|
568 |
+
{
|
569 |
+
"epoch": 1.06,
|
570 |
+
"learning_rate": 5.299464372666775e-05,
|
571 |
+
"loss": 1.018,
|
572 |
+
"step": 7300
|
573 |
+
},
|
574 |
+
{
|
575 |
+
"epoch": 1.07,
|
576 |
+
"learning_rate": 5.218308716117514e-05,
|
577 |
+
"loss": 0.9936,
|
578 |
+
"step": 7400
|
579 |
+
},
|
580 |
+
{
|
581 |
+
"epoch": 1.09,
|
582 |
+
"learning_rate": 5.137153059568252e-05,
|
583 |
+
"loss": 1.0197,
|
584 |
+
"step": 7500
|
585 |
+
},
|
586 |
+
{
|
587 |
+
"epoch": 1.09,
|
588 |
+
"eval_loss": 0.26907604932785034,
|
589 |
+
"eval_runtime": 298.796,
|
590 |
+
"eval_samples_per_second": 19.384,
|
591 |
+
"eval_steps_per_second": 0.305,
|
592 |
+
"eval_wer": 0.3884453713056796,
|
593 |
+
"step": 7500
|
594 |
+
},
|
595 |
+
{
|
596 |
+
"epoch": 1.1,
|
597 |
+
"learning_rate": 5.055997403018991e-05,
|
598 |
+
"loss": 1.008,
|
599 |
+
"step": 7600
|
600 |
+
},
|
601 |
+
{
|
602 |
+
"epoch": 1.11,
|
603 |
+
"learning_rate": 4.97484174646973e-05,
|
604 |
+
"loss": 1.0028,
|
605 |
+
"step": 7700
|
606 |
+
},
|
607 |
+
{
|
608 |
+
"epoch": 1.13,
|
609 |
+
"learning_rate": 4.893686089920468e-05,
|
610 |
+
"loss": 0.9929,
|
611 |
+
"step": 7800
|
612 |
+
},
|
613 |
+
{
|
614 |
+
"epoch": 1.14,
|
615 |
+
"learning_rate": 4.8125304333712064e-05,
|
616 |
+
"loss": 0.995,
|
617 |
+
"step": 7900
|
618 |
+
},
|
619 |
+
{
|
620 |
+
"epoch": 1.16,
|
621 |
+
"learning_rate": 4.731374776821945e-05,
|
622 |
+
"loss": 1.0056,
|
623 |
+
"step": 8000
|
624 |
+
},
|
625 |
+
{
|
626 |
+
"epoch": 1.16,
|
627 |
+
"eval_loss": 0.26817116141319275,
|
628 |
+
"eval_runtime": 298.1504,
|
629 |
+
"eval_samples_per_second": 19.426,
|
630 |
+
"eval_steps_per_second": 0.305,
|
631 |
+
"eval_wer": 0.3888526833718922,
|
632 |
+
"step": 8000
|
633 |
+
},
|
634 |
+
{
|
635 |
+
"epoch": 1.17,
|
636 |
+
"learning_rate": 4.6510306768381754e-05,
|
637 |
+
"loss": 0.9971,
|
638 |
+
"step": 8100
|
639 |
+
},
|
640 |
+
{
|
641 |
+
"epoch": 1.19,
|
642 |
+
"learning_rate": 4.5698750202889145e-05,
|
643 |
+
"loss": 0.9976,
|
644 |
+
"step": 8200
|
645 |
+
},
|
646 |
+
{
|
647 |
+
"epoch": 1.2,
|
648 |
+
"learning_rate": 4.488719363739653e-05,
|
649 |
+
"loss": 1.0014,
|
650 |
+
"step": 8300
|
651 |
+
},
|
652 |
+
{
|
653 |
+
"epoch": 1.22,
|
654 |
+
"learning_rate": 4.407563707190391e-05,
|
655 |
+
"loss": 0.9835,
|
656 |
+
"step": 8400
|
657 |
+
},
|
658 |
+
{
|
659 |
+
"epoch": 1.23,
|
660 |
+
"learning_rate": 4.3272196072066225e-05,
|
661 |
+
"loss": 0.9826,
|
662 |
+
"step": 8500
|
663 |
+
},
|
664 |
+
{
|
665 |
+
"epoch": 1.23,
|
666 |
+
"eval_loss": 0.26473307609558105,
|
667 |
+
"eval_runtime": 299.0161,
|
668 |
+
"eval_samples_per_second": 19.37,
|
669 |
+
"eval_steps_per_second": 0.304,
|
670 |
+
"eval_wer": 0.38675095311023494,
|
671 |
+
"step": 8500
|
672 |
+
},
|
673 |
+
{
|
674 |
+
"epoch": 1.24,
|
675 |
+
"learning_rate": 4.2460639506573615e-05,
|
676 |
+
"loss": 0.9838,
|
677 |
+
"step": 8600
|
678 |
+
},
|
679 |
+
{
|
680 |
+
"epoch": 1.26,
|
681 |
+
"learning_rate": 4.1649082941081e-05,
|
682 |
+
"loss": 0.9836,
|
683 |
+
"step": 8700
|
684 |
+
},
|
685 |
+
{
|
686 |
+
"epoch": 1.27,
|
687 |
+
"learning_rate": 4.0845641941243305e-05,
|
688 |
+
"loss": 0.9824,
|
689 |
+
"step": 8800
|
690 |
+
},
|
691 |
+
{
|
692 |
+
"epoch": 1.29,
|
693 |
+
"learning_rate": 4.003408537575069e-05,
|
694 |
+
"loss": 0.9715,
|
695 |
+
"step": 8900
|
696 |
+
},
|
697 |
+
{
|
698 |
+
"epoch": 1.3,
|
699 |
+
"learning_rate": 3.922252881025807e-05,
|
700 |
+
"loss": 0.9815,
|
701 |
+
"step": 9000
|
702 |
+
},
|
703 |
+
{
|
704 |
+
"epoch": 1.3,
|
705 |
+
"eval_loss": 0.26034271717071533,
|
706 |
+
"eval_runtime": 299.6782,
|
707 |
+
"eval_samples_per_second": 19.327,
|
708 |
+
"eval_steps_per_second": 0.304,
|
709 |
+
"eval_wer": 0.3831828994102121,
|
710 |
+
"step": 9000
|
711 |
+
},
|
712 |
+
{
|
713 |
+
"epoch": 1.32,
|
714 |
+
"learning_rate": 3.841097224476546e-05,
|
715 |
+
"loss": 0.9757,
|
716 |
+
"step": 9100
|
717 |
+
},
|
718 |
+
{
|
719 |
+
"epoch": 1.33,
|
720 |
+
"learning_rate": 3.7599415679272846e-05,
|
721 |
+
"loss": 0.9689,
|
722 |
+
"step": 9200
|
723 |
+
},
|
724 |
+
{
|
725 |
+
"epoch": 1.35,
|
726 |
+
"learning_rate": 3.678785911378023e-05,
|
727 |
+
"loss": 0.9778,
|
728 |
+
"step": 9300
|
729 |
+
},
|
730 |
+
{
|
731 |
+
"epoch": 1.36,
|
732 |
+
"learning_rate": 3.5976302548287614e-05,
|
733 |
+
"loss": 0.9794,
|
734 |
+
"step": 9400
|
735 |
+
},
|
736 |
+
{
|
737 |
+
"epoch": 1.37,
|
738 |
+
"learning_rate": 3.5164745982795e-05,
|
739 |
+
"loss": 0.9717,
|
740 |
+
"step": 9500
|
741 |
+
},
|
742 |
+
{
|
743 |
+
"epoch": 1.37,
|
744 |
+
"eval_loss": 0.25609487295150757,
|
745 |
+
"eval_runtime": 299.6976,
|
746 |
+
"eval_samples_per_second": 19.326,
|
747 |
+
"eval_steps_per_second": 0.304,
|
748 |
+
"eval_wer": 0.3807064420476392,
|
749 |
+
"step": 9500
|
750 |
+
},
|
751 |
+
{
|
752 |
+
"epoch": 1.39,
|
753 |
+
"learning_rate": 3.435318941730239e-05,
|
754 |
+
"loss": 0.9752,
|
755 |
+
"step": 9600
|
756 |
+
},
|
757 |
+
{
|
758 |
+
"epoch": 1.4,
|
759 |
+
"learning_rate": 3.354163285180977e-05,
|
760 |
+
"loss": 0.965,
|
761 |
+
"step": 9700
|
762 |
+
},
|
763 |
+
{
|
764 |
+
"epoch": 1.42,
|
765 |
+
"learning_rate": 3.2730076286317155e-05,
|
766 |
+
"loss": 0.9522,
|
767 |
+
"step": 9800
|
768 |
+
},
|
769 |
+
{
|
770 |
+
"epoch": 1.43,
|
771 |
+
"learning_rate": 3.191851972082454e-05,
|
772 |
+
"loss": 0.9718,
|
773 |
+
"step": 9900
|
774 |
+
},
|
775 |
+
{
|
776 |
+
"epoch": 1.45,
|
777 |
+
"learning_rate": 3.110696315533193e-05,
|
778 |
+
"loss": 0.9605,
|
779 |
+
"step": 10000
|
780 |
+
},
|
781 |
+
{
|
782 |
+
"epoch": 1.45,
|
783 |
+
"eval_loss": 0.25231894850730896,
|
784 |
+
"eval_runtime": 297.5796,
|
785 |
+
"eval_samples_per_second": 19.464,
|
786 |
+
"eval_steps_per_second": 0.306,
|
787 |
+
"eval_wer": 0.3782951546156603,
|
788 |
+
"step": 10000
|
789 |
+
},
|
790 |
+
{
|
791 |
+
"epoch": 1.46,
|
792 |
+
"learning_rate": 3.0295406589839315e-05,
|
793 |
+
"loss": 0.9635,
|
794 |
+
"step": 10100
|
795 |
+
},
|
796 |
+
{
|
797 |
+
"epoch": 1.48,
|
798 |
+
"learning_rate": 2.94838500243467e-05,
|
799 |
+
"loss": 0.9632,
|
800 |
+
"step": 10200
|
801 |
+
},
|
802 |
+
{
|
803 |
+
"epoch": 1.49,
|
804 |
+
"learning_rate": 2.8672293458854082e-05,
|
805 |
+
"loss": 0.9548,
|
806 |
+
"step": 10300
|
807 |
+
},
|
808 |
+
{
|
809 |
+
"epoch": 1.5,
|
810 |
+
"learning_rate": 2.7868852459016392e-05,
|
811 |
+
"loss": 0.9554,
|
812 |
+
"step": 10400
|
813 |
+
},
|
814 |
+
{
|
815 |
+
"epoch": 1.52,
|
816 |
+
"learning_rate": 2.7057295893523783e-05,
|
817 |
+
"loss": 0.96,
|
818 |
+
"step": 10500
|
819 |
+
},
|
820 |
+
{
|
821 |
+
"epoch": 1.52,
|
822 |
+
"eval_loss": 0.24939315021038055,
|
823 |
+
"eval_runtime": 300.036,
|
824 |
+
"eval_samples_per_second": 19.304,
|
825 |
+
"eval_steps_per_second": 0.303,
|
826 |
+
"eval_wer": 0.3787513441298185,
|
827 |
+
"step": 10500
|
828 |
+
},
|
829 |
+
{
|
830 |
+
"epoch": 1.53,
|
831 |
+
"learning_rate": 2.6245739328031166e-05,
|
832 |
+
"loss": 0.9611,
|
833 |
+
"step": 10600
|
834 |
+
},
|
835 |
+
{
|
836 |
+
"epoch": 1.55,
|
837 |
+
"learning_rate": 2.543418276253855e-05,
|
838 |
+
"loss": 0.9594,
|
839 |
+
"step": 10700
|
840 |
+
},
|
841 |
+
{
|
842 |
+
"epoch": 1.56,
|
843 |
+
"learning_rate": 2.4622626197045937e-05,
|
844 |
+
"loss": 0.9589,
|
845 |
+
"step": 10800
|
846 |
+
},
|
847 |
+
{
|
848 |
+
"epoch": 1.58,
|
849 |
+
"learning_rate": 2.381106963155332e-05,
|
850 |
+
"loss": 0.9441,
|
851 |
+
"step": 10900
|
852 |
+
},
|
853 |
+
{
|
854 |
+
"epoch": 1.59,
|
855 |
+
"learning_rate": 2.2999513066060704e-05,
|
856 |
+
"loss": 0.9442,
|
857 |
+
"step": 11000
|
858 |
+
},
|
859 |
+
{
|
860 |
+
"epoch": 1.59,
|
861 |
+
"eval_loss": 0.24783751368522644,
|
862 |
+
"eval_runtime": 297.7741,
|
863 |
+
"eval_samples_per_second": 19.451,
|
864 |
+
"eval_steps_per_second": 0.306,
|
865 |
+
"eval_wer": 0.3760142070448695,
|
866 |
+
"step": 11000
|
867 |
+
},
|
868 |
+
{
|
869 |
+
"epoch": 1.61,
|
870 |
+
"learning_rate": 2.218795650056809e-05,
|
871 |
+
"loss": 0.9496,
|
872 |
+
"step": 11100
|
873 |
+
},
|
874 |
+
{
|
875 |
+
"epoch": 1.62,
|
876 |
+
"learning_rate": 2.1376399935075474e-05,
|
877 |
+
"loss": 0.9486,
|
878 |
+
"step": 11200
|
879 |
+
},
|
880 |
+
{
|
881 |
+
"epoch": 1.64,
|
882 |
+
"learning_rate": 2.056484336958286e-05,
|
883 |
+
"loss": 0.9558,
|
884 |
+
"step": 11300
|
885 |
+
},
|
886 |
+
{
|
887 |
+
"epoch": 1.65,
|
888 |
+
"learning_rate": 1.9753286804090245e-05,
|
889 |
+
"loss": 0.9486,
|
890 |
+
"step": 11400
|
891 |
+
},
|
892 |
+
{
|
893 |
+
"epoch": 1.66,
|
894 |
+
"learning_rate": 1.8941730238597632e-05,
|
895 |
+
"loss": 0.9564,
|
896 |
+
"step": 11500
|
897 |
+
},
|
898 |
+
{
|
899 |
+
"epoch": 1.66,
|
900 |
+
"eval_loss": 0.2454409897327423,
|
901 |
+
"eval_runtime": 296.0265,
|
902 |
+
"eval_samples_per_second": 19.566,
|
903 |
+
"eval_steps_per_second": 0.307,
|
904 |
+
"eval_wer": 0.3733096549252175,
|
905 |
+
"step": 11500
|
906 |
+
},
|
907 |
+
{
|
908 |
+
"epoch": 1.68,
|
909 |
+
"learning_rate": 1.8130173673105015e-05,
|
910 |
+
"loss": 0.9427,
|
911 |
+
"step": 11600
|
912 |
+
},
|
913 |
+
{
|
914 |
+
"epoch": 1.69,
|
915 |
+
"learning_rate": 1.73186171076124e-05,
|
916 |
+
"loss": 0.9423,
|
917 |
+
"step": 11700
|
918 |
+
},
|
919 |
+
{
|
920 |
+
"epoch": 1.71,
|
921 |
+
"learning_rate": 1.6507060542119786e-05,
|
922 |
+
"loss": 0.9503,
|
923 |
+
"step": 11800
|
924 |
+
},
|
925 |
+
{
|
926 |
+
"epoch": 1.72,
|
927 |
+
"learning_rate": 1.5695503976627173e-05,
|
928 |
+
"loss": 0.9383,
|
929 |
+
"step": 11900
|
930 |
+
},
|
931 |
+
{
|
932 |
+
"epoch": 1.74,
|
933 |
+
"learning_rate": 1.4883947411134558e-05,
|
934 |
+
"loss": 0.9436,
|
935 |
+
"step": 12000
|
936 |
+
},
|
937 |
+
{
|
938 |
+
"epoch": 1.74,
|
939 |
+
"eval_loss": 0.24390804767608643,
|
940 |
+
"eval_runtime": 295.4584,
|
941 |
+
"eval_samples_per_second": 19.603,
|
942 |
+
"eval_steps_per_second": 0.308,
|
943 |
+
"eval_wer": 0.37467822346769203,
|
944 |
+
"step": 12000
|
945 |
+
},
|
946 |
+
{
|
947 |
+
"epoch": 1.75,
|
948 |
+
"learning_rate": 1.4072390845641942e-05,
|
949 |
+
"loss": 0.9491,
|
950 |
+
"step": 12100
|
951 |
+
},
|
952 |
+
{
|
953 |
+
"epoch": 1.77,
|
954 |
+
"learning_rate": 1.3260834280149325e-05,
|
955 |
+
"loss": 0.9419,
|
956 |
+
"step": 12200
|
957 |
+
},
|
958 |
+
{
|
959 |
+
"epoch": 1.78,
|
960 |
+
"learning_rate": 1.2449277714656712e-05,
|
961 |
+
"loss": 0.9517,
|
962 |
+
"step": 12300
|
963 |
+
},
|
964 |
+
{
|
965 |
+
"epoch": 1.79,
|
966 |
+
"learning_rate": 1.1637721149164097e-05,
|
967 |
+
"loss": 0.9367,
|
968 |
+
"step": 12400
|
969 |
+
},
|
970 |
+
{
|
971 |
+
"epoch": 1.81,
|
972 |
+
"learning_rate": 1.0826164583671483e-05,
|
973 |
+
"loss": 0.938,
|
974 |
+
"step": 12500
|
975 |
+
},
|
976 |
+
{
|
977 |
+
"epoch": 1.81,
|
978 |
+
"eval_loss": 0.24111612141132355,
|
979 |
+
"eval_runtime": 296.9314,
|
980 |
+
"eval_samples_per_second": 19.506,
|
981 |
+
"eval_steps_per_second": 0.306,
|
982 |
+
"eval_wer": 0.37159894424712436,
|
983 |
+
"step": 12500
|
984 |
+
},
|
985 |
+
{
|
986 |
+
"epoch": 1.82,
|
987 |
+
"learning_rate": 1.0014608018178868e-05,
|
988 |
+
"loss": 0.9337,
|
989 |
+
"step": 12600
|
990 |
+
},
|
991 |
+
{
|
992 |
+
"epoch": 1.84,
|
993 |
+
"learning_rate": 9.203051452686253e-06,
|
994 |
+
"loss": 0.9284,
|
995 |
+
"step": 12700
|
996 |
+
},
|
997 |
+
{
|
998 |
+
"epoch": 1.85,
|
999 |
+
"learning_rate": 8.391494887193638e-06,
|
1000 |
+
"loss": 0.938,
|
1001 |
+
"step": 12800
|
1002 |
+
},
|
1003 |
+
{
|
1004 |
+
"epoch": 1.87,
|
1005 |
+
"learning_rate": 7.579938321701023e-06,
|
1006 |
+
"loss": 0.9365,
|
1007 |
+
"step": 12900
|
1008 |
+
},
|
1009 |
+
{
|
1010 |
+
"epoch": 1.88,
|
1011 |
+
"learning_rate": 6.768381756208409e-06,
|
1012 |
+
"loss": 0.9353,
|
1013 |
+
"step": 13000
|
1014 |
+
},
|
1015 |
+
{
|
1016 |
+
"epoch": 1.88,
|
1017 |
+
"eval_loss": 0.23965783417224884,
|
1018 |
+
"eval_runtime": 296.6078,
|
1019 |
+
"eval_samples_per_second": 19.527,
|
1020 |
+
"eval_steps_per_second": 0.307,
|
1021 |
+
"eval_wer": 0.3697904786731402,
|
1022 |
+
"step": 13000
|
1023 |
+
},
|
1024 |
+
{
|
1025 |
+
"epoch": 1.9,
|
1026 |
+
"learning_rate": 5.956825190715793e-06,
|
1027 |
+
"loss": 0.9413,
|
1028 |
+
"step": 13100
|
1029 |
+
},
|
1030 |
+
{
|
1031 |
+
"epoch": 1.91,
|
1032 |
+
"learning_rate": 5.153384190878104e-06,
|
1033 |
+
"loss": 0.9356,
|
1034 |
+
"step": 13200
|
1035 |
+
},
|
1036 |
+
{
|
1037 |
+
"epoch": 1.92,
|
1038 |
+
"learning_rate": 4.34182762538549e-06,
|
1039 |
+
"loss": 0.9209,
|
1040 |
+
"step": 13300
|
1041 |
+
},
|
1042 |
+
{
|
1043 |
+
"epoch": 1.94,
|
1044 |
+
"learning_rate": 3.530271059892875e-06,
|
1045 |
+
"loss": 0.9362,
|
1046 |
+
"step": 13400
|
1047 |
+
},
|
1048 |
+
{
|
1049 |
+
"epoch": 1.95,
|
1050 |
+
"learning_rate": 2.71871449440026e-06,
|
1051 |
+
"loss": 0.9271,
|
1052 |
+
"step": 13500
|
1053 |
+
},
|
1054 |
+
{
|
1055 |
+
"epoch": 1.95,
|
1056 |
+
"eval_loss": 0.23875188827514648,
|
1057 |
+
"eval_runtime": 296.9414,
|
1058 |
+
"eval_samples_per_second": 19.506,
|
1059 |
+
"eval_steps_per_second": 0.306,
|
1060 |
+
"eval_wer": 0.3680797679950471,
|
1061 |
+
"step": 13500
|
1062 |
+
},
|
1063 |
+
{
|
1064 |
+
"epoch": 1.97,
|
1065 |
+
"learning_rate": 1.907157928907645e-06,
|
1066 |
+
"loss": 0.9288,
|
1067 |
+
"step": 13600
|
1068 |
+
},
|
1069 |
+
{
|
1070 |
+
"epoch": 1.98,
|
1071 |
+
"learning_rate": 1.0956013634150302e-06,
|
1072 |
+
"loss": 0.9345,
|
1073 |
+
"step": 13700
|
1074 |
+
},
|
1075 |
+
{
|
1076 |
+
"epoch": 2.0,
|
1077 |
+
"learning_rate": 2.840447979224152e-07,
|
1078 |
+
"loss": 0.9326,
|
1079 |
+
"step": 13800
|
1080 |
+
},
|
1081 |
+
{
|
1082 |
+
"epoch": 2.0,
|
1083 |
+
"step": 13822,
|
1084 |
+
"total_flos": 1.2600843645735263e+20,
|
1085 |
+
"train_loss": 1.442369053426242,
|
1086 |
+
"train_runtime": 53680.5392,
|
1087 |
+
"train_samples_per_second": 16.478,
|
1088 |
+
"train_steps_per_second": 0.257
|
1089 |
+
}
|
1090 |
+
],
|
1091 |
+
"max_steps": 13822,
|
1092 |
+
"num_train_epochs": 2,
|
1093 |
+
"total_flos": 1.2600843645735263e+20,
|
1094 |
+
"trial_name": null,
|
1095 |
+
"trial_params": null
|
1096 |
+
}
|
training_args.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3515fcb3146dcedc28056bf8ff0c0b0a7c0ebdb32789529ea25826588fbd9491
|
3 |
+
size 3055
|
vocab.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"'": 1, "(": 2, ")": 3, "*": 4, ".": 5, "/": 6, "1": 7, "2": 8, "=": 9, "C": 10, "E": 11, "N": 12, "Q": 13, "R": 14, "Z": 15, "`": 16, "a": 17, "b": 18, "c": 19, "d": 20, "e": 21, "f": 22, "g": 23, "h": 24, "i": 25, "j": 26, "k": 27, "l": 28, "m": 29, "n": 30, "o": 31, "p": 32, "q": 33, "r": 34, "s": 35, "t": 36, "u": 37, "v": 38, "w": 39, "x": 40, "y": 41, "z": 42, "{": 43, "|": 0, "}": 45, "~": 46, "§": 47, "«": 48, "®": 49, "°": 50, "±": 51, "·": 52, "»": 53, "×": 54, "ß": 55, "æ": 56, "ç": 57, "ð": 58, "ø": 59, "þ": 60, "đ": 61, "ħ": 62, "ı": 63, "ł": 64, "œ": 65, "ǀ": 66, "ǃ": 67, "ɑ": 68, "ə": 69, "ɨ": 70, "ʉ": 71, "ʔ": 72, "ʻ": 73, "ʼ": 74, "ʽ": 75, "ʾ": 76, "ʿ": 77, "ː": 78, "α": 79, "β": 80, "γ": 81, "δ": 82, "ε": 83, "ζ": 84, "η": 85, "θ": 86, "ι": 87, "κ": 88, "λ": 89, "μ": 90, "ν": 91, "ο": 92, "π": 93, "ρ": 94, "ς": 95, "σ": 96, "τ": 97, "υ": 98, "φ": 99, "χ": 100, "ψ": 101, "ω": 102, "а": 103, "г": 104, "е": 105, "з": 106, "и": 107, "к": 108, "м": 109, "н": 110, "о": 111, "п": 112, "р": 113, "ц": 114, "ч": 115, "э": 116, "я": 117, "є": 118, "і": 119, "ј": 120, "џ": 121, "ҫ": 122, "ӌ": 123, "գ": 124, "զ": 125, "ا": 126, "ب": 127, "ة": 128, "د": 129, "ر": 130, "ل": 131, "م": 132, "ن": 133, "و": 134, "ي": 135, "ᄀ": 136, "ᄆ": 137, "ᄉ": 138, "ᄌ": 139, "ᅡ": 140, "ᅢ": 141, "ᅥ": 142, "ᅩ": 143, "ᅵ": 144, "ᆨ": 145, "ᆷ": 146, "ᆸ": 147, "ᆼ": 148, "ቀ": 149, "ከ": 150, "ወ": 151, "ደ": 152, "ጀ": 153, "‐": 154, "–": 155, "—": 156, "―": 157, "’": 158, "„": 159, "†": 160, "′": 161, "‹": 162, "›": 163, "⁄": 164, "₽": 165, "→": 166, "↔": 167, "∅": 168, "∆": 169, "∈": 170, "−": 171, "∞": 172, "∨": 173, "∼": 174, "≥": 175, "⊨": 176, "⋅": 177, "─": 178, "☉": 179, "ⱅ": 180, "ⱎ": 181, "い": 182, "う": 183, "た": 184, "つ": 185, "の": 186, "ひ": 187, "へ": 188, "ま": 189, "む": 190, "め": 191, "も": 192, "や": 193, "三": 194, "丹": 195, "乃": 196, "京": 197, "保": 198, "北": 199, "厳": 200, "宇": 201, "扬": 202, "文": 203, "星": 204, "术": 205, "杜": 206, "津": 207, "牡": 208, "甌": 209, "美": 210, "西": 211, "貴": 212, "青": 213, "馆": 214, "ꝑ": 215, "[UNK]": 215, "[PAD]": 216}
|
wandb/debug-internal.log
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220129_215451-1vipdbow/logs/debug-internal.log
|
wandb/debug.log
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220129_215451-1vipdbow/logs/debug.log
|
wandb/latest-run
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220129_215451-1vipdbow
|
wandb/run-20220129_131141-h6nhqm30/files/conda-environment.yaml
ADDED
File without changes
|
wandb/run-20220129_131141-h6nhqm30/files/config.yaml
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_131141-h6nhqm30/files/output.log
ADDED
@@ -0,0 +1,9003 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
|
3 |
+
|
4 |
+
|
5 |
+
|
6 |
+
|
7 |
+
|
8 |
+
|
9 |
+
|
10 |
+
|
11 |
+
|
12 |
+
|
13 |
+
|
14 |
+
|
15 |
+
|
16 |
+
|
17 |
+
|
18 |
+
|
19 |
+
|
20 |
+
|
21 |
+
|
22 |
+
|
23 |
+
|
24 |
+
|
25 |
+
|
26 |
+
|
27 |
+
|
28 |
+
|
29 |
+
|
30 |
+
|
31 |
+
|
32 |
+
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
|
37 |
+
|
38 |
+
|
39 |
+
|
40 |
+
|
41 |
+
|
42 |
+
|
43 |
+
|
44 |
+
|
45 |
+
|
46 |
+
|
47 |
+
|
48 |
+
|
49 |
+
|
50 |
+
|
51 |
+
|
52 |
+
|
53 |
+
|
54 |
+
|
55 |
+
|
56 |
+
|
57 |
+
|
58 |
+
|
59 |
+
|
60 |
+
|
61 |
+
|
62 |
+
|
63 |
+
|
64 |
+
|
65 |
+
|
66 |
+
|
67 |
+
|
68 |
+
|
69 |
+
|
70 |
+
|
71 |
+
|
72 |
+
|
73 |
+
|
74 |
+
|
75 |
+
|
76 |
+
|
77 |
+
|
78 |
+
|
79 |
+
|
80 |
+
|
81 |
+
|
82 |
+
|
83 |
+
|
84 |
+
|
85 |
+
|
86 |
+
|
87 |
+
|
88 |
+
|
89 |
+
|
90 |
+
|
91 |
+
|
92 |
+
|
93 |
+
|
94 |
+
|
95 |
+
|
96 |
+
|
97 |
+
|
98 |
+
|
99 |
+
|
100 |
+
0%|▋ | 100/21520 [04:25<9:53:42, 1.66s/it]
|
101 |
+
|
102 |
+
|
103 |
+
|
104 |
+
|
105 |
+
|
106 |
+
|
107 |
+
|
108 |
+
|
109 |
+
|
110 |
+
|
111 |
+
|
112 |
+
|
113 |
+
|
114 |
+
|
115 |
+
|
116 |
+
|
117 |
+
|
118 |
+
|
119 |
+
|
120 |
+
|
121 |
+
|
122 |
+
|
123 |
+
|
124 |
+
|
125 |
+
|
126 |
+
|
127 |
+
|
128 |
+
|
129 |
+
|
130 |
+
|
131 |
+
|
132 |
+
|
133 |
+
|
134 |
+
|
135 |
+
|
136 |
+
|
137 |
+
|
138 |
+
|
139 |
+
|
140 |
+
|
141 |
+
|
142 |
+
|
143 |
+
|
144 |
+
|
145 |
+
|
146 |
+
|
147 |
+
|
148 |
+
|
149 |
+
|
150 |
+
|
151 |
+
|
152 |
+
|
153 |
+
|
154 |
+
|
155 |
+
|
156 |
+
|
157 |
+
|
158 |
+
|
159 |
+
|
160 |
+
|
161 |
+
|
162 |
+
|
163 |
+
|
164 |
+
|
165 |
+
|
166 |
+
|
167 |
+
|
168 |
+
|
169 |
+
|
170 |
+
|
171 |
+
|
172 |
+
|
173 |
+
|
174 |
+
|
175 |
+
|
176 |
+
|
177 |
+
|
178 |
+
|
179 |
+
|
180 |
+
|
181 |
+
|
182 |
+
|
183 |
+
|
184 |
+
|
185 |
+
|
186 |
+
|
187 |
+
|
188 |
+
|
189 |
+
|
190 |
+
|
191 |
+
|
192 |
+
|
193 |
+
|
194 |
+
|
195 |
+
|
196 |
+
|
197 |
+
|
198 |
+
1%|█▍ | 199/21520 [08:51<10:08:31, 1.71s/it]
|
199 |
+
|
200 |
+
|
201 |
+
|
202 |
+
|
203 |
+
|
204 |
+
|
205 |
+
|
206 |
+
|
207 |
+
|
208 |
+
|
209 |
+
|
210 |
+
|
211 |
+
|
212 |
+
|
213 |
+
|
214 |
+
|
215 |
+
|
216 |
+
|
217 |
+
|
218 |
+
|
219 |
+
|
220 |
+
|
221 |
+
|
222 |
+
|
223 |
+
|
224 |
+
|
225 |
+
|
226 |
+
|
227 |
+
|
228 |
+
|
229 |
+
|
230 |
+
|
231 |
+
|
232 |
+
|
233 |
+
|
234 |
+
|
235 |
+
|
236 |
+
|
237 |
+
|
238 |
+
|
239 |
+
|
240 |
+
|
241 |
+
|
242 |
+
|
243 |
+
|
244 |
+
|
245 |
+
|
246 |
+
|
247 |
+
|
248 |
+
|
249 |
+
|
250 |
+
|
251 |
+
|
252 |
+
|
253 |
+
|
254 |
+
|
255 |
+
|
256 |
+
|
257 |
+
|
258 |
+
|
259 |
+
|
260 |
+
|
261 |
+
|
262 |
+
|
263 |
+
|
264 |
+
|
265 |
+
|
266 |
+
|
267 |
+
|
268 |
+
|
269 |
+
|
270 |
+
|
271 |
+
|
272 |
+
|
273 |
+
|
274 |
+
|
275 |
+
|
276 |
+
|
277 |
+
|
278 |
+
|
279 |
+
|
280 |
+
|
281 |
+
|
282 |
+
|
283 |
+
|
284 |
+
|
285 |
+
|
286 |
+
|
287 |
+
|
288 |
+
|
289 |
+
|
290 |
+
|
291 |
+
|
292 |
+
|
293 |
+
|
294 |
+
|
295 |
+
|
296 |
+
1%|██▏ | 300/21520 [13:20<9:42:10, 1.65s/it]
|
297 |
+
|
298 |
+
|
299 |
+
|
300 |
+
|
301 |
+
|
302 |
+
|
303 |
+
|
304 |
+
|
305 |
+
|
306 |
+
|
307 |
+
|
308 |
+
|
309 |
+
|
310 |
+
|
311 |
+
|
312 |
+
|
313 |
+
|
314 |
+
|
315 |
+
|
316 |
+
|
317 |
+
|
318 |
+
|
319 |
+
|
320 |
+
|
321 |
+
|
322 |
+
|
323 |
+
|
324 |
+
|
325 |
+
|
326 |
+
|
327 |
+
|
328 |
+
|
329 |
+
|
330 |
+
|
331 |
+
|
332 |
+
|
333 |
+
|
334 |
+
|
335 |
+
|
336 |
+
|
337 |
+
|
338 |
+
|
339 |
+
|
340 |
+
|
341 |
+
|
342 |
+
|
343 |
+
|
344 |
+
|
345 |
+
|
346 |
+
|
347 |
+
|
348 |
+
|
349 |
+
|
350 |
+
|
351 |
+
|
352 |
+
|
353 |
+
|
354 |
+
|
355 |
+
|
356 |
+
|
357 |
+
|
358 |
+
|
359 |
+
|
360 |
+
|
361 |
+
|
362 |
+
|
363 |
+
|
364 |
+
|
365 |
+
|
366 |
+
|
367 |
+
|
368 |
+
|
369 |
+
|
370 |
+
|
371 |
+
|
372 |
+
|
373 |
+
|
374 |
+
|
375 |
+
|
376 |
+
|
377 |
+
|
378 |
+
|
379 |
+
|
380 |
+
|
381 |
+
|
382 |
+
|
383 |
+
|
384 |
+
|
385 |
+
|
386 |
+
|
387 |
+
|
388 |
+
|
389 |
+
|
390 |
+
|
391 |
+
|
392 |
+
|
393 |
+
|
394 |
+
2%|██▉ | 399/21520 [17:44<9:49:47, 1.68s/it]
|
395 |
+
|
396 |
+
|
397 |
+
|
398 |
+
|
399 |
+
|
400 |
+
|
401 |
+
|
402 |
+
|
403 |
+
|
404 |
+
|
405 |
+
|
406 |
+
|
407 |
+
|
408 |
+
|
409 |
+
|
410 |
+
|
411 |
+
|
412 |
+
|
413 |
+
|
414 |
+
|
415 |
+
|
416 |
+
|
417 |
+
|
418 |
+
|
419 |
+
|
420 |
+
|
421 |
+
|
422 |
+
|
423 |
+
|
424 |
+
|
425 |
+
|
426 |
+
|
427 |
+
|
428 |
+
|
429 |
+
|
430 |
+
|
431 |
+
|
432 |
+
|
433 |
+
|
434 |
+
|
435 |
+
|
436 |
+
|
437 |
+
|
438 |
+
|
439 |
+
|
440 |
+
|
441 |
+
|
442 |
+
|
443 |
+
|
444 |
+
|
445 |
+
|
446 |
+
|
447 |
+
|
448 |
+
|
449 |
+
|
450 |
+
|
451 |
+
|
452 |
+
|
453 |
+
|
454 |
+
|
455 |
+
|
456 |
+
|
457 |
+
|
458 |
+
|
459 |
+
|
460 |
+
|
461 |
+
|
462 |
+
|
463 |
+
|
464 |
+
|
465 |
+
|
466 |
+
|
467 |
+
|
468 |
+
|
469 |
+
|
470 |
+
|
471 |
+
|
472 |
+
|
473 |
+
|
474 |
+
|
475 |
+
|
476 |
+
|
477 |
+
|
478 |
+
|
479 |
+
|
480 |
+
|
481 |
+
|
482 |
+
|
483 |
+
|
484 |
+
|
485 |
+
|
486 |
+
|
487 |
+
|
488 |
+
|
489 |
+
|
490 |
+
|
491 |
+
|
492 |
+
|
493 |
+
2%|███▋ | 500/21520 [22:14<9:36:18, 1.65s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
494 |
+
***** Running Evaluation *****
|
495 |
+
Num examples = 1839
|
496 |
+
Batch size = 64
|
497 |
+
{'loss': 5.4101, 'learning_rate': 2.3239999999999998e-05, 'epoch': 0.12}
|
498 |
+
|
499 |
+
|
500 |
+
|
501 |
+
|
502 |
+
|
503 |
+
|
504 |
+
|
505 |
+
|
506 |
+
|
507 |
+
|
508 |
+
|
509 |
+
|
510 |
+
|
511 |
+
|
512 |
+
|
513 |
+
|
514 |
+
|
515 |
+
|
516 |
+
|
517 |
+
|
518 |
+
|
519 |
+
|
520 |
+
|
521 |
+
|
522 |
+
|
523 |
+
|
524 |
+
|
525 |
+
|
526 |
+
Configuration saved in ./checkpoint-500/config.json
|
527 |
+
{'eval_loss': 5.053680896759033, 'eval_wer': 1.0, 'eval_runtime': 62.0508, 'eval_samples_per_second': 29.637, 'eval_steps_per_second': 0.467, 'epoch': 0.12}
|
528 |
+
Model weights saved in ./checkpoint-500/pytorch_model.bin
|
529 |
+
Configuration saved in ./checkpoint-500/preprocessor_config.json
|
530 |
+
Configuration saved in ./preprocessor_config.json
|
531 |
+
|
532 |
+
|
533 |
+
|
534 |
+
|
535 |
+
|
536 |
+
|
537 |
+
|
538 |
+
|
539 |
+
|
540 |
+
|
541 |
+
|
542 |
+
|
543 |
+
|
544 |
+
|
545 |
+
|
546 |
+
|
547 |
+
|
548 |
+
|
549 |
+
|
550 |
+
|
551 |
+
|
552 |
+
|
553 |
+
|
554 |
+
|
555 |
+
|
556 |
+
|
557 |
+
|
558 |
+
|
559 |
+
|
560 |
+
|
561 |
+
|
562 |
+
|
563 |
+
|
564 |
+
|
565 |
+
|
566 |
+
|
567 |
+
|
568 |
+
|
569 |
+
|
570 |
+
|
571 |
+
|
572 |
+
|
573 |
+
|
574 |
+
|
575 |
+
|
576 |
+
|
577 |
+
|
578 |
+
|
579 |
+
|
580 |
+
|
581 |
+
|
582 |
+
|
583 |
+
|
584 |
+
|
585 |
+
|
586 |
+
|
587 |
+
|
588 |
+
|
589 |
+
|
590 |
+
|
591 |
+
|
592 |
+
|
593 |
+
|
594 |
+
|
595 |
+
|
596 |
+
|
597 |
+
|
598 |
+
|
599 |
+
|
600 |
+
|
601 |
+
|
602 |
+
|
603 |
+
|
604 |
+
|
605 |
+
|
606 |
+
|
607 |
+
|
608 |
+
|
609 |
+
|
610 |
+
|
611 |
+
|
612 |
+
|
613 |
+
|
614 |
+
|
615 |
+
|
616 |
+
|
617 |
+
|
618 |
+
|
619 |
+
|
620 |
+
|
621 |
+
|
622 |
+
|
623 |
+
|
624 |
+
|
625 |
+
|
626 |
+
|
627 |
+
3%|████▍ | 600/21520 [28:28<9:27:51, 1.63s/it]
|
628 |
+
|
629 |
+
|
630 |
+
|
631 |
+
|
632 |
+
|
633 |
+
|
634 |
+
|
635 |
+
|
636 |
+
|
637 |
+
|
638 |
+
|
639 |
+
|
640 |
+
|
641 |
+
|
642 |
+
|
643 |
+
|
644 |
+
|
645 |
+
|
646 |
+
|
647 |
+
|
648 |
+
|
649 |
+
|
650 |
+
|
651 |
+
|
652 |
+
|
653 |
+
|
654 |
+
|
655 |
+
|
656 |
+
|
657 |
+
|
658 |
+
|
659 |
+
|
660 |
+
|
661 |
+
|
662 |
+
|
663 |
+
|
664 |
+
|
665 |
+
|
666 |
+
|
667 |
+
|
668 |
+
|
669 |
+
|
670 |
+
|
671 |
+
|
672 |
+
|
673 |
+
|
674 |
+
|
675 |
+
|
676 |
+
|
677 |
+
|
678 |
+
|
679 |
+
|
680 |
+
|
681 |
+
|
682 |
+
|
683 |
+
|
684 |
+
|
685 |
+
|
686 |
+
|
687 |
+
|
688 |
+
|
689 |
+
|
690 |
+
|
691 |
+
|
692 |
+
|
693 |
+
|
694 |
+
|
695 |
+
|
696 |
+
|
697 |
+
|
698 |
+
|
699 |
+
|
700 |
+
|
701 |
+
|
702 |
+
|
703 |
+
|
704 |
+
|
705 |
+
|
706 |
+
|
707 |
+
|
708 |
+
|
709 |
+
|
710 |
+
|
711 |
+
|
712 |
+
|
713 |
+
|
714 |
+
|
715 |
+
|
716 |
+
|
717 |
+
|
718 |
+
|
719 |
+
|
720 |
+
|
721 |
+
|
722 |
+
|
723 |
+
|
724 |
+
|
725 |
+
3%|█████▏ | 700/21520 [32:54<9:54:50, 1.71s/it]
|
726 |
+
|
727 |
+
|
728 |
+
|
729 |
+
|
730 |
+
|
731 |
+
|
732 |
+
|
733 |
+
|
734 |
+
|
735 |
+
|
736 |
+
|
737 |
+
|
738 |
+
|
739 |
+
|
740 |
+
|
741 |
+
|
742 |
+
|
743 |
+
|
744 |
+
|
745 |
+
|
746 |
+
|
747 |
+
|
748 |
+
|
749 |
+
|
750 |
+
|
751 |
+
|
752 |
+
|
753 |
+
|
754 |
+
|
755 |
+
|
756 |
+
|
757 |
+
|
758 |
+
|
759 |
+
|
760 |
+
|
761 |
+
|
762 |
+
|
763 |
+
|
764 |
+
|
765 |
+
|
766 |
+
|
767 |
+
|
768 |
+
|
769 |
+
|
770 |
+
|
771 |
+
|
772 |
+
|
773 |
+
|
774 |
+
|
775 |
+
|
776 |
+
|
777 |
+
|
778 |
+
|
779 |
+
|
780 |
+
|
781 |
+
|
782 |
+
|
783 |
+
|
784 |
+
|
785 |
+
|
786 |
+
|
787 |
+
|
788 |
+
|
789 |
+
|
790 |
+
|
791 |
+
|
792 |
+
|
793 |
+
|
794 |
+
|
795 |
+
|
796 |
+
|
797 |
+
|
798 |
+
|
799 |
+
|
800 |
+
|
801 |
+
|
802 |
+
|
803 |
+
|
804 |
+
|
805 |
+
|
806 |
+
|
807 |
+
|
808 |
+
|
809 |
+
|
810 |
+
|
811 |
+
|
812 |
+
|
813 |
+
|
814 |
+
|
815 |
+
|
816 |
+
|
817 |
+
|
818 |
+
|
819 |
+
|
820 |
+
|
821 |
+
|
822 |
+
|
823 |
+
4%|█████▉ | 799/21520 [37:16<9:45:40, 1.70s/it]
|
824 |
+
|
825 |
+
|
826 |
+
|
827 |
+
|
828 |
+
|
829 |
+
|
830 |
+
|
831 |
+
|
832 |
+
|
833 |
+
|
834 |
+
|
835 |
+
|
836 |
+
|
837 |
+
|
838 |
+
|
839 |
+
|
840 |
+
|
841 |
+
|
842 |
+
|
843 |
+
|
844 |
+
|
845 |
+
|
846 |
+
|
847 |
+
|
848 |
+
|
849 |
+
|
850 |
+
|
851 |
+
|
852 |
+
|
853 |
+
|
854 |
+
|
855 |
+
|
856 |
+
|
857 |
+
|
858 |
+
|
859 |
+
|
860 |
+
|
861 |
+
|
862 |
+
|
863 |
+
|
864 |
+
|
865 |
+
|
866 |
+
|
867 |
+
|
868 |
+
|
869 |
+
|
870 |
+
|
871 |
+
|
872 |
+
|
873 |
+
|
874 |
+
|
875 |
+
|
876 |
+
|
877 |
+
|
878 |
+
|
879 |
+
|
880 |
+
|
881 |
+
|
882 |
+
|
883 |
+
|
884 |
+
|
885 |
+
|
886 |
+
|
887 |
+
|
888 |
+
|
889 |
+
|
890 |
+
|
891 |
+
|
892 |
+
|
893 |
+
|
894 |
+
|
895 |
+
|
896 |
+
|
897 |
+
|
898 |
+
|
899 |
+
|
900 |
+
|
901 |
+
|
902 |
+
|
903 |
+
|
904 |
+
|
905 |
+
|
906 |
+
|
907 |
+
|
908 |
+
|
909 |
+
|
910 |
+
|
911 |
+
|
912 |
+
|
913 |
+
|
914 |
+
|
915 |
+
|
916 |
+
|
917 |
+
|
918 |
+
|
919 |
+
|
920 |
+
|
921 |
+
|
922 |
+
4%|██████▋ | 899/21520 [41:40<9:40:54, 1.69s/it]
|
923 |
+
|
924 |
+
|
925 |
+
|
926 |
+
|
927 |
+
|
928 |
+
|
929 |
+
|
930 |
+
|
931 |
+
|
932 |
+
|
933 |
+
|
934 |
+
|
935 |
+
|
936 |
+
|
937 |
+
|
938 |
+
|
939 |
+
|
940 |
+
|
941 |
+
|
942 |
+
|
943 |
+
|
944 |
+
|
945 |
+
|
946 |
+
|
947 |
+
|
948 |
+
|
949 |
+
|
950 |
+
|
951 |
+
|
952 |
+
|
953 |
+
|
954 |
+
|
955 |
+
|
956 |
+
|
957 |
+
|
958 |
+
|
959 |
+
|
960 |
+
|
961 |
+
|
962 |
+
|
963 |
+
|
964 |
+
|
965 |
+
|
966 |
+
|
967 |
+
|
968 |
+
|
969 |
+
|
970 |
+
|
971 |
+
|
972 |
+
|
973 |
+
|
974 |
+
|
975 |
+
|
976 |
+
|
977 |
+
|
978 |
+
|
979 |
+
|
980 |
+
|
981 |
+
|
982 |
+
|
983 |
+
|
984 |
+
|
985 |
+
|
986 |
+
|
987 |
+
|
988 |
+
|
989 |
+
|
990 |
+
|
991 |
+
|
992 |
+
|
993 |
+
|
994 |
+
|
995 |
+
|
996 |
+
|
997 |
+
|
998 |
+
|
999 |
+
|
1000 |
+
|
1001 |
+
|
1002 |
+
|
1003 |
+
|
1004 |
+
|
1005 |
+
|
1006 |
+
|
1007 |
+
|
1008 |
+
|
1009 |
+
|
1010 |
+
|
1011 |
+
|
1012 |
+
|
1013 |
+
|
1014 |
+
|
1015 |
+
|
1016 |
+
|
1017 |
+
|
1018 |
+
|
1019 |
+
|
1020 |
+
|
1021 |
+
5%|███████▍ | 999/21520 [46:03<9:58:48, 1.75s/it]
|
1022 |
+
5%|███████▎ | 1000/21520 [46:05<9:34:03, 1.68s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
1023 |
+
***** Running Evaluation *****
|
1024 |
+
Num examples = 1839
|
1025 |
+
Batch size = 64
|
1026 |
+
|
1027 |
+
|
1028 |
+
|
1029 |
+
|
1030 |
+
|
1031 |
+
|
1032 |
+
|
1033 |
+
|
1034 |
+
|
1035 |
+
|
1036 |
+
|
1037 |
+
|
1038 |
+
|
1039 |
+
|
1040 |
+
|
1041 |
+
|
1042 |
+
|
1043 |
+
|
1044 |
+
|
1045 |
+
|
1046 |
+
|
1047 |
+
|
1048 |
+
|
1049 |
+
|
1050 |
+
|
1051 |
+
|
1052 |
+
|
1053 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:55<00:00, 1.82s/it]
|
1054 |
+
|
1055 |
+
Configuration saved in ./checkpoint-1000/config.json
|
1056 |
+
Model weights saved in ./checkpoint-1000/pytorch_model.bin
|
1057 |
+
Configuration saved in ./checkpoint-1000/preprocessor_config.json
|
1058 |
+
Configuration saved in ./preprocessor_config.json
|
1059 |
+
|
1060 |
+
|
1061 |
+
|
1062 |
+
|
1063 |
+
|
1064 |
+
|
1065 |
+
|
1066 |
+
|
1067 |
+
|
1068 |
+
|
1069 |
+
|
1070 |
+
|
1071 |
+
|
1072 |
+
|
1073 |
+
|
1074 |
+
|
1075 |
+
|
1076 |
+
|
1077 |
+
|
1078 |
+
|
1079 |
+
|
1080 |
+
|
1081 |
+
|
1082 |
+
|
1083 |
+
|
1084 |
+
|
1085 |
+
|
1086 |
+
|
1087 |
+
|
1088 |
+
|
1089 |
+
|
1090 |
+
|
1091 |
+
|
1092 |
+
|
1093 |
+
|
1094 |
+
|
1095 |
+
|
1096 |
+
|
1097 |
+
|
1098 |
+
|
1099 |
+
|
1100 |
+
|
1101 |
+
|
1102 |
+
|
1103 |
+
|
1104 |
+
|
1105 |
+
|
1106 |
+
|
1107 |
+
|
1108 |
+
|
1109 |
+
|
1110 |
+
|
1111 |
+
|
1112 |
+
|
1113 |
+
|
1114 |
+
|
1115 |
+
|
1116 |
+
|
1117 |
+
|
1118 |
+
|
1119 |
+
|
1120 |
+
|
1121 |
+
|
1122 |
+
|
1123 |
+
|
1124 |
+
|
1125 |
+
|
1126 |
+
|
1127 |
+
|
1128 |
+
|
1129 |
+
|
1130 |
+
|
1131 |
+
|
1132 |
+
|
1133 |
+
|
1134 |
+
|
1135 |
+
|
1136 |
+
|
1137 |
+
|
1138 |
+
|
1139 |
+
|
1140 |
+
|
1141 |
+
|
1142 |
+
|
1143 |
+
|
1144 |
+
|
1145 |
+
|
1146 |
+
|
1147 |
+
|
1148 |
+
|
1149 |
+
|
1150 |
+
|
1151 |
+
|
1152 |
+
|
1153 |
+
|
1154 |
+
|
1155 |
+
5%|████████ | 1099/21520 [53:19<9:33:07, 1.68s/it]
|
1156 |
+
|
1157 |
+
|
1158 |
+
|
1159 |
+
|
1160 |
+
|
1161 |
+
|
1162 |
+
|
1163 |
+
|
1164 |
+
|
1165 |
+
|
1166 |
+
|
1167 |
+
|
1168 |
+
|
1169 |
+
|
1170 |
+
|
1171 |
+
|
1172 |
+
|
1173 |
+
|
1174 |
+
|
1175 |
+
|
1176 |
+
|
1177 |
+
|
1178 |
+
|
1179 |
+
|
1180 |
+
|
1181 |
+
|
1182 |
+
|
1183 |
+
|
1184 |
+
|
1185 |
+
|
1186 |
+
|
1187 |
+
|
1188 |
+
|
1189 |
+
|
1190 |
+
|
1191 |
+
|
1192 |
+
|
1193 |
+
|
1194 |
+
|
1195 |
+
|
1196 |
+
|
1197 |
+
|
1198 |
+
|
1199 |
+
|
1200 |
+
|
1201 |
+
|
1202 |
+
|
1203 |
+
|
1204 |
+
|
1205 |
+
|
1206 |
+
|
1207 |
+
|
1208 |
+
|
1209 |
+
|
1210 |
+
|
1211 |
+
|
1212 |
+
|
1213 |
+
|
1214 |
+
|
1215 |
+
|
1216 |
+
|
1217 |
+
|
1218 |
+
|
1219 |
+
|
1220 |
+
|
1221 |
+
|
1222 |
+
|
1223 |
+
|
1224 |
+
|
1225 |
+
|
1226 |
+
|
1227 |
+
|
1228 |
+
|
1229 |
+
|
1230 |
+
|
1231 |
+
|
1232 |
+
|
1233 |
+
|
1234 |
+
|
1235 |
+
|
1236 |
+
|
1237 |
+
|
1238 |
+
|
1239 |
+
|
1240 |
+
|
1241 |
+
|
1242 |
+
|
1243 |
+
|
1244 |
+
|
1245 |
+
|
1246 |
+
|
1247 |
+
|
1248 |
+
|
1249 |
+
|
1250 |
+
|
1251 |
+
|
1252 |
+
|
1253 |
+
|
1254 |
+
|
1255 |
+
6%|████████▊ | 1199/21520 [57:45<9:59:52, 1.77s/it]
|
1256 |
+
|
1257 |
+
|
1258 |
+
|
1259 |
+
|
1260 |
+
|
1261 |
+
|
1262 |
+
|
1263 |
+
|
1264 |
+
|
1265 |
+
|
1266 |
+
|
1267 |
+
|
1268 |
+
|
1269 |
+
|
1270 |
+
|
1271 |
+
|
1272 |
+
|
1273 |
+
|
1274 |
+
|
1275 |
+
|
1276 |
+
|
1277 |
+
|
1278 |
+
|
1279 |
+
|
1280 |
+
|
1281 |
+
|
1282 |
+
|
1283 |
+
|
1284 |
+
|
1285 |
+
|
1286 |
+
|
1287 |
+
|
1288 |
+
|
1289 |
+
|
1290 |
+
|
1291 |
+
|
1292 |
+
|
1293 |
+
|
1294 |
+
|
1295 |
+
|
1296 |
+
|
1297 |
+
|
1298 |
+
|
1299 |
+
|
1300 |
+
|
1301 |
+
|
1302 |
+
|
1303 |
+
|
1304 |
+
|
1305 |
+
|
1306 |
+
|
1307 |
+
|
1308 |
+
|
1309 |
+
|
1310 |
+
|
1311 |
+
|
1312 |
+
|
1313 |
+
|
1314 |
+
|
1315 |
+
|
1316 |
+
|
1317 |
+
|
1318 |
+
|
1319 |
+
|
1320 |
+
|
1321 |
+
|
1322 |
+
|
1323 |
+
|
1324 |
+
|
1325 |
+
|
1326 |
+
|
1327 |
+
|
1328 |
+
|
1329 |
+
|
1330 |
+
|
1331 |
+
|
1332 |
+
|
1333 |
+
|
1334 |
+
|
1335 |
+
|
1336 |
+
|
1337 |
+
|
1338 |
+
|
1339 |
+
|
1340 |
+
|
1341 |
+
|
1342 |
+
|
1343 |
+
|
1344 |
+
|
1345 |
+
|
1346 |
+
|
1347 |
+
|
1348 |
+
|
1349 |
+
|
1350 |
+
|
1351 |
+
|
1352 |
+
6%|█████████▍ | 1299/21520 [1:02:09<9:15:12, 1.65s/it]
|
1353 |
+
|
1354 |
+
|
1355 |
+
|
1356 |
+
|
1357 |
+
|
1358 |
+
|
1359 |
+
|
1360 |
+
|
1361 |
+
|
1362 |
+
|
1363 |
+
|
1364 |
+
|
1365 |
+
|
1366 |
+
|
1367 |
+
|
1368 |
+
|
1369 |
+
|
1370 |
+
|
1371 |
+
|
1372 |
+
|
1373 |
+
|
1374 |
+
|
1375 |
+
|
1376 |
+
|
1377 |
+
|
1378 |
+
|
1379 |
+
|
1380 |
+
|
1381 |
+
|
1382 |
+
|
1383 |
+
|
1384 |
+
|
1385 |
+
|
1386 |
+
|
1387 |
+
|
1388 |
+
|
1389 |
+
|
1390 |
+
|
1391 |
+
|
1392 |
+
|
1393 |
+
|
1394 |
+
|
1395 |
+
|
1396 |
+
|
1397 |
+
|
1398 |
+
|
1399 |
+
|
1400 |
+
|
1401 |
+
|
1402 |
+
|
1403 |
+
|
1404 |
+
|
1405 |
+
|
1406 |
+
|
1407 |
+
|
1408 |
+
|
1409 |
+
|
1410 |
+
|
1411 |
+
|
1412 |
+
|
1413 |
+
|
1414 |
+
|
1415 |
+
|
1416 |
+
|
1417 |
+
|
1418 |
+
|
1419 |
+
|
1420 |
+
|
1421 |
+
|
1422 |
+
|
1423 |
+
|
1424 |
+
|
1425 |
+
|
1426 |
+
|
1427 |
+
|
1428 |
+
|
1429 |
+
|
1430 |
+
|
1431 |
+
|
1432 |
+
|
1433 |
+
|
1434 |
+
|
1435 |
+
|
1436 |
+
|
1437 |
+
|
1438 |
+
|
1439 |
+
|
1440 |
+
|
1441 |
+
|
1442 |
+
|
1443 |
+
|
1444 |
+
|
1445 |
+
|
1446 |
+
|
1447 |
+
|
1448 |
+
|
1449 |
+
|
1450 |
+
|
1451 |
+
7%|██████████▏ | 1399/21520 [1:06:35<9:28:45, 1.70s/it]
|
1452 |
+
|
1453 |
+
|
1454 |
+
|
1455 |
+
|
1456 |
+
|
1457 |
+
|
1458 |
+
|
1459 |
+
|
1460 |
+
|
1461 |
+
|
1462 |
+
|
1463 |
+
|
1464 |
+
|
1465 |
+
|
1466 |
+
|
1467 |
+
|
1468 |
+
|
1469 |
+
|
1470 |
+
|
1471 |
+
|
1472 |
+
|
1473 |
+
|
1474 |
+
|
1475 |
+
|
1476 |
+
|
1477 |
+
|
1478 |
+
|
1479 |
+
|
1480 |
+
|
1481 |
+
|
1482 |
+
|
1483 |
+
|
1484 |
+
|
1485 |
+
|
1486 |
+
|
1487 |
+
|
1488 |
+
|
1489 |
+
|
1490 |
+
|
1491 |
+
|
1492 |
+
|
1493 |
+
|
1494 |
+
|
1495 |
+
|
1496 |
+
|
1497 |
+
|
1498 |
+
|
1499 |
+
|
1500 |
+
|
1501 |
+
|
1502 |
+
|
1503 |
+
|
1504 |
+
|
1505 |
+
|
1506 |
+
|
1507 |
+
|
1508 |
+
|
1509 |
+
|
1510 |
+
|
1511 |
+
|
1512 |
+
|
1513 |
+
|
1514 |
+
|
1515 |
+
|
1516 |
+
|
1517 |
+
|
1518 |
+
|
1519 |
+
|
1520 |
+
|
1521 |
+
|
1522 |
+
|
1523 |
+
|
1524 |
+
|
1525 |
+
|
1526 |
+
|
1527 |
+
|
1528 |
+
|
1529 |
+
|
1530 |
+
|
1531 |
+
|
1532 |
+
|
1533 |
+
|
1534 |
+
|
1535 |
+
|
1536 |
+
|
1537 |
+
|
1538 |
+
|
1539 |
+
|
1540 |
+
|
1541 |
+
|
1542 |
+
|
1543 |
+
|
1544 |
+
|
1545 |
+
|
1546 |
+
|
1547 |
+
|
1548 |
+
|
1549 |
+
7%|██████████▊ | 1499/21520 [1:10:59<9:13:59, 1.66s/it]
|
1550 |
+
7%|██████████▊ | 1500/21520 [1:11:00<8:58:38, 1.61s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
1551 |
+
***** Running Evaluation *****
|
1552 |
+
Num examples = 1839
|
1553 |
+
Batch size = 64
|
1554 |
+
|
1555 |
+
|
1556 |
+
|
1557 |
+
|
1558 |
+
|
1559 |
+
|
1560 |
+
|
1561 |
+
|
1562 |
+
|
1563 |
+
|
1564 |
+
|
1565 |
+
|
1566 |
+
|
1567 |
+
|
1568 |
+
|
1569 |
+
|
1570 |
+
|
1571 |
+
|
1572 |
+
|
1573 |
+
|
1574 |
+
|
1575 |
+
|
1576 |
+
|
1577 |
+
|
1578 |
+
|
1579 |
+
|
1580 |
+
|
1581 |
+
|
1582 |
+
Configuration saved in ./checkpoint-1500/config.json
|
1583 |
+
{'eval_loss': 1.2048969268798828, 'eval_wer': 0.8062855432505238, 'eval_runtime': 64.3059, 'eval_samples_per_second': 28.598, 'eval_steps_per_second': 0.451, 'epoch': 0.35}
|
1584 |
+
Model weights saved in ./checkpoint-1500/pytorch_model.bin
|
1585 |
+
Configuration saved in ./checkpoint-1500/preprocessor_config.json
|
1586 |
+
Configuration saved in ./preprocessor_config.json
|
1587 |
+
Adding files tracked by Git LFS: ['wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb']. This may take a bit of time if the files are large.
|
1588 |
+
01/29/2022 14:25:13 - WARNING - huggingface_hub.repository - Adding files tracked by Git LFS: ['wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb']. This may take a bit of time if the files are large.
|
1589 |
+
|
1590 |
+
|
1591 |
+
|
1592 |
+
|
1593 |
+
|
1594 |
+
|
1595 |
+
|
1596 |
+
|
1597 |
+
|
1598 |
+
|
1599 |
+
|
1600 |
+
|
1601 |
+
|
1602 |
+
|
1603 |
+
|
1604 |
+
|
1605 |
+
|
1606 |
+
|
1607 |
+
|
1608 |
+
|
1609 |
+
|
1610 |
+
|
1611 |
+
|
1612 |
+
|
1613 |
+
|
1614 |
+
|
1615 |
+
|
1616 |
+
|
1617 |
+
|
1618 |
+
|
1619 |
+
|
1620 |
+
|
1621 |
+
|
1622 |
+
|
1623 |
+
|
1624 |
+
|
1625 |
+
|
1626 |
+
|
1627 |
+
|
1628 |
+
|
1629 |
+
|
1630 |
+
|
1631 |
+
|
1632 |
+
|
1633 |
+
|
1634 |
+
|
1635 |
+
|
1636 |
+
|
1637 |
+
|
1638 |
+
|
1639 |
+
|
1640 |
+
|
1641 |
+
|
1642 |
+
|
1643 |
+
|
1644 |
+
|
1645 |
+
|
1646 |
+
|
1647 |
+
|
1648 |
+
|
1649 |
+
|
1650 |
+
|
1651 |
+
|
1652 |
+
|
1653 |
+
|
1654 |
+
|
1655 |
+
|
1656 |
+
|
1657 |
+
|
1658 |
+
|
1659 |
+
|
1660 |
+
|
1661 |
+
|
1662 |
+
|
1663 |
+
|
1664 |
+
|
1665 |
+
|
1666 |
+
|
1667 |
+
|
1668 |
+
|
1669 |
+
|
1670 |
+
|
1671 |
+
|
1672 |
+
|
1673 |
+
|
1674 |
+
|
1675 |
+
|
1676 |
+
|
1677 |
+
|
1678 |
+
|
1679 |
+
|
1680 |
+
|
1681 |
+
|
1682 |
+
|
1683 |
+
|
1684 |
+
|
1685 |
+
7%|███████████▌ | 1599/21520 [1:18:18<9:12:21, 1.66s/it]
|
1686 |
+
|
1687 |
+
|
1688 |
+
|
1689 |
+
|
1690 |
+
|
1691 |
+
|
1692 |
+
|
1693 |
+
|
1694 |
+
|
1695 |
+
|
1696 |
+
|
1697 |
+
|
1698 |
+
|
1699 |
+
|
1700 |
+
|
1701 |
+
|
1702 |
+
|
1703 |
+
|
1704 |
+
|
1705 |
+
|
1706 |
+
|
1707 |
+
|
1708 |
+
|
1709 |
+
|
1710 |
+
|
1711 |
+
|
1712 |
+
|
1713 |
+
|
1714 |
+
|
1715 |
+
|
1716 |
+
|
1717 |
+
|
1718 |
+
|
1719 |
+
|
1720 |
+
|
1721 |
+
|
1722 |
+
|
1723 |
+
|
1724 |
+
|
1725 |
+
|
1726 |
+
|
1727 |
+
|
1728 |
+
|
1729 |
+
|
1730 |
+
|
1731 |
+
|
1732 |
+
|
1733 |
+
|
1734 |
+
|
1735 |
+
|
1736 |
+
|
1737 |
+
|
1738 |
+
|
1739 |
+
|
1740 |
+
|
1741 |
+
|
1742 |
+
|
1743 |
+
|
1744 |
+
|
1745 |
+
|
1746 |
+
|
1747 |
+
|
1748 |
+
|
1749 |
+
|
1750 |
+
|
1751 |
+
|
1752 |
+
|
1753 |
+
|
1754 |
+
|
1755 |
+
|
1756 |
+
|
1757 |
+
|
1758 |
+
|
1759 |
+
|
1760 |
+
|
1761 |
+
|
1762 |
+
|
1763 |
+
|
1764 |
+
|
1765 |
+
|
1766 |
+
|
1767 |
+
|
1768 |
+
|
1769 |
+
|
1770 |
+
|
1771 |
+
|
1772 |
+
|
1773 |
+
|
1774 |
+
|
1775 |
+
|
1776 |
+
|
1777 |
+
|
1778 |
+
|
1779 |
+
|
1780 |
+
|
1781 |
+
|
1782 |
+
|
1783 |
+
|
1784 |
+
8%|████████████▎ | 1700/21520 [1:22:44<8:47:54, 1.60s/it]
|
1785 |
+
|
1786 |
+
|
1787 |
+
|
1788 |
+
|
1789 |
+
|
1790 |
+
|
1791 |
+
|
1792 |
+
|
1793 |
+
|
1794 |
+
|
1795 |
+
|
1796 |
+
|
1797 |
+
|
1798 |
+
|
1799 |
+
|
1800 |
+
|
1801 |
+
|
1802 |
+
|
1803 |
+
|
1804 |
+
|
1805 |
+
|
1806 |
+
|
1807 |
+
|
1808 |
+
|
1809 |
+
|
1810 |
+
|
1811 |
+
|
1812 |
+
|
1813 |
+
|
1814 |
+
|
1815 |
+
|
1816 |
+
|
1817 |
+
|
1818 |
+
|
1819 |
+
|
1820 |
+
|
1821 |
+
|
1822 |
+
|
1823 |
+
|
1824 |
+
|
1825 |
+
|
1826 |
+
|
1827 |
+
|
1828 |
+
|
1829 |
+
|
1830 |
+
|
1831 |
+
|
1832 |
+
|
1833 |
+
|
1834 |
+
|
1835 |
+
|
1836 |
+
|
1837 |
+
|
1838 |
+
|
1839 |
+
|
1840 |
+
|
1841 |
+
|
1842 |
+
|
1843 |
+
|
1844 |
+
|
1845 |
+
|
1846 |
+
|
1847 |
+
|
1848 |
+
|
1849 |
+
|
1850 |
+
|
1851 |
+
|
1852 |
+
|
1853 |
+
|
1854 |
+
|
1855 |
+
|
1856 |
+
|
1857 |
+
|
1858 |
+
|
1859 |
+
|
1860 |
+
|
1861 |
+
|
1862 |
+
|
1863 |
+
|
1864 |
+
|
1865 |
+
|
1866 |
+
|
1867 |
+
|
1868 |
+
|
1869 |
+
|
1870 |
+
|
1871 |
+
|
1872 |
+
|
1873 |
+
|
1874 |
+
|
1875 |
+
|
1876 |
+
|
1877 |
+
|
1878 |
+
|
1879 |
+
|
1880 |
+
|
1881 |
+
8%|█████████████ | 1799/21520 [1:27:06<9:14:58, 1.69s/it]
|
1882 |
+
|
1883 |
+
|
1884 |
+
|
1885 |
+
|
1886 |
+
|
1887 |
+
|
1888 |
+
|
1889 |
+
|
1890 |
+
|
1891 |
+
|
1892 |
+
|
1893 |
+
|
1894 |
+
|
1895 |
+
|
1896 |
+
|
1897 |
+
|
1898 |
+
|
1899 |
+
|
1900 |
+
|
1901 |
+
|
1902 |
+
|
1903 |
+
|
1904 |
+
|
1905 |
+
|
1906 |
+
|
1907 |
+
|
1908 |
+
|
1909 |
+
|
1910 |
+
|
1911 |
+
|
1912 |
+
|
1913 |
+
|
1914 |
+
|
1915 |
+
|
1916 |
+
|
1917 |
+
|
1918 |
+
|
1919 |
+
|
1920 |
+
|
1921 |
+
|
1922 |
+
|
1923 |
+
|
1924 |
+
|
1925 |
+
|
1926 |
+
|
1927 |
+
|
1928 |
+
|
1929 |
+
|
1930 |
+
|
1931 |
+
|
1932 |
+
|
1933 |
+
|
1934 |
+
|
1935 |
+
|
1936 |
+
|
1937 |
+
|
1938 |
+
|
1939 |
+
|
1940 |
+
|
1941 |
+
|
1942 |
+
|
1943 |
+
|
1944 |
+
|
1945 |
+
|
1946 |
+
|
1947 |
+
|
1948 |
+
|
1949 |
+
|
1950 |
+
|
1951 |
+
|
1952 |
+
|
1953 |
+
|
1954 |
+
|
1955 |
+
|
1956 |
+
|
1957 |
+
|
1958 |
+
|
1959 |
+
|
1960 |
+
|
1961 |
+
|
1962 |
+
|
1963 |
+
|
1964 |
+
|
1965 |
+
|
1966 |
+
|
1967 |
+
|
1968 |
+
|
1969 |
+
|
1970 |
+
|
1971 |
+
|
1972 |
+
|
1973 |
+
|
1974 |
+
|
1975 |
+
|
1976 |
+
|
1977 |
+
|
1978 |
+
|
1979 |
+
|
1980 |
+
9%|█████████████▊ | 1900/21520 [1:31:34<8:50:40, 1.62s/it]
|
1981 |
+
|
1982 |
+
|
1983 |
+
|
1984 |
+
|
1985 |
+
|
1986 |
+
|
1987 |
+
|
1988 |
+
|
1989 |
+
|
1990 |
+
|
1991 |
+
|
1992 |
+
|
1993 |
+
|
1994 |
+
|
1995 |
+
|
1996 |
+
|
1997 |
+
|
1998 |
+
|
1999 |
+
|
2000 |
+
|
2001 |
+
|
2002 |
+
|
2003 |
+
|
2004 |
+
|
2005 |
+
|
2006 |
+
|
2007 |
+
|
2008 |
+
|
2009 |
+
|
2010 |
+
|
2011 |
+
|
2012 |
+
|
2013 |
+
|
2014 |
+
|
2015 |
+
|
2016 |
+
|
2017 |
+
|
2018 |
+
|
2019 |
+
|
2020 |
+
|
2021 |
+
|
2022 |
+
|
2023 |
+
|
2024 |
+
|
2025 |
+
|
2026 |
+
|
2027 |
+
|
2028 |
+
|
2029 |
+
|
2030 |
+
|
2031 |
+
|
2032 |
+
|
2033 |
+
|
2034 |
+
|
2035 |
+
|
2036 |
+
|
2037 |
+
|
2038 |
+
|
2039 |
+
|
2040 |
+
|
2041 |
+
|
2042 |
+
|
2043 |
+
|
2044 |
+
|
2045 |
+
|
2046 |
+
|
2047 |
+
|
2048 |
+
|
2049 |
+
|
2050 |
+
|
2051 |
+
|
2052 |
+
|
2053 |
+
|
2054 |
+
|
2055 |
+
|
2056 |
+
|
2057 |
+
|
2058 |
+
|
2059 |
+
|
2060 |
+
|
2061 |
+
|
2062 |
+
|
2063 |
+
|
2064 |
+
|
2065 |
+
|
2066 |
+
|
2067 |
+
|
2068 |
+
|
2069 |
+
|
2070 |
+
|
2071 |
+
|
2072 |
+
|
2073 |
+
|
2074 |
+
|
2075 |
+
|
2076 |
+
|
2077 |
+
|
2078 |
+
9%|██████████████▍ | 2000/21520 [1:35:55<8:46:26, 1.62s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
2079 |
+
***** Running Evaluation *****
|
2080 |
+
Num examples = 1839
|
2081 |
+
Batch size = 64
|
2082 |
+
{'loss': 1.5498, 'learning_rate': 6.826923076923076e-05, 'epoch': 0.46}
|
2083 |
+
|
2084 |
+
|
2085 |
+
|
2086 |
+
|
2087 |
+
|
2088 |
+
|
2089 |
+
|
2090 |
+
|
2091 |
+
|
2092 |
+
|
2093 |
+
|
2094 |
+
|
2095 |
+
|
2096 |
+
|
2097 |
+
|
2098 |
+
|
2099 |
+
|
2100 |
+
|
2101 |
+
|
2102 |
+
|
2103 |
+
|
2104 |
+
|
2105 |
+
|
2106 |
+
|
2107 |
+
|
2108 |
+
|
2109 |
+
|
2110 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:55<00:00, 1.81s/it]
|
2111 |
+
|
2112 |
+
Configuration saved in ./checkpoint-2000/config.json
|
2113 |
+
Model weights saved in ./checkpoint-2000/pytorch_model.bin
|
2114 |
+
Configuration saved in ./checkpoint-2000/preprocessor_config.json
|
2115 |
+
Configuration saved in ./preprocessor_config.json
|
2116 |
+
Deleting older checkpoint [checkpoint-500] due to args.save_total_limit
|
2117 |
+
|
2118 |
+
|
2119 |
+
|
2120 |
+
|
2121 |
+
|
2122 |
+
|
2123 |
+
|
2124 |
+
|
2125 |
+
|
2126 |
+
|
2127 |
+
|
2128 |
+
|
2129 |
+
|
2130 |
+
|
2131 |
+
|
2132 |
+
|
2133 |
+
|
2134 |
+
|
2135 |
+
|
2136 |
+
|
2137 |
+
|
2138 |
+
|
2139 |
+
|
2140 |
+
|
2141 |
+
|
2142 |
+
|
2143 |
+
|
2144 |
+
|
2145 |
+
|
2146 |
+
|
2147 |
+
|
2148 |
+
|
2149 |
+
|
2150 |
+
|
2151 |
+
|
2152 |
+
|
2153 |
+
|
2154 |
+
|
2155 |
+
|
2156 |
+
|
2157 |
+
|
2158 |
+
|
2159 |
+
|
2160 |
+
|
2161 |
+
|
2162 |
+
|
2163 |
+
|
2164 |
+
|
2165 |
+
|
2166 |
+
|
2167 |
+
|
2168 |
+
|
2169 |
+
|
2170 |
+
|
2171 |
+
|
2172 |
+
|
2173 |
+
|
2174 |
+
|
2175 |
+
|
2176 |
+
|
2177 |
+
|
2178 |
+
|
2179 |
+
|
2180 |
+
|
2181 |
+
|
2182 |
+
|
2183 |
+
|
2184 |
+
|
2185 |
+
|
2186 |
+
|
2187 |
+
|
2188 |
+
|
2189 |
+
|
2190 |
+
|
2191 |
+
|
2192 |
+
|
2193 |
+
|
2194 |
+
|
2195 |
+
|
2196 |
+
|
2197 |
+
|
2198 |
+
|
2199 |
+
|
2200 |
+
|
2201 |
+
|
2202 |
+
|
2203 |
+
|
2204 |
+
|
2205 |
+
|
2206 |
+
|
2207 |
+
|
2208 |
+
|
2209 |
+
|
2210 |
+
|
2211 |
+
|
2212 |
+
|
2213 |
+
10%|███████████████▏ | 2100/21520 [1:43:15<8:45:13, 1.62s/it]
|
2214 |
+
|
2215 |
+
|
2216 |
+
|
2217 |
+
|
2218 |
+
|
2219 |
+
|
2220 |
+
|
2221 |
+
|
2222 |
+
|
2223 |
+
|
2224 |
+
|
2225 |
+
|
2226 |
+
|
2227 |
+
|
2228 |
+
|
2229 |
+
|
2230 |
+
|
2231 |
+
|
2232 |
+
|
2233 |
+
|
2234 |
+
|
2235 |
+
|
2236 |
+
|
2237 |
+
|
2238 |
+
|
2239 |
+
|
2240 |
+
|
2241 |
+
|
2242 |
+
|
2243 |
+
|
2244 |
+
|
2245 |
+
|
2246 |
+
|
2247 |
+
|
2248 |
+
|
2249 |
+
|
2250 |
+
|
2251 |
+
|
2252 |
+
|
2253 |
+
|
2254 |
+
|
2255 |
+
|
2256 |
+
|
2257 |
+
|
2258 |
+
|
2259 |
+
|
2260 |
+
|
2261 |
+
|
2262 |
+
|
2263 |
+
|
2264 |
+
|
2265 |
+
|
2266 |
+
|
2267 |
+
|
2268 |
+
|
2269 |
+
|
2270 |
+
|
2271 |
+
|
2272 |
+
|
2273 |
+
|
2274 |
+
|
2275 |
+
|
2276 |
+
|
2277 |
+
|
2278 |
+
|
2279 |
+
|
2280 |
+
|
2281 |
+
|
2282 |
+
|
2283 |
+
|
2284 |
+
|
2285 |
+
|
2286 |
+
|
2287 |
+
|
2288 |
+
|
2289 |
+
|
2290 |
+
|
2291 |
+
|
2292 |
+
|
2293 |
+
|
2294 |
+
|
2295 |
+
|
2296 |
+
|
2297 |
+
|
2298 |
+
|
2299 |
+
|
2300 |
+
|
2301 |
+
|
2302 |
+
|
2303 |
+
|
2304 |
+
|
2305 |
+
|
2306 |
+
|
2307 |
+
|
2308 |
+
|
2309 |
+
|
2310 |
+
|
2311 |
+
|
2312 |
+
10%|███████████████▉ | 2200/21520 [1:47:39<8:48:39, 1.64s/it]
|
2313 |
+
|
2314 |
+
|
2315 |
+
|
2316 |
+
|
2317 |
+
|
2318 |
+
|
2319 |
+
|
2320 |
+
|
2321 |
+
|
2322 |
+
|
2323 |
+
|
2324 |
+
|
2325 |
+
|
2326 |
+
|
2327 |
+
|
2328 |
+
|
2329 |
+
|
2330 |
+
|
2331 |
+
|
2332 |
+
|
2333 |
+
|
2334 |
+
|
2335 |
+
|
2336 |
+
|
2337 |
+
|
2338 |
+
|
2339 |
+
|
2340 |
+
|
2341 |
+
|
2342 |
+
|
2343 |
+
|
2344 |
+
|
2345 |
+
|
2346 |
+
|
2347 |
+
|
2348 |
+
|
2349 |
+
|
2350 |
+
|
2351 |
+
|
2352 |
+
|
2353 |
+
|
2354 |
+
|
2355 |
+
|
2356 |
+
|
2357 |
+
|
2358 |
+
|
2359 |
+
|
2360 |
+
|
2361 |
+
|
2362 |
+
|
2363 |
+
|
2364 |
+
|
2365 |
+
|
2366 |
+
|
2367 |
+
|
2368 |
+
|
2369 |
+
|
2370 |
+
|
2371 |
+
|
2372 |
+
|
2373 |
+
|
2374 |
+
|
2375 |
+
|
2376 |
+
|
2377 |
+
|
2378 |
+
|
2379 |
+
|
2380 |
+
|
2381 |
+
|
2382 |
+
|
2383 |
+
|
2384 |
+
|
2385 |
+
|
2386 |
+
|
2387 |
+
|
2388 |
+
|
2389 |
+
|
2390 |
+
|
2391 |
+
|
2392 |
+
|
2393 |
+
|
2394 |
+
|
2395 |
+
|
2396 |
+
|
2397 |
+
|
2398 |
+
|
2399 |
+
|
2400 |
+
|
2401 |
+
|
2402 |
+
|
2403 |
+
|
2404 |
+
|
2405 |
+
|
2406 |
+
|
2407 |
+
|
2408 |
+
|
2409 |
+
|
2410 |
+
11%|████████████████▋ | 2299/21520 [1:52:02<8:59:39, 1.68s/it]
|
2411 |
+
|
2412 |
+
|
2413 |
+
|
2414 |
+
|
2415 |
+
|
2416 |
+
|
2417 |
+
|
2418 |
+
|
2419 |
+
|
2420 |
+
|
2421 |
+
|
2422 |
+
|
2423 |
+
|
2424 |
+
|
2425 |
+
|
2426 |
+
|
2427 |
+
|
2428 |
+
|
2429 |
+
|
2430 |
+
|
2431 |
+
|
2432 |
+
|
2433 |
+
|
2434 |
+
|
2435 |
+
|
2436 |
+
|
2437 |
+
|
2438 |
+
|
2439 |
+
|
2440 |
+
|
2441 |
+
|
2442 |
+
|
2443 |
+
|
2444 |
+
|
2445 |
+
|
2446 |
+
|
2447 |
+
|
2448 |
+
|
2449 |
+
|
2450 |
+
|
2451 |
+
|
2452 |
+
|
2453 |
+
|
2454 |
+
|
2455 |
+
|
2456 |
+
|
2457 |
+
|
2458 |
+
|
2459 |
+
|
2460 |
+
|
2461 |
+
|
2462 |
+
|
2463 |
+
|
2464 |
+
|
2465 |
+
|
2466 |
+
|
2467 |
+
|
2468 |
+
|
2469 |
+
|
2470 |
+
|
2471 |
+
|
2472 |
+
|
2473 |
+
|
2474 |
+
|
2475 |
+
|
2476 |
+
|
2477 |
+
|
2478 |
+
|
2479 |
+
|
2480 |
+
|
2481 |
+
|
2482 |
+
|
2483 |
+
|
2484 |
+
|
2485 |
+
|
2486 |
+
|
2487 |
+
|
2488 |
+
|
2489 |
+
|
2490 |
+
|
2491 |
+
|
2492 |
+
|
2493 |
+
|
2494 |
+
|
2495 |
+
|
2496 |
+
|
2497 |
+
|
2498 |
+
|
2499 |
+
|
2500 |
+
|
2501 |
+
|
2502 |
+
|
2503 |
+
|
2504 |
+
|
2505 |
+
|
2506 |
+
|
2507 |
+
|
2508 |
+
|
2509 |
+
11%|█████████████████▍ | 2400/21520 [1:56:27<8:53:22, 1.67s/it]
|
2510 |
+
|
2511 |
+
|
2512 |
+
|
2513 |
+
|
2514 |
+
|
2515 |
+
|
2516 |
+
|
2517 |
+
|
2518 |
+
|
2519 |
+
|
2520 |
+
|
2521 |
+
|
2522 |
+
|
2523 |
+
|
2524 |
+
|
2525 |
+
|
2526 |
+
|
2527 |
+
|
2528 |
+
|
2529 |
+
|
2530 |
+
|
2531 |
+
|
2532 |
+
|
2533 |
+
|
2534 |
+
|
2535 |
+
|
2536 |
+
|
2537 |
+
|
2538 |
+
|
2539 |
+
|
2540 |
+
|
2541 |
+
|
2542 |
+
|
2543 |
+
|
2544 |
+
|
2545 |
+
|
2546 |
+
|
2547 |
+
|
2548 |
+
|
2549 |
+
|
2550 |
+
|
2551 |
+
|
2552 |
+
|
2553 |
+
|
2554 |
+
|
2555 |
+
|
2556 |
+
|
2557 |
+
|
2558 |
+
|
2559 |
+
|
2560 |
+
|
2561 |
+
|
2562 |
+
|
2563 |
+
|
2564 |
+
|
2565 |
+
|
2566 |
+
|
2567 |
+
|
2568 |
+
|
2569 |
+
|
2570 |
+
|
2571 |
+
|
2572 |
+
|
2573 |
+
|
2574 |
+
|
2575 |
+
|
2576 |
+
|
2577 |
+
|
2578 |
+
|
2579 |
+
|
2580 |
+
|
2581 |
+
|
2582 |
+
|
2583 |
+
|
2584 |
+
|
2585 |
+
|
2586 |
+
|
2587 |
+
|
2588 |
+
|
2589 |
+
|
2590 |
+
|
2591 |
+
|
2592 |
+
|
2593 |
+
|
2594 |
+
|
2595 |
+
|
2596 |
+
|
2597 |
+
|
2598 |
+
|
2599 |
+
|
2600 |
+
|
2601 |
+
|
2602 |
+
|
2603 |
+
|
2604 |
+
|
2605 |
+
|
2606 |
+
|
2607 |
+
12%|██████████████████ | 2499/21520 [2:00:52<9:23:31, 1.78s/it]
|
2608 |
+
12%|██████████████████ | 2500/21520 [2:00:53<8:59:59, 1.70s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
2609 |
+
***** Running Evaluation *****
|
2610 |
+
Num examples = 1839
|
2611 |
+
Batch size = 64
|
2612 |
+
|
2613 |
+
|
2614 |
+
|
2615 |
+
|
2616 |
+
|
2617 |
+
|
2618 |
+
|
2619 |
+
|
2620 |
+
|
2621 |
+
|
2622 |
+
|
2623 |
+
|
2624 |
+
|
2625 |
+
|
2626 |
+
|
2627 |
+
|
2628 |
+
|
2629 |
+
|
2630 |
+
|
2631 |
+
|
2632 |
+
|
2633 |
+
|
2634 |
+
|
2635 |
+
|
2636 |
+
|
2637 |
+
|
2638 |
+
|
2639 |
+
|
2640 |
+
|
2641 |
+
Configuration saved in ./checkpoint-2500/config.json
|
2642 |
+
{'eval_loss': 0.4956221282482147, 'eval_wer': 0.5256510026938043, 'eval_runtime': 65.2452, 'eval_samples_per_second': 28.186, 'eval_steps_per_second': 0.444, 'epoch': 0.58}
|
2643 |
+
Model weights saved in ./checkpoint-2500/pytorch_model.bin
|
2644 |
+
Configuration saved in ./checkpoint-2500/preprocessor_config.json
|
2645 |
+
Configuration saved in ./preprocessor_config.json
|
2646 |
+
Deleting older checkpoint [checkpoint-1000] due to args.save_total_limit
|
2647 |
+
|
2648 |
+
|
2649 |
+
|
2650 |
+
|
2651 |
+
|
2652 |
+
|
2653 |
+
|
2654 |
+
|
2655 |
+
|
2656 |
+
|
2657 |
+
|
2658 |
+
|
2659 |
+
|
2660 |
+
|
2661 |
+
|
2662 |
+
|
2663 |
+
|
2664 |
+
|
2665 |
+
|
2666 |
+
|
2667 |
+
|
2668 |
+
|
2669 |
+
|
2670 |
+
|
2671 |
+
|
2672 |
+
|
2673 |
+
|
2674 |
+
|
2675 |
+
|
2676 |
+
|
2677 |
+
|
2678 |
+
|
2679 |
+
|
2680 |
+
|
2681 |
+
|
2682 |
+
|
2683 |
+
|
2684 |
+
|
2685 |
+
|
2686 |
+
|
2687 |
+
|
2688 |
+
|
2689 |
+
|
2690 |
+
|
2691 |
+
|
2692 |
+
|
2693 |
+
|
2694 |
+
|
2695 |
+
|
2696 |
+
|
2697 |
+
|
2698 |
+
|
2699 |
+
|
2700 |
+
|
2701 |
+
|
2702 |
+
|
2703 |
+
|
2704 |
+
|
2705 |
+
|
2706 |
+
|
2707 |
+
|
2708 |
+
|
2709 |
+
|
2710 |
+
|
2711 |
+
|
2712 |
+
|
2713 |
+
|
2714 |
+
|
2715 |
+
|
2716 |
+
|
2717 |
+
|
2718 |
+
|
2719 |
+
|
2720 |
+
|
2721 |
+
|
2722 |
+
|
2723 |
+
|
2724 |
+
|
2725 |
+
|
2726 |
+
|
2727 |
+
|
2728 |
+
|
2729 |
+
|
2730 |
+
|
2731 |
+
|
2732 |
+
|
2733 |
+
|
2734 |
+
|
2735 |
+
|
2736 |
+
|
2737 |
+
|
2738 |
+
|
2739 |
+
|
2740 |
+
|
2741 |
+
|
2742 |
+
|
2743 |
+
|
2744 |
+
12%|██████████████████▊ | 2600/21520 [2:08:12<8:38:33, 1.64s/it]
|
2745 |
+
|
2746 |
+
|
2747 |
+
|
2748 |
+
|
2749 |
+
|
2750 |
+
|
2751 |
+
|
2752 |
+
|
2753 |
+
|
2754 |
+
|
2755 |
+
|
2756 |
+
|
2757 |
+
|
2758 |
+
|
2759 |
+
|
2760 |
+
|
2761 |
+
|
2762 |
+
|
2763 |
+
|
2764 |
+
|
2765 |
+
|
2766 |
+
|
2767 |
+
|
2768 |
+
|
2769 |
+
|
2770 |
+
|
2771 |
+
|
2772 |
+
|
2773 |
+
|
2774 |
+
|
2775 |
+
|
2776 |
+
|
2777 |
+
|
2778 |
+
|
2779 |
+
|
2780 |
+
|
2781 |
+
|
2782 |
+
|
2783 |
+
|
2784 |
+
|
2785 |
+
|
2786 |
+
|
2787 |
+
|
2788 |
+
|
2789 |
+
|
2790 |
+
|
2791 |
+
|
2792 |
+
|
2793 |
+
|
2794 |
+
|
2795 |
+
|
2796 |
+
|
2797 |
+
|
2798 |
+
|
2799 |
+
|
2800 |
+
|
2801 |
+
|
2802 |
+
|
2803 |
+
|
2804 |
+
|
2805 |
+
|
2806 |
+
|
2807 |
+
|
2808 |
+
|
2809 |
+
|
2810 |
+
|
2811 |
+
|
2812 |
+
|
2813 |
+
|
2814 |
+
|
2815 |
+
|
2816 |
+
|
2817 |
+
|
2818 |
+
|
2819 |
+
|
2820 |
+
|
2821 |
+
|
2822 |
+
|
2823 |
+
|
2824 |
+
|
2825 |
+
|
2826 |
+
|
2827 |
+
|
2828 |
+
|
2829 |
+
|
2830 |
+
|
2831 |
+
|
2832 |
+
|
2833 |
+
|
2834 |
+
|
2835 |
+
|
2836 |
+
|
2837 |
+
|
2838 |
+
|
2839 |
+
|
2840 |
+
|
2841 |
+
13%|███████████████████▌ | 2699/21520 [2:12:34<9:20:50, 1.79s/it]
|
2842 |
+
|
2843 |
+
|
2844 |
+
|
2845 |
+
|
2846 |
+
|
2847 |
+
|
2848 |
+
|
2849 |
+
|
2850 |
+
|
2851 |
+
|
2852 |
+
|
2853 |
+
|
2854 |
+
|
2855 |
+
|
2856 |
+
|
2857 |
+
|
2858 |
+
|
2859 |
+
|
2860 |
+
|
2861 |
+
|
2862 |
+
|
2863 |
+
|
2864 |
+
|
2865 |
+
|
2866 |
+
|
2867 |
+
|
2868 |
+
|
2869 |
+
|
2870 |
+
|
2871 |
+
|
2872 |
+
|
2873 |
+
|
2874 |
+
|
2875 |
+
|
2876 |
+
|
2877 |
+
|
2878 |
+
|
2879 |
+
|
2880 |
+
|
2881 |
+
|
2882 |
+
|
2883 |
+
|
2884 |
+
|
2885 |
+
|
2886 |
+
|
2887 |
+
|
2888 |
+
|
2889 |
+
|
2890 |
+
|
2891 |
+
|
2892 |
+
|
2893 |
+
|
2894 |
+
|
2895 |
+
|
2896 |
+
|
2897 |
+
|
2898 |
+
|
2899 |
+
|
2900 |
+
|
2901 |
+
|
2902 |
+
|
2903 |
+
|
2904 |
+
|
2905 |
+
|
2906 |
+
|
2907 |
+
|
2908 |
+
|
2909 |
+
|
2910 |
+
|
2911 |
+
|
2912 |
+
|
2913 |
+
|
2914 |
+
|
2915 |
+
|
2916 |
+
|
2917 |
+
|
2918 |
+
|
2919 |
+
|
2920 |
+
|
2921 |
+
|
2922 |
+
|
2923 |
+
|
2924 |
+
|
2925 |
+
|
2926 |
+
|
2927 |
+
|
2928 |
+
|
2929 |
+
|
2930 |
+
|
2931 |
+
|
2932 |
+
|
2933 |
+
|
2934 |
+
|
2935 |
+
|
2936 |
+
|
2937 |
+
|
2938 |
+
|
2939 |
+
|
2940 |
+
13%|████████████████████▎ | 2799/21520 [2:16:58<8:44:37, 1.68s/it]
|
2941 |
+
|
2942 |
+
|
2943 |
+
|
2944 |
+
|
2945 |
+
|
2946 |
+
|
2947 |
+
|
2948 |
+
|
2949 |
+
|
2950 |
+
|
2951 |
+
|
2952 |
+
|
2953 |
+
|
2954 |
+
|
2955 |
+
|
2956 |
+
|
2957 |
+
|
2958 |
+
|
2959 |
+
|
2960 |
+
|
2961 |
+
|
2962 |
+
|
2963 |
+
|
2964 |
+
|
2965 |
+
|
2966 |
+
|
2967 |
+
|
2968 |
+
|
2969 |
+
|
2970 |
+
|
2971 |
+
|
2972 |
+
|
2973 |
+
|
2974 |
+
|
2975 |
+
|
2976 |
+
|
2977 |
+
|
2978 |
+
|
2979 |
+
|
2980 |
+
|
2981 |
+
|
2982 |
+
|
2983 |
+
|
2984 |
+
|
2985 |
+
|
2986 |
+
|
2987 |
+
|
2988 |
+
|
2989 |
+
|
2990 |
+
|
2991 |
+
|
2992 |
+
|
2993 |
+
|
2994 |
+
|
2995 |
+
|
2996 |
+
|
2997 |
+
|
2998 |
+
|
2999 |
+
|
3000 |
+
|
3001 |
+
|
3002 |
+
|
3003 |
+
|
3004 |
+
|
3005 |
+
|
3006 |
+
|
3007 |
+
|
3008 |
+
|
3009 |
+
|
3010 |
+
|
3011 |
+
|
3012 |
+
|
3013 |
+
|
3014 |
+
|
3015 |
+
|
3016 |
+
|
3017 |
+
|
3018 |
+
|
3019 |
+
|
3020 |
+
|
3021 |
+
|
3022 |
+
|
3023 |
+
|
3024 |
+
|
3025 |
+
|
3026 |
+
|
3027 |
+
|
3028 |
+
|
3029 |
+
|
3030 |
+
|
3031 |
+
|
3032 |
+
|
3033 |
+
|
3034 |
+
|
3035 |
+
|
3036 |
+
|
3037 |
+
|
3038 |
+
|
3039 |
+
|
3040 |
+
13%|█████████████████████ | 2900/21520 [2:21:21<8:26:00, 1.63s/it]
|
3041 |
+
|
3042 |
+
|
3043 |
+
|
3044 |
+
|
3045 |
+
|
3046 |
+
|
3047 |
+
|
3048 |
+
|
3049 |
+
|
3050 |
+
|
3051 |
+
|
3052 |
+
|
3053 |
+
|
3054 |
+
|
3055 |
+
|
3056 |
+
|
3057 |
+
|
3058 |
+
|
3059 |
+
|
3060 |
+
|
3061 |
+
|
3062 |
+
|
3063 |
+
|
3064 |
+
|
3065 |
+
|
3066 |
+
|
3067 |
+
|
3068 |
+
|
3069 |
+
|
3070 |
+
|
3071 |
+
|
3072 |
+
|
3073 |
+
|
3074 |
+
|
3075 |
+
|
3076 |
+
|
3077 |
+
|
3078 |
+
|
3079 |
+
|
3080 |
+
|
3081 |
+
|
3082 |
+
|
3083 |
+
|
3084 |
+
|
3085 |
+
|
3086 |
+
|
3087 |
+
|
3088 |
+
|
3089 |
+
|
3090 |
+
|
3091 |
+
|
3092 |
+
|
3093 |
+
|
3094 |
+
|
3095 |
+
|
3096 |
+
|
3097 |
+
|
3098 |
+
|
3099 |
+
|
3100 |
+
|
3101 |
+
|
3102 |
+
|
3103 |
+
|
3104 |
+
|
3105 |
+
|
3106 |
+
|
3107 |
+
|
3108 |
+
|
3109 |
+
|
3110 |
+
|
3111 |
+
|
3112 |
+
|
3113 |
+
|
3114 |
+
|
3115 |
+
|
3116 |
+
|
3117 |
+
|
3118 |
+
|
3119 |
+
|
3120 |
+
|
3121 |
+
|
3122 |
+
|
3123 |
+
|
3124 |
+
|
3125 |
+
|
3126 |
+
|
3127 |
+
|
3128 |
+
|
3129 |
+
|
3130 |
+
|
3131 |
+
|
3132 |
+
|
3133 |
+
|
3134 |
+
|
3135 |
+
|
3136 |
+
|
3137 |
+
|
3138 |
+
14%|█████████████████████▋ | 2998/21520 [2:25:39<9:03:14, 1.76s/it]
|
3139 |
+
14%|█████████████████████▋ | 3000/21520 [2:25:42<8:12:20, 1.60s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
3140 |
+
***** Running Evaluation *****
|
3141 |
+
Num examples = 1839
|
3142 |
+
Batch size = 64
|
3143 |
+
|
3144 |
+
|
3145 |
+
|
3146 |
+
|
3147 |
+
|
3148 |
+
|
3149 |
+
|
3150 |
+
|
3151 |
+
|
3152 |
+
|
3153 |
+
|
3154 |
+
|
3155 |
+
|
3156 |
+
|
3157 |
+
|
3158 |
+
|
3159 |
+
|
3160 |
+
|
3161 |
+
|
3162 |
+
|
3163 |
+
|
3164 |
+
|
3165 |
+
|
3166 |
+
|
3167 |
+
|
3168 |
+
|
3169 |
+
|
3170 |
+
|
3171 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:55<00:00, 1.82s/it]
|
3172 |
+
|
3173 |
+
Configuration saved in ./checkpoint-3000/config.json
|
3174 |
+
Model weights saved in ./checkpoint-3000/pytorch_model.bin
|
3175 |
+
Configuration saved in ./checkpoint-3000/preprocessor_config.json
|
3176 |
+
Configuration saved in ./preprocessor_config.json
|
3177 |
+
Deleting older checkpoint [checkpoint-1500] due to args.save_total_limit
|
3178 |
+
|
3179 |
+
|
3180 |
+
|
3181 |
+
|
3182 |
+
|
3183 |
+
|
3184 |
+
|
3185 |
+
|
3186 |
+
|
3187 |
+
|
3188 |
+
|
3189 |
+
|
3190 |
+
|
3191 |
+
|
3192 |
+
|
3193 |
+
|
3194 |
+
|
3195 |
+
|
3196 |
+
|
3197 |
+
|
3198 |
+
|
3199 |
+
|
3200 |
+
|
3201 |
+
|
3202 |
+
|
3203 |
+
|
3204 |
+
|
3205 |
+
|
3206 |
+
|
3207 |
+
|
3208 |
+
|
3209 |
+
|
3210 |
+
|
3211 |
+
|
3212 |
+
|
3213 |
+
|
3214 |
+
|
3215 |
+
|
3216 |
+
|
3217 |
+
|
3218 |
+
|
3219 |
+
|
3220 |
+
|
3221 |
+
|
3222 |
+
|
3223 |
+
|
3224 |
+
|
3225 |
+
|
3226 |
+
|
3227 |
+
|
3228 |
+
|
3229 |
+
|
3230 |
+
|
3231 |
+
|
3232 |
+
|
3233 |
+
|
3234 |
+
|
3235 |
+
|
3236 |
+
|
3237 |
+
|
3238 |
+
|
3239 |
+
|
3240 |
+
|
3241 |
+
|
3242 |
+
|
3243 |
+
|
3244 |
+
|
3245 |
+
|
3246 |
+
|
3247 |
+
|
3248 |
+
|
3249 |
+
|
3250 |
+
|
3251 |
+
|
3252 |
+
|
3253 |
+
|
3254 |
+
|
3255 |
+
|
3256 |
+
|
3257 |
+
|
3258 |
+
|
3259 |
+
|
3260 |
+
|
3261 |
+
|
3262 |
+
|
3263 |
+
|
3264 |
+
|
3265 |
+
|
3266 |
+
|
3267 |
+
|
3268 |
+
|
3269 |
+
|
3270 |
+
|
3271 |
+
|
3272 |
+
|
3273 |
+
|
3274 |
+
|
3275 |
+
14%|██████████████████████▍ | 3100/21520 [2:33:03<8:22:01, 1.64s/it]
|
3276 |
+
|
3277 |
+
|
3278 |
+
|
3279 |
+
|
3280 |
+
|
3281 |
+
|
3282 |
+
|
3283 |
+
|
3284 |
+
|
3285 |
+
|
3286 |
+
|
3287 |
+
|
3288 |
+
|
3289 |
+
|
3290 |
+
|
3291 |
+
|
3292 |
+
|
3293 |
+
|
3294 |
+
|
3295 |
+
|
3296 |
+
|
3297 |
+
|
3298 |
+
|
3299 |
+
|
3300 |
+
|
3301 |
+
|
3302 |
+
|
3303 |
+
|
3304 |
+
|
3305 |
+
|
3306 |
+
|
3307 |
+
|
3308 |
+
|
3309 |
+
|
3310 |
+
|
3311 |
+
|
3312 |
+
|
3313 |
+
|
3314 |
+
|
3315 |
+
|
3316 |
+
|
3317 |
+
|
3318 |
+
|
3319 |
+
|
3320 |
+
|
3321 |
+
|
3322 |
+
|
3323 |
+
|
3324 |
+
|
3325 |
+
|
3326 |
+
|
3327 |
+
|
3328 |
+
|
3329 |
+
|
3330 |
+
|
3331 |
+
|
3332 |
+
|
3333 |
+
|
3334 |
+
|
3335 |
+
|
3336 |
+
|
3337 |
+
|
3338 |
+
|
3339 |
+
|
3340 |
+
|
3341 |
+
|
3342 |
+
|
3343 |
+
|
3344 |
+
|
3345 |
+
|
3346 |
+
|
3347 |
+
|
3348 |
+
|
3349 |
+
|
3350 |
+
|
3351 |
+
|
3352 |
+
|
3353 |
+
|
3354 |
+
|
3355 |
+
|
3356 |
+
|
3357 |
+
|
3358 |
+
|
3359 |
+
|
3360 |
+
|
3361 |
+
|
3362 |
+
|
3363 |
+
|
3364 |
+
|
3365 |
+
|
3366 |
+
|
3367 |
+
|
3368 |
+
|
3369 |
+
|
3370 |
+
|
3371 |
+
|
3372 |
+
|
3373 |
+
15%|███████████████████████▏ | 3199/21520 [2:37:24<8:25:51, 1.66s/it]
|
3374 |
+
|
3375 |
+
|
3376 |
+
|
3377 |
+
|
3378 |
+
|
3379 |
+
|
3380 |
+
|
3381 |
+
|
3382 |
+
|
3383 |
+
|
3384 |
+
|
3385 |
+
|
3386 |
+
|
3387 |
+
|
3388 |
+
|
3389 |
+
|
3390 |
+
|
3391 |
+
|
3392 |
+
|
3393 |
+
|
3394 |
+
|
3395 |
+
|
3396 |
+
|
3397 |
+
|
3398 |
+
|
3399 |
+
|
3400 |
+
|
3401 |
+
|
3402 |
+
|
3403 |
+
|
3404 |
+
|
3405 |
+
|
3406 |
+
|
3407 |
+
|
3408 |
+
|
3409 |
+
|
3410 |
+
|
3411 |
+
|
3412 |
+
|
3413 |
+
|
3414 |
+
|
3415 |
+
|
3416 |
+
|
3417 |
+
|
3418 |
+
|
3419 |
+
|
3420 |
+
|
3421 |
+
|
3422 |
+
|
3423 |
+
|
3424 |
+
|
3425 |
+
|
3426 |
+
|
3427 |
+
|
3428 |
+
|
3429 |
+
|
3430 |
+
|
3431 |
+
|
3432 |
+
|
3433 |
+
|
3434 |
+
|
3435 |
+
|
3436 |
+
|
3437 |
+
|
3438 |
+
|
3439 |
+
|
3440 |
+
|
3441 |
+
|
3442 |
+
|
3443 |
+
|
3444 |
+
|
3445 |
+
|
3446 |
+
|
3447 |
+
|
3448 |
+
|
3449 |
+
|
3450 |
+
|
3451 |
+
|
3452 |
+
|
3453 |
+
|
3454 |
+
|
3455 |
+
|
3456 |
+
|
3457 |
+
|
3458 |
+
|
3459 |
+
|
3460 |
+
|
3461 |
+
|
3462 |
+
|
3463 |
+
|
3464 |
+
|
3465 |
+
|
3466 |
+
|
3467 |
+
|
3468 |
+
|
3469 |
+
|
3470 |
+
|
3471 |
+
|
3472 |
+
|
3473 |
+
15%|███████████████████████▉ | 3300/21520 [2:41:52<8:21:22, 1.65s/it]
|
3474 |
+
|
3475 |
+
|
3476 |
+
|
3477 |
+
|
3478 |
+
|
3479 |
+
|
3480 |
+
|
3481 |
+
|
3482 |
+
|
3483 |
+
|
3484 |
+
|
3485 |
+
|
3486 |
+
|
3487 |
+
|
3488 |
+
|
3489 |
+
|
3490 |
+
|
3491 |
+
|
3492 |
+
|
3493 |
+
|
3494 |
+
|
3495 |
+
|
3496 |
+
|
3497 |
+
|
3498 |
+
|
3499 |
+
|
3500 |
+
|
3501 |
+
|
3502 |
+
|
3503 |
+
|
3504 |
+
|
3505 |
+
|
3506 |
+
|
3507 |
+
|
3508 |
+
|
3509 |
+
|
3510 |
+
|
3511 |
+
|
3512 |
+
|
3513 |
+
|
3514 |
+
|
3515 |
+
|
3516 |
+
|
3517 |
+
|
3518 |
+
|
3519 |
+
|
3520 |
+
|
3521 |
+
|
3522 |
+
|
3523 |
+
|
3524 |
+
|
3525 |
+
|
3526 |
+
|
3527 |
+
|
3528 |
+
|
3529 |
+
|
3530 |
+
|
3531 |
+
|
3532 |
+
|
3533 |
+
|
3534 |
+
|
3535 |
+
|
3536 |
+
|
3537 |
+
|
3538 |
+
|
3539 |
+
|
3540 |
+
|
3541 |
+
|
3542 |
+
|
3543 |
+
|
3544 |
+
|
3545 |
+
|
3546 |
+
|
3547 |
+
|
3548 |
+
|
3549 |
+
|
3550 |
+
|
3551 |
+
|
3552 |
+
|
3553 |
+
|
3554 |
+
|
3555 |
+
|
3556 |
+
|
3557 |
+
|
3558 |
+
|
3559 |
+
|
3560 |
+
|
3561 |
+
|
3562 |
+
|
3563 |
+
|
3564 |
+
|
3565 |
+
|
3566 |
+
|
3567 |
+
|
3568 |
+
|
3569 |
+
|
3570 |
+
16%|████████████████████████▋ | 3398/21520 [2:46:11<8:41:39, 1.73s/it]
|
3571 |
+
|
3572 |
+
|
3573 |
+
|
3574 |
+
|
3575 |
+
|
3576 |
+
|
3577 |
+
|
3578 |
+
|
3579 |
+
|
3580 |
+
|
3581 |
+
|
3582 |
+
|
3583 |
+
|
3584 |
+
|
3585 |
+
|
3586 |
+
|
3587 |
+
|
3588 |
+
|
3589 |
+
|
3590 |
+
|
3591 |
+
|
3592 |
+
|
3593 |
+
|
3594 |
+
|
3595 |
+
|
3596 |
+
|
3597 |
+
|
3598 |
+
|
3599 |
+
|
3600 |
+
|
3601 |
+
|
3602 |
+
|
3603 |
+
|
3604 |
+
|
3605 |
+
|
3606 |
+
|
3607 |
+
|
3608 |
+
|
3609 |
+
|
3610 |
+
|
3611 |
+
|
3612 |
+
|
3613 |
+
|
3614 |
+
|
3615 |
+
|
3616 |
+
|
3617 |
+
|
3618 |
+
|
3619 |
+
|
3620 |
+
|
3621 |
+
|
3622 |
+
|
3623 |
+
|
3624 |
+
|
3625 |
+
|
3626 |
+
|
3627 |
+
|
3628 |
+
|
3629 |
+
|
3630 |
+
|
3631 |
+
|
3632 |
+
|
3633 |
+
|
3634 |
+
|
3635 |
+
|
3636 |
+
|
3637 |
+
|
3638 |
+
|
3639 |
+
|
3640 |
+
|
3641 |
+
|
3642 |
+
|
3643 |
+
|
3644 |
+
|
3645 |
+
|
3646 |
+
|
3647 |
+
|
3648 |
+
|
3649 |
+
|
3650 |
+
|
3651 |
+
|
3652 |
+
|
3653 |
+
|
3654 |
+
|
3655 |
+
|
3656 |
+
|
3657 |
+
|
3658 |
+
|
3659 |
+
|
3660 |
+
|
3661 |
+
|
3662 |
+
|
3663 |
+
|
3664 |
+
|
3665 |
+
|
3666 |
+
|
3667 |
+
|
3668 |
+
|
3669 |
+
16%|█████████████████████████▎ | 3499/21520 [2:50:38<8:32:27, 1.71s/it]
|
3670 |
+
16%|█████████████████████████▎ | 3500/21520 [2:50:39<8:11:10, 1.64s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
3671 |
+
***** Running Evaluation *****
|
3672 |
+
Num examples = 1839
|
3673 |
+
Batch size = 64
|
3674 |
+
|
3675 |
+
|
3676 |
+
|
3677 |
+
|
3678 |
+
|
3679 |
+
|
3680 |
+
|
3681 |
+
|
3682 |
+
|
3683 |
+
|
3684 |
+
|
3685 |
+
|
3686 |
+
|
3687 |
+
|
3688 |
+
|
3689 |
+
|
3690 |
+
|
3691 |
+
|
3692 |
+
|
3693 |
+
|
3694 |
+
|
3695 |
+
|
3696 |
+
|
3697 |
+
|
3698 |
+
|
3699 |
+
|
3700 |
+
|
3701 |
+
|
3702 |
+
Configuration saved in ./checkpoint-3500/config.json
|
3703 |
+
{'eval_loss': 0.38118353486061096, 'eval_wer': 0.4635139179886262, 'eval_runtime': 65.5254, 'eval_samples_per_second': 28.065, 'eval_steps_per_second': 0.443, 'epoch': 0.81}
|
3704 |
+
Model weights saved in ./checkpoint-3500/pytorch_model.bin
|
3705 |
+
Configuration saved in ./checkpoint-3500/preprocessor_config.json
|
3706 |
+
Configuration saved in ./preprocessor_config.json
|
3707 |
+
Deleting older checkpoint [checkpoint-2000] due to args.save_total_limit
|
3708 |
+
|
3709 |
+
|
3710 |
+
|
3711 |
+
|
3712 |
+
|
3713 |
+
|
3714 |
+
|
3715 |
+
|
3716 |
+
|
3717 |
+
|
3718 |
+
|
3719 |
+
|
3720 |
+
|
3721 |
+
|
3722 |
+
|
3723 |
+
|
3724 |
+
|
3725 |
+
|
3726 |
+
|
3727 |
+
|
3728 |
+
|
3729 |
+
|
3730 |
+
|
3731 |
+
|
3732 |
+
|
3733 |
+
|
3734 |
+
|
3735 |
+
|
3736 |
+
|
3737 |
+
|
3738 |
+
|
3739 |
+
|
3740 |
+
|
3741 |
+
|
3742 |
+
|
3743 |
+
|
3744 |
+
|
3745 |
+
|
3746 |
+
|
3747 |
+
|
3748 |
+
|
3749 |
+
|
3750 |
+
|
3751 |
+
|
3752 |
+
|
3753 |
+
|
3754 |
+
|
3755 |
+
|
3756 |
+
|
3757 |
+
|
3758 |
+
|
3759 |
+
|
3760 |
+
|
3761 |
+
|
3762 |
+
|
3763 |
+
|
3764 |
+
|
3765 |
+
|
3766 |
+
|
3767 |
+
|
3768 |
+
|
3769 |
+
|
3770 |
+
|
3771 |
+
|
3772 |
+
|
3773 |
+
|
3774 |
+
|
3775 |
+
|
3776 |
+
|
3777 |
+
|
3778 |
+
|
3779 |
+
|
3780 |
+
|
3781 |
+
|
3782 |
+
|
3783 |
+
|
3784 |
+
|
3785 |
+
|
3786 |
+
|
3787 |
+
|
3788 |
+
|
3789 |
+
|
3790 |
+
|
3791 |
+
|
3792 |
+
|
3793 |
+
|
3794 |
+
|
3795 |
+
|
3796 |
+
|
3797 |
+
|
3798 |
+
|
3799 |
+
|
3800 |
+
|
3801 |
+
|
3802 |
+
|
3803 |
+
|
3804 |
+
|
3805 |
+
17%|██████████████████████████ | 3600/21520 [2:58:00<8:11:02, 1.64s/it]
|
3806 |
+
|
3807 |
+
|
3808 |
+
|
3809 |
+
|
3810 |
+
|
3811 |
+
|
3812 |
+
|
3813 |
+
|
3814 |
+
|
3815 |
+
|
3816 |
+
|
3817 |
+
|
3818 |
+
|
3819 |
+
|
3820 |
+
|
3821 |
+
|
3822 |
+
|
3823 |
+
|
3824 |
+
|
3825 |
+
|
3826 |
+
|
3827 |
+
|
3828 |
+
|
3829 |
+
|
3830 |
+
|
3831 |
+
|
3832 |
+
|
3833 |
+
|
3834 |
+
|
3835 |
+
|
3836 |
+
|
3837 |
+
|
3838 |
+
|
3839 |
+
|
3840 |
+
|
3841 |
+
|
3842 |
+
|
3843 |
+
|
3844 |
+
|
3845 |
+
|
3846 |
+
|
3847 |
+
|
3848 |
+
|
3849 |
+
|
3850 |
+
|
3851 |
+
|
3852 |
+
|
3853 |
+
|
3854 |
+
|
3855 |
+
|
3856 |
+
|
3857 |
+
|
3858 |
+
|
3859 |
+
|
3860 |
+
|
3861 |
+
|
3862 |
+
|
3863 |
+
|
3864 |
+
|
3865 |
+
|
3866 |
+
|
3867 |
+
|
3868 |
+
|
3869 |
+
|
3870 |
+
|
3871 |
+
|
3872 |
+
|
3873 |
+
|
3874 |
+
|
3875 |
+
|
3876 |
+
|
3877 |
+
|
3878 |
+
|
3879 |
+
|
3880 |
+
|
3881 |
+
|
3882 |
+
|
3883 |
+
|
3884 |
+
|
3885 |
+
|
3886 |
+
|
3887 |
+
|
3888 |
+
|
3889 |
+
|
3890 |
+
|
3891 |
+
|
3892 |
+
|
3893 |
+
|
3894 |
+
|
3895 |
+
|
3896 |
+
|
3897 |
+
|
3898 |
+
|
3899 |
+
|
3900 |
+
|
3901 |
+
|
3902 |
+
|
3903 |
+
17%|██████████████████████████▊ | 3699/21520 [3:02:22<8:11:14, 1.65s/it]
|
3904 |
+
|
3905 |
+
|
3906 |
+
|
3907 |
+
|
3908 |
+
|
3909 |
+
|
3910 |
+
|
3911 |
+
|
3912 |
+
|
3913 |
+
|
3914 |
+
|
3915 |
+
|
3916 |
+
|
3917 |
+
|
3918 |
+
|
3919 |
+
|
3920 |
+
|
3921 |
+
|
3922 |
+
|
3923 |
+
|
3924 |
+
|
3925 |
+
|
3926 |
+
|
3927 |
+
|
3928 |
+
|
3929 |
+
|
3930 |
+
|
3931 |
+
|
3932 |
+
|
3933 |
+
|
3934 |
+
|
3935 |
+
|
3936 |
+
|
3937 |
+
|
3938 |
+
|
3939 |
+
|
3940 |
+
|
3941 |
+
|
3942 |
+
|
3943 |
+
|
3944 |
+
|
3945 |
+
|
3946 |
+
|
3947 |
+
|
3948 |
+
|
3949 |
+
|
3950 |
+
|
3951 |
+
|
3952 |
+
|
3953 |
+
|
3954 |
+
|
3955 |
+
|
3956 |
+
|
3957 |
+
|
3958 |
+
|
3959 |
+
|
3960 |
+
|
3961 |
+
|
3962 |
+
|
3963 |
+
|
3964 |
+
|
3965 |
+
|
3966 |
+
|
3967 |
+
|
3968 |
+
|
3969 |
+
|
3970 |
+
|
3971 |
+
|
3972 |
+
|
3973 |
+
|
3974 |
+
|
3975 |
+
|
3976 |
+
|
3977 |
+
|
3978 |
+
|
3979 |
+
|
3980 |
+
|
3981 |
+
|
3982 |
+
|
3983 |
+
|
3984 |
+
|
3985 |
+
|
3986 |
+
|
3987 |
+
|
3988 |
+
|
3989 |
+
|
3990 |
+
|
3991 |
+
|
3992 |
+
|
3993 |
+
|
3994 |
+
|
3995 |
+
|
3996 |
+
|
3997 |
+
|
3998 |
+
|
3999 |
+
|
4000 |
+
|
4001 |
+
|
4002 |
+
|
4003 |
+
18%|███████████████████████████▌ | 3800/21520 [3:06:46<7:57:07, 1.62s/it]
|
4004 |
+
|
4005 |
+
|
4006 |
+
|
4007 |
+
|
4008 |
+
|
4009 |
+
|
4010 |
+
|
4011 |
+
|
4012 |
+
|
4013 |
+
|
4014 |
+
|
4015 |
+
|
4016 |
+
|
4017 |
+
|
4018 |
+
|
4019 |
+
|
4020 |
+
|
4021 |
+
|
4022 |
+
|
4023 |
+
|
4024 |
+
|
4025 |
+
|
4026 |
+
|
4027 |
+
|
4028 |
+
|
4029 |
+
|
4030 |
+
|
4031 |
+
|
4032 |
+
|
4033 |
+
|
4034 |
+
|
4035 |
+
|
4036 |
+
|
4037 |
+
|
4038 |
+
|
4039 |
+
|
4040 |
+
|
4041 |
+
|
4042 |
+
|
4043 |
+
|
4044 |
+
|
4045 |
+
|
4046 |
+
|
4047 |
+
|
4048 |
+
|
4049 |
+
|
4050 |
+
|
4051 |
+
|
4052 |
+
|
4053 |
+
|
4054 |
+
|
4055 |
+
|
4056 |
+
|
4057 |
+
|
4058 |
+
|
4059 |
+
|
4060 |
+
|
4061 |
+
|
4062 |
+
|
4063 |
+
|
4064 |
+
|
4065 |
+
|
4066 |
+
|
4067 |
+
|
4068 |
+
|
4069 |
+
|
4070 |
+
|
4071 |
+
|
4072 |
+
|
4073 |
+
|
4074 |
+
|
4075 |
+
|
4076 |
+
|
4077 |
+
|
4078 |
+
|
4079 |
+
|
4080 |
+
|
4081 |
+
|
4082 |
+
|
4083 |
+
|
4084 |
+
|
4085 |
+
|
4086 |
+
|
4087 |
+
|
4088 |
+
|
4089 |
+
|
4090 |
+
|
4091 |
+
|
4092 |
+
|
4093 |
+
|
4094 |
+
|
4095 |
+
|
4096 |
+
|
4097 |
+
|
4098 |
+
|
4099 |
+
|
4100 |
+
18%|████████████████████████████▎ | 3898/21520 [3:11:08<8:29:00, 1.73s/it]
|
4101 |
+
|
4102 |
+
|
4103 |
+
|
4104 |
+
|
4105 |
+
|
4106 |
+
|
4107 |
+
|
4108 |
+
|
4109 |
+
|
4110 |
+
|
4111 |
+
|
4112 |
+
|
4113 |
+
|
4114 |
+
|
4115 |
+
|
4116 |
+
|
4117 |
+
|
4118 |
+
|
4119 |
+
|
4120 |
+
|
4121 |
+
|
4122 |
+
|
4123 |
+
|
4124 |
+
|
4125 |
+
|
4126 |
+
|
4127 |
+
|
4128 |
+
|
4129 |
+
|
4130 |
+
|
4131 |
+
|
4132 |
+
|
4133 |
+
|
4134 |
+
|
4135 |
+
|
4136 |
+
|
4137 |
+
|
4138 |
+
|
4139 |
+
|
4140 |
+
|
4141 |
+
|
4142 |
+
|
4143 |
+
|
4144 |
+
|
4145 |
+
|
4146 |
+
|
4147 |
+
|
4148 |
+
|
4149 |
+
|
4150 |
+
|
4151 |
+
|
4152 |
+
|
4153 |
+
|
4154 |
+
|
4155 |
+
|
4156 |
+
|
4157 |
+
|
4158 |
+
|
4159 |
+
|
4160 |
+
|
4161 |
+
|
4162 |
+
|
4163 |
+
|
4164 |
+
|
4165 |
+
|
4166 |
+
|
4167 |
+
|
4168 |
+
|
4169 |
+
|
4170 |
+
|
4171 |
+
|
4172 |
+
|
4173 |
+
|
4174 |
+
|
4175 |
+
|
4176 |
+
|
4177 |
+
|
4178 |
+
|
4179 |
+
|
4180 |
+
|
4181 |
+
|
4182 |
+
|
4183 |
+
|
4184 |
+
|
4185 |
+
|
4186 |
+
|
4187 |
+
|
4188 |
+
|
4189 |
+
|
4190 |
+
|
4191 |
+
|
4192 |
+
|
4193 |
+
|
4194 |
+
|
4195 |
+
|
4196 |
+
|
4197 |
+
|
4198 |
+
19%|████████████████████████████▉ | 4000/21520 [3:15:36<7:50:45, 1.61s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
4199 |
+
***** Running Evaluation *****
|
4200 |
+
Num examples = 1839
|
4201 |
+
Batch size = 64
|
4202 |
+
{'loss': 1.2001, 'learning_rate': 6.128321678321677e-05, 'epoch': 0.93}
|
4203 |
+
|
4204 |
+
|
4205 |
+
|
4206 |
+
|
4207 |
+
|
4208 |
+
|
4209 |
+
|
4210 |
+
|
4211 |
+
|
4212 |
+
|
4213 |
+
|
4214 |
+
|
4215 |
+
|
4216 |
+
|
4217 |
+
|
4218 |
+
|
4219 |
+
|
4220 |
+
|
4221 |
+
|
4222 |
+
|
4223 |
+
|
4224 |
+
|
4225 |
+
|
4226 |
+
|
4227 |
+
|
4228 |
+
|
4229 |
+
|
4230 |
+
|
4231 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:55<00:00, 1.79s/it]
|
4232 |
+
|
4233 |
+
Configuration saved in ./checkpoint-4000/config.json
|
4234 |
+
Model weights saved in ./checkpoint-4000/pytorch_model.bin
|
4235 |
+
Configuration saved in ./checkpoint-4000/preprocessor_config.json
|
4236 |
+
Configuration saved in ./preprocessor_config.json
|
4237 |
+
Deleting older checkpoint [checkpoint-2500] due to args.save_total_limit
|
4238 |
+
|
4239 |
+
|
4240 |
+
|
4241 |
+
|
4242 |
+
|
4243 |
+
|
4244 |
+
|
4245 |
+
|
4246 |
+
|
4247 |
+
|
4248 |
+
|
4249 |
+
|
4250 |
+
|
4251 |
+
|
4252 |
+
|
4253 |
+
|
4254 |
+
|
4255 |
+
|
4256 |
+
|
4257 |
+
|
4258 |
+
|
4259 |
+
|
4260 |
+
|
4261 |
+
|
4262 |
+
|
4263 |
+
|
4264 |
+
|
4265 |
+
|
4266 |
+
|
4267 |
+
|
4268 |
+
|
4269 |
+
|
4270 |
+
|
4271 |
+
|
4272 |
+
|
4273 |
+
|
4274 |
+
|
4275 |
+
|
4276 |
+
|
4277 |
+
|
4278 |
+
|
4279 |
+
|
4280 |
+
|
4281 |
+
|
4282 |
+
|
4283 |
+
|
4284 |
+
|
4285 |
+
|
4286 |
+
|
4287 |
+
|
4288 |
+
|
4289 |
+
|
4290 |
+
|
4291 |
+
|
4292 |
+
|
4293 |
+
|
4294 |
+
|
4295 |
+
|
4296 |
+
|
4297 |
+
|
4298 |
+
|
4299 |
+
|
4300 |
+
|
4301 |
+
|
4302 |
+
|
4303 |
+
|
4304 |
+
|
4305 |
+
|
4306 |
+
|
4307 |
+
|
4308 |
+
|
4309 |
+
|
4310 |
+
|
4311 |
+
|
4312 |
+
|
4313 |
+
|
4314 |
+
|
4315 |
+
|
4316 |
+
|
4317 |
+
|
4318 |
+
|
4319 |
+
|
4320 |
+
|
4321 |
+
|
4322 |
+
|
4323 |
+
|
4324 |
+
|
4325 |
+
|
4326 |
+
|
4327 |
+
|
4328 |
+
|
4329 |
+
|
4330 |
+
|
4331 |
+
|
4332 |
+
|
4333 |
+
|
4334 |
+
|
4335 |
+
19%|█████████████████████████████▋ | 4100/21520 [3:22:57<7:45:28, 1.60s/it]
|
4336 |
+
|
4337 |
+
|
4338 |
+
|
4339 |
+
|
4340 |
+
|
4341 |
+
|
4342 |
+
|
4343 |
+
|
4344 |
+
|
4345 |
+
|
4346 |
+
|
4347 |
+
|
4348 |
+
|
4349 |
+
|
4350 |
+
|
4351 |
+
|
4352 |
+
|
4353 |
+
|
4354 |
+
|
4355 |
+
|
4356 |
+
|
4357 |
+
|
4358 |
+
|
4359 |
+
|
4360 |
+
|
4361 |
+
|
4362 |
+
|
4363 |
+
|
4364 |
+
|
4365 |
+
|
4366 |
+
|
4367 |
+
|
4368 |
+
|
4369 |
+
|
4370 |
+
|
4371 |
+
|
4372 |
+
|
4373 |
+
|
4374 |
+
|
4375 |
+
|
4376 |
+
|
4377 |
+
|
4378 |
+
|
4379 |
+
|
4380 |
+
|
4381 |
+
|
4382 |
+
|
4383 |
+
|
4384 |
+
|
4385 |
+
|
4386 |
+
|
4387 |
+
|
4388 |
+
|
4389 |
+
|
4390 |
+
|
4391 |
+
|
4392 |
+
|
4393 |
+
|
4394 |
+
|
4395 |
+
|
4396 |
+
|
4397 |
+
|
4398 |
+
|
4399 |
+
|
4400 |
+
|
4401 |
+
|
4402 |
+
|
4403 |
+
|
4404 |
+
|
4405 |
+
|
4406 |
+
|
4407 |
+
|
4408 |
+
|
4409 |
+
|
4410 |
+
|
4411 |
+
|
4412 |
+
|
4413 |
+
|
4414 |
+
|
4415 |
+
|
4416 |
+
|
4417 |
+
|
4418 |
+
|
4419 |
+
|
4420 |
+
|
4421 |
+
|
4422 |
+
|
4423 |
+
|
4424 |
+
|
4425 |
+
|
4426 |
+
|
4427 |
+
|
4428 |
+
|
4429 |
+
|
4430 |
+
|
4431 |
+
|
4432 |
+
|
4433 |
+
20%|██████████████████████████████▍ | 4199/21520 [3:27:18<7:58:53, 1.66s/it]
|
4434 |
+
|
4435 |
+
|
4436 |
+
|
4437 |
+
|
4438 |
+
|
4439 |
+
|
4440 |
+
|
4441 |
+
|
4442 |
+
|
4443 |
+
|
4444 |
+
|
4445 |
+
|
4446 |
+
|
4447 |
+
|
4448 |
+
|
4449 |
+
|
4450 |
+
|
4451 |
+
|
4452 |
+
|
4453 |
+
|
4454 |
+
|
4455 |
+
|
4456 |
+
|
4457 |
+
|
4458 |
+
|
4459 |
+
|
4460 |
+
|
4461 |
+
|
4462 |
+
|
4463 |
+
|
4464 |
+
|
4465 |
+
|
4466 |
+
|
4467 |
+
|
4468 |
+
|
4469 |
+
|
4470 |
+
|
4471 |
+
|
4472 |
+
|
4473 |
+
|
4474 |
+
|
4475 |
+
|
4476 |
+
|
4477 |
+
|
4478 |
+
|
4479 |
+
|
4480 |
+
|
4481 |
+
|
4482 |
+
|
4483 |
+
|
4484 |
+
|
4485 |
+
|
4486 |
+
|
4487 |
+
|
4488 |
+
|
4489 |
+
|
4490 |
+
|
4491 |
+
|
4492 |
+
|
4493 |
+
|
4494 |
+
|
4495 |
+
|
4496 |
+
|
4497 |
+
|
4498 |
+
|
4499 |
+
|
4500 |
+
|
4501 |
+
|
4502 |
+
|
4503 |
+
|
4504 |
+
|
4505 |
+
|
4506 |
+
|
4507 |
+
|
4508 |
+
|
4509 |
+
|
4510 |
+
|
4511 |
+
|
4512 |
+
|
4513 |
+
|
4514 |
+
|
4515 |
+
|
4516 |
+
|
4517 |
+
|
4518 |
+
|
4519 |
+
|
4520 |
+
|
4521 |
+
|
4522 |
+
|
4523 |
+
|
4524 |
+
|
4525 |
+
|
4526 |
+
|
4527 |
+
|
4528 |
+
|
4529 |
+
|
4530 |
+
|
4531 |
+
|
4532 |
+
20%|███████████████████████████████▏ | 4299/21520 [3:31:42<8:00:50, 1.68s/it]
|
4533 |
+
|
4534 |
+
|
4535 |
+
|
4536 |
+
|
4537 |
+
|
4538 |
+
|
4539 |
+
|
4540 |
+
|
4541 |
+
|
4542 |
+
|
4543 |
+
|
4544 |
+
|
4545 |
+
|
4546 |
+
|
4547 |
+
|
4548 |
+
|
4549 |
+
|
4550 |
+
|
4551 |
+
|
4552 |
+
|
4553 |
+
|
4554 |
+
|
4555 |
+
|
4556 |
+
|
4557 |
+
|
4558 |
+
|
4559 |
+
|
4560 |
+
|
4561 |
+
|
4562 |
+
|
4563 |
+
|
4564 |
+
|
4565 |
+
|
4566 |
+
|
4567 |
+
|
4568 |
+
|
4569 |
+
|
4570 |
+
|
4571 |
+
|
4572 |
+
|
4573 |
+
|
4574 |
+
|
4575 |
+
|
4576 |
+
|
4577 |
+
|
4578 |
+
|
4579 |
+
|
4580 |
+
|
4581 |
+
|
4582 |
+
|
4583 |
+
|
4584 |
+
|
4585 |
+
|
4586 |
+
|
4587 |
+
|
4588 |
+
|
4589 |
+
|
4590 |
+
|
4591 |
+
|
4592 |
+
|
4593 |
+
|
4594 |
+
|
4595 |
+
|
4596 |
+
|
4597 |
+
|
4598 |
+
|
4599 |
+
|
4600 |
+
|
4601 |
+
|
4602 |
+
|
4603 |
+
|
4604 |
+
|
4605 |
+
|
4606 |
+
|
4607 |
+
|
4608 |
+
|
4609 |
+
|
4610 |
+
|
4611 |
+
|
4612 |
+
|
4613 |
+
|
4614 |
+
|
4615 |
+
|
4616 |
+
|
4617 |
+
|
4618 |
+
|
4619 |
+
|
4620 |
+
|
4621 |
+
|
4622 |
+
|
4623 |
+
|
4624 |
+
|
4625 |
+
|
4626 |
+
|
4627 |
+
|
4628 |
+
|
4629 |
+
|
4630 |
+
|
4631 |
+
|
4632 |
+
20%|███████████████████████████████▉ | 4400/21520 [3:36:10<9:34:20, 2.01s/it]
|
4633 |
+
|
4634 |
+
|
4635 |
+
|
4636 |
+
|
4637 |
+
|
4638 |
+
|
4639 |
+
|
4640 |
+
|
4641 |
+
|
4642 |
+
|
4643 |
+
|
4644 |
+
|
4645 |
+
|
4646 |
+
|
4647 |
+
|
4648 |
+
|
4649 |
+
|
4650 |
+
|
4651 |
+
|
4652 |
+
|
4653 |
+
|
4654 |
+
|
4655 |
+
|
4656 |
+
|
4657 |
+
|
4658 |
+
|
4659 |
+
|
4660 |
+
|
4661 |
+
|
4662 |
+
|
4663 |
+
|
4664 |
+
|
4665 |
+
|
4666 |
+
|
4667 |
+
|
4668 |
+
|
4669 |
+
|
4670 |
+
|
4671 |
+
|
4672 |
+
|
4673 |
+
|
4674 |
+
|
4675 |
+
|
4676 |
+
|
4677 |
+
|
4678 |
+
|
4679 |
+
|
4680 |
+
|
4681 |
+
|
4682 |
+
|
4683 |
+
|
4684 |
+
|
4685 |
+
|
4686 |
+
|
4687 |
+
|
4688 |
+
|
4689 |
+
|
4690 |
+
|
4691 |
+
|
4692 |
+
|
4693 |
+
|
4694 |
+
|
4695 |
+
|
4696 |
+
|
4697 |
+
|
4698 |
+
|
4699 |
+
|
4700 |
+
|
4701 |
+
|
4702 |
+
|
4703 |
+
|
4704 |
+
|
4705 |
+
|
4706 |
+
|
4707 |
+
|
4708 |
+
|
4709 |
+
|
4710 |
+
|
4711 |
+
|
4712 |
+
|
4713 |
+
|
4714 |
+
|
4715 |
+
|
4716 |
+
|
4717 |
+
|
4718 |
+
|
4719 |
+
|
4720 |
+
|
4721 |
+
|
4722 |
+
|
4723 |
+
|
4724 |
+
|
4725 |
+
|
4726 |
+
|
4727 |
+
|
4728 |
+
|
4729 |
+
21%|████████████████████████████████▌ | 4500/21520 [3:40:32<9:18:48, 1.97s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
4730 |
+
***** Running Evaluation *****
|
4731 |
+
Num examples = 1839
|
4732 |
+
Batch size = 64
|
4733 |
+
{'loss': 1.1671, 'learning_rate': 5.953496503496503e-05, 'epoch': 1.05}
|
4734 |
+
|
4735 |
+
|
4736 |
+
|
4737 |
+
|
4738 |
+
|
4739 |
+
|
4740 |
+
|
4741 |
+
|
4742 |
+
|
4743 |
+
|
4744 |
+
|
4745 |
+
|
4746 |
+
|
4747 |
+
|
4748 |
+
|
4749 |
+
|
4750 |
+
|
4751 |
+
|
4752 |
+
|
4753 |
+
|
4754 |
+
|
4755 |
+
|
4756 |
+
|
4757 |
+
|
4758 |
+
|
4759 |
+
|
4760 |
+
|
4761 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:56<00:00, 1.81s/it]
|
4762 |
+
|
4763 |
+
Configuration saved in ./checkpoint-4500/config.json
|
4764 |
+
Model weights saved in ./checkpoint-4500/pytorch_model.bin
|
4765 |
+
Configuration saved in ./checkpoint-4500/preprocessor_config.json
|
4766 |
+
Configuration saved in ./preprocessor_config.json
|
4767 |
+
Deleting older checkpoint [checkpoint-3000] due to args.save_total_limit
|
4768 |
+
|
4769 |
+
|
4770 |
+
|
4771 |
+
|
4772 |
+
|
4773 |
+
|
4774 |
+
|
4775 |
+
|
4776 |
+
|
4777 |
+
|
4778 |
+
|
4779 |
+
|
4780 |
+
|
4781 |
+
|
4782 |
+
|
4783 |
+
|
4784 |
+
|
4785 |
+
|
4786 |
+
|
4787 |
+
|
4788 |
+
|
4789 |
+
|
4790 |
+
|
4791 |
+
|
4792 |
+
|
4793 |
+
|
4794 |
+
|
4795 |
+
|
4796 |
+
|
4797 |
+
|
4798 |
+
|
4799 |
+
|
4800 |
+
|
4801 |
+
|
4802 |
+
|
4803 |
+
|
4804 |
+
|
4805 |
+
|
4806 |
+
|
4807 |
+
|
4808 |
+
|
4809 |
+
|
4810 |
+
|
4811 |
+
|
4812 |
+
|
4813 |
+
|
4814 |
+
|
4815 |
+
|
4816 |
+
|
4817 |
+
|
4818 |
+
|
4819 |
+
|
4820 |
+
|
4821 |
+
|
4822 |
+
|
4823 |
+
|
4824 |
+
|
4825 |
+
|
4826 |
+
|
4827 |
+
|
4828 |
+
|
4829 |
+
|
4830 |
+
|
4831 |
+
|
4832 |
+
|
4833 |
+
|
4834 |
+
|
4835 |
+
|
4836 |
+
|
4837 |
+
|
4838 |
+
|
4839 |
+
|
4840 |
+
|
4841 |
+
|
4842 |
+
|
4843 |
+
|
4844 |
+
|
4845 |
+
|
4846 |
+
|
4847 |
+
|
4848 |
+
|
4849 |
+
|
4850 |
+
|
4851 |
+
|
4852 |
+
|
4853 |
+
|
4854 |
+
|
4855 |
+
|
4856 |
+
|
4857 |
+
|
4858 |
+
|
4859 |
+
|
4860 |
+
|
4861 |
+
|
4862 |
+
|
4863 |
+
21%|█████████████████████████████████▎ | 4599/21520 [3:47:51<9:07:25, 1.94s/it]
|
4864 |
+
|
4865 |
+
|
4866 |
+
|
4867 |
+
|
4868 |
+
|
4869 |
+
|
4870 |
+
|
4871 |
+
|
4872 |
+
|
4873 |
+
|
4874 |
+
|
4875 |
+
|
4876 |
+
|
4877 |
+
|
4878 |
+
|
4879 |
+
|
4880 |
+
|
4881 |
+
|
4882 |
+
|
4883 |
+
|
4884 |
+
|
4885 |
+
|
4886 |
+
|
4887 |
+
|
4888 |
+
|
4889 |
+
|
4890 |
+
|
4891 |
+
|
4892 |
+
|
4893 |
+
|
4894 |
+
|
4895 |
+
|
4896 |
+
|
4897 |
+
|
4898 |
+
|
4899 |
+
|
4900 |
+
|
4901 |
+
|
4902 |
+
|
4903 |
+
|
4904 |
+
|
4905 |
+
|
4906 |
+
|
4907 |
+
|
4908 |
+
|
4909 |
+
|
4910 |
+
|
4911 |
+
|
4912 |
+
|
4913 |
+
|
4914 |
+
|
4915 |
+
|
4916 |
+
|
4917 |
+
|
4918 |
+
|
4919 |
+
|
4920 |
+
|
4921 |
+
|
4922 |
+
|
4923 |
+
|
4924 |
+
|
4925 |
+
|
4926 |
+
|
4927 |
+
|
4928 |
+
|
4929 |
+
|
4930 |
+
|
4931 |
+
|
4932 |
+
|
4933 |
+
|
4934 |
+
|
4935 |
+
|
4936 |
+
|
4937 |
+
|
4938 |
+
|
4939 |
+
|
4940 |
+
|
4941 |
+
|
4942 |
+
|
4943 |
+
|
4944 |
+
|
4945 |
+
|
4946 |
+
|
4947 |
+
|
4948 |
+
|
4949 |
+
|
4950 |
+
|
4951 |
+
|
4952 |
+
|
4953 |
+
|
4954 |
+
|
4955 |
+
|
4956 |
+
|
4957 |
+
|
4958 |
+
|
4959 |
+
|
4960 |
+
|
4961 |
+
|
4962 |
+
|
4963 |
+
22%|██████████████████████████████████ | 4700/21520 [3:52:17<9:25:30, 2.02s/it]
|
4964 |
+
|
4965 |
+
|
4966 |
+
|
4967 |
+
|
4968 |
+
|
4969 |
+
|
4970 |
+
|
4971 |
+
|
4972 |
+
|
4973 |
+
|
4974 |
+
|
4975 |
+
|
4976 |
+
|
4977 |
+
|
4978 |
+
|
4979 |
+
|
4980 |
+
|
4981 |
+
|
4982 |
+
|
4983 |
+
|
4984 |
+
|
4985 |
+
|
4986 |
+
|
4987 |
+
|
4988 |
+
|
4989 |
+
|
4990 |
+
|
4991 |
+
|
4992 |
+
|
4993 |
+
|
4994 |
+
|
4995 |
+
|
4996 |
+
|
4997 |
+
|
4998 |
+
|
4999 |
+
|
5000 |
+
|
5001 |
+
|
5002 |
+
|
5003 |
+
|
5004 |
+
|
5005 |
+
|
5006 |
+
|
5007 |
+
|
5008 |
+
|
5009 |
+
|
5010 |
+
|
5011 |
+
|
5012 |
+
|
5013 |
+
|
5014 |
+
|
5015 |
+
|
5016 |
+
|
5017 |
+
|
5018 |
+
|
5019 |
+
|
5020 |
+
|
5021 |
+
|
5022 |
+
|
5023 |
+
|
5024 |
+
|
5025 |
+
|
5026 |
+
|
5027 |
+
|
5028 |
+
|
5029 |
+
|
5030 |
+
|
5031 |
+
|
5032 |
+
|
5033 |
+
|
5034 |
+
|
5035 |
+
|
5036 |
+
|
5037 |
+
|
5038 |
+
|
5039 |
+
|
5040 |
+
|
5041 |
+
|
5042 |
+
|
5043 |
+
|
5044 |
+
|
5045 |
+
|
5046 |
+
|
5047 |
+
|
5048 |
+
|
5049 |
+
|
5050 |
+
|
5051 |
+
|
5052 |
+
|
5053 |
+
|
5054 |
+
|
5055 |
+
|
5056 |
+
|
5057 |
+
|
5058 |
+
|
5059 |
+
|
5060 |
+
|
5061 |
+
|
5062 |
+
22%|██████████████████████████████████▊ | 4800/21520 [3:56:42<9:16:50, 2.00s/it]
|
5063 |
+
|
5064 |
+
|
5065 |
+
|
5066 |
+
|
5067 |
+
|
5068 |
+
|
5069 |
+
|
5070 |
+
|
5071 |
+
|
5072 |
+
|
5073 |
+
|
5074 |
+
|
5075 |
+
|
5076 |
+
|
5077 |
+
|
5078 |
+
|
5079 |
+
|
5080 |
+
|
5081 |
+
|
5082 |
+
|
5083 |
+
|
5084 |
+
|
5085 |
+
|
5086 |
+
|
5087 |
+
|
5088 |
+
|
5089 |
+
|
5090 |
+
|
5091 |
+
|
5092 |
+
|
5093 |
+
|
5094 |
+
|
5095 |
+
|
5096 |
+
|
5097 |
+
|
5098 |
+
|
5099 |
+
|
5100 |
+
|
5101 |
+
|
5102 |
+
|
5103 |
+
|
5104 |
+
|
5105 |
+
|
5106 |
+
|
5107 |
+
|
5108 |
+
|
5109 |
+
|
5110 |
+
|
5111 |
+
|
5112 |
+
|
5113 |
+
|
5114 |
+
|
5115 |
+
|
5116 |
+
|
5117 |
+
|
5118 |
+
|
5119 |
+
|
5120 |
+
|
5121 |
+
|
5122 |
+
|
5123 |
+
|
5124 |
+
|
5125 |
+
|
5126 |
+
|
5127 |
+
|
5128 |
+
|
5129 |
+
|
5130 |
+
|
5131 |
+
|
5132 |
+
|
5133 |
+
|
5134 |
+
|
5135 |
+
|
5136 |
+
|
5137 |
+
|
5138 |
+
|
5139 |
+
|
5140 |
+
|
5141 |
+
|
5142 |
+
|
5143 |
+
|
5144 |
+
|
5145 |
+
|
5146 |
+
|
5147 |
+
|
5148 |
+
|
5149 |
+
|
5150 |
+
|
5151 |
+
|
5152 |
+
|
5153 |
+
|
5154 |
+
|
5155 |
+
|
5156 |
+
|
5157 |
+
|
5158 |
+
|
5159 |
+
|
5160 |
+
|
5161 |
+
23%|███████████████████████████████████▌ | 4900/21520 [4:01:06<9:15:05, 2.00s/it]
|
5162 |
+
|
5163 |
+
|
5164 |
+
|
5165 |
+
|
5166 |
+
|
5167 |
+
|
5168 |
+
|
5169 |
+
|
5170 |
+
|
5171 |
+
|
5172 |
+
|
5173 |
+
|
5174 |
+
|
5175 |
+
|
5176 |
+
|
5177 |
+
|
5178 |
+
|
5179 |
+
|
5180 |
+
|
5181 |
+
|
5182 |
+
|
5183 |
+
|
5184 |
+
|
5185 |
+
|
5186 |
+
|
5187 |
+
|
5188 |
+
|
5189 |
+
|
5190 |
+
|
5191 |
+
|
5192 |
+
|
5193 |
+
|
5194 |
+
|
5195 |
+
|
5196 |
+
|
5197 |
+
|
5198 |
+
|
5199 |
+
|
5200 |
+
|
5201 |
+
|
5202 |
+
|
5203 |
+
|
5204 |
+
|
5205 |
+
|
5206 |
+
|
5207 |
+
|
5208 |
+
|
5209 |
+
|
5210 |
+
|
5211 |
+
|
5212 |
+
|
5213 |
+
|
5214 |
+
|
5215 |
+
|
5216 |
+
|
5217 |
+
|
5218 |
+
|
5219 |
+
|
5220 |
+
|
5221 |
+
|
5222 |
+
|
5223 |
+
|
5224 |
+
|
5225 |
+
|
5226 |
+
|
5227 |
+
|
5228 |
+
|
5229 |
+
|
5230 |
+
|
5231 |
+
|
5232 |
+
|
5233 |
+
|
5234 |
+
|
5235 |
+
|
5236 |
+
|
5237 |
+
|
5238 |
+
|
5239 |
+
|
5240 |
+
|
5241 |
+
|
5242 |
+
|
5243 |
+
|
5244 |
+
|
5245 |
+
|
5246 |
+
|
5247 |
+
|
5248 |
+
|
5249 |
+
|
5250 |
+
|
5251 |
+
|
5252 |
+
|
5253 |
+
|
5254 |
+
|
5255 |
+
|
5256 |
+
|
5257 |
+
|
5258 |
+
|
5259 |
+
23%|████████████████████████████████████▏ | 5000/21520 [4:05:30<8:56:08, 1.95s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
5260 |
+
***** Running Evaluation *****
|
5261 |
+
Num examples = 1839
|
5262 |
+
Batch size = 64
|
5263 |
+
{'loss': 1.1599, 'learning_rate': 5.779020979020979e-05, 'epoch': 1.16}
|
5264 |
+
|
5265 |
+
|
5266 |
+
|
5267 |
+
|
5268 |
+
|
5269 |
+
|
5270 |
+
|
5271 |
+
|
5272 |
+
|
5273 |
+
|
5274 |
+
|
5275 |
+
|
5276 |
+
|
5277 |
+
|
5278 |
+
|
5279 |
+
|
5280 |
+
|
5281 |
+
|
5282 |
+
|
5283 |
+
|
5284 |
+
|
5285 |
+
|
5286 |
+
|
5287 |
+
|
5288 |
+
|
5289 |
+
|
5290 |
+
|
5291 |
+
|
5292 |
+
Configuration saved in ./checkpoint-5000/config.json
|
5293 |
+
{'eval_loss': 0.3355213701725006, 'eval_wer': 0.4330439988027537, 'eval_runtime': 63.7849, 'eval_samples_per_second': 28.831, 'eval_steps_per_second': 0.455, 'epoch': 1.16}
|
5294 |
+
Model weights saved in ./checkpoint-5000/pytorch_model.bin
|
5295 |
+
Configuration saved in ./checkpoint-5000/preprocessor_config.json
|
5296 |
+
Configuration saved in ./preprocessor_config.json
|
5297 |
+
Deleting older checkpoint [checkpoint-3500] due to args.save_total_limit
|
5298 |
+
|
5299 |
+
|
5300 |
+
|
5301 |
+
|
5302 |
+
|
5303 |
+
|
5304 |
+
|
5305 |
+
|
5306 |
+
|
5307 |
+
|
5308 |
+
|
5309 |
+
|
5310 |
+
|
5311 |
+
|
5312 |
+
|
5313 |
+
|
5314 |
+
|
5315 |
+
|
5316 |
+
|
5317 |
+
|
5318 |
+
|
5319 |
+
|
5320 |
+
|
5321 |
+
|
5322 |
+
|
5323 |
+
|
5324 |
+
|
5325 |
+
|
5326 |
+
|
5327 |
+
|
5328 |
+
|
5329 |
+
|
5330 |
+
|
5331 |
+
|
5332 |
+
|
5333 |
+
|
5334 |
+
|
5335 |
+
|
5336 |
+
|
5337 |
+
|
5338 |
+
|
5339 |
+
|
5340 |
+
|
5341 |
+
|
5342 |
+
|
5343 |
+
|
5344 |
+
|
5345 |
+
|
5346 |
+
|
5347 |
+
|
5348 |
+
|
5349 |
+
|
5350 |
+
|
5351 |
+
|
5352 |
+
|
5353 |
+
|
5354 |
+
|
5355 |
+
|
5356 |
+
|
5357 |
+
|
5358 |
+
|
5359 |
+
|
5360 |
+
|
5361 |
+
|
5362 |
+
|
5363 |
+
|
5364 |
+
|
5365 |
+
|
5366 |
+
|
5367 |
+
|
5368 |
+
|
5369 |
+
|
5370 |
+
|
5371 |
+
|
5372 |
+
|
5373 |
+
|
5374 |
+
|
5375 |
+
|
5376 |
+
|
5377 |
+
|
5378 |
+
|
5379 |
+
|
5380 |
+
|
5381 |
+
|
5382 |
+
|
5383 |
+
|
5384 |
+
|
5385 |
+
|
5386 |
+
|
5387 |
+
|
5388 |
+
|
5389 |
+
|
5390 |
+
|
5391 |
+
|
5392 |
+
|
5393 |
+
|
5394 |
+
24%|████████████████████████████████████▉ | 5100/21520 [4:12:48<8:56:31, 1.96s/it]
|
5395 |
+
|
5396 |
+
|
5397 |
+
|
5398 |
+
|
5399 |
+
|
5400 |
+
|
5401 |
+
|
5402 |
+
|
5403 |
+
|
5404 |
+
|
5405 |
+
|
5406 |
+
|
5407 |
+
|
5408 |
+
|
5409 |
+
|
5410 |
+
|
5411 |
+
|
5412 |
+
|
5413 |
+
|
5414 |
+
|
5415 |
+
|
5416 |
+
|
5417 |
+
|
5418 |
+
|
5419 |
+
|
5420 |
+
|
5421 |
+
|
5422 |
+
|
5423 |
+
|
5424 |
+
|
5425 |
+
|
5426 |
+
|
5427 |
+
|
5428 |
+
|
5429 |
+
|
5430 |
+
|
5431 |
+
|
5432 |
+
|
5433 |
+
|
5434 |
+
|
5435 |
+
|
5436 |
+
|
5437 |
+
|
5438 |
+
|
5439 |
+
|
5440 |
+
|
5441 |
+
|
5442 |
+
|
5443 |
+
|
5444 |
+
|
5445 |
+
|
5446 |
+
|
5447 |
+
|
5448 |
+
|
5449 |
+
|
5450 |
+
|
5451 |
+
|
5452 |
+
|
5453 |
+
|
5454 |
+
|
5455 |
+
|
5456 |
+
|
5457 |
+
|
5458 |
+
|
5459 |
+
|
5460 |
+
|
5461 |
+
|
5462 |
+
|
5463 |
+
|
5464 |
+
|
5465 |
+
|
5466 |
+
|
5467 |
+
|
5468 |
+
|
5469 |
+
|
5470 |
+
|
5471 |
+
|
5472 |
+
|
5473 |
+
|
5474 |
+
|
5475 |
+
|
5476 |
+
|
5477 |
+
|
5478 |
+
|
5479 |
+
|
5480 |
+
|
5481 |
+
|
5482 |
+
|
5483 |
+
|
5484 |
+
|
5485 |
+
|
5486 |
+
|
5487 |
+
|
5488 |
+
|
5489 |
+
|
5490 |
+
|
5491 |
+
|
5492 |
+
|
5493 |
+
|
5494 |
+
24%|█████████████████████████████████████▋ | 5200/21520 [4:17:12<9:28:46, 2.09s/it]
|
5495 |
+
|
5496 |
+
|
5497 |
+
|
5498 |
+
|
5499 |
+
|
5500 |
+
|
5501 |
+
|
5502 |
+
|
5503 |
+
|
5504 |
+
|
5505 |
+
|
5506 |
+
|
5507 |
+
|
5508 |
+
|
5509 |
+
|
5510 |
+
|
5511 |
+
|
5512 |
+
|
5513 |
+
|
5514 |
+
|
5515 |
+
|
5516 |
+
|
5517 |
+
|
5518 |
+
|
5519 |
+
|
5520 |
+
|
5521 |
+
|
5522 |
+
|
5523 |
+
|
5524 |
+
|
5525 |
+
|
5526 |
+
|
5527 |
+
|
5528 |
+
|
5529 |
+
|
5530 |
+
|
5531 |
+
|
5532 |
+
|
5533 |
+
|
5534 |
+
|
5535 |
+
|
5536 |
+
|
5537 |
+
|
5538 |
+
|
5539 |
+
|
5540 |
+
|
5541 |
+
|
5542 |
+
|
5543 |
+
|
5544 |
+
|
5545 |
+
|
5546 |
+
|
5547 |
+
|
5548 |
+
|
5549 |
+
|
5550 |
+
|
5551 |
+
|
5552 |
+
|
5553 |
+
|
5554 |
+
|
5555 |
+
|
5556 |
+
|
5557 |
+
|
5558 |
+
|
5559 |
+
|
5560 |
+
|
5561 |
+
|
5562 |
+
|
5563 |
+
|
5564 |
+
|
5565 |
+
|
5566 |
+
|
5567 |
+
|
5568 |
+
|
5569 |
+
|
5570 |
+
|
5571 |
+
|
5572 |
+
|
5573 |
+
|
5574 |
+
|
5575 |
+
|
5576 |
+
|
5577 |
+
|
5578 |
+
|
5579 |
+
|
5580 |
+
|
5581 |
+
|
5582 |
+
|
5583 |
+
|
5584 |
+
|
5585 |
+
|
5586 |
+
|
5587 |
+
|
5588 |
+
|
5589 |
+
|
5590 |
+
|
5591 |
+
|
5592 |
+
25%|██████████████████████████████████████▍ | 5299/21520 [4:21:32<8:51:36, 1.97s/it]
|
5593 |
+
|
5594 |
+
|
5595 |
+
|
5596 |
+
|
5597 |
+
|
5598 |
+
|
5599 |
+
|
5600 |
+
|
5601 |
+
|
5602 |
+
|
5603 |
+
|
5604 |
+
|
5605 |
+
|
5606 |
+
|
5607 |
+
|
5608 |
+
|
5609 |
+
|
5610 |
+
|
5611 |
+
|
5612 |
+
|
5613 |
+
|
5614 |
+
|
5615 |
+
|
5616 |
+
|
5617 |
+
|
5618 |
+
|
5619 |
+
|
5620 |
+
|
5621 |
+
|
5622 |
+
|
5623 |
+
|
5624 |
+
|
5625 |
+
|
5626 |
+
|
5627 |
+
|
5628 |
+
|
5629 |
+
|
5630 |
+
|
5631 |
+
|
5632 |
+
|
5633 |
+
|
5634 |
+
|
5635 |
+
|
5636 |
+
|
5637 |
+
|
5638 |
+
|
5639 |
+
|
5640 |
+
|
5641 |
+
|
5642 |
+
|
5643 |
+
|
5644 |
+
|
5645 |
+
|
5646 |
+
|
5647 |
+
|
5648 |
+
|
5649 |
+
|
5650 |
+
|
5651 |
+
|
5652 |
+
|
5653 |
+
|
5654 |
+
|
5655 |
+
|
5656 |
+
|
5657 |
+
|
5658 |
+
|
5659 |
+
|
5660 |
+
|
5661 |
+
|
5662 |
+
|
5663 |
+
|
5664 |
+
|
5665 |
+
|
5666 |
+
|
5667 |
+
|
5668 |
+
|
5669 |
+
|
5670 |
+
|
5671 |
+
|
5672 |
+
|
5673 |
+
|
5674 |
+
|
5675 |
+
|
5676 |
+
|
5677 |
+
|
5678 |
+
|
5679 |
+
|
5680 |
+
|
5681 |
+
|
5682 |
+
|
5683 |
+
|
5684 |
+
|
5685 |
+
|
5686 |
+
|
5687 |
+
|
5688 |
+
|
5689 |
+
|
5690 |
+
|
5691 |
+
25%|███████████████████████████████████████▏ | 5399/21520 [4:25:55<9:07:16, 2.04s/it]
|
5692 |
+
|
5693 |
+
|
5694 |
+
|
5695 |
+
|
5696 |
+
|
5697 |
+
|
5698 |
+
|
5699 |
+
|
5700 |
+
|
5701 |
+
|
5702 |
+
|
5703 |
+
|
5704 |
+
|
5705 |
+
|
5706 |
+
|
5707 |
+
|
5708 |
+
|
5709 |
+
|
5710 |
+
|
5711 |
+
|
5712 |
+
|
5713 |
+
|
5714 |
+
|
5715 |
+
|
5716 |
+
|
5717 |
+
|
5718 |
+
|
5719 |
+
|
5720 |
+
|
5721 |
+
|
5722 |
+
|
5723 |
+
|
5724 |
+
|
5725 |
+
|
5726 |
+
|
5727 |
+
|
5728 |
+
|
5729 |
+
|
5730 |
+
|
5731 |
+
|
5732 |
+
|
5733 |
+
|
5734 |
+
|
5735 |
+
|
5736 |
+
|
5737 |
+
|
5738 |
+
|
5739 |
+
|
5740 |
+
|
5741 |
+
|
5742 |
+
|
5743 |
+
|
5744 |
+
|
5745 |
+
|
5746 |
+
|
5747 |
+
|
5748 |
+
|
5749 |
+
|
5750 |
+
|
5751 |
+
|
5752 |
+
|
5753 |
+
|
5754 |
+
|
5755 |
+
|
5756 |
+
|
5757 |
+
|
5758 |
+
|
5759 |
+
|
5760 |
+
|
5761 |
+
|
5762 |
+
|
5763 |
+
|
5764 |
+
|
5765 |
+
|
5766 |
+
|
5767 |
+
|
5768 |
+
|
5769 |
+
|
5770 |
+
|
5771 |
+
|
5772 |
+
|
5773 |
+
|
5774 |
+
|
5775 |
+
|
5776 |
+
|
5777 |
+
|
5778 |
+
|
5779 |
+
|
5780 |
+
|
5781 |
+
|
5782 |
+
|
5783 |
+
|
5784 |
+
|
5785 |
+
|
5786 |
+
|
5787 |
+
|
5788 |
+
|
5789 |
+
26%|███████████████████████████████████████▊ | 5500/21520 [4:30:20<8:58:27, 2.02s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
5790 |
+
***** Running Evaluation *****
|
5791 |
+
Num examples = 1839
|
5792 |
+
Batch size = 64
|
5793 |
+
{'loss': 1.1568, 'learning_rate': 5.6041958041958036e-05, 'epoch': 1.28}
|
5794 |
+
|
5795 |
+
|
5796 |
+
|
5797 |
+
|
5798 |
+
|
5799 |
+
|
5800 |
+
|
5801 |
+
|
5802 |
+
|
5803 |
+
|
5804 |
+
|
5805 |
+
|
5806 |
+
|
5807 |
+
|
5808 |
+
|
5809 |
+
|
5810 |
+
|
5811 |
+
|
5812 |
+
|
5813 |
+
|
5814 |
+
|
5815 |
+
|
5816 |
+
|
5817 |
+
|
5818 |
+
|
5819 |
+
|
5820 |
+
|
5821 |
+
|
5822 |
+
|
5823 |
+
Configuration saved in ./checkpoint-5500/config.json
|
5824 |
+
{'eval_loss': 0.327897846698761, 'eval_wer': 0.42490272373540855, 'eval_runtime': 64.3828, 'eval_samples_per_second': 28.564, 'eval_steps_per_second': 0.45, 'epoch': 1.28}
|
5825 |
+
Model weights saved in ./checkpoint-5500/pytorch_model.bin
|
5826 |
+
Configuration saved in ./checkpoint-5500/preprocessor_config.json
|
5827 |
+
Configuration saved in ./preprocessor_config.json
|
5828 |
+
Deleting older checkpoint [checkpoint-4000] due to args.save_total_limit
|
5829 |
+
|
5830 |
+
|
5831 |
+
|
5832 |
+
|
5833 |
+
|
5834 |
+
|
5835 |
+
|
5836 |
+
|
5837 |
+
|
5838 |
+
|
5839 |
+
|
5840 |
+
|
5841 |
+
|
5842 |
+
|
5843 |
+
|
5844 |
+
|
5845 |
+
|
5846 |
+
|
5847 |
+
|
5848 |
+
|
5849 |
+
|
5850 |
+
|
5851 |
+
|
5852 |
+
|
5853 |
+
|
5854 |
+
|
5855 |
+
|
5856 |
+
|
5857 |
+
|
5858 |
+
|
5859 |
+
|
5860 |
+
|
5861 |
+
|
5862 |
+
|
5863 |
+
|
5864 |
+
|
5865 |
+
|
5866 |
+
|
5867 |
+
|
5868 |
+
|
5869 |
+
|
5870 |
+
|
5871 |
+
|
5872 |
+
|
5873 |
+
|
5874 |
+
|
5875 |
+
|
5876 |
+
|
5877 |
+
|
5878 |
+
|
5879 |
+
|
5880 |
+
|
5881 |
+
|
5882 |
+
|
5883 |
+
|
5884 |
+
|
5885 |
+
|
5886 |
+
|
5887 |
+
|
5888 |
+
|
5889 |
+
|
5890 |
+
|
5891 |
+
|
5892 |
+
|
5893 |
+
|
5894 |
+
|
5895 |
+
|
5896 |
+
|
5897 |
+
|
5898 |
+
|
5899 |
+
|
5900 |
+
|
5901 |
+
|
5902 |
+
|
5903 |
+
|
5904 |
+
|
5905 |
+
|
5906 |
+
|
5907 |
+
|
5908 |
+
|
5909 |
+
|
5910 |
+
|
5911 |
+
|
5912 |
+
|
5913 |
+
|
5914 |
+
|
5915 |
+
|
5916 |
+
|
5917 |
+
|
5918 |
+
|
5919 |
+
|
5920 |
+
|
5921 |
+
|
5922 |
+
|
5923 |
+
|
5924 |
+
|
5925 |
+
26%|████████████████████████████████████████▌ | 5599/21520 [4:37:38<8:44:34, 1.98s/it]
|
5926 |
+
|
5927 |
+
|
5928 |
+
|
5929 |
+
|
5930 |
+
|
5931 |
+
|
5932 |
+
|
5933 |
+
|
5934 |
+
|
5935 |
+
|
5936 |
+
|
5937 |
+
|
5938 |
+
|
5939 |
+
|
5940 |
+
|
5941 |
+
|
5942 |
+
|
5943 |
+
|
5944 |
+
|
5945 |
+
|
5946 |
+
|
5947 |
+
|
5948 |
+
|
5949 |
+
|
5950 |
+
|
5951 |
+
|
5952 |
+
|
5953 |
+
|
5954 |
+
|
5955 |
+
|
5956 |
+
|
5957 |
+
|
5958 |
+
|
5959 |
+
|
5960 |
+
|
5961 |
+
|
5962 |
+
|
5963 |
+
|
5964 |
+
|
5965 |
+
|
5966 |
+
|
5967 |
+
|
5968 |
+
|
5969 |
+
|
5970 |
+
|
5971 |
+
|
5972 |
+
|
5973 |
+
|
5974 |
+
|
5975 |
+
|
5976 |
+
|
5977 |
+
|
5978 |
+
|
5979 |
+
|
5980 |
+
|
5981 |
+
|
5982 |
+
|
5983 |
+
|
5984 |
+
|
5985 |
+
|
5986 |
+
|
5987 |
+
|
5988 |
+
|
5989 |
+
|
5990 |
+
|
5991 |
+
|
5992 |
+
|
5993 |
+
|
5994 |
+
|
5995 |
+
|
5996 |
+
|
5997 |
+
|
5998 |
+
|
5999 |
+
|
6000 |
+
|
6001 |
+
|
6002 |
+
|
6003 |
+
|
6004 |
+
|
6005 |
+
|
6006 |
+
|
6007 |
+
|
6008 |
+
|
6009 |
+
|
6010 |
+
|
6011 |
+
|
6012 |
+
|
6013 |
+
|
6014 |
+
|
6015 |
+
|
6016 |
+
|
6017 |
+
|
6018 |
+
|
6019 |
+
|
6020 |
+
|
6021 |
+
|
6022 |
+
|
6023 |
+
|
6024 |
+
26%|█████████████████████████████████████████▎ | 5699/21520 [4:42:01<8:33:59, 1.95s/it]
|
6025 |
+
|
6026 |
+
|
6027 |
+
|
6028 |
+
|
6029 |
+
|
6030 |
+
|
6031 |
+
|
6032 |
+
|
6033 |
+
|
6034 |
+
|
6035 |
+
|
6036 |
+
|
6037 |
+
|
6038 |
+
|
6039 |
+
|
6040 |
+
|
6041 |
+
|
6042 |
+
|
6043 |
+
|
6044 |
+
|
6045 |
+
|
6046 |
+
|
6047 |
+
|
6048 |
+
|
6049 |
+
|
6050 |
+
|
6051 |
+
|
6052 |
+
|
6053 |
+
|
6054 |
+
|
6055 |
+
|
6056 |
+
|
6057 |
+
|
6058 |
+
|
6059 |
+
|
6060 |
+
|
6061 |
+
|
6062 |
+
|
6063 |
+
|
6064 |
+
|
6065 |
+
|
6066 |
+
|
6067 |
+
|
6068 |
+
|
6069 |
+
|
6070 |
+
|
6071 |
+
|
6072 |
+
|
6073 |
+
|
6074 |
+
|
6075 |
+
|
6076 |
+
|
6077 |
+
|
6078 |
+
|
6079 |
+
|
6080 |
+
|
6081 |
+
|
6082 |
+
|
6083 |
+
|
6084 |
+
|
6085 |
+
|
6086 |
+
|
6087 |
+
|
6088 |
+
|
6089 |
+
|
6090 |
+
|
6091 |
+
|
6092 |
+
|
6093 |
+
|
6094 |
+
|
6095 |
+
|
6096 |
+
|
6097 |
+
|
6098 |
+
|
6099 |
+
|
6100 |
+
|
6101 |
+
|
6102 |
+
|
6103 |
+
|
6104 |
+
|
6105 |
+
|
6106 |
+
|
6107 |
+
|
6108 |
+
|
6109 |
+
|
6110 |
+
|
6111 |
+
|
6112 |
+
|
6113 |
+
|
6114 |
+
|
6115 |
+
|
6116 |
+
|
6117 |
+
|
6118 |
+
|
6119 |
+
|
6120 |
+
|
6121 |
+
|
6122 |
+
|
6123 |
+
27%|██████████████████████████████████████████ | 5799/21520 [4:46:25<8:30:31, 1.95s/it]
|
6124 |
+
|
6125 |
+
|
6126 |
+
|
6127 |
+
|
6128 |
+
|
6129 |
+
|
6130 |
+
|
6131 |
+
|
6132 |
+
|
6133 |
+
|
6134 |
+
|
6135 |
+
|
6136 |
+
|
6137 |
+
|
6138 |
+
|
6139 |
+
|
6140 |
+
|
6141 |
+
|
6142 |
+
|
6143 |
+
|
6144 |
+
|
6145 |
+
|
6146 |
+
|
6147 |
+
|
6148 |
+
|
6149 |
+
|
6150 |
+
|
6151 |
+
|
6152 |
+
|
6153 |
+
|
6154 |
+
|
6155 |
+
|
6156 |
+
|
6157 |
+
|
6158 |
+
|
6159 |
+
|
6160 |
+
|
6161 |
+
|
6162 |
+
|
6163 |
+
|
6164 |
+
|
6165 |
+
|
6166 |
+
|
6167 |
+
|
6168 |
+
|
6169 |
+
|
6170 |
+
|
6171 |
+
|
6172 |
+
|
6173 |
+
|
6174 |
+
|
6175 |
+
|
6176 |
+
|
6177 |
+
|
6178 |
+
|
6179 |
+
|
6180 |
+
|
6181 |
+
|
6182 |
+
|
6183 |
+
|
6184 |
+
|
6185 |
+
|
6186 |
+
|
6187 |
+
|
6188 |
+
|
6189 |
+
|
6190 |
+
|
6191 |
+
|
6192 |
+
|
6193 |
+
|
6194 |
+
|
6195 |
+
|
6196 |
+
|
6197 |
+
|
6198 |
+
|
6199 |
+
|
6200 |
+
|
6201 |
+
|
6202 |
+
|
6203 |
+
|
6204 |
+
|
6205 |
+
|
6206 |
+
|
6207 |
+
|
6208 |
+
|
6209 |
+
|
6210 |
+
|
6211 |
+
|
6212 |
+
|
6213 |
+
|
6214 |
+
|
6215 |
+
|
6216 |
+
|
6217 |
+
|
6218 |
+
|
6219 |
+
|
6220 |
+
|
6221 |
+
|
6222 |
+
27%|██████████████████████████████████████████▊ | 5899/21520 [4:50:48<8:50:56, 2.04s/it]
|
6223 |
+
|
6224 |
+
|
6225 |
+
|
6226 |
+
|
6227 |
+
|
6228 |
+
|
6229 |
+
|
6230 |
+
|
6231 |
+
|
6232 |
+
|
6233 |
+
|
6234 |
+
|
6235 |
+
|
6236 |
+
|
6237 |
+
|
6238 |
+
|
6239 |
+
|
6240 |
+
|
6241 |
+
|
6242 |
+
|
6243 |
+
|
6244 |
+
|
6245 |
+
|
6246 |
+
|
6247 |
+
|
6248 |
+
|
6249 |
+
|
6250 |
+
|
6251 |
+
|
6252 |
+
|
6253 |
+
|
6254 |
+
|
6255 |
+
|
6256 |
+
|
6257 |
+
|
6258 |
+
|
6259 |
+
|
6260 |
+
|
6261 |
+
|
6262 |
+
|
6263 |
+
|
6264 |
+
|
6265 |
+
|
6266 |
+
|
6267 |
+
|
6268 |
+
|
6269 |
+
|
6270 |
+
|
6271 |
+
|
6272 |
+
|
6273 |
+
|
6274 |
+
|
6275 |
+
|
6276 |
+
|
6277 |
+
|
6278 |
+
|
6279 |
+
|
6280 |
+
|
6281 |
+
|
6282 |
+
|
6283 |
+
|
6284 |
+
|
6285 |
+
|
6286 |
+
|
6287 |
+
|
6288 |
+
|
6289 |
+
|
6290 |
+
|
6291 |
+
|
6292 |
+
|
6293 |
+
|
6294 |
+
|
6295 |
+
|
6296 |
+
|
6297 |
+
|
6298 |
+
|
6299 |
+
|
6300 |
+
|
6301 |
+
|
6302 |
+
|
6303 |
+
|
6304 |
+
|
6305 |
+
|
6306 |
+
|
6307 |
+
|
6308 |
+
|
6309 |
+
|
6310 |
+
|
6311 |
+
|
6312 |
+
|
6313 |
+
|
6314 |
+
|
6315 |
+
|
6316 |
+
|
6317 |
+
|
6318 |
+
|
6319 |
+
|
6320 |
+
|
6321 |
+
28%|███████████████████████████████████████████▍ | 6000/21520 [4:55:17<8:35:28, 1.99s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
6322 |
+
***** Running Evaluation *****
|
6323 |
+
Num examples = 1839
|
6324 |
+
Batch size = 64
|
6325 |
+
{'loss': 1.0994, 'learning_rate': 5.4297202797202796e-05, 'epoch': 1.39}
|
6326 |
+
|
6327 |
+
|
6328 |
+
|
6329 |
+
|
6330 |
+
|
6331 |
+
|
6332 |
+
|
6333 |
+
|
6334 |
+
|
6335 |
+
|
6336 |
+
|
6337 |
+
|
6338 |
+
|
6339 |
+
|
6340 |
+
|
6341 |
+
|
6342 |
+
|
6343 |
+
|
6344 |
+
|
6345 |
+
|
6346 |
+
|
6347 |
+
|
6348 |
+
|
6349 |
+
|
6350 |
+
|
6351 |
+
|
6352 |
+
|
6353 |
+
|
6354 |
+
|
6355 |
+
Configuration saved in ./checkpoint-6000/config.json
|
6356 |
+
{'eval_loss': 0.318962037563324, 'eval_wer': 0.4240646513020054, 'eval_runtime': 65.8584, 'eval_samples_per_second': 27.924, 'eval_steps_per_second': 0.44, 'epoch': 1.39}
|
6357 |
+
Model weights saved in ./checkpoint-6000/pytorch_model.bin
|
6358 |
+
Configuration saved in ./checkpoint-6000/preprocessor_config.json
|
6359 |
+
Configuration saved in ./preprocessor_config.json
|
6360 |
+
Deleting older checkpoint [checkpoint-4500] due to args.save_total_limit
|
6361 |
+
|
6362 |
+
|
6363 |
+
|
6364 |
+
|
6365 |
+
|
6366 |
+
|
6367 |
+
|
6368 |
+
|
6369 |
+
|
6370 |
+
|
6371 |
+
|
6372 |
+
|
6373 |
+
|
6374 |
+
|
6375 |
+
|
6376 |
+
|
6377 |
+
|
6378 |
+
|
6379 |
+
|
6380 |
+
|
6381 |
+
|
6382 |
+
|
6383 |
+
|
6384 |
+
|
6385 |
+
|
6386 |
+
|
6387 |
+
|
6388 |
+
|
6389 |
+
|
6390 |
+
|
6391 |
+
|
6392 |
+
|
6393 |
+
|
6394 |
+
|
6395 |
+
|
6396 |
+
|
6397 |
+
|
6398 |
+
|
6399 |
+
|
6400 |
+
|
6401 |
+
|
6402 |
+
|
6403 |
+
|
6404 |
+
|
6405 |
+
|
6406 |
+
|
6407 |
+
|
6408 |
+
|
6409 |
+
|
6410 |
+
|
6411 |
+
|
6412 |
+
|
6413 |
+
|
6414 |
+
|
6415 |
+
|
6416 |
+
|
6417 |
+
|
6418 |
+
|
6419 |
+
|
6420 |
+
|
6421 |
+
|
6422 |
+
|
6423 |
+
|
6424 |
+
|
6425 |
+
|
6426 |
+
|
6427 |
+
|
6428 |
+
|
6429 |
+
|
6430 |
+
|
6431 |
+
|
6432 |
+
|
6433 |
+
|
6434 |
+
|
6435 |
+
|
6436 |
+
|
6437 |
+
|
6438 |
+
|
6439 |
+
|
6440 |
+
|
6441 |
+
|
6442 |
+
|
6443 |
+
|
6444 |
+
|
6445 |
+
|
6446 |
+
|
6447 |
+
|
6448 |
+
|
6449 |
+
|
6450 |
+
|
6451 |
+
|
6452 |
+
|
6453 |
+
|
6454 |
+
|
6455 |
+
|
6456 |
+
28%|████████████████████████████████████████████▏ | 6099/21520 [5:02:30<8:27:43, 1.98s/it]
|
6457 |
+
|
6458 |
+
|
6459 |
+
|
6460 |
+
|
6461 |
+
|
6462 |
+
|
6463 |
+
|
6464 |
+
|
6465 |
+
|
6466 |
+
|
6467 |
+
|
6468 |
+
|
6469 |
+
|
6470 |
+
|
6471 |
+
|
6472 |
+
|
6473 |
+
|
6474 |
+
|
6475 |
+
|
6476 |
+
|
6477 |
+
|
6478 |
+
|
6479 |
+
|
6480 |
+
|
6481 |
+
|
6482 |
+
|
6483 |
+
|
6484 |
+
|
6485 |
+
|
6486 |
+
|
6487 |
+
|
6488 |
+
|
6489 |
+
|
6490 |
+
|
6491 |
+
|
6492 |
+
|
6493 |
+
|
6494 |
+
|
6495 |
+
|
6496 |
+
|
6497 |
+
|
6498 |
+
|
6499 |
+
|
6500 |
+
|
6501 |
+
|
6502 |
+
|
6503 |
+
|
6504 |
+
|
6505 |
+
|
6506 |
+
|
6507 |
+
|
6508 |
+
|
6509 |
+
|
6510 |
+
|
6511 |
+
|
6512 |
+
|
6513 |
+
|
6514 |
+
|
6515 |
+
|
6516 |
+
|
6517 |
+
|
6518 |
+
|
6519 |
+
|
6520 |
+
|
6521 |
+
|
6522 |
+
|
6523 |
+
|
6524 |
+
|
6525 |
+
|
6526 |
+
|
6527 |
+
|
6528 |
+
|
6529 |
+
|
6530 |
+
|
6531 |
+
|
6532 |
+
|
6533 |
+
|
6534 |
+
|
6535 |
+
|
6536 |
+
|
6537 |
+
|
6538 |
+
|
6539 |
+
|
6540 |
+
|
6541 |
+
|
6542 |
+
|
6543 |
+
|
6544 |
+
|
6545 |
+
|
6546 |
+
|
6547 |
+
|
6548 |
+
|
6549 |
+
|
6550 |
+
|
6551 |
+
|
6552 |
+
|
6553 |
+
|
6554 |
+
|
6555 |
+
29%|████████████████████████████████████████████▉ | 6199/21520 [5:06:54<8:34:39, 2.02s/it]
|
6556 |
+
|
6557 |
+
|
6558 |
+
|
6559 |
+
|
6560 |
+
|
6561 |
+
|
6562 |
+
|
6563 |
+
|
6564 |
+
|
6565 |
+
|
6566 |
+
|
6567 |
+
|
6568 |
+
|
6569 |
+
|
6570 |
+
|
6571 |
+
|
6572 |
+
|
6573 |
+
|
6574 |
+
|
6575 |
+
|
6576 |
+
|
6577 |
+
|
6578 |
+
|
6579 |
+
|
6580 |
+
|
6581 |
+
|
6582 |
+
|
6583 |
+
|
6584 |
+
|
6585 |
+
|
6586 |
+
|
6587 |
+
|
6588 |
+
|
6589 |
+
|
6590 |
+
|
6591 |
+
|
6592 |
+
|
6593 |
+
|
6594 |
+
|
6595 |
+
|
6596 |
+
|
6597 |
+
|
6598 |
+
|
6599 |
+
|
6600 |
+
|
6601 |
+
|
6602 |
+
|
6603 |
+
|
6604 |
+
|
6605 |
+
|
6606 |
+
|
6607 |
+
|
6608 |
+
|
6609 |
+
|
6610 |
+
|
6611 |
+
|
6612 |
+
|
6613 |
+
|
6614 |
+
|
6615 |
+
|
6616 |
+
|
6617 |
+
|
6618 |
+
|
6619 |
+
|
6620 |
+
|
6621 |
+
|
6622 |
+
|
6623 |
+
|
6624 |
+
|
6625 |
+
|
6626 |
+
|
6627 |
+
|
6628 |
+
|
6629 |
+
|
6630 |
+
|
6631 |
+
|
6632 |
+
|
6633 |
+
|
6634 |
+
|
6635 |
+
|
6636 |
+
|
6637 |
+
|
6638 |
+
|
6639 |
+
|
6640 |
+
|
6641 |
+
|
6642 |
+
|
6643 |
+
|
6644 |
+
|
6645 |
+
|
6646 |
+
|
6647 |
+
|
6648 |
+
|
6649 |
+
|
6650 |
+
|
6651 |
+
|
6652 |
+
|
6653 |
+
|
6654 |
+
29%|█████████████████████████████████████████████▋ | 6299/21520 [5:11:18<8:31:19, 2.02s/it]
|
6655 |
+
|
6656 |
+
|
6657 |
+
|
6658 |
+
|
6659 |
+
|
6660 |
+
|
6661 |
+
|
6662 |
+
|
6663 |
+
|
6664 |
+
|
6665 |
+
|
6666 |
+
|
6667 |
+
|
6668 |
+
|
6669 |
+
|
6670 |
+
|
6671 |
+
|
6672 |
+
|
6673 |
+
|
6674 |
+
|
6675 |
+
|
6676 |
+
|
6677 |
+
|
6678 |
+
|
6679 |
+
|
6680 |
+
|
6681 |
+
|
6682 |
+
|
6683 |
+
|
6684 |
+
|
6685 |
+
|
6686 |
+
|
6687 |
+
|
6688 |
+
|
6689 |
+
|
6690 |
+
|
6691 |
+
|
6692 |
+
|
6693 |
+
|
6694 |
+
|
6695 |
+
|
6696 |
+
|
6697 |
+
|
6698 |
+
|
6699 |
+
|
6700 |
+
|
6701 |
+
|
6702 |
+
|
6703 |
+
|
6704 |
+
|
6705 |
+
|
6706 |
+
|
6707 |
+
|
6708 |
+
|
6709 |
+
|
6710 |
+
|
6711 |
+
|
6712 |
+
|
6713 |
+
|
6714 |
+
|
6715 |
+
|
6716 |
+
|
6717 |
+
|
6718 |
+
|
6719 |
+
|
6720 |
+
|
6721 |
+
|
6722 |
+
|
6723 |
+
|
6724 |
+
|
6725 |
+
|
6726 |
+
|
6727 |
+
|
6728 |
+
|
6729 |
+
|
6730 |
+
|
6731 |
+
|
6732 |
+
|
6733 |
+
|
6734 |
+
|
6735 |
+
|
6736 |
+
|
6737 |
+
|
6738 |
+
|
6739 |
+
|
6740 |
+
|
6741 |
+
|
6742 |
+
|
6743 |
+
|
6744 |
+
|
6745 |
+
|
6746 |
+
|
6747 |
+
|
6748 |
+
|
6749 |
+
|
6750 |
+
|
6751 |
+
|
6752 |
+
|
6753 |
+
|
6754 |
+
30%|██████████████████████████████████████████████▍ | 6400/21520 [5:15:44<8:25:03, 2.00s/it]
|
6755 |
+
|
6756 |
+
|
6757 |
+
|
6758 |
+
|
6759 |
+
|
6760 |
+
|
6761 |
+
|
6762 |
+
|
6763 |
+
|
6764 |
+
|
6765 |
+
|
6766 |
+
|
6767 |
+
|
6768 |
+
|
6769 |
+
|
6770 |
+
|
6771 |
+
|
6772 |
+
|
6773 |
+
|
6774 |
+
|
6775 |
+
|
6776 |
+
|
6777 |
+
|
6778 |
+
|
6779 |
+
|
6780 |
+
|
6781 |
+
|
6782 |
+
|
6783 |
+
|
6784 |
+
|
6785 |
+
|
6786 |
+
|
6787 |
+
|
6788 |
+
|
6789 |
+
|
6790 |
+
|
6791 |
+
|
6792 |
+
|
6793 |
+
|
6794 |
+
|
6795 |
+
|
6796 |
+
|
6797 |
+
|
6798 |
+
|
6799 |
+
|
6800 |
+
|
6801 |
+
|
6802 |
+
|
6803 |
+
|
6804 |
+
|
6805 |
+
|
6806 |
+
|
6807 |
+
|
6808 |
+
|
6809 |
+
|
6810 |
+
|
6811 |
+
|
6812 |
+
|
6813 |
+
|
6814 |
+
|
6815 |
+
|
6816 |
+
|
6817 |
+
|
6818 |
+
|
6819 |
+
|
6820 |
+
|
6821 |
+
|
6822 |
+
|
6823 |
+
|
6824 |
+
|
6825 |
+
|
6826 |
+
|
6827 |
+
|
6828 |
+
|
6829 |
+
|
6830 |
+
|
6831 |
+
|
6832 |
+
|
6833 |
+
|
6834 |
+
|
6835 |
+
|
6836 |
+
|
6837 |
+
|
6838 |
+
|
6839 |
+
|
6840 |
+
|
6841 |
+
|
6842 |
+
|
6843 |
+
|
6844 |
+
|
6845 |
+
|
6846 |
+
|
6847 |
+
|
6848 |
+
|
6849 |
+
|
6850 |
+
|
6851 |
+
30%|███████████████████████████████████████████████ | 6500/21520 [5:20:08<8:19:08, 1.99s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
6852 |
+
***** Running Evaluation *****
|
6853 |
+
Num examples = 1839
|
6854 |
+
Batch size = 64
|
6855 |
+
{'loss': 1.1201, 'learning_rate': 5.2548951048951044e-05, 'epoch': 1.51}
|
6856 |
+
|
6857 |
+
|
6858 |
+
|
6859 |
+
|
6860 |
+
|
6861 |
+
|
6862 |
+
|
6863 |
+
|
6864 |
+
|
6865 |
+
|
6866 |
+
|
6867 |
+
|
6868 |
+
|
6869 |
+
|
6870 |
+
|
6871 |
+
|
6872 |
+
|
6873 |
+
|
6874 |
+
|
6875 |
+
|
6876 |
+
|
6877 |
+
|
6878 |
+
|
6879 |
+
|
6880 |
+
|
6881 |
+
|
6882 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:57<00:00, 1.83s/it]
|
6883 |
+
|
6884 |
+
Configuration saved in ./checkpoint-6500/config.json
|
6885 |
+
Model weights saved in ./checkpoint-6500/pytorch_model.bin
|
6886 |
+
Configuration saved in ./checkpoint-6500/preprocessor_config.json
|
6887 |
+
Configuration saved in ./preprocessor_config.json
|
6888 |
+
Deleting older checkpoint [checkpoint-5000] due to args.save_total_limit
|
6889 |
+
|
6890 |
+
|
6891 |
+
|
6892 |
+
|
6893 |
+
|
6894 |
+
|
6895 |
+
|
6896 |
+
|
6897 |
+
|
6898 |
+
|
6899 |
+
|
6900 |
+
|
6901 |
+
|
6902 |
+
|
6903 |
+
|
6904 |
+
|
6905 |
+
|
6906 |
+
|
6907 |
+
|
6908 |
+
|
6909 |
+
|
6910 |
+
|
6911 |
+
|
6912 |
+
|
6913 |
+
|
6914 |
+
|
6915 |
+
|
6916 |
+
|
6917 |
+
|
6918 |
+
|
6919 |
+
|
6920 |
+
|
6921 |
+
|
6922 |
+
|
6923 |
+
|
6924 |
+
|
6925 |
+
|
6926 |
+
|
6927 |
+
|
6928 |
+
|
6929 |
+
|
6930 |
+
|
6931 |
+
|
6932 |
+
|
6933 |
+
|
6934 |
+
|
6935 |
+
|
6936 |
+
|
6937 |
+
|
6938 |
+
|
6939 |
+
|
6940 |
+
|
6941 |
+
|
6942 |
+
|
6943 |
+
|
6944 |
+
|
6945 |
+
|
6946 |
+
|
6947 |
+
|
6948 |
+
|
6949 |
+
|
6950 |
+
|
6951 |
+
|
6952 |
+
|
6953 |
+
|
6954 |
+
|
6955 |
+
|
6956 |
+
|
6957 |
+
|
6958 |
+
|
6959 |
+
|
6960 |
+
|
6961 |
+
|
6962 |
+
|
6963 |
+
|
6964 |
+
|
6965 |
+
|
6966 |
+
|
6967 |
+
|
6968 |
+
|
6969 |
+
|
6970 |
+
|
6971 |
+
|
6972 |
+
|
6973 |
+
|
6974 |
+
|
6975 |
+
|
6976 |
+
|
6977 |
+
|
6978 |
+
|
6979 |
+
|
6980 |
+
|
6981 |
+
|
6982 |
+
|
6983 |
+
|
6984 |
+
|
6985 |
+
31%|███████████████████████████████████████████████▊ | 6600/21520 [5:27:24<8:15:18, 1.99s/it]
|
6986 |
+
|
6987 |
+
|
6988 |
+
|
6989 |
+
|
6990 |
+
|
6991 |
+
|
6992 |
+
|
6993 |
+
|
6994 |
+
|
6995 |
+
|
6996 |
+
|
6997 |
+
|
6998 |
+
|
6999 |
+
|
7000 |
+
|
7001 |
+
|
7002 |
+
|
7003 |
+
|
7004 |
+
|
7005 |
+
|
7006 |
+
|
7007 |
+
|
7008 |
+
|
7009 |
+
|
7010 |
+
|
7011 |
+
|
7012 |
+
|
7013 |
+
|
7014 |
+
|
7015 |
+
|
7016 |
+
|
7017 |
+
|
7018 |
+
|
7019 |
+
|
7020 |
+
|
7021 |
+
|
7022 |
+
|
7023 |
+
|
7024 |
+
|
7025 |
+
|
7026 |
+
|
7027 |
+
|
7028 |
+
|
7029 |
+
|
7030 |
+
|
7031 |
+
|
7032 |
+
|
7033 |
+
|
7034 |
+
|
7035 |
+
|
7036 |
+
|
7037 |
+
|
7038 |
+
|
7039 |
+
|
7040 |
+
|
7041 |
+
|
7042 |
+
|
7043 |
+
|
7044 |
+
|
7045 |
+
|
7046 |
+
|
7047 |
+
|
7048 |
+
|
7049 |
+
|
7050 |
+
|
7051 |
+
|
7052 |
+
|
7053 |
+
|
7054 |
+
|
7055 |
+
|
7056 |
+
|
7057 |
+
|
7058 |
+
|
7059 |
+
|
7060 |
+
|
7061 |
+
|
7062 |
+
|
7063 |
+
|
7064 |
+
|
7065 |
+
|
7066 |
+
|
7067 |
+
|
7068 |
+
|
7069 |
+
|
7070 |
+
|
7071 |
+
|
7072 |
+
|
7073 |
+
|
7074 |
+
|
7075 |
+
|
7076 |
+
|
7077 |
+
|
7078 |
+
|
7079 |
+
|
7080 |
+
|
7081 |
+
|
7082 |
+
|
7083 |
+
31%|████████████████████████████████████████████████▌ | 6700/21520 [5:31:47<8:03:10, 1.96s/it]
|
7084 |
+
|
7085 |
+
|
7086 |
+
|
7087 |
+
|
7088 |
+
|
7089 |
+
|
7090 |
+
|
7091 |
+
|
7092 |
+
|
7093 |
+
|
7094 |
+
|
7095 |
+
|
7096 |
+
|
7097 |
+
|
7098 |
+
|
7099 |
+
|
7100 |
+
|
7101 |
+
|
7102 |
+
|
7103 |
+
|
7104 |
+
|
7105 |
+
|
7106 |
+
|
7107 |
+
|
7108 |
+
|
7109 |
+
|
7110 |
+
|
7111 |
+
|
7112 |
+
|
7113 |
+
|
7114 |
+
|
7115 |
+
|
7116 |
+
|
7117 |
+
|
7118 |
+
|
7119 |
+
|
7120 |
+
|
7121 |
+
|
7122 |
+
|
7123 |
+
|
7124 |
+
|
7125 |
+
|
7126 |
+
|
7127 |
+
|
7128 |
+
|
7129 |
+
|
7130 |
+
|
7131 |
+
|
7132 |
+
|
7133 |
+
|
7134 |
+
|
7135 |
+
|
7136 |
+
|
7137 |
+
|
7138 |
+
|
7139 |
+
|
7140 |
+
|
7141 |
+
|
7142 |
+
|
7143 |
+
|
7144 |
+
|
7145 |
+
|
7146 |
+
|
7147 |
+
|
7148 |
+
|
7149 |
+
|
7150 |
+
|
7151 |
+
|
7152 |
+
|
7153 |
+
|
7154 |
+
|
7155 |
+
|
7156 |
+
|
7157 |
+
|
7158 |
+
|
7159 |
+
|
7160 |
+
|
7161 |
+
|
7162 |
+
|
7163 |
+
|
7164 |
+
|
7165 |
+
|
7166 |
+
|
7167 |
+
|
7168 |
+
|
7169 |
+
|
7170 |
+
|
7171 |
+
|
7172 |
+
|
7173 |
+
|
7174 |
+
|
7175 |
+
|
7176 |
+
|
7177 |
+
|
7178 |
+
|
7179 |
+
|
7180 |
+
|
7181 |
+
32%|█████████████████████████████████████████████████▎ | 6799/21520 [5:36:08<7:56:35, 1.94s/it]
|
7182 |
+
|
7183 |
+
|
7184 |
+
|
7185 |
+
|
7186 |
+
|
7187 |
+
|
7188 |
+
|
7189 |
+
|
7190 |
+
|
7191 |
+
|
7192 |
+
|
7193 |
+
|
7194 |
+
|
7195 |
+
|
7196 |
+
|
7197 |
+
|
7198 |
+
|
7199 |
+
|
7200 |
+
|
7201 |
+
|
7202 |
+
|
7203 |
+
|
7204 |
+
|
7205 |
+
|
7206 |
+
|
7207 |
+
|
7208 |
+
|
7209 |
+
|
7210 |
+
|
7211 |
+
|
7212 |
+
|
7213 |
+
|
7214 |
+
|
7215 |
+
|
7216 |
+
|
7217 |
+
|
7218 |
+
|
7219 |
+
|
7220 |
+
|
7221 |
+
|
7222 |
+
|
7223 |
+
|
7224 |
+
|
7225 |
+
|
7226 |
+
|
7227 |
+
|
7228 |
+
|
7229 |
+
|
7230 |
+
|
7231 |
+
|
7232 |
+
|
7233 |
+
|
7234 |
+
|
7235 |
+
|
7236 |
+
|
7237 |
+
|
7238 |
+
|
7239 |
+
|
7240 |
+
|
7241 |
+
|
7242 |
+
|
7243 |
+
|
7244 |
+
|
7245 |
+
|
7246 |
+
|
7247 |
+
|
7248 |
+
|
7249 |
+
|
7250 |
+
|
7251 |
+
|
7252 |
+
|
7253 |
+
|
7254 |
+
|
7255 |
+
|
7256 |
+
|
7257 |
+
|
7258 |
+
|
7259 |
+
|
7260 |
+
|
7261 |
+
|
7262 |
+
|
7263 |
+
|
7264 |
+
|
7265 |
+
|
7266 |
+
|
7267 |
+
|
7268 |
+
|
7269 |
+
|
7270 |
+
|
7271 |
+
|
7272 |
+
|
7273 |
+
|
7274 |
+
|
7275 |
+
|
7276 |
+
|
7277 |
+
|
7278 |
+
|
7279 |
+
|
7280 |
+
32%|██████████████████████████████████████████████████ | 6900/21520 [5:40:33<8:02:43, 1.98s/it]
|
7281 |
+
|
7282 |
+
|
7283 |
+
|
7284 |
+
|
7285 |
+
|
7286 |
+
|
7287 |
+
|
7288 |
+
|
7289 |
+
|
7290 |
+
|
7291 |
+
|
7292 |
+
|
7293 |
+
|
7294 |
+
|
7295 |
+
|
7296 |
+
|
7297 |
+
|
7298 |
+
|
7299 |
+
|
7300 |
+
|
7301 |
+
|
7302 |
+
|
7303 |
+
|
7304 |
+
|
7305 |
+
|
7306 |
+
|
7307 |
+
|
7308 |
+
|
7309 |
+
|
7310 |
+
|
7311 |
+
|
7312 |
+
|
7313 |
+
|
7314 |
+
|
7315 |
+
|
7316 |
+
|
7317 |
+
|
7318 |
+
|
7319 |
+
|
7320 |
+
|
7321 |
+
|
7322 |
+
|
7323 |
+
|
7324 |
+
|
7325 |
+
|
7326 |
+
|
7327 |
+
|
7328 |
+
|
7329 |
+
|
7330 |
+
|
7331 |
+
|
7332 |
+
|
7333 |
+
|
7334 |
+
|
7335 |
+
|
7336 |
+
|
7337 |
+
|
7338 |
+
|
7339 |
+
|
7340 |
+
|
7341 |
+
|
7342 |
+
|
7343 |
+
|
7344 |
+
|
7345 |
+
|
7346 |
+
|
7347 |
+
|
7348 |
+
|
7349 |
+
|
7350 |
+
|
7351 |
+
|
7352 |
+
|
7353 |
+
|
7354 |
+
|
7355 |
+
|
7356 |
+
|
7357 |
+
|
7358 |
+
|
7359 |
+
|
7360 |
+
|
7361 |
+
|
7362 |
+
|
7363 |
+
|
7364 |
+
|
7365 |
+
|
7366 |
+
|
7367 |
+
|
7368 |
+
|
7369 |
+
|
7370 |
+
|
7371 |
+
|
7372 |
+
|
7373 |
+
|
7374 |
+
|
7375 |
+
|
7376 |
+
|
7377 |
+
|
7378 |
+
33%|██████████████████████████████████████████████████▋ | 6999/21520 [5:44:52<7:54:39, 1.96s/it]
|
7379 |
+
33%|██████████████████████████████████████████████████▋ | 7000/21520 [5:44:54<7:55:25, 1.96s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
7380 |
+
***** Running Evaluation *****
|
7381 |
+
Num examples = 1839
|
7382 |
+
Batch size = 64
|
7383 |
+
|
7384 |
+
|
7385 |
+
|
7386 |
+
|
7387 |
+
|
7388 |
+
|
7389 |
+
|
7390 |
+
|
7391 |
+
|
7392 |
+
|
7393 |
+
|
7394 |
+
|
7395 |
+
|
7396 |
+
|
7397 |
+
|
7398 |
+
|
7399 |
+
|
7400 |
+
|
7401 |
+
|
7402 |
+
|
7403 |
+
|
7404 |
+
|
7405 |
+
|
7406 |
+
|
7407 |
+
|
7408 |
+
|
7409 |
+
|
7410 |
+
100%|████████████████��████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:54<00:00, 1.80s/it]
|
7411 |
+
|
7412 |
+
Configuration saved in ./checkpoint-7000/config.json
|
7413 |
+
Model weights saved in ./checkpoint-7000/pytorch_model.bin
|
7414 |
+
Configuration saved in ./checkpoint-7000/preprocessor_config.json
|
7415 |
+
Configuration saved in ./preprocessor_config.json
|
7416 |
+
Deleting older checkpoint [checkpoint-5500] due to args.save_total_limit
|
7417 |
+
|
7418 |
+
|
7419 |
+
|
7420 |
+
|
7421 |
+
|
7422 |
+
|
7423 |
+
|
7424 |
+
|
7425 |
+
|
7426 |
+
|
7427 |
+
|
7428 |
+
|
7429 |
+
|
7430 |
+
|
7431 |
+
|
7432 |
+
|
7433 |
+
|
7434 |
+
|
7435 |
+
|
7436 |
+
|
7437 |
+
|
7438 |
+
|
7439 |
+
|
7440 |
+
|
7441 |
+
|
7442 |
+
|
7443 |
+
|
7444 |
+
|
7445 |
+
|
7446 |
+
|
7447 |
+
|
7448 |
+
|
7449 |
+
|
7450 |
+
|
7451 |
+
|
7452 |
+
|
7453 |
+
|
7454 |
+
|
7455 |
+
|
7456 |
+
|
7457 |
+
|
7458 |
+
|
7459 |
+
|
7460 |
+
|
7461 |
+
|
7462 |
+
|
7463 |
+
|
7464 |
+
|
7465 |
+
|
7466 |
+
|
7467 |
+
|
7468 |
+
|
7469 |
+
|
7470 |
+
|
7471 |
+
|
7472 |
+
|
7473 |
+
|
7474 |
+
|
7475 |
+
|
7476 |
+
|
7477 |
+
|
7478 |
+
|
7479 |
+
|
7480 |
+
|
7481 |
+
|
7482 |
+
|
7483 |
+
|
7484 |
+
|
7485 |
+
|
7486 |
+
|
7487 |
+
|
7488 |
+
|
7489 |
+
|
7490 |
+
|
7491 |
+
|
7492 |
+
|
7493 |
+
|
7494 |
+
|
7495 |
+
|
7496 |
+
|
7497 |
+
|
7498 |
+
|
7499 |
+
|
7500 |
+
|
7501 |
+
|
7502 |
+
|
7503 |
+
|
7504 |
+
|
7505 |
+
|
7506 |
+
|
7507 |
+
|
7508 |
+
|
7509 |
+
|
7510 |
+
|
7511 |
+
|
7512 |
+
|
7513 |
+
|
7514 |
+
33%|███████████████████████████████████████████████████▍ | 7100/21520 [5:52:08<8:10:16, 2.04s/it]
|
7515 |
+
|
7516 |
+
|
7517 |
+
|
7518 |
+
|
7519 |
+
|
7520 |
+
|
7521 |
+
|
7522 |
+
|
7523 |
+
|
7524 |
+
|
7525 |
+
|
7526 |
+
|
7527 |
+
|
7528 |
+
|
7529 |
+
|
7530 |
+
|
7531 |
+
|
7532 |
+
|
7533 |
+
|
7534 |
+
|
7535 |
+
|
7536 |
+
|
7537 |
+
|
7538 |
+
|
7539 |
+
|
7540 |
+
|
7541 |
+
|
7542 |
+
|
7543 |
+
|
7544 |
+
|
7545 |
+
|
7546 |
+
|
7547 |
+
|
7548 |
+
|
7549 |
+
|
7550 |
+
|
7551 |
+
|
7552 |
+
|
7553 |
+
|
7554 |
+
|
7555 |
+
|
7556 |
+
|
7557 |
+
|
7558 |
+
|
7559 |
+
|
7560 |
+
|
7561 |
+
|
7562 |
+
|
7563 |
+
|
7564 |
+
|
7565 |
+
|
7566 |
+
|
7567 |
+
|
7568 |
+
|
7569 |
+
|
7570 |
+
|
7571 |
+
|
7572 |
+
|
7573 |
+
|
7574 |
+
|
7575 |
+
|
7576 |
+
|
7577 |
+
|
7578 |
+
|
7579 |
+
|
7580 |
+
|
7581 |
+
|
7582 |
+
|
7583 |
+
|
7584 |
+
|
7585 |
+
|
7586 |
+
|
7587 |
+
|
7588 |
+
|
7589 |
+
|
7590 |
+
|
7591 |
+
|
7592 |
+
|
7593 |
+
|
7594 |
+
|
7595 |
+
|
7596 |
+
|
7597 |
+
|
7598 |
+
|
7599 |
+
|
7600 |
+
|
7601 |
+
|
7602 |
+
|
7603 |
+
|
7604 |
+
|
7605 |
+
|
7606 |
+
|
7607 |
+
|
7608 |
+
|
7609 |
+
|
7610 |
+
|
7611 |
+
|
7612 |
+
33%|████████████████████████████████████████████████████▏ | 7199/21520 [5:56:30<8:01:29, 2.02s/it]
|
7613 |
+
|
7614 |
+
|
7615 |
+
|
7616 |
+
|
7617 |
+
|
7618 |
+
|
7619 |
+
|
7620 |
+
|
7621 |
+
|
7622 |
+
|
7623 |
+
|
7624 |
+
|
7625 |
+
|
7626 |
+
|
7627 |
+
|
7628 |
+
|
7629 |
+
|
7630 |
+
|
7631 |
+
|
7632 |
+
|
7633 |
+
|
7634 |
+
|
7635 |
+
|
7636 |
+
|
7637 |
+
|
7638 |
+
|
7639 |
+
|
7640 |
+
|
7641 |
+
|
7642 |
+
|
7643 |
+
|
7644 |
+
|
7645 |
+
|
7646 |
+
|
7647 |
+
|
7648 |
+
|
7649 |
+
|
7650 |
+
|
7651 |
+
|
7652 |
+
|
7653 |
+
|
7654 |
+
|
7655 |
+
|
7656 |
+
|
7657 |
+
|
7658 |
+
|
7659 |
+
|
7660 |
+
|
7661 |
+
|
7662 |
+
|
7663 |
+
|
7664 |
+
|
7665 |
+
|
7666 |
+
|
7667 |
+
|
7668 |
+
|
7669 |
+
|
7670 |
+
|
7671 |
+
|
7672 |
+
|
7673 |
+
|
7674 |
+
|
7675 |
+
|
7676 |
+
|
7677 |
+
|
7678 |
+
|
7679 |
+
|
7680 |
+
|
7681 |
+
|
7682 |
+
|
7683 |
+
|
7684 |
+
|
7685 |
+
|
7686 |
+
|
7687 |
+
|
7688 |
+
|
7689 |
+
|
7690 |
+
|
7691 |
+
|
7692 |
+
|
7693 |
+
|
7694 |
+
|
7695 |
+
|
7696 |
+
|
7697 |
+
|
7698 |
+
|
7699 |
+
|
7700 |
+
|
7701 |
+
|
7702 |
+
|
7703 |
+
|
7704 |
+
|
7705 |
+
|
7706 |
+
|
7707 |
+
|
7708 |
+
|
7709 |
+
|
7710 |
+
|
7711 |
+
34%|████████████████████████████████████████████████████▉ | 7299/21520 [6:00:54<7:59:22, 2.02s/it]
|
7712 |
+
|
7713 |
+
|
7714 |
+
|
7715 |
+
|
7716 |
+
|
7717 |
+
|
7718 |
+
|
7719 |
+
|
7720 |
+
|
7721 |
+
|
7722 |
+
|
7723 |
+
|
7724 |
+
|
7725 |
+
|
7726 |
+
|
7727 |
+
|
7728 |
+
|
7729 |
+
|
7730 |
+
|
7731 |
+
|
7732 |
+
|
7733 |
+
|
7734 |
+
|
7735 |
+
|
7736 |
+
|
7737 |
+
|
7738 |
+
|
7739 |
+
|
7740 |
+
|
7741 |
+
|
7742 |
+
|
7743 |
+
|
7744 |
+
|
7745 |
+
|
7746 |
+
|
7747 |
+
|
7748 |
+
|
7749 |
+
|
7750 |
+
|
7751 |
+
|
7752 |
+
|
7753 |
+
|
7754 |
+
|
7755 |
+
|
7756 |
+
|
7757 |
+
|
7758 |
+
|
7759 |
+
|
7760 |
+
|
7761 |
+
|
7762 |
+
|
7763 |
+
|
7764 |
+
|
7765 |
+
|
7766 |
+
|
7767 |
+
|
7768 |
+
|
7769 |
+
|
7770 |
+
|
7771 |
+
|
7772 |
+
|
7773 |
+
|
7774 |
+
|
7775 |
+
|
7776 |
+
|
7777 |
+
|
7778 |
+
|
7779 |
+
|
7780 |
+
|
7781 |
+
|
7782 |
+
|
7783 |
+
|
7784 |
+
|
7785 |
+
|
7786 |
+
|
7787 |
+
|
7788 |
+
|
7789 |
+
|
7790 |
+
|
7791 |
+
|
7792 |
+
|
7793 |
+
|
7794 |
+
|
7795 |
+
|
7796 |
+
|
7797 |
+
|
7798 |
+
|
7799 |
+
|
7800 |
+
|
7801 |
+
|
7802 |
+
|
7803 |
+
|
7804 |
+
|
7805 |
+
|
7806 |
+
|
7807 |
+
|
7808 |
+
|
7809 |
+
34%|█████████████████████████████████████████████████████▋ | 7399/21520 [6:05:20<7:52:51, 2.01s/it]
|
7810 |
+
|
7811 |
+
|
7812 |
+
|
7813 |
+
|
7814 |
+
|
7815 |
+
|
7816 |
+
|
7817 |
+
|
7818 |
+
|
7819 |
+
|
7820 |
+
|
7821 |
+
|
7822 |
+
|
7823 |
+
|
7824 |
+
|
7825 |
+
|
7826 |
+
|
7827 |
+
|
7828 |
+
|
7829 |
+
|
7830 |
+
|
7831 |
+
|
7832 |
+
|
7833 |
+
|
7834 |
+
|
7835 |
+
|
7836 |
+
|
7837 |
+
|
7838 |
+
|
7839 |
+
|
7840 |
+
|
7841 |
+
|
7842 |
+
|
7843 |
+
|
7844 |
+
|
7845 |
+
|
7846 |
+
|
7847 |
+
|
7848 |
+
|
7849 |
+
|
7850 |
+
|
7851 |
+
|
7852 |
+
|
7853 |
+
|
7854 |
+
|
7855 |
+
|
7856 |
+
|
7857 |
+
|
7858 |
+
|
7859 |
+
|
7860 |
+
|
7861 |
+
|
7862 |
+
|
7863 |
+
|
7864 |
+
|
7865 |
+
|
7866 |
+
|
7867 |
+
|
7868 |
+
|
7869 |
+
|
7870 |
+
|
7871 |
+
|
7872 |
+
|
7873 |
+
|
7874 |
+
|
7875 |
+
|
7876 |
+
|
7877 |
+
|
7878 |
+
|
7879 |
+
|
7880 |
+
|
7881 |
+
|
7882 |
+
|
7883 |
+
|
7884 |
+
|
7885 |
+
|
7886 |
+
|
7887 |
+
|
7888 |
+
|
7889 |
+
|
7890 |
+
|
7891 |
+
|
7892 |
+
|
7893 |
+
|
7894 |
+
|
7895 |
+
|
7896 |
+
|
7897 |
+
|
7898 |
+
|
7899 |
+
|
7900 |
+
|
7901 |
+
|
7902 |
+
|
7903 |
+
|
7904 |
+
|
7905 |
+
|
7906 |
+
35%|██████████████████████████████████████████████████████▎ | 7499/21520 [6:09:43<7:42:55, 1.98s/it]
|
7907 |
+
35%|██████████████████████████████████████████████████████▎ | 7500/21520 [6:09:45<7:45:02, 1.99s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
7908 |
+
***** Running Evaluation *****
|
7909 |
+
Num examples = 1839
|
7910 |
+
Batch size = 64
|
7911 |
+
|
7912 |
+
|
7913 |
+
|
7914 |
+
|
7915 |
+
|
7916 |
+
|
7917 |
+
|
7918 |
+
|
7919 |
+
|
7920 |
+
|
7921 |
+
|
7922 |
+
|
7923 |
+
|
7924 |
+
|
7925 |
+
|
7926 |
+
|
7927 |
+
|
7928 |
+
|
7929 |
+
|
7930 |
+
|
7931 |
+
|
7932 |
+
|
7933 |
+
|
7934 |
+
|
7935 |
+
|
7936 |
+
|
7937 |
+
|
7938 |
+
|
7939 |
+
|
7940 |
+
Configuration saved in ./checkpoint-7500/config.json
|
7941 |
+
{'eval_loss': 0.3057877719402313, 'eval_wer': 0.4125112241843759, 'eval_runtime': 65.7311, 'eval_samples_per_second': 27.978, 'eval_steps_per_second': 0.441, 'epoch': 1.74}
|
7942 |
+
Model weights saved in ./checkpoint-7500/pytorch_model.bin
|
7943 |
+
Configuration saved in ./checkpoint-7500/preprocessor_config.json
|
7944 |
+
Configuration saved in ./preprocessor_config.json
|
7945 |
+
Deleting older checkpoint [checkpoint-6000] due to args.save_total_limit
|
7946 |
+
|
7947 |
+
|
7948 |
+
|
7949 |
+
|
7950 |
+
|
7951 |
+
|
7952 |
+
|
7953 |
+
|
7954 |
+
|
7955 |
+
|
7956 |
+
|
7957 |
+
|
7958 |
+
|
7959 |
+
|
7960 |
+
|
7961 |
+
|
7962 |
+
|
7963 |
+
|
7964 |
+
|
7965 |
+
|
7966 |
+
|
7967 |
+
|
7968 |
+
|
7969 |
+
|
7970 |
+
|
7971 |
+
|
7972 |
+
|
7973 |
+
|
7974 |
+
|
7975 |
+
|
7976 |
+
|
7977 |
+
|
7978 |
+
|
7979 |
+
|
7980 |
+
|
7981 |
+
|
7982 |
+
|
7983 |
+
|
7984 |
+
|
7985 |
+
|
7986 |
+
|
7987 |
+
|
7988 |
+
|
7989 |
+
|
7990 |
+
|
7991 |
+
|
7992 |
+
|
7993 |
+
|
7994 |
+
|
7995 |
+
|
7996 |
+
|
7997 |
+
|
7998 |
+
|
7999 |
+
|
8000 |
+
|
8001 |
+
|
8002 |
+
|
8003 |
+
|
8004 |
+
|
8005 |
+
|
8006 |
+
|
8007 |
+
|
8008 |
+
|
8009 |
+
|
8010 |
+
|
8011 |
+
|
8012 |
+
|
8013 |
+
|
8014 |
+
|
8015 |
+
|
8016 |
+
|
8017 |
+
|
8018 |
+
|
8019 |
+
|
8020 |
+
|
8021 |
+
|
8022 |
+
|
8023 |
+
|
8024 |
+
|
8025 |
+
|
8026 |
+
|
8027 |
+
|
8028 |
+
|
8029 |
+
|
8030 |
+
|
8031 |
+
|
8032 |
+
|
8033 |
+
|
8034 |
+
|
8035 |
+
|
8036 |
+
|
8037 |
+
|
8038 |
+
|
8039 |
+
|
8040 |
+
|
8041 |
+
|
8042 |
+
|
8043 |
+
35%|███████████████████████████████████████████████████████ | 7600/21520 [6:16:59<7:31:32, 1.95s/it]
|
8044 |
+
|
8045 |
+
|
8046 |
+
|
8047 |
+
|
8048 |
+
|
8049 |
+
|
8050 |
+
|
8051 |
+
|
8052 |
+
|
8053 |
+
|
8054 |
+
|
8055 |
+
|
8056 |
+
|
8057 |
+
|
8058 |
+
|
8059 |
+
|
8060 |
+
|
8061 |
+
|
8062 |
+
|
8063 |
+
|
8064 |
+
|
8065 |
+
|
8066 |
+
|
8067 |
+
|
8068 |
+
|
8069 |
+
|
8070 |
+
|
8071 |
+
|
8072 |
+
|
8073 |
+
|
8074 |
+
|
8075 |
+
|
8076 |
+
|
8077 |
+
|
8078 |
+
|
8079 |
+
|
8080 |
+
|
8081 |
+
|
8082 |
+
|
8083 |
+
|
8084 |
+
|
8085 |
+
|
8086 |
+
|
8087 |
+
|
8088 |
+
|
8089 |
+
|
8090 |
+
|
8091 |
+
|
8092 |
+
|
8093 |
+
|
8094 |
+
|
8095 |
+
|
8096 |
+
|
8097 |
+
|
8098 |
+
|
8099 |
+
|
8100 |
+
|
8101 |
+
|
8102 |
+
|
8103 |
+
|
8104 |
+
|
8105 |
+
|
8106 |
+
|
8107 |
+
|
8108 |
+
|
8109 |
+
|
8110 |
+
|
8111 |
+
|
8112 |
+
|
8113 |
+
|
8114 |
+
|
8115 |
+
|
8116 |
+
|
8117 |
+
|
8118 |
+
|
8119 |
+
|
8120 |
+
|
8121 |
+
|
8122 |
+
|
8123 |
+
|
8124 |
+
|
8125 |
+
|
8126 |
+
|
8127 |
+
|
8128 |
+
|
8129 |
+
|
8130 |
+
|
8131 |
+
|
8132 |
+
|
8133 |
+
|
8134 |
+
|
8135 |
+
|
8136 |
+
|
8137 |
+
|
8138 |
+
|
8139 |
+
|
8140 |
+
36%|███████████████████████████████████████████████████████▊ | 7699/21520 [6:21:22<7:59:09, 2.08s/it]
|
8141 |
+
|
8142 |
+
|
8143 |
+
|
8144 |
+
|
8145 |
+
|
8146 |
+
|
8147 |
+
|
8148 |
+
|
8149 |
+
|
8150 |
+
|
8151 |
+
|
8152 |
+
|
8153 |
+
|
8154 |
+
|
8155 |
+
|
8156 |
+
|
8157 |
+
|
8158 |
+
|
8159 |
+
|
8160 |
+
|
8161 |
+
|
8162 |
+
|
8163 |
+
|
8164 |
+
|
8165 |
+
|
8166 |
+
|
8167 |
+
|
8168 |
+
|
8169 |
+
|
8170 |
+
|
8171 |
+
|
8172 |
+
|
8173 |
+
|
8174 |
+
|
8175 |
+
|
8176 |
+
|
8177 |
+
|
8178 |
+
|
8179 |
+
|
8180 |
+
|
8181 |
+
|
8182 |
+
|
8183 |
+
|
8184 |
+
|
8185 |
+
|
8186 |
+
|
8187 |
+
|
8188 |
+
|
8189 |
+
|
8190 |
+
|
8191 |
+
|
8192 |
+
|
8193 |
+
|
8194 |
+
|
8195 |
+
|
8196 |
+
|
8197 |
+
|
8198 |
+
|
8199 |
+
|
8200 |
+
|
8201 |
+
|
8202 |
+
|
8203 |
+
|
8204 |
+
|
8205 |
+
|
8206 |
+
|
8207 |
+
|
8208 |
+
|
8209 |
+
|
8210 |
+
|
8211 |
+
|
8212 |
+
|
8213 |
+
|
8214 |
+
|
8215 |
+
|
8216 |
+
|
8217 |
+
|
8218 |
+
|
8219 |
+
|
8220 |
+
|
8221 |
+
|
8222 |
+
|
8223 |
+
|
8224 |
+
|
8225 |
+
|
8226 |
+
|
8227 |
+
|
8228 |
+
|
8229 |
+
|
8230 |
+
|
8231 |
+
|
8232 |
+
|
8233 |
+
|
8234 |
+
|
8235 |
+
|
8236 |
+
|
8237 |
+
|
8238 |
+
|
8239 |
+
|
8240 |
+
36%|████████████████████████████████████████████████████████▌ | 7800/21520 [6:25:47<7:34:58, 1.99s/it]
|
8241 |
+
|
8242 |
+
|
8243 |
+
|
8244 |
+
|
8245 |
+
|
8246 |
+
|
8247 |
+
|
8248 |
+
|
8249 |
+
|
8250 |
+
|
8251 |
+
|
8252 |
+
|
8253 |
+
|
8254 |
+
|
8255 |
+
|
8256 |
+
|
8257 |
+
|
8258 |
+
|
8259 |
+
|
8260 |
+
|
8261 |
+
|
8262 |
+
|
8263 |
+
|
8264 |
+
|
8265 |
+
|
8266 |
+
|
8267 |
+
|
8268 |
+
|
8269 |
+
|
8270 |
+
|
8271 |
+
|
8272 |
+
|
8273 |
+
|
8274 |
+
|
8275 |
+
|
8276 |
+
|
8277 |
+
|
8278 |
+
|
8279 |
+
|
8280 |
+
|
8281 |
+
|
8282 |
+
|
8283 |
+
|
8284 |
+
|
8285 |
+
|
8286 |
+
|
8287 |
+
|
8288 |
+
|
8289 |
+
|
8290 |
+
|
8291 |
+
|
8292 |
+
|
8293 |
+
|
8294 |
+
|
8295 |
+
|
8296 |
+
|
8297 |
+
|
8298 |
+
|
8299 |
+
|
8300 |
+
|
8301 |
+
|
8302 |
+
|
8303 |
+
|
8304 |
+
|
8305 |
+
|
8306 |
+
|
8307 |
+
|
8308 |
+
|
8309 |
+
|
8310 |
+
|
8311 |
+
|
8312 |
+
|
8313 |
+
|
8314 |
+
|
8315 |
+
|
8316 |
+
|
8317 |
+
|
8318 |
+
|
8319 |
+
|
8320 |
+
|
8321 |
+
|
8322 |
+
|
8323 |
+
|
8324 |
+
|
8325 |
+
|
8326 |
+
|
8327 |
+
|
8328 |
+
|
8329 |
+
|
8330 |
+
|
8331 |
+
|
8332 |
+
|
8333 |
+
|
8334 |
+
|
8335 |
+
|
8336 |
+
|
8337 |
+
|
8338 |
+
37%|█████████████████████████████████████████████████████████▎ | 7899/21520 [6:30:08<7:34:20, 2.00s/it]
|
8339 |
+
|
8340 |
+
|
8341 |
+
|
8342 |
+
|
8343 |
+
|
8344 |
+
|
8345 |
+
|
8346 |
+
|
8347 |
+
|
8348 |
+
|
8349 |
+
|
8350 |
+
|
8351 |
+
|
8352 |
+
|
8353 |
+
|
8354 |
+
|
8355 |
+
|
8356 |
+
|
8357 |
+
|
8358 |
+
|
8359 |
+
|
8360 |
+
|
8361 |
+
|
8362 |
+
|
8363 |
+
|
8364 |
+
|
8365 |
+
|
8366 |
+
|
8367 |
+
|
8368 |
+
|
8369 |
+
|
8370 |
+
|
8371 |
+
|
8372 |
+
|
8373 |
+
|
8374 |
+
|
8375 |
+
|
8376 |
+
|
8377 |
+
|
8378 |
+
|
8379 |
+
|
8380 |
+
|
8381 |
+
|
8382 |
+
|
8383 |
+
|
8384 |
+
|
8385 |
+
|
8386 |
+
|
8387 |
+
|
8388 |
+
|
8389 |
+
|
8390 |
+
|
8391 |
+
|
8392 |
+
|
8393 |
+
|
8394 |
+
|
8395 |
+
|
8396 |
+
|
8397 |
+
|
8398 |
+
|
8399 |
+
|
8400 |
+
|
8401 |
+
|
8402 |
+
|
8403 |
+
|
8404 |
+
|
8405 |
+
|
8406 |
+
|
8407 |
+
|
8408 |
+
|
8409 |
+
|
8410 |
+
|
8411 |
+
|
8412 |
+
|
8413 |
+
|
8414 |
+
|
8415 |
+
|
8416 |
+
|
8417 |
+
|
8418 |
+
|
8419 |
+
|
8420 |
+
|
8421 |
+
|
8422 |
+
|
8423 |
+
|
8424 |
+
|
8425 |
+
|
8426 |
+
|
8427 |
+
|
8428 |
+
|
8429 |
+
|
8430 |
+
|
8431 |
+
|
8432 |
+
|
8433 |
+
|
8434 |
+
|
8435 |
+
|
8436 |
+
37%|█████████████████████████████████████████████████████████▉ | 8000/21520 [6:34:33<7:26:34, 1.98s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
8437 |
+
***** Running Evaluation *****
|
8438 |
+
Num examples = 1839
|
8439 |
+
Batch size = 64
|
8440 |
+
{'loss': 1.1101, 'learning_rate': 4.7311188811188806e-05, 'epoch': 1.86}
|
8441 |
+
|
8442 |
+
|
8443 |
+
|
8444 |
+
|
8445 |
+
|
8446 |
+
|
8447 |
+
|
8448 |
+
|
8449 |
+
|
8450 |
+
|
8451 |
+
|
8452 |
+
|
8453 |
+
|
8454 |
+
|
8455 |
+
|
8456 |
+
|
8457 |
+
|
8458 |
+
|
8459 |
+
|
8460 |
+
|
8461 |
+
|
8462 |
+
|
8463 |
+
|
8464 |
+
|
8465 |
+
|
8466 |
+
|
8467 |
+
|
8468 |
+
|
8469 |
+
Configuration saved in ./checkpoint-8000/config.json
|
8470 |
+
{'eval_loss': 0.3026394248008728, 'eval_wer': 0.4099970068841664, 'eval_runtime': 64.4321, 'eval_samples_per_second': 28.542, 'eval_steps_per_second': 0.45, 'epoch': 1.86}
|
8471 |
+
Model weights saved in ./checkpoint-8000/pytorch_model.bin
|
8472 |
+
Configuration saved in ./checkpoint-8000/preprocessor_config.json
|
8473 |
+
Configuration saved in ./preprocessor_config.json
|
8474 |
+
Deleting older checkpoint [checkpoint-6500] due to args.save_total_limit
|
8475 |
+
|
8476 |
+
|
8477 |
+
|
8478 |
+
|
8479 |
+
|
8480 |
+
|
8481 |
+
|
8482 |
+
|
8483 |
+
|
8484 |
+
|
8485 |
+
|
8486 |
+
|
8487 |
+
|
8488 |
+
|
8489 |
+
|
8490 |
+
|
8491 |
+
|
8492 |
+
|
8493 |
+
|
8494 |
+
|
8495 |
+
|
8496 |
+
|
8497 |
+
|
8498 |
+
|
8499 |
+
|
8500 |
+
|
8501 |
+
|
8502 |
+
|
8503 |
+
|
8504 |
+
|
8505 |
+
|
8506 |
+
|
8507 |
+
|
8508 |
+
|
8509 |
+
|
8510 |
+
|
8511 |
+
|
8512 |
+
|
8513 |
+
|
8514 |
+
|
8515 |
+
|
8516 |
+
|
8517 |
+
|
8518 |
+
|
8519 |
+
|
8520 |
+
|
8521 |
+
|
8522 |
+
|
8523 |
+
|
8524 |
+
|
8525 |
+
|
8526 |
+
|
8527 |
+
|
8528 |
+
|
8529 |
+
|
8530 |
+
|
8531 |
+
|
8532 |
+
|
8533 |
+
|
8534 |
+
|
8535 |
+
|
8536 |
+
|
8537 |
+
|
8538 |
+
|
8539 |
+
|
8540 |
+
|
8541 |
+
|
8542 |
+
|
8543 |
+
|
8544 |
+
|
8545 |
+
|
8546 |
+
|
8547 |
+
|
8548 |
+
|
8549 |
+
|
8550 |
+
|
8551 |
+
|
8552 |
+
|
8553 |
+
|
8554 |
+
|
8555 |
+
|
8556 |
+
|
8557 |
+
|
8558 |
+
|
8559 |
+
|
8560 |
+
|
8561 |
+
|
8562 |
+
|
8563 |
+
|
8564 |
+
|
8565 |
+
|
8566 |
+
|
8567 |
+
|
8568 |
+
|
8569 |
+
|
8570 |
+
|
8571 |
+
38%|██████████████████████████████████████████████████████████▋ | 8099/21520 [6:41:43<7:19:00, 1.96s/it]
|
8572 |
+
|
8573 |
+
|
8574 |
+
|
8575 |
+
|
8576 |
+
|
8577 |
+
|
8578 |
+
|
8579 |
+
|
8580 |
+
|
8581 |
+
|
8582 |
+
|
8583 |
+
|
8584 |
+
|
8585 |
+
|
8586 |
+
|
8587 |
+
|
8588 |
+
|
8589 |
+
|
8590 |
+
|
8591 |
+
|
8592 |
+
|
8593 |
+
|
8594 |
+
|
8595 |
+
|
8596 |
+
|
8597 |
+
|
8598 |
+
|
8599 |
+
|
8600 |
+
|
8601 |
+
|
8602 |
+
|
8603 |
+
|
8604 |
+
|
8605 |
+
|
8606 |
+
|
8607 |
+
|
8608 |
+
|
8609 |
+
|
8610 |
+
|
8611 |
+
|
8612 |
+
|
8613 |
+
|
8614 |
+
|
8615 |
+
|
8616 |
+
|
8617 |
+
|
8618 |
+
|
8619 |
+
|
8620 |
+
|
8621 |
+
|
8622 |
+
|
8623 |
+
|
8624 |
+
|
8625 |
+
|
8626 |
+
|
8627 |
+
|
8628 |
+
|
8629 |
+
|
8630 |
+
|
8631 |
+
|
8632 |
+
|
8633 |
+
|
8634 |
+
|
8635 |
+
|
8636 |
+
|
8637 |
+
|
8638 |
+
|
8639 |
+
|
8640 |
+
|
8641 |
+
|
8642 |
+
|
8643 |
+
|
8644 |
+
|
8645 |
+
|
8646 |
+
|
8647 |
+
|
8648 |
+
|
8649 |
+
|
8650 |
+
|
8651 |
+
|
8652 |
+
|
8653 |
+
|
8654 |
+
|
8655 |
+
|
8656 |
+
|
8657 |
+
|
8658 |
+
|
8659 |
+
|
8660 |
+
|
8661 |
+
|
8662 |
+
|
8663 |
+
|
8664 |
+
|
8665 |
+
|
8666 |
+
|
8667 |
+
|
8668 |
+
|
8669 |
+
|
8670 |
+
38%|███████████████████████████████████████████████████████████▍ | 8200/21520 [6:46:08<7:12:30, 1.95s/it]
|
8671 |
+
|
8672 |
+
|
8673 |
+
|
8674 |
+
|
8675 |
+
|
8676 |
+
|
8677 |
+
|
8678 |
+
|
8679 |
+
|
8680 |
+
|
8681 |
+
|
8682 |
+
|
8683 |
+
|
8684 |
+
|
8685 |
+
|
8686 |
+
|
8687 |
+
|
8688 |
+
|
8689 |
+
|
8690 |
+
|
8691 |
+
|
8692 |
+
|
8693 |
+
|
8694 |
+
|
8695 |
+
|
8696 |
+
|
8697 |
+
|
8698 |
+
|
8699 |
+
|
8700 |
+
|
8701 |
+
|
8702 |
+
|
8703 |
+
|
8704 |
+
|
8705 |
+
|
8706 |
+
|
8707 |
+
|
8708 |
+
|
8709 |
+
|
8710 |
+
|
8711 |
+
|
8712 |
+
|
8713 |
+
|
8714 |
+
|
8715 |
+
|
8716 |
+
|
8717 |
+
|
8718 |
+
|
8719 |
+
|
8720 |
+
|
8721 |
+
|
8722 |
+
|
8723 |
+
|
8724 |
+
|
8725 |
+
|
8726 |
+
|
8727 |
+
|
8728 |
+
|
8729 |
+
|
8730 |
+
|
8731 |
+
|
8732 |
+
|
8733 |
+
|
8734 |
+
|
8735 |
+
|
8736 |
+
|
8737 |
+
|
8738 |
+
|
8739 |
+
|
8740 |
+
|
8741 |
+
|
8742 |
+
|
8743 |
+
|
8744 |
+
|
8745 |
+
|
8746 |
+
|
8747 |
+
|
8748 |
+
|
8749 |
+
|
8750 |
+
|
8751 |
+
|
8752 |
+
|
8753 |
+
|
8754 |
+
|
8755 |
+
|
8756 |
+
|
8757 |
+
|
8758 |
+
|
8759 |
+
|
8760 |
+
|
8761 |
+
|
8762 |
+
|
8763 |
+
|
8764 |
+
|
8765 |
+
|
8766 |
+
|
8767 |
+
|
8768 |
+
|
8769 |
+
39%|████████████████████████████████████████████████████████████▏ | 8301/21520 [6:50:33<7:07:31, 1.94s/it]
|
8770 |
+
|
8771 |
+
|
8772 |
+
|
8773 |
+
|
8774 |
+
|
8775 |
+
|
8776 |
+
|
8777 |
+
|
8778 |
+
|
8779 |
+
|
8780 |
+
|
8781 |
+
|
8782 |
+
|
8783 |
+
|
8784 |
+
|
8785 |
+
|
8786 |
+
|
8787 |
+
|
8788 |
+
|
8789 |
+
|
8790 |
+
|
8791 |
+
|
8792 |
+
|
8793 |
+
|
8794 |
+
|
8795 |
+
|
8796 |
+
|
8797 |
+
|
8798 |
+
|
8799 |
+
|
8800 |
+
|
8801 |
+
|
8802 |
+
|
8803 |
+
|
8804 |
+
|
8805 |
+
|
8806 |
+
|
8807 |
+
|
8808 |
+
|
8809 |
+
|
8810 |
+
|
8811 |
+
|
8812 |
+
|
8813 |
+
|
8814 |
+
|
8815 |
+
|
8816 |
+
|
8817 |
+
|
8818 |
+
|
8819 |
+
|
8820 |
+
|
8821 |
+
|
8822 |
+
|
8823 |
+
|
8824 |
+
|
8825 |
+
|
8826 |
+
|
8827 |
+
|
8828 |
+
|
8829 |
+
|
8830 |
+
|
8831 |
+
|
8832 |
+
|
8833 |
+
|
8834 |
+
|
8835 |
+
|
8836 |
+
|
8837 |
+
|
8838 |
+
|
8839 |
+
|
8840 |
+
|
8841 |
+
|
8842 |
+
|
8843 |
+
|
8844 |
+
|
8845 |
+
|
8846 |
+
|
8847 |
+
|
8848 |
+
|
8849 |
+
|
8850 |
+
|
8851 |
+
|
8852 |
+
|
8853 |
+
|
8854 |
+
|
8855 |
+
|
8856 |
+
|
8857 |
+
|
8858 |
+
|
8859 |
+
|
8860 |
+
|
8861 |
+
|
8862 |
+
|
8863 |
+
|
8864 |
+
|
8865 |
+
|
8866 |
+
|
8867 |
+
|
8868 |
+
39%|████████████████████████████████████████████████████████████▉ | 8400/21520 [6:54:55<7:36:42, 2.09s/it]
|
8869 |
+
|
8870 |
+
|
8871 |
+
|
8872 |
+
|
8873 |
+
|
8874 |
+
|
8875 |
+
|
8876 |
+
|
8877 |
+
|
8878 |
+
|
8879 |
+
|
8880 |
+
|
8881 |
+
|
8882 |
+
|
8883 |
+
|
8884 |
+
|
8885 |
+
|
8886 |
+
|
8887 |
+
|
8888 |
+
|
8889 |
+
|
8890 |
+
|
8891 |
+
|
8892 |
+
|
8893 |
+
|
8894 |
+
|
8895 |
+
|
8896 |
+
|
8897 |
+
|
8898 |
+
|
8899 |
+
|
8900 |
+
|
8901 |
+
|
8902 |
+
|
8903 |
+
|
8904 |
+
|
8905 |
+
|
8906 |
+
|
8907 |
+
|
8908 |
+
|
8909 |
+
|
8910 |
+
|
8911 |
+
|
8912 |
+
|
8913 |
+
|
8914 |
+
|
8915 |
+
|
8916 |
+
|
8917 |
+
|
8918 |
+
|
8919 |
+
|
8920 |
+
|
8921 |
+
|
8922 |
+
|
8923 |
+
|
8924 |
+
|
8925 |
+
|
8926 |
+
|
8927 |
+
|
8928 |
+
|
8929 |
+
|
8930 |
+
|
8931 |
+
|
8932 |
+
|
8933 |
+
|
8934 |
+
|
8935 |
+
|
8936 |
+
|
8937 |
+
|
8938 |
+
|
8939 |
+
|
8940 |
+
|
8941 |
+
|
8942 |
+
|
8943 |
+
|
8944 |
+
|
8945 |
+
|
8946 |
+
|
8947 |
+
|
8948 |
+
|
8949 |
+
|
8950 |
+
|
8951 |
+
|
8952 |
+
|
8953 |
+
|
8954 |
+
|
8955 |
+
|
8956 |
+
|
8957 |
+
|
8958 |
+
|
8959 |
+
|
8960 |
+
|
8961 |
+
|
8962 |
+
|
8963 |
+
|
8964 |
+
|
8965 |
+
|
8966 |
+
39%|█████████████████████████████████████████████████████████████▌ | 8499/21520 [6:59:15<7:12:41, 1.99s/it]
|
8967 |
+
39%|█████████████████████████████████████████████████████████████▌ | 8500/21520 [6:59:17<7:12:02, 1.99s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
|
8968 |
+
***** Running Evaluation *****
|
8969 |
+
Num examples = 1839
|
8970 |
+
Batch size = 64
|
8971 |
+
|
8972 |
+
|
8973 |
+
|
8974 |
+
|
8975 |
+
|
8976 |
+
|
8977 |
+
|
8978 |
+
|
8979 |
+
|
8980 |
+
|
8981 |
+
|
8982 |
+
|
8983 |
+
|
8984 |
+
|
8985 |
+
|
8986 |
+
|
8987 |
+
|
8988 |
+
|
8989 |
+
|
8990 |
+
|
8991 |
+
|
8992 |
+
|
8993 |
+
|
8994 |
+
|
8995 |
+
|
8996 |
+
|
8997 |
+
|
8998 |
+
|
8999 |
+
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 29/29 [00:56<00:00, 1.81s/it]
|
9000 |
+
|
9001 |
+
Configuration saved in ./checkpoint-8500/config.json
|
9002 |
+
Model weights saved in ./checkpoint-8500/pytorch_model.bin
|
9003 |
+
Configuration saved in ./checkpoint-8500/preprocessor_config.json
|
wandb/run-20220129_131141-h6nhqm30/files/requirements.txt
ADDED
@@ -0,0 +1,180 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
aiohttp==3.8.1
|
2 |
+
aiosignal==1.2.0
|
3 |
+
analytics-python==1.4.0
|
4 |
+
anyio==3.5.0
|
5 |
+
appdirs==1.4.4
|
6 |
+
argon2-cffi-bindings==21.2.0
|
7 |
+
argon2-cffi==21.3.0
|
8 |
+
asgiref==3.5.0
|
9 |
+
asttokens==2.0.5
|
10 |
+
async-timeout==4.0.2
|
11 |
+
attrs==21.4.0
|
12 |
+
audioread==2.1.9
|
13 |
+
backcall==0.2.0
|
14 |
+
backoff==1.10.0
|
15 |
+
bcrypt==3.2.0
|
16 |
+
beautifulsoup4==4.9.3
|
17 |
+
black==21.12b0
|
18 |
+
bleach==4.1.0
|
19 |
+
brotlipy==0.7.0
|
20 |
+
certifi==2020.12.5
|
21 |
+
cffi==1.14.3
|
22 |
+
chardet==3.0.4
|
23 |
+
charset-normalizer==2.0.10
|
24 |
+
click==8.0.3
|
25 |
+
conda-build==3.21.4
|
26 |
+
conda-package-handling==1.7.2
|
27 |
+
conda==4.9.2
|
28 |
+
configparser==5.2.0
|
29 |
+
cryptography==3.2.1
|
30 |
+
cycler==0.11.0
|
31 |
+
datasets==1.18.2.dev0
|
32 |
+
debugpy==1.5.1
|
33 |
+
decorator==4.4.2
|
34 |
+
defusedxml==0.7.1
|
35 |
+
dill==0.3.4
|
36 |
+
dnspython==2.1.0
|
37 |
+
docker-pycreds==0.4.0
|
38 |
+
entrypoints==0.3
|
39 |
+
executing==0.8.2
|
40 |
+
fastapi==0.73.0
|
41 |
+
ffmpy==0.3.0
|
42 |
+
filelock==3.0.12
|
43 |
+
fonttools==4.29.0
|
44 |
+
frozenlist==1.3.0
|
45 |
+
fsspec==2022.1.0
|
46 |
+
gitdb==4.0.9
|
47 |
+
gitpython==3.1.26
|
48 |
+
glob2==0.7
|
49 |
+
gradio==2.7.5.2
|
50 |
+
h11==0.13.0
|
51 |
+
huggingface-hub==0.4.0
|
52 |
+
idna==2.10
|
53 |
+
importlib-resources==5.4.0
|
54 |
+
ipykernel==6.7.0
|
55 |
+
ipython-genutils==0.2.0
|
56 |
+
ipython==8.0.1
|
57 |
+
ipywidgets==7.6.3
|
58 |
+
jedi==0.17.0
|
59 |
+
jinja2==2.11.3
|
60 |
+
jiwer==2.3.0
|
61 |
+
joblib==1.1.0
|
62 |
+
json5==0.9.6
|
63 |
+
jsonschema==4.4.0
|
64 |
+
jupyter-client==7.1.2
|
65 |
+
jupyter-core==4.9.1
|
66 |
+
jupyterlab-pygments==0.1.2
|
67 |
+
jupyterlab-server==1.2.0
|
68 |
+
jupyterlab-widgets==1.0.2
|
69 |
+
jupyterlab==2.2.9
|
70 |
+
kiwisolver==1.3.2
|
71 |
+
libarchive-c==2.9
|
72 |
+
librosa==0.8.1
|
73 |
+
llvmlite==0.38.0
|
74 |
+
markdown2==2.4.2
|
75 |
+
markupsafe==1.1.1
|
76 |
+
matplotlib-inline==0.1.3
|
77 |
+
matplotlib==3.5.1
|
78 |
+
mistune==0.8.4
|
79 |
+
mkl-fft==1.3.0
|
80 |
+
mkl-random==1.1.1
|
81 |
+
mkl-service==2.3.0
|
82 |
+
monotonic==1.6
|
83 |
+
multidict==6.0.2
|
84 |
+
multiprocess==0.70.12.2
|
85 |
+
mypy-extensions==0.4.3
|
86 |
+
nano==0.10.0
|
87 |
+
nbclient==0.5.10
|
88 |
+
nbconvert==6.4.1
|
89 |
+
nbformat==5.1.3
|
90 |
+
nest-asyncio==1.5.4
|
91 |
+
notebook==6.4.8
|
92 |
+
numba==0.55.1
|
93 |
+
numpy==1.19.2
|
94 |
+
olefile==0.46
|
95 |
+
packaging==21.3
|
96 |
+
pandas==1.4.0
|
97 |
+
pandocfilters==1.5.0
|
98 |
+
paramiko==2.9.2
|
99 |
+
parso==0.8.1
|
100 |
+
pathspec==0.9.0
|
101 |
+
pathtools==0.1.2
|
102 |
+
pexpect==4.8.0
|
103 |
+
pickleshare==0.7.5
|
104 |
+
pillow==8.1.2
|
105 |
+
pip==21.3.1
|
106 |
+
pkginfo==1.7.0
|
107 |
+
platformdirs==2.4.1
|
108 |
+
pooch==1.6.0
|
109 |
+
prometheus-client==0.13.0
|
110 |
+
promise==2.3
|
111 |
+
prompt-toolkit==3.0.8
|
112 |
+
protobuf==3.19.4
|
113 |
+
psutil==5.8.0
|
114 |
+
ptyprocess==0.7.0
|
115 |
+
pure-eval==0.2.2
|
116 |
+
pyarrow==6.0.1
|
117 |
+
pycosat==0.6.3
|
118 |
+
pycparser==2.20
|
119 |
+
pycryptodome==3.13.0
|
120 |
+
pydantic==1.9.0
|
121 |
+
pydub==0.25.1
|
122 |
+
pygments==2.8.0
|
123 |
+
pynacl==1.5.0
|
124 |
+
pyopenssl==19.1.0
|
125 |
+
pyparsing==3.0.7
|
126 |
+
pyrsistent==0.18.1
|
127 |
+
pysocks==1.7.1
|
128 |
+
python-dateutil==2.8.2
|
129 |
+
python-etcd==0.4.5
|
130 |
+
python-levenshtein==0.12.2
|
131 |
+
python-multipart==0.0.5
|
132 |
+
pytz==2021.1
|
133 |
+
pyyaml==5.4.1
|
134 |
+
pyzmq==22.3.0
|
135 |
+
regex==2022.1.18
|
136 |
+
requests==2.24.0
|
137 |
+
resampy==0.2.2
|
138 |
+
ruamel-yaml==0.15.87
|
139 |
+
sacremoses==0.0.47
|
140 |
+
scikit-learn==1.0.2
|
141 |
+
scipy==1.7.3
|
142 |
+
send2trash==1.8.0
|
143 |
+
sentry-sdk==1.5.4
|
144 |
+
setuptools==50.3.1.post20201107
|
145 |
+
shortuuid==1.0.8
|
146 |
+
six==1.15.0
|
147 |
+
smmap==5.0.0
|
148 |
+
sniffio==1.2.0
|
149 |
+
soundfile==0.10.3.post1
|
150 |
+
soupsieve==2.2
|
151 |
+
stack-data==0.1.4
|
152 |
+
starlette==0.17.1
|
153 |
+
subprocess32==3.5.4
|
154 |
+
termcolor==1.1.0
|
155 |
+
terminado==0.13.1
|
156 |
+
testpath==0.5.0
|
157 |
+
threadpoolctl==3.0.0
|
158 |
+
tokenizers==0.11.4
|
159 |
+
tomli==1.2.3
|
160 |
+
torch==1.10.2
|
161 |
+
torchaudio==0.10.2
|
162 |
+
torchelastic==0.2.2
|
163 |
+
torchtext==0.9.1
|
164 |
+
torchvision==0.9.1
|
165 |
+
tornado==6.1
|
166 |
+
tqdm==4.62.3
|
167 |
+
traitlets==5.1.1
|
168 |
+
transformers==4.17.0.dev0
|
169 |
+
typing-extensions==4.0.1
|
170 |
+
urllib3==1.25.11
|
171 |
+
uvicorn==0.17.1
|
172 |
+
wandb==0.12.9
|
173 |
+
wcwidth==0.2.5
|
174 |
+
webencodings==0.5.1
|
175 |
+
wheel==0.35.1
|
176 |
+
widgetsnbextension==3.5.2
|
177 |
+
xxhash==2.0.2
|
178 |
+
yarl==1.7.2
|
179 |
+
yaspin==2.1.0
|
180 |
+
zipp==3.7.0
|
wandb/run-20220129_131141-h6nhqm30/files/wandb-metadata.json
ADDED
@@ -0,0 +1,64 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"os": "Linux-4.15.0-151-generic-x86_64-with-glibc2.10",
|
3 |
+
"python": "3.8.8",
|
4 |
+
"heartbeatAt": "2022-01-29T13:11:43.886807",
|
5 |
+
"startedAt": "2022-01-29T13:11:41.626227",
|
6 |
+
"docker": null,
|
7 |
+
"gpu": "Tesla V100S-PCIE-32GB",
|
8 |
+
"gpu_count": 1,
|
9 |
+
"cpu_count": 60,
|
10 |
+
"cuda": null,
|
11 |
+
"args": [
|
12 |
+
"--dataset_name=mozilla-foundation/common_voice_8_0",
|
13 |
+
"--model_name_or_path=facebook/wav2vec2-xls-r-300m",
|
14 |
+
"--dataset_config_name=fr",
|
15 |
+
"--output_dir=./",
|
16 |
+
"--overwrite_output_dir",
|
17 |
+
"--num_train_epochs=5",
|
18 |
+
"--per_device_train_batch_size=64",
|
19 |
+
"--per_device_eval_batch_size=64",
|
20 |
+
"--gradient_accumulation_steps=1",
|
21 |
+
"--learning_rate=7e-5",
|
22 |
+
"--warmup_steps=1500",
|
23 |
+
"--length_column_name=input_length",
|
24 |
+
"--evaluation_strategy=steps",
|
25 |
+
"--text_column_name=sentence",
|
26 |
+
"--save_steps=500",
|
27 |
+
"--eval_steps=500",
|
28 |
+
"--logging_steps=100",
|
29 |
+
"--layerdrop=0.0",
|
30 |
+
"--activation_dropout=0.1",
|
31 |
+
"--save_total_limit=3",
|
32 |
+
"--freeze_feature_encoder",
|
33 |
+
"--feat_proj_dropout=0.0",
|
34 |
+
"--mask_time_prob=0.75",
|
35 |
+
"--mask_time_length=10",
|
36 |
+
"--mask_feature_prob=0.33",
|
37 |
+
"--mask_feature_length=10",
|
38 |
+
"--gradient_checkpointing",
|
39 |
+
"--report_to=wandb",
|
40 |
+
"--run_name=xls-r-300m-fr",
|
41 |
+
"--max_eval_samples=4000",
|
42 |
+
"--max_duration_in_seconds=5",
|
43 |
+
"--use_auth_token",
|
44 |
+
"--fp16",
|
45 |
+
"--group_by_length",
|
46 |
+
"--preprocessing_num_workers=64",
|
47 |
+
"--do_train",
|
48 |
+
"--do_eval",
|
49 |
+
"--load_best_model_at_end",
|
50 |
+
"--push_to_hub"
|
51 |
+
],
|
52 |
+
"state": "running",
|
53 |
+
"program": "run_speech_recognition_ctc.py",
|
54 |
+
"codePath": "run_speech_recognition_ctc.py",
|
55 |
+
"git": {
|
56 |
+
"remote": "https://huggingface.co/AlexN/xls-r-300m-fr",
|
57 |
+
"commit": "44c47390840b2e07543aa79514aa51086a0ff3cf"
|
58 |
+
},
|
59 |
+
"email": "[email protected]",
|
60 |
+
"root": "/workspace/xls-r-300m-fr",
|
61 |
+
"host": "job-1abccd0a-3293-4ffe-8274-9e8f841f653f",
|
62 |
+
"username": "ovh",
|
63 |
+
"executable": "/opt/conda/bin/python"
|
64 |
+
}
|
wandb/run-20220129_131141-h6nhqm30/files/wandb-summary.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_131141-h6nhqm30/logs/debug-internal.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_131141-h6nhqm30/logs/debug.log
ADDED
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-01-29 13:11:41,630 INFO MainThread:44227 [wandb_setup.py:_flush():71] setting env: {}
|
2 |
+
2022-01-29 13:11:41,630 INFO MainThread:44227 [wandb_setup.py:_flush():71] setting login settings: {}
|
3 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [wandb_init.py:_log_setup():371] Logging user logs to /workspace/xls-r-300m-fr/wandb/run-20220129_131141-h6nhqm30/logs/debug.log
|
4 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [wandb_init.py:_log_setup():372] Logging internal logs to /workspace/xls-r-300m-fr/wandb/run-20220129_131141-h6nhqm30/logs/debug-internal.log
|
5 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [wandb_init.py:init():404] calling init triggers
|
6 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
|
7 |
+
config: {}
|
8 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [wandb_init.py:init():460] starting backend
|
9 |
+
2022-01-29 13:11:41,631 INFO MainThread:44227 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
|
10 |
+
2022-01-29 13:11:42,043 INFO MainThread:44227 [backend.py:ensure_launched():216] starting backend process...
|
11 |
+
2022-01-29 13:11:42,437 INFO MainThread:44227 [backend.py:ensure_launched():221] started backend process with pid: 13445
|
12 |
+
2022-01-29 13:11:42,439 INFO MainThread:44227 [wandb_init.py:init():469] backend started and connected
|
13 |
+
2022-01-29 13:11:42,449 INFO MainThread:44227 [wandb_init.py:init():533] updated telemetry
|
14 |
+
2022-01-29 13:11:43,278 INFO MainThread:44227 [wandb_init.py:init():563] communicating current version
|
15 |
+
2022-01-29 13:11:43,680 INFO MainThread:44227 [wandb_init.py:init():568] got version response
|
16 |
+
2022-01-29 13:11:43,681 INFO MainThread:44227 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
|
17 |
+
2022-01-29 13:11:43,876 INFO MainThread:44227 [wandb_init.py:init():606] starting run threads in backend
|
18 |
+
2022-01-29 13:11:44,463 INFO MainThread:44227 [wandb_run.py:_console_start():1810] atexit reg
|
19 |
+
2022-01-29 13:11:44,464 INFO MainThread:44227 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
|
20 |
+
2022-01-29 13:11:44,464 INFO MainThread:44227 [wandb_run.py:_redirect():1689] Redirecting console.
|
21 |
+
2022-01-29 13:11:44,471 INFO MainThread:44227 [wandb_run.py:_redirect():1745] Redirects installed.
|
22 |
+
2022-01-29 13:11:44,471 INFO MainThread:44227 [wandb_init.py:init():633] run started, returning control to user process
|
23 |
+
2022-01-29 13:11:44,474 INFO MainThread:44227 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 216, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.17.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.1, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 218, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.75, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.33, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 64, 'per_device_eval_batch_size': 64, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': 'None', 'learning_rate': 7e-05, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 5.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 1500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan29_11-53-08_job-1abccd0a-3293-4ffe-8274-9e8f841f653f', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 3, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'xls-r-300m-fr', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': True, 'metric_for_best_model': 'loss', 'greater_is_better': False, 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 64, 'eval_batch_size': 64}
|
24 |
+
2022-01-29 13:11:44,479 INFO MainThread:44227 [wandb_watch.py:watch():43] Watching
|
wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:624404f6778fa63cf42e5d65fc81506fe3d0200ae05be92d55229cbe4a878ab4
|
3 |
+
size 62254538
|
wandb/run-20220129_215451-1vipdbow/files/conda-environment.yaml
ADDED
File without changes
|
wandb/run-20220129_215451-1vipdbow/files/config.yaml
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_215451-1vipdbow/files/output.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_215451-1vipdbow/files/requirements.txt
ADDED
@@ -0,0 +1,180 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
aiohttp==3.8.1
|
2 |
+
aiosignal==1.2.0
|
3 |
+
analytics-python==1.4.0
|
4 |
+
anyio==3.5.0
|
5 |
+
appdirs==1.4.4
|
6 |
+
argon2-cffi-bindings==21.2.0
|
7 |
+
argon2-cffi==21.3.0
|
8 |
+
asgiref==3.5.0
|
9 |
+
asttokens==2.0.5
|
10 |
+
async-timeout==4.0.2
|
11 |
+
attrs==21.4.0
|
12 |
+
audioread==2.1.9
|
13 |
+
backcall==0.2.0
|
14 |
+
backoff==1.10.0
|
15 |
+
bcrypt==3.2.0
|
16 |
+
beautifulsoup4==4.9.3
|
17 |
+
black==21.12b0
|
18 |
+
bleach==4.1.0
|
19 |
+
brotlipy==0.7.0
|
20 |
+
certifi==2020.12.5
|
21 |
+
cffi==1.14.3
|
22 |
+
chardet==3.0.4
|
23 |
+
charset-normalizer==2.0.10
|
24 |
+
click==8.0.3
|
25 |
+
conda-build==3.21.4
|
26 |
+
conda-package-handling==1.7.2
|
27 |
+
conda==4.9.2
|
28 |
+
configparser==5.2.0
|
29 |
+
cryptography==3.2.1
|
30 |
+
cycler==0.11.0
|
31 |
+
datasets==1.18.2.dev0
|
32 |
+
debugpy==1.5.1
|
33 |
+
decorator==4.4.2
|
34 |
+
defusedxml==0.7.1
|
35 |
+
dill==0.3.4
|
36 |
+
dnspython==2.1.0
|
37 |
+
docker-pycreds==0.4.0
|
38 |
+
entrypoints==0.3
|
39 |
+
executing==0.8.2
|
40 |
+
fastapi==0.73.0
|
41 |
+
ffmpy==0.3.0
|
42 |
+
filelock==3.0.12
|
43 |
+
fonttools==4.29.0
|
44 |
+
frozenlist==1.3.0
|
45 |
+
fsspec==2022.1.0
|
46 |
+
gitdb==4.0.9
|
47 |
+
gitpython==3.1.26
|
48 |
+
glob2==0.7
|
49 |
+
gradio==2.7.5.2
|
50 |
+
h11==0.13.0
|
51 |
+
huggingface-hub==0.4.0
|
52 |
+
idna==2.10
|
53 |
+
importlib-resources==5.4.0
|
54 |
+
ipykernel==6.7.0
|
55 |
+
ipython-genutils==0.2.0
|
56 |
+
ipython==8.0.1
|
57 |
+
ipywidgets==7.6.3
|
58 |
+
jedi==0.17.0
|
59 |
+
jinja2==2.11.3
|
60 |
+
jiwer==2.3.0
|
61 |
+
joblib==1.1.0
|
62 |
+
json5==0.9.6
|
63 |
+
jsonschema==4.4.0
|
64 |
+
jupyter-client==7.1.2
|
65 |
+
jupyter-core==4.9.1
|
66 |
+
jupyterlab-pygments==0.1.2
|
67 |
+
jupyterlab-server==1.2.0
|
68 |
+
jupyterlab-widgets==1.0.2
|
69 |
+
jupyterlab==2.2.9
|
70 |
+
kiwisolver==1.3.2
|
71 |
+
libarchive-c==2.9
|
72 |
+
librosa==0.8.1
|
73 |
+
llvmlite==0.38.0
|
74 |
+
markdown2==2.4.2
|
75 |
+
markupsafe==1.1.1
|
76 |
+
matplotlib-inline==0.1.3
|
77 |
+
matplotlib==3.5.1
|
78 |
+
mistune==0.8.4
|
79 |
+
mkl-fft==1.3.0
|
80 |
+
mkl-random==1.1.1
|
81 |
+
mkl-service==2.3.0
|
82 |
+
monotonic==1.6
|
83 |
+
multidict==6.0.2
|
84 |
+
multiprocess==0.70.12.2
|
85 |
+
mypy-extensions==0.4.3
|
86 |
+
nano==0.10.0
|
87 |
+
nbclient==0.5.10
|
88 |
+
nbconvert==6.4.1
|
89 |
+
nbformat==5.1.3
|
90 |
+
nest-asyncio==1.5.4
|
91 |
+
notebook==6.4.8
|
92 |
+
numba==0.55.1
|
93 |
+
numpy==1.19.2
|
94 |
+
olefile==0.46
|
95 |
+
packaging==21.3
|
96 |
+
pandas==1.4.0
|
97 |
+
pandocfilters==1.5.0
|
98 |
+
paramiko==2.9.2
|
99 |
+
parso==0.8.1
|
100 |
+
pathspec==0.9.0
|
101 |
+
pathtools==0.1.2
|
102 |
+
pexpect==4.8.0
|
103 |
+
pickleshare==0.7.5
|
104 |
+
pillow==8.1.2
|
105 |
+
pip==21.3.1
|
106 |
+
pkginfo==1.7.0
|
107 |
+
platformdirs==2.4.1
|
108 |
+
pooch==1.6.0
|
109 |
+
prometheus-client==0.13.0
|
110 |
+
promise==2.3
|
111 |
+
prompt-toolkit==3.0.8
|
112 |
+
protobuf==3.19.4
|
113 |
+
psutil==5.8.0
|
114 |
+
ptyprocess==0.7.0
|
115 |
+
pure-eval==0.2.2
|
116 |
+
pyarrow==6.0.1
|
117 |
+
pycosat==0.6.3
|
118 |
+
pycparser==2.20
|
119 |
+
pycryptodome==3.13.0
|
120 |
+
pydantic==1.9.0
|
121 |
+
pydub==0.25.1
|
122 |
+
pygments==2.8.0
|
123 |
+
pynacl==1.5.0
|
124 |
+
pyopenssl==19.1.0
|
125 |
+
pyparsing==3.0.7
|
126 |
+
pyrsistent==0.18.1
|
127 |
+
pysocks==1.7.1
|
128 |
+
python-dateutil==2.8.2
|
129 |
+
python-etcd==0.4.5
|
130 |
+
python-levenshtein==0.12.2
|
131 |
+
python-multipart==0.0.5
|
132 |
+
pytz==2021.1
|
133 |
+
pyyaml==5.4.1
|
134 |
+
pyzmq==22.3.0
|
135 |
+
regex==2022.1.18
|
136 |
+
requests==2.24.0
|
137 |
+
resampy==0.2.2
|
138 |
+
ruamel-yaml==0.15.87
|
139 |
+
sacremoses==0.0.47
|
140 |
+
scikit-learn==1.0.2
|
141 |
+
scipy==1.7.3
|
142 |
+
send2trash==1.8.0
|
143 |
+
sentry-sdk==1.5.4
|
144 |
+
setuptools==50.3.1.post20201107
|
145 |
+
shortuuid==1.0.8
|
146 |
+
six==1.15.0
|
147 |
+
smmap==5.0.0
|
148 |
+
sniffio==1.2.0
|
149 |
+
soundfile==0.10.3.post1
|
150 |
+
soupsieve==2.2
|
151 |
+
stack-data==0.1.4
|
152 |
+
starlette==0.17.1
|
153 |
+
subprocess32==3.5.4
|
154 |
+
termcolor==1.1.0
|
155 |
+
terminado==0.13.1
|
156 |
+
testpath==0.5.0
|
157 |
+
threadpoolctl==3.0.0
|
158 |
+
tokenizers==0.11.4
|
159 |
+
tomli==1.2.3
|
160 |
+
torch==1.10.2
|
161 |
+
torchaudio==0.10.2
|
162 |
+
torchelastic==0.2.2
|
163 |
+
torchtext==0.9.1
|
164 |
+
torchvision==0.9.1
|
165 |
+
tornado==6.1
|
166 |
+
tqdm==4.62.3
|
167 |
+
traitlets==5.1.1
|
168 |
+
transformers==4.17.0.dev0
|
169 |
+
typing-extensions==4.0.1
|
170 |
+
urllib3==1.25.11
|
171 |
+
uvicorn==0.17.1
|
172 |
+
wandb==0.12.9
|
173 |
+
wcwidth==0.2.5
|
174 |
+
webencodings==0.5.1
|
175 |
+
wheel==0.35.1
|
176 |
+
widgetsnbextension==3.5.2
|
177 |
+
xxhash==2.0.2
|
178 |
+
yarl==1.7.2
|
179 |
+
yaspin==2.1.0
|
180 |
+
zipp==3.7.0
|
wandb/run-20220129_215451-1vipdbow/files/wandb-metadata.json
ADDED
@@ -0,0 +1,65 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"os": "Linux-4.15.0-151-generic-x86_64-with-glibc2.10",
|
3 |
+
"python": "3.8.8",
|
4 |
+
"heartbeatAt": "2022-01-29T21:54:53.025463",
|
5 |
+
"startedAt": "2022-01-29T21:54:51.704109",
|
6 |
+
"docker": null,
|
7 |
+
"gpu": "Tesla V100S-PCIE-32GB",
|
8 |
+
"gpu_count": 1,
|
9 |
+
"cpu_count": 60,
|
10 |
+
"cuda": null,
|
11 |
+
"args": [
|
12 |
+
"--dataset_name=mozilla-foundation/common_voice_8_0",
|
13 |
+
"--model_name_or_path=facebook/wav2vec2-xls-r-300m",
|
14 |
+
"--dataset_config_name=fr",
|
15 |
+
"--tokenizer_name_or_path=./",
|
16 |
+
"--output_dir=./",
|
17 |
+
"--overwrite_output_dir",
|
18 |
+
"--num_train_epochs=2",
|
19 |
+
"--per_device_train_batch_size=64",
|
20 |
+
"--per_device_eval_batch_size=64",
|
21 |
+
"--gradient_accumulation_steps=1",
|
22 |
+
"--learning_rate=1e-4",
|
23 |
+
"--warmup_steps=1500",
|
24 |
+
"--length_column_name=input_length",
|
25 |
+
"--evaluation_strategy=steps",
|
26 |
+
"--text_column_name=sentence",
|
27 |
+
"--save_steps=500",
|
28 |
+
"--eval_steps=500",
|
29 |
+
"--logging_steps=100",
|
30 |
+
"--layerdrop=0.0",
|
31 |
+
"--activation_dropout=0.05",
|
32 |
+
"--save_total_limit=2",
|
33 |
+
"--freeze_feature_encoder",
|
34 |
+
"--feat_proj_dropout=0.0",
|
35 |
+
"--mask_time_prob=0.75",
|
36 |
+
"--mask_time_length=10",
|
37 |
+
"--mask_feature_prob=0.4",
|
38 |
+
"--mask_feature_length=10",
|
39 |
+
"--gradient_checkpointing",
|
40 |
+
"--report_to=wandb",
|
41 |
+
"--run_name=xls-r-300m-fr",
|
42 |
+
"--max_eval_samples=6000",
|
43 |
+
"--max_duration_in_seconds=9",
|
44 |
+
"--use_auth_token",
|
45 |
+
"--fp16",
|
46 |
+
"--group_by_length",
|
47 |
+
"--preprocessing_num_workers=64",
|
48 |
+
"--do_train",
|
49 |
+
"--do_eval",
|
50 |
+
"--load_best_model_at_end",
|
51 |
+
"--push_to_hub"
|
52 |
+
],
|
53 |
+
"state": "running",
|
54 |
+
"program": "run_speech_recognition_ctc.py",
|
55 |
+
"codePath": "run_speech_recognition_ctc.py",
|
56 |
+
"git": {
|
57 |
+
"remote": "https://huggingface.co/AlexN/xls-r-300m-fr",
|
58 |
+
"commit": "c58dbaf4476093ff9758e2d11bbf63f828a0ecc3"
|
59 |
+
},
|
60 |
+
"email": "[email protected]",
|
61 |
+
"root": "/workspace/xls-r-300m-fr",
|
62 |
+
"host": "job-1abccd0a-3293-4ffe-8274-9e8f841f653f",
|
63 |
+
"username": "ovh",
|
64 |
+
"executable": "/opt/conda/bin/python"
|
65 |
+
}
|
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220129_215451-1vipdbow/logs/debug.log
ADDED
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-01-29 21:54:51,708 INFO MainThread:7453 [wandb_setup.py:_flush():71] setting env: {}
|
2 |
+
2022-01-29 21:54:51,708 INFO MainThread:7453 [wandb_setup.py:_flush():71] setting login settings: {}
|
3 |
+
2022-01-29 21:54:51,708 INFO MainThread:7453 [wandb_init.py:_log_setup():371] Logging user logs to /workspace/xls-r-300m-fr/wandb/run-20220129_215451-1vipdbow/logs/debug.log
|
4 |
+
2022-01-29 21:54:51,709 INFO MainThread:7453 [wandb_init.py:_log_setup():372] Logging internal logs to /workspace/xls-r-300m-fr/wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log
|
5 |
+
2022-01-29 21:54:51,709 INFO MainThread:7453 [wandb_init.py:init():404] calling init triggers
|
6 |
+
2022-01-29 21:54:51,709 INFO MainThread:7453 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
|
7 |
+
config: {}
|
8 |
+
2022-01-29 21:54:51,709 INFO MainThread:7453 [wandb_init.py:init():460] starting backend
|
9 |
+
2022-01-29 21:54:51,709 INFO MainThread:7453 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
|
10 |
+
2022-01-29 21:54:51,806 INFO MainThread:7453 [backend.py:ensure_launched():216] starting backend process...
|
11 |
+
2022-01-29 21:54:51,893 INFO MainThread:7453 [backend.py:ensure_launched():221] started backend process with pid: 37713
|
12 |
+
2022-01-29 21:54:51,897 INFO MainThread:7453 [wandb_init.py:init():469] backend started and connected
|
13 |
+
2022-01-29 21:54:51,906 INFO MainThread:7453 [wandb_init.py:init():533] updated telemetry
|
14 |
+
2022-01-29 21:54:52,074 INFO MainThread:7453 [wandb_init.py:init():563] communicating current version
|
15 |
+
2022-01-29 21:54:52,818 INFO MainThread:7453 [wandb_init.py:init():568] got version response
|
16 |
+
2022-01-29 21:54:52,818 INFO MainThread:7453 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
|
17 |
+
2022-01-29 21:54:53,017 INFO MainThread:7453 [wandb_init.py:init():606] starting run threads in backend
|
18 |
+
2022-01-29 21:54:53,621 INFO MainThread:7453 [wandb_run.py:_console_start():1810] atexit reg
|
19 |
+
2022-01-29 21:54:53,622 INFO MainThread:7453 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
|
20 |
+
2022-01-29 21:54:53,623 INFO MainThread:7453 [wandb_run.py:_redirect():1689] Redirecting console.
|
21 |
+
2022-01-29 21:54:53,628 INFO MainThread:7453 [wandb_run.py:_redirect():1745] Redirects installed.
|
22 |
+
2022-01-29 21:54:53,628 INFO MainThread:7453 [wandb_init.py:init():633] run started, returning control to user process
|
23 |
+
2022-01-29 21:54:53,630 INFO MainThread:7453 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 216, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.17.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.05, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 218, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.75, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.4, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 64, 'per_device_eval_batch_size': 64, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0001, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 2.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 1500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan29_20-37-14_job-1abccd0a-3293-4ffe-8274-9e8f841f653f', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'xls-r-300m-fr', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': True, 'metric_for_best_model': 'loss', 'greater_is_better': False, 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 64, 'eval_batch_size': 64}
|
24 |
+
2022-01-29 21:54:53,634 INFO MainThread:7453 [wandb_watch.py:watch():43] Watching
|
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:51e1b275cfd48d2b806baadf5c43eb7f3d8efffce1a2b58da8b720f5bf6e721d
|
3 |
+
size 104538108
|