icefall-asr-multidataset-pruned_transducer_stateless7-2023-05-04 / decoding-results /greedy_search /log-decode-epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model-2023-05-03-11-35-18
yfyeung's picture
First commit
101cbee
raw
history blame
6.12 kB
2023-05-03 11:35:18,697 INFO [decode.py:777] Decoding started
2023-05-03 11:35:18,697 INFO [decode.py:783] Device: cuda:0
2023-05-03 11:35:18,699 INFO [decode.py:793] {'frame_shift_ms': 10.0, 'allowed_excess_duration_ratio': 0.1, 'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.23.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'a23383c5a381713b51e9014f3f05d096f8aceec3', 'k2-git-date': 'Wed Apr 26 15:33:33 2023', 'lhotse-version': '1.14.0.dev+git.b61b917.dirty', 'torch-version': '1.13.1', 'torch-cuda-available': True, 'torch-cuda-version': '11.6', 'python-version': '3.1', 'icefall-git-branch': 'master', 'icefall-git-sha1': '45c13e9-dirty', 'icefall-git-date': 'Mon Apr 24 15:00:02 2023', 'icefall-path': '/k2-dev/yangyifan/icefall-master', 'k2-path': '/k2-dev/yangyifan/anaconda3/envs/icefall/lib/python3.10/site-packages/k2-1.23.4.dev20230427+cuda11.6.torch1.13.1-py3.10-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/k2-dev/yangyifan/anaconda3/envs/icefall/lib/python3.10/site-packages/lhotse-1.14.0.dev0+git.b61b917.dirty-py3.10.egg/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-1-1220091118-57c4d55446-mlpzc', 'IP address': '10.177.22.19'}, 'epoch': 30, 'iter': 0, 'avg': 1, 'use_averaged_model': True, 'exp_dir': PosixPath('pruned_transducer_stateless7/exp_multidataset'), 'bpe_model': 'data/lang_bpe_500/bpe.model', 'lang_dir': PosixPath('data/lang_bpe_500'), 'decoding_method': 'greedy_search', 'beam_size': 4, 'beam': 20.0, 'ngram_lm_scale': 0.01, 'max_contexts': 8, 'max_states': 64, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'use_shallow_fusion': False, 'lm_type': 'rnn', 'lm_scale': 0.3, 'tokens_ngram': 3, 'backoff_id': 500, 'num_encoder_layers': '2,4,3,2,4', 'feedforward_dims': '1024,1024,2048,2048,1024', 'nhead': '8,8,8,8,8', 'encoder_dims': '384,384,384,384,384', 'attention_dims': '192,192,192,192,192', 'encoder_unmasked_dims': '256,256,256,256,256', 'zipformer_downsampling_factors': '1,2,4,8,2', 'cnn_module_kernels': '31,31,31,31,31', 'decoder_dim': 512, 'joiner_dim': 512, 'full_libri': True, 'manifest_dir': PosixPath('data/fbank'), 'cv_manifest_dir': PosixPath('data/en/fbank'), 'max_duration': 600, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'lm_vocab_size': 500, 'lm_epoch': 7, 'lm_avg': 1, 'lm_exp_dir': None, 'rnn_lm_embedding_dim': 2048, 'rnn_lm_hidden_dim': 2048, 'rnn_lm_num_layers': 3, 'rnn_lm_tie_weights': True, 'transformer_lm_exp_dir': None, 'transformer_lm_dim_feedforward': 2048, 'transformer_lm_encoder_dim': 768, 'transformer_lm_embedding_dim': 768, 'transformer_lm_nhead': 8, 'transformer_lm_num_layers': 16, 'transformer_lm_tie_weights': True, 'res_dir': PosixPath('pruned_transducer_stateless7/exp_multidataset/greedy_search'), 'suffix': 'epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 500}
2023-05-03 11:35:18,700 INFO [decode.py:795] About to create model
2023-05-03 11:35:19,508 INFO [zipformer.py:178] At encoder stack 4, which has downsampling_factor=2, we will combine the outputs of layers 1 and 3, with downsampling_factors=2 and 8.
2023-05-03 11:35:19,521 INFO [decode.py:862] Calculating the averaged model over epoch range from 29 (excluded) to 30
2023-05-03 11:35:33,975 INFO [decode.py:924] Number of model parameters: 70369391
2023-05-03 11:35:33,975 INFO [asr_datamodule.py:449] About to get test-clean cuts
2023-05-03 11:35:33,980 INFO [asr_datamodule.py:456] About to get test-other cuts
2023-05-03 11:35:38,422 INFO [decode.py:674] batch 0/?, cuts processed until now is 44
2023-05-03 11:36:27,968 INFO [decode.py:688] The transcripts are stored in pruned_transducer_stateless7/exp_multidataset/greedy_search/recogs-test-clean-epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model.txt
2023-05-03 11:36:28,062 INFO [utils.py:558] [test-clean-greedy_search] %WER 1.90% [999 / 52576, 93 ins, 118 del, 788 sub ]
2023-05-03 11:36:28,256 INFO [decode.py:699] Wrote detailed error stats to pruned_transducer_stateless7/exp_multidataset/greedy_search/errs-test-clean-epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model.txt
2023-05-03 11:36:28,256 INFO [decode.py:713]
For test-clean, WER of different settings are:
greedy_search 1.9 best for test-clean
2023-05-03 11:36:30,075 INFO [decode.py:674] batch 0/?, cuts processed until now is 52
2023-05-03 11:36:50,980 INFO [zipformer.py:1454] attn_weights_entropy = tensor([3.7565, 3.6993, 3.8258, 3.8986, 3.9204, 3.8740, 3.8114, 3.9465],
device='cuda:0'), covar=tensor([0.1241, 0.0872, 0.0934, 0.0534, 0.0523, 0.0545, 0.0672, 0.0753],
device='cuda:0'), in_proj_covar=tensor([0.0686, 0.0833, 0.0962, 0.0853, 0.0646, 0.0669, 0.0708, 0.0823],
device='cuda:0'), out_proj_covar=tensor([0.0001, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002],
device='cuda:0')
2023-05-03 11:37:15,613 INFO [decode.py:688] The transcripts are stored in pruned_transducer_stateless7/exp_multidataset/greedy_search/recogs-test-other-epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model.txt
2023-05-03 11:37:15,708 INFO [utils.py:558] [test-other-greedy_search] %WER 4.14% [2166 / 52343, 218 ins, 218 del, 1730 sub ]
2023-05-03 11:37:15,905 INFO [decode.py:699] Wrote detailed error stats to pruned_transducer_stateless7/exp_multidataset/greedy_search/errs-test-other-epoch-30-avg-1-context-2-max-sym-per-frame-1-use-averaged-model.txt
2023-05-03 11:37:15,905 INFO [decode.py:713]
For test-other, WER of different settings are:
greedy_search 4.14 best for test-other
2023-05-03 11:37:15,906 INFO [decode.py:958] Done!