Edit model card

scream_non_large_timestamp_test

This model is a fine-tuned version of NbAiLab/scream_non_large_1e06_beams5_constantlr_long on the NbAiLab/NCC_whisper_both_timestamp dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 16
  • total_train_batch_size_per_node: 64
  • total_train_batch_size: 64
  • total_optimization_steps: 5,000
  • starting_optimization_step: None
  • finishing_optimization_step: 5,000
  • num_train_dataset_workers: 32
  • num_hosts: 1
  • total_num_training_examples: 320,000
  • steps_per_epoch: 475
  • num_beams: 5

Training results

step eval_loss train_loss eval_wer eval_cer
0 0.5499 0.6409 10.5055 5.6538
500 0.4301 0.1060 10.1401 4.9232
1000 0.4168 0.0776 10.3837 4.8123
1500 0.4132 0.0689 10.5968 4.8123
2000 0.4176 0.0511 10.5968 4.6712
2500 0.4261 0.0454 11.0536 4.7871
3000 0.4385 0.0441 11.1449 4.7770

Framework versions

  • Transformers 4.29.0.dev0
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.