Edit model card

Conformer-ctc-medium-ko

ν•΄λ‹Ή λͺ¨λΈμ€ RIVA Conformer ASR Korean을 AI hub dataset에 λŒ€ν•΄ νŒŒμΈνŠœλ‹μ„ μ§„ν–‰ν–ˆμŠ΅λ‹ˆλ‹€.
Conformer 기반의 λͺ¨λΈμ€ whisper와 같은 attention 기반 λͺ¨λΈκ³Ό 달리 streaming을 μ§„ν–‰ν•˜μ—¬λ„ μ„±λŠ₯이 크게 떨어지지 μ•Šκ³ , 속도가 λΉ λ₯΄λ‹€λŠ” μž₯점이 μžˆμŠ΅λ‹ˆλ‹€.
V100 GPUμ—μ„œλŠ” RTFκ°€ 0.05, CPU(7 cores)μ—μ„œλŠ” 0.35 정도 λ‚˜μ˜€λŠ” 것을 확인할 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
μ˜€λ””μ˜€ chunk size 2초의 streaming ν…ŒμŠ€νŠΈμ—μ„œλŠ” 전체 μ˜€λ””μ˜€λ₯Ό λ„£λŠ” 것에 λΉ„ν•΄μ„œλŠ” 20% 정도 μ„±λŠ₯μ €ν•˜κ°€ μžˆμœΌλ‚˜ μΆ©λΆ„νžˆ μ‚¬μš©ν•  수 μžˆλŠ” μ„±λŠ₯μž…λ‹ˆλ‹€.
μΆ”κ°€λ‘œ open domain이 μ•„λ‹Œ 고객 μ‘λŒ€ μŒμ„±κ³Ό 같은 domainμ—μ„œλŠ” kenlm을 μΆ”κ°€ν•˜μ˜€μ„ λ•Œ WER 13.45μ—μ„œ WER 5.27둜 크게 μ„±λŠ₯ ν–₯상이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
ν•˜μ§€λ§Œ κ·Έ μ™Έμ˜ domainμ—μ„œλŠ” kenlm의 μΆ”κ°€κ°€ 큰 μ„±λŠ₯ ν–₯μƒμœΌλ‘œ 이어지지 μ•Šμ•˜μŠ΅λ‹ˆλ‹€.

Streaming μ½”λ“œμ™€ Denoise model이 ν¬ν•¨λœ μ½”λ“œλŠ” μ•„λž˜ κΉƒν—™μ—μ„œ 확인할 수 μžˆμŠ΅λ‹ˆλ‹€. https://github.com/SUNGBEOMCHOI/Korean-Streaming-ASR

Training results

Training Loss Epoch Wer
9.09 1.0 11.51

dataset

데이터셋 이름 데이터 μƒ˜ν”Œ 수(train/test)
κ³ κ°μ‘λŒ€μŒμ„± 2067668/21092
ν•œκ΅­μ–΄ μŒμ„± 620000/3000
ν•œκ΅­μΈ λŒ€ν™” μŒμ„± 2483570/142399
μžμœ λŒ€ν™”μŒμ„±(μΌλ°˜λ‚¨λ…€) 1886882/263371
볡지 λΆ„μ•Ό μ½œμ„Όν„° 상담데이터 1096704/206470
μ°¨λŸ‰λ‚΄ λŒ€ν™” 데이터 2624132/332787
λͺ…λ Ήμ–΄ μŒμ„±(노인남여) 137467/237469
전체 10916423(13946μ‹œκ°„)/1206588(1474μ‹œκ°„)

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • num_train_epoch: 1
  • sample_rate: 16000
  • max_duration: 20.0
Downloads last month
439
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.