xls-r-300m-fr-0 / eval_results.json
AlexN's picture
commit with good tokenizer
1d509dc
raw
history blame contribute delete
225 Bytes
{
"epoch": 2.0,
"eval_loss": 0.23875188827514648,
"eval_runtime": 294.1776,
"eval_samples": 5792,
"eval_samples_per_second": 19.689,
"eval_steps_per_second": 0.309,
"eval_wer": 0.3680797679950471
}