wiki-finetuned-pythia-70m-deduped / train_results.json
taufeeque's picture
add tokenizer
8dc3a1c
raw
history blame contribute delete
196 Bytes
{
"epoch": 3.0,
"train_loss": 3.1020872242466644,
"train_runtime": 9159.1702,
"train_samples": 114937,
"train_samples_per_second": 37.647,
"train_steps_per_second": 0.294
}