Llama-2-7b-ultrachat-syn200k-2e / train_results.json
kykim0's picture
Model save
177fb19 verified
raw
history blame
238 Bytes
{
"epoch": 1.99936,
"total_flos": 534097819484160.0,
"train_loss": 0.14969347108265196,
"train_runtime": 45991.4783,
"train_samples": 100000,
"train_samples_per_second": 4.349,
"train_steps_per_second": 0.034
}