Edit model card

Uploaded model

  • Developed by: xxxxxccc
  • License: apache-2.0
  • Finetuned from model : unsloth/Qwen2-7b-bnb-4bit

This qwen2 model was trained 2x faster with Unsloth and Huggingface's TRL library.

Step Training Loss Validation Loss 10 1.969100 2.004664 20 1.912000 2.000807 30 1.973500 1.997514 40 1.987000 1.995450 50 2.035200 1.992437 60 1.947100 1.989926 70 1.913200 1.988254 80 1.984200 1.985698 90 1.829400 1.984344 100 1.925600 1.982281 110 1.924200 1.980534 120 1.946400 1.979197 130 1.886500 1.977808 140 1.911200 1.976381 150 1.855700 1.974918 160 1.906900 1.973701 170 1.827500 1.972471 180 1.905400 1.972400 190 1.864500 1.972158 200 1.974000 1.971486 210 2.020100 1.970601 220 1.835600 1.969159 230 1.873000 1.969961 240 1.853200 1.968564 250 1.892800 1.968765 260 1.808400 1.967971 270 1.818600 1.967605 280 1.866600 1.967552 290 1.761000 1.966953 300 1.860300 1.966536 310 1.793400 1.966086 320 1.814500 1.965425 330 1.978200 1.965850 340 1.868600 1.965540 350 1.834300 1.966008 360 1.822400 1.966800 370 1.896100 1.968465 380 1.883600 1.967751 390 1.810500 1.967558 400 1.808000 1.967848 410 1.771100 1.968701 420 1.877800 1.967933 430 1.838300 1.968531 440 1.717500 1.968299 450 1.848500 1.969323 460 1.794400 1.969219 470 1.864300 1.969595 480 1.768400 1.968718 490 1.682400 1.969312 500 1.835200 1.967268 510 1.754400 1.968593 520 1.870700 1.968871 530 1.810000 1.972527 540 1.813400 1.972523 550 1.767600 1.973855 560 1.874200 1.974136 570 1.791200 1.973645 580 1.904100 1.974470 590 1.792400 1.972956 600 1.841000 1.974010 610 1.769700 1.974349 620 2.016400 1.974549 630 1.804100 1.974385 640 1.891100 1.975045 650 1.785100 1.975142 660 1.760900 1.974409 670 1.822400 1.974042 680 1.783800 1.974177 690 1.757100 1.974227

Downloads last month
107
Safetensors
Model size
7.62B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.