--- tags: - generated_from_trainer datasets: - kanishka/babylm2-subset metrics: - accuracy model-index: - name: cria-babylm2-subset-default-1e-3 results: - task: name: Causal Language Modeling type: text-generation dataset: name: kanishka/babylm2-subset type: kanishka/babylm2-subset metrics: - name: Accuracy type: accuracy value: 0.5203706477236009 --- # cria-babylm2-subset-default-1e-3 This model was trained from scratch on the kanishka/babylm2-subset dataset. It achieves the following results on the evaluation set: - Loss: 2.6626 - Accuracy: 0.5204 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 32000 - num_epochs: 10.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:------:|:---------------:|:--------:| | 2.4397 | 1.0 | 14142 | 2.6684 | 0.4955 | | 2.3085 | 2.0 | 28284 | 2.5420 | 0.5093 | | 2.19 | 3.0 | 42426 | 2.4397 | 0.5215 | | 2.0865 | 4.0 | 56568 | 2.3943 | 0.5276 | | 1.9957 | 5.0 | 70710 | 2.3786 | 0.5305 | | 1.9161 | 6.0 | 84852 | 2.3910 | 0.5313 | | 1.8361 | 7.0 | 98994 | 2.4205 | 0.5304 | | 1.7477 | 8.0 | 113136 | 2.4748 | 0.5283 | | 1.6549 | 9.0 | 127278 | 2.5582 | 0.5249 | | 1.5611 | 10.0 | 141420 | 2.6626 | 0.5204 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.2.0+cu121 - Datasets 2.16.1 - Tokenizers 0.19.1