goofy_pasteur / README.md
rasikabh's picture
Update README.md
56d281a
|
raw
history blame
9.54 kB
metadata
language:
  - en
license: mit
tags:
  - generated_from_trainer
datasets:
  - tomekkorbak/detoxify-pile-chunk3-0-50000
  - tomekkorbak/detoxify-pile-chunk3-50000-100000
  - tomekkorbak/detoxify-pile-chunk3-100000-150000
  - tomekkorbak/detoxify-pile-chunk3-150000-200000
  - tomekkorbak/detoxify-pile-chunk3-200000-250000
  - tomekkorbak/detoxify-pile-chunk3-250000-300000
  - tomekkorbak/detoxify-pile-chunk3-300000-350000
  - tomekkorbak/detoxify-pile-chunk3-350000-400000
  - tomekkorbak/detoxify-pile-chunk3-400000-450000
  - tomekkorbak/detoxify-pile-chunk3-450000-500000
  - tomekkorbak/detoxify-pile-chunk3-500000-550000
  - tomekkorbak/detoxify-pile-chunk3-550000-600000
  - tomekkorbak/detoxify-pile-chunk3-600000-650000
  - tomekkorbak/detoxify-pile-chunk3-650000-700000
  - tomekkorbak/detoxify-pile-chunk3-700000-750000
  - tomekkorbak/detoxify-pile-chunk3-750000-800000
  - tomekkorbak/detoxify-pile-chunk3-800000-850000
  - tomekkorbak/detoxify-pile-chunk3-850000-900000
  - tomekkorbak/detoxify-pile-chunk3-900000-950000
  - tomekkorbak/detoxify-pile-chunk3-950000-1000000
  - tomekkorbak/detoxify-pile-chunk3-1000000-1050000
  - tomekkorbak/detoxify-pile-chunk3-1050000-1100000
  - tomekkorbak/detoxify-pile-chunk3-1100000-1150000
  - tomekkorbak/detoxify-pile-chunk3-1150000-1200000
  - tomekkorbak/detoxify-pile-chunk3-1200000-1250000
  - tomekkorbak/detoxify-pile-chunk3-1250000-1300000
  - tomekkorbak/detoxify-pile-chunk3-1300000-1350000
  - tomekkorbak/detoxify-pile-chunk3-1350000-1400000
  - tomekkorbak/detoxify-pile-chunk3-1400000-1450000
  - tomekkorbak/detoxify-pile-chunk3-1450000-1500000
  - tomekkorbak/detoxify-pile-chunk3-1500000-1550000
  - tomekkorbak/detoxify-pile-chunk3-1550000-1600000
  - tomekkorbak/detoxify-pile-chunk3-1600000-1650000
  - tomekkorbak/detoxify-pile-chunk3-1650000-1700000
  - tomekkorbak/detoxify-pile-chunk3-1700000-1750000
  - tomekkorbak/detoxify-pile-chunk3-1750000-1800000
  - tomekkorbak/detoxify-pile-chunk3-1800000-1850000
  - tomekkorbak/detoxify-pile-chunk3-1850000-1900000
  - tomekkorbak/detoxify-pile-chunk3-1900000-1950000
model-index:
  - name: goofy_pasteur
    results: []

goofy_pasteur

Model description

This model was trained using pile-detoxify, which is data from The Pile, annotated based on toxicity detected by Detoxify.

Intended uses & limitations

This model has been trained to generate text that receives a low score for toxicity from Detoxify.

While we have promising results with the methods used to avoid toxic text, we cannot guarantee that it will output text that is fully aligned with non-toxicity in every situation. This model and its associated datasets are intended for research purposes only and should not be deployed anywhere.

Please take care to avoid misusing the datasets used to train this model (where toxicity and personal identifiable information are annotated) or putting anybody in danger by publicizing their information.

Training and evaluation data

This model was trained using pile-detoxify.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.01
  • training_steps: 50354
  • mixed_precision_training: Native AMP

Framework versions

  • Transformers 4.20.1
  • Pytorch 1.11.0+cu113
  • Datasets 2.5.1
  • Tokenizers 0.11.6

Full config

{'dataset': {'datasets': ['tomekkorbak/detoxify-pile-chunk3-0-50000', 'tomekkorbak/detoxify-pile-chunk3-50000-100000', 'tomekkorbak/detoxify-pile-chunk3-100000-150000', 'tomekkorbak/detoxify-pile-chunk3-150000-200000', 'tomekkorbak/detoxify-pile-chunk3-200000-250000', 'tomekkorbak/detoxify-pile-chunk3-250000-300000', 'tomekkorbak/detoxify-pile-chunk3-300000-350000', 'tomekkorbak/detoxify-pile-chunk3-350000-400000', 'tomekkorbak/detoxify-pile-chunk3-400000-450000', 'tomekkorbak/detoxify-pile-chunk3-450000-500000', 'tomekkorbak/detoxify-pile-chunk3-500000-550000', 'tomekkorbak/detoxify-pile-chunk3-550000-600000', 'tomekkorbak/detoxify-pile-chunk3-600000-650000', 'tomekkorbak/detoxify-pile-chunk3-650000-700000', 'tomekkorbak/detoxify-pile-chunk3-700000-750000', 'tomekkorbak/detoxify-pile-chunk3-750000-800000', 'tomekkorbak/detoxify-pile-chunk3-800000-850000', 'tomekkorbak/detoxify-pile-chunk3-850000-900000', 'tomekkorbak/detoxify-pile-chunk3-900000-950000', 'tomekkorbak/detoxify-pile-chunk3-950000-1000000', 'tomekkorbak/detoxify-pile-chunk3-1000000-1050000', 'tomekkorbak/detoxify-pile-chunk3-1050000-1100000', 'tomekkorbak/detoxify-pile-chunk3-1100000-1150000', 'tomekkorbak/detoxify-pile-chunk3-1150000-1200000', 'tomekkorbak/detoxify-pile-chunk3-1200000-1250000', 'tomekkorbak/detoxify-pile-chunk3-1250000-1300000', 'tomekkorbak/detoxify-pile-chunk3-1300000-1350000', 'tomekkorbak/detoxify-pile-chunk3-1350000-1400000', 'tomekkorbak/detoxify-pile-chunk3-1400000-1450000', 'tomekkorbak/detoxify-pile-chunk3-1450000-1500000', 'tomekkorbak/detoxify-pile-chunk3-1500000-1550000', 'tomekkorbak/detoxify-pile-chunk3-1550000-1600000', 'tomekkorbak/detoxify-pile-chunk3-1600000-1650000', 'tomekkorbak/detoxify-pile-chunk3-1650000-1700000', 'tomekkorbak/detoxify-pile-chunk3-1700000-1750000', 'tomekkorbak/detoxify-pile-chunk3-1750000-1800000', 'tomekkorbak/detoxify-pile-chunk3-1800000-1850000', 'tomekkorbak/detoxify-pile-chunk3-1850000-1900000', 'tomekkorbak/detoxify-pile-chunk3-1900000-1950000'], 'is_split_by_sentences': True}, 'generation': {'force_call_on': [25354], 'metrics_configs': [{}, {'n': 1}, {'n': 2}, {'n': 5}], 'scenario_configs': [{'generate_kwargs': {'do_sample': True, 'max_length': 128, 'min_length': 10, 'temperature': 0.7, 'top_k': 0, 'top_p': 0.9}, 'name': 'unconditional', 'num_samples': 2048}, {'generate_kwargs': {'do_sample': True, 'max_length': 128, 'min_length': 10, 'temperature': 0.7, 'top_k': 0, 'top_p': 0.9}, 'name': 'challenging_rtp', 'num_samples': 2048, 'prompts_path': 'resources/challenging_rtp.jsonl'}], 'scorer_config': {'device': 'cuda:0'}}, 'kl_gpt3_callback': {'force_call_on': [25354], 'max_tokens': 64, 'num_samples': 4096}, 'model': {'from_scratch': True, 'gpt2_config_kwargs': {'reorder_and_upcast_attn': True, 'scale_attn_by': True}, 'path_or_name': 'gpt2'}, 'objective': {'name': 'MLE'}, 'tokenizer': {'path_or_name': 'gpt2'}, 'training': {'dataloader_num_workers': 0, 'effective_batch_size': 64, 'evaluation_strategy': 'no', 'fp16': True, 'hub_model_id': 'goofy_pasteur', 'hub_strategy': 'all_checkpoints', 'learning_rate': 0.0005, 'logging_first_step': True, 'logging_steps': 1, 'num_tokens': 3300000000, 'output_dir': 'training_output104340', 'per_device_train_batch_size': 16, 'push_to_hub': True, 'remove_unused_columns': False, 'save_steps': 25354, 'save_strategy': 'steps', 'seed': 42, 'warmup_ratio': 0.01, 'weight_decay': 0.1}}

Wandb URL:

https://wandb.ai/tomekkorbak/apo/runs/20d87pk8