File size: 956 Bytes
cbdf34e aeae75d cbdf34e aeae75d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
---
license: apache-2.0
pipeline_tag: text2text-generation
tags:
- llama
- llm
---
This is LoRA checkpoint fine-tuned with the following CLI. The fine-tuning process is logged in [W&B dashboard](https://wandb.ai/chansung18/alpaca_lora/runs/l6lepk3g?workspace=user-chansung18). I have used DGX workstation with 8 x A100(40G).
```console
python finetune.py \
--base_model='elinas/llama-30b-hf-transformers-4.29' \
--data_path='alpaca_data.json' \
--num_epochs=10 \
--cutoff_len=1024 \
--group_by_length \
--output_dir='./lora-alpaca-30b-elinas' \
--lora_target_modules='[q_proj,k_proj,v_proj,o_proj]' \
--lora_r=16 \
--lora_alpha=32 \
--batch_size=1024 \
--micro_batch_size=20
```
This LoRA checkpoint is recommended to be used with `transformers >= 4.29` which should be installed with the following command currently(4/30/2023).
```console
pip install git+https://github.com/huggingface/transformers.git
```
|