chuanli-lambda
commited on
Commit
•
25d0be4
1
Parent(s):
7bf9ada
Update README.md
Browse files
README.md
CHANGED
@@ -105,4 +105,4 @@ Answer:Create a budget and track your spending.
|
|
105 |
|
106 |
The model was trained on the [`Dahoas/synthetic-instruct-gptj-pairwise`](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise). We split the original dataset into the train (first 32000 examples) and validation (the remaining 1144 examples) subsets.
|
107 |
|
108 |
-
We finetune the model for 4 epoches. This took 8xA100 80GB 2 hours, where we set `batch_size_per_gpu` to `8` (so global batch size is 64), and learning rate to `0.00002` (with linear decay to zero at the last trainig step). You can find a Weights and Biases record [here](https://wandb.ai/chuanli11/public-ft-synthetic-instruct-gptj-pairwise-pythia1
|
|
|
105 |
|
106 |
The model was trained on the [`Dahoas/synthetic-instruct-gptj-pairwise`](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise). We split the original dataset into the train (first 32000 examples) and validation (the remaining 1144 examples) subsets.
|
107 |
|
108 |
+
We finetune the model for 4 epoches. This took 8xA100 80GB 2 hours, where we set `batch_size_per_gpu` to `8` (so global batch size is 64), and learning rate to `0.00002` (with linear decay to zero at the last trainig step). You can find a Weights and Biases record [here](https://wandb.ai/chuanli11/public-ft-synthetic-instruct-gptj-pairwise-pythia1-4b?workspace=user-).
|