chrisociepa
commited on
Commit
•
f1c4afe
1
Parent(s):
3c5bf87
Update README.md
Browse files
README.md
CHANGED
@@ -1,8 +1,15 @@
|
|
1 |
---
|
2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
4 |
|
5 |
-
This repo contains a low-rank adapter for LLaMA-7B trained on generated (not translated!) 55125 instructions in Polish.
|
6 |
|
7 |
The training took almost 16 hours on a single RTX 4090 with the following hyperparameters:
|
8 |
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
language:
|
4 |
+
- pl
|
5 |
+
tags:
|
6 |
+
- llama
|
7 |
+
- alpaca
|
8 |
+
- lora
|
9 |
+
- self-instruct
|
10 |
---
|
11 |
|
12 |
+
This repo contains a low-rank adapter for LLaMA-7B trained on generated (not translated!) 55125 [instructions](https://huggingface.co/datasets/chrisociepa/self-generated-instructions-pl) in Polish.
|
13 |
|
14 |
The training took almost 16 hours on a single RTX 4090 with the following hyperparameters:
|
15 |
|