|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- nampdn-ai/tiny-textbooks |
|
library_name: transformers |
|
tags: |
|
- general |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
metrics: |
|
- accuracy |
|
--- |
|
#### LLaMA-2-7b-Tinytext |
|
(https://cdn.midjourney.com/3ded8b9e-a1e1-4893-8dff-26d82d81db22/0_1.png) |
|
# Model Details |
|
Llama2-7b-tinytext is a fine-tuned language model ontop of TinyPixel/Llama-2-7B-bf16-sharded, |
|
|
|
### Model Description |
|
|
|
TODO |
|
|
|
|
|
- **Developed by:** Collin Heenan |
|
- **Model type:** [More Information Needed] |
|
- **Language(s) (NLP):** English |
|
- **Finetuned from model:** TinyPixel/Llama-2-7B-bf16-sharded |
|
|
|
### Model Sources [optional] |
|
|
|
<!-- Provide the basic links for the model. --> |
|
|
|
- **Repository:** [More Information Needed] |
|
- **Paper [optional]:** [More Information Needed] |
|
- **Demo [optional]:** [More Information Needed] |
|
|
|
## Uses |
|
|
|
TODO |
|
|
|
### Direct Use |
|
TODO |
|
|
|
|
|
|
|
## Bias, Risks, and Limitations |
|
|
|
TODO |