Model Details
- Model Description: This model is test for data ordering.
- Developed by: Jisu Kim
- Model Type: Large Language Model
Model Architecture
This model is based on falcon-7B. We fine-tuning this model for data ordering task.
falcon-7B is a transformer model, with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
Dataset
We random sample Open-Orca dataset. (We finetune the 100,000 dataset)
Guthub
License
Apache License 2.0
- Downloads last month
- 629
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.