Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
risangpanggalih
/
Bokap-Betawi-v0-Qwen2.5-7B-fp16
like
0
Text Generation
Transformers
PyTorch
risangpanggalih/betawi-v0
qwen2
unsloth
trl
sft
qwen2.5
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
b9b28af
Bokap-Betawi-v0-Qwen2.5-7B-fp16
1 contributor
History:
6 commits
risangpanggalih
Upload betawi.png
b9b28af
verified
about 1 month ago
.gitattributes
1.52 kB
initial commit
about 1 month ago
README.md
1.46 kB
Update README.md
about 1 month ago
added_tokens.json
632 Bytes
Upload tokenizer
about 1 month ago
betawi.png
248 kB
Upload betawi.png
about 1 month ago
config.json
820 Bytes
Trained with Unsloth
about 1 month ago
generation_config.json
167 Bytes
Trained with Unsloth
about 1 month ago
merges.txt
1.67 MB
Upload tokenizer
about 1 month ago
pytorch_model-00001-of-00004.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
,
"collections.OrderedDict"
What is a pickle import?
4.88 GB
LFS
Trained with Unsloth
about 1 month ago
pytorch_model-00002-of-00004.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
,
"collections.OrderedDict"
What is a pickle import?
4.93 GB
LFS
Trained with Unsloth
about 1 month ago
pytorch_model-00003-of-00004.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
What is a pickle import?
4.33 GB
LFS
Trained with Unsloth
about 1 month ago
pytorch_model-00004-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
1.09 GB
LFS
Trained with Unsloth
about 1 month ago
pytorch_model.bin.index.json
27.8 kB
Trained with Unsloth
about 1 month ago
special_tokens_map.json
616 Bytes
Upload tokenizer
about 1 month ago
tokenizer.json
7.03 MB
Upload tokenizer
about 1 month ago
tokenizer_config.json
4.87 kB
Upload tokenizer
about 1 month ago
vocab.json
2.78 MB
Upload tokenizer
about 1 month ago