Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
imohammad12
/
GRS-complex-simple-classifier-DeBerta
like
0
Text Classification
Transformers
PyTorch
English
deberta
grs
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
GRS-complex-simple-classifier-DeBerta
1 contributor
History:
5 commits
imohammad12
Update README.md
6900816
over 2 years ago
.gitattributes
Safe
1.17 kB
initial commit
over 2 years ago
README.md
Safe
1.33 kB
Update README.md
over 2 years ago
config.json
Safe
758 Bytes
model uploaded
over 2 years ago
merges.txt
Safe
456 kB
add tokenizer
over 2 years ago
optimizer.pt
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
1.11 GB
LFS
model uploaded
over 2 years ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"torch.LongStorage"
What is a pickle import?
557 MB
LFS
model uploaded
over 2 years ago
rng_state.pth
pickle
Detected Pickle imports (7)
"numpy.core.multiarray._reconstruct"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"numpy.ndarray"
,
"_codecs.encode"
,
"numpy.dtype"
,
"torch.ByteStorage"
How to fix it?
16.7 kB
LFS
model uploaded
over 2 years ago
scheduler.pt
Safe
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
623 Bytes
LFS
model uploaded
over 2 years ago
special_tokens_map.json
Safe
778 Bytes
add tokenizer
over 2 years ago
tokenizer.json
Safe
2.11 MB
add tokenizer
over 2 years ago
tokenizer_config.json
Safe
1.2 kB
add tokenizer
over 2 years ago
trainer_state.json
Safe
108 kB
model uploaded
over 2 years ago
training_args.bin
pickle
Detected Pickle imports (4)
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.training_args.TrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"torch.device"
How to fix it?
2.67 kB
LFS
model uploaded
over 2 years ago
vocab.json
Safe
798 kB
add tokenizer
over 2 years ago