Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
xiaoou
/
am-sentence
like
0
Token Classification
Transformers
Safetensors
roberta
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
am-sentence
1 contributor
History:
3 commits
xiaoou
Upload tokenizer
2d09bad
12 months ago
.gitattributes
Safe
1.52 kB
initial commit
12 months ago
config.json
Safe
1.01 kB
Upload RobertaForTokenClassification
12 months ago
merges.txt
Safe
456 kB
Upload tokenizer
12 months ago
model.safetensors
Safe
496 MB
LFS
Upload RobertaForTokenClassification
12 months ago
special_tokens_map.json
Safe
958 Bytes
Upload tokenizer
12 months ago
tokenizer.json
Safe
2.11 MB
Upload tokenizer
12 months ago
tokenizer_config.json
Safe
1.32 kB
Upload tokenizer
12 months ago
vocab.json
Safe
798 kB
Upload tokenizer
12 months ago