Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
xiaoou
/
am-sentence
like
0
Token Classification
Transformers
Safetensors
roberta
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
am-sentence
1 contributor
History:
3 commits
xiaoou
Upload tokenizer
2d09bad
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
config.json
1.01 kB
Upload RobertaForTokenClassification
10 months ago
merges.txt
456 kB
Upload tokenizer
10 months ago
model.safetensors
496 MB
LFS
Upload RobertaForTokenClassification
10 months ago
special_tokens_map.json
958 Bytes
Upload tokenizer
10 months ago
tokenizer.json
2.11 MB
Upload tokenizer
10 months ago
tokenizer_config.json
1.32 kB
Upload tokenizer
10 months ago
vocab.json
798 kB
Upload tokenizer
10 months ago