Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MendelAI
/
nv-embed-v2-ontada-twab-peft
like
0
Follow
Mendel AI
9
Sentence Similarity
sentence-transformers
Safetensors
nvembed
feature-extraction
Generated from Trainer
dataset_size:16186
loss:MultipleNegativesRankingLoss
custom_code
Eval Results
Inference Endpoints
arxiv:
1908.10084
arxiv:
1705.00652
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
nv-embed-v2-ontada-twab-peft
1 contributor
History:
3 commits
simran-t
Upload tokenizer
e9d7a9f
verified
3 days ago
1_Pooling
Add new SentenceTransformer model
3 days ago
.gitattributes
Safe
1.52 kB
initial commit
3 days ago
README.md
Safe
26.6 kB
Add new SentenceTransformer model
3 days ago
config.json
Safe
2.82 kB
Add new SentenceTransformer model
3 days ago
config_sentence_transformers.json
Safe
210 Bytes
Add new SentenceTransformer model
3 days ago
configuration_nvembed.py
Safe
3.16 kB
Add new SentenceTransformer model
3 days ago
model-00001-of-00007.safetensors
Safe
5 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00002-of-00007.safetensors
Safe
5 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00003-of-00007.safetensors
Safe
5 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00004-of-00007.safetensors
Safe
4.83 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00005-of-00007.safetensors
Safe
5 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00006-of-00007.safetensors
Safe
5 GB
LFS
Add new SentenceTransformer model
3 days ago
model-00007-of-00007.safetensors
Safe
1.58 GB
LFS
Add new SentenceTransformer model
3 days ago
model.safetensors.index.json
Safe
28.2 kB
Add new SentenceTransformer model
3 days ago
modeling_nvembed.py
Safe
18.7 kB
Add new SentenceTransformer model
3 days ago
modules.json
Safe
349 Bytes
Add new SentenceTransformer model
3 days ago
sentence_bert_config.json
Safe
54 Bytes
Add new SentenceTransformer model
3 days ago
special_tokens_map.json
Safe
551 Bytes
Add new SentenceTransformer model
3 days ago
tokenizer.json
Safe
3.51 MB
Add new SentenceTransformer model
3 days ago
tokenizer.model
Safe
493 kB
LFS
Add new SentenceTransformer model
3 days ago
tokenizer_config.json
Safe
1.16 kB
Upload tokenizer
3 days ago