Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
InstaDeepAI
/
nucleotide-transformer-2.5b-multi-species
like
33
Follow
InstaDeep Ltd
197
Fill-Mask
Transformers
PyTorch
google-tensorflow
TensorFlow
Joblib
InstaDeepAI/multi_species_genomes
InstaDeepAI/nucleotide_transformer_downstream_tasks
esm
DNA
biology
genomics
Inference Endpoints
License:
cc-by-nc-sa-4.0
Model card
Files
Files and versions
Community
9
Train
Deploy
Use this model
refs/pr/6
nucleotide-transformer-2.5b-multi-species
3 contributors
History:
15 commits
SFconvertbot
Adding `safetensors` variant of this model
8e653a9
verified
8 months ago
.gitattributes
Safe
1.48 kB
initial commit
over 1 year ago
README.md
Safe
6.11 kB
Update README.md
about 1 year ago
config.json
Safe
707 Bytes
Upload EsmForMaskedLM
over 1 year ago
model-00001-of-00002.safetensors
Safe
9.91 GB
LFS
Adding `safetensors` variant of this model
8 months ago
model-00002-of-00002.safetensors
Safe
278 MB
LFS
Adding `safetensors` variant of this model
8 months ago
model.safetensors.index.json
Safe
48.2 kB
Adding `safetensors` variant of this model
8 months ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch.LongStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.91 GB
LFS
Upload EsmForMaskedLM
over 1 year ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
278 MB
LFS
Upload EsmForMaskedLM
over 1 year ago
pytorch_model.bin.index.json
Safe
46 kB
Upload EsmForMaskedLM
over 1 year ago
special_tokens_map.json
Safe
101 Bytes
Upload tokenizer
over 1 year ago
tf_model-00001-of-00002.h5
Safe
9.91 GB
LFS
Add TF weights (#2)
over 1 year ago
tf_model-00002-of-00002.h5
Safe
278 MB
LFS
Add TF weights (#2)
over 1 year ago
tf_model.h5.index.json
Safe
56.9 kB
Add TF weights (#2)
over 1 year ago
tokenizer_config.json
Safe
129 Bytes
Upload tokenizer
over 1 year ago
vocab.txt
Safe
28.7 kB
Upload tokenizer
over 1 year ago