Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
nicholasKluge
/
Aira-2-1B5
like
1
Text Generation
Transformers
PyTorch
Safetensors
nicholasKluge/instruct-aira-dataset
English
gpt2
alignment
instruction tuned
text generation
conversation
assistant
Carbon Emissions
text-generation-inference
Inference Endpoints
arxiv:
1803.05457
arxiv:
2109.07958
arxiv:
2203.09509
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
Aira-2-1B5
4 contributors
History:
59 commits
nicholasKluge
Update README.md
a066b2b
verified
5 months ago
.gitattributes
1.52 kB
initial commit
over 1 year ago
Aira_emissions.csv
771 Bytes
Upload 19 files
about 1 year ago
LICENSE
10.7 kB
Upload 19 files
about 1 year ago
README.md
7.02 kB
Update README.md
5 months ago
added_tokens.json
123 Bytes
Upload 19 files
about 1 year ago
config.json
951 Bytes
Update config.json
12 months ago
generation_config.json
338 Bytes
Update generation_config.json
11 months ago
lm_evaluation_harness.ipynb
143 kB
Upload lm_evaluation_harness.ipynb
about 1 year ago
merges.txt
456 kB
Upload 19 files
about 1 year ago
model.safetensors
6.23 GB
LFS
Upload 19 files
about 1 year ago
optimizer.pt
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
3.59 GB
LFS
Upload 19 files
about 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
6.23 GB
LFS
Upload 19 files
about 1 year ago
rng_state.pt
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
What is a pickle import?
5.81 kB
LFS
Upload 19 files
about 1 year ago
scheduler.pt
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
563 Bytes
LFS
Upload 19 files
about 1 year ago
special_tokens_map.json
631 Bytes
Upload 19 files
about 1 year ago
tokenizer.json
2.11 MB
Upload 19 files
about 1 year ago
tokenizer_config.json
927 Bytes
Upload 19 files
about 1 year ago
training_stats.parquet
2.33 kB
LFS
Upload 19 files
about 1 year ago
vocab.json
798 kB
Upload 19 files
about 1 year ago