Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
EleutherAI
/
pile-t5-xxl
like
26
Follow
EleutherAI
566
Text2Text Generation
Transformers
Safetensors
EleutherAI/pile
English
umt5
t5x
encoder-decoder
Inference Endpoints
arxiv:
2101.00027
arxiv:
2201.07311
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
1177fe6
pile-t5-xxl
1 contributor
History:
8 commits
lintang
Upload xxl.zip
1177fe6
verified
7 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
6.35 kB
Update README.md
7 months ago
config.json
801 Bytes
fix tokenizer settings for step 2000000
8 months ago
generation_config.json
156 Bytes
add files for step 1000000
10 months ago
model-00001-of-00009.safetensors
4.99 GB
LFS
add files for step 1000000
10 months ago
model-00002-of-00009.safetensors
5 GB
LFS
add files for step 1000000
10 months ago
model-00003-of-00009.safetensors
4.87 GB
LFS
add files for step 1000000
10 months ago
model-00004-of-00009.safetensors
4.9 GB
LFS
add files for step 1000000
10 months ago
model-00005-of-00009.safetensors
4.97 GB
LFS
add files for step 1000000
10 months ago
model-00006-of-00009.safetensors
5 GB
LFS
add files for step 1000000
10 months ago
model-00007-of-00009.safetensors
5 GB
LFS
add files for step 1000000
10 months ago
model-00008-of-00009.safetensors
4.97 GB
LFS
add files for step 1000000
10 months ago
model-00009-of-00009.safetensors
4.86 GB
LFS
add files for step 1000000
10 months ago
model.safetensors.index.json
55.8 kB
add files for step 1000000
10 months ago
special_tokens_map.json
2.2 kB
fix tokenizer settings for step 2000000
8 months ago
spiece.model
500 kB
LFS
add files for step 1000000
10 months ago
tokenizer.model
500 kB
LFS
fix tokenizer settings for step 2000000
8 months ago
tokenizer_config.json
2.88 kB
fix tokenizer settings for step 2000000
8 months ago
xxl.zip
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
370 kB
LFS
Upload xxl.zip
7 months ago