Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
13
Follow
AWS Inferentia and Trainium
64
License:
apache-2.0
Model card
Files
Files and versions
Community
254
d109adf
optimum-neuron-cache
/
neuronxcc-2.9.0.16+fa12ba55a
/
MODULE_7e6f5f7b13890111e1c5+b32fba9d
/
compile_flags.txt
Jingya
HF staff
Synchronizing local compiler cache.
b65a0a1
verified
8 months ago
raw
Copy download link
history
blame
35 Bytes
--model-type=transformer-inference