|
--- |
|
license: apache-2.0 |
|
library_name: pruna-engine |
|
thumbnail: "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg" |
|
metrics: |
|
- memory_disk |
|
- memory_inference |
|
- inference_latency |
|
- inference_throughput |
|
- inference_CO2_emissions |
|
- inference_energy_consumption |
|
--- |
|
<!-- header start --> |
|
<!-- 200823 --> |
|
<div style="width: auto; margin-left: auto; margin-right: auto"> |
|
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer"> |
|
<img src="https://i.imgur.com/eDAlcgk.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> |
|
</a> |
|
</div> |
|
<!-- header end --> |
|
|
|
# Simply make AI models cheaper, smaller, faster, and greener! |
|
|
|
## Results |
|
|
|
![image info](./plots.png) |
|
|
|
## Setup |
|
|
|
You can run the smashed model on a A100 GPU by: |
|
1. Installing and importing the `pruna-engine` (version 0.2.6) package. Use `pip install pruna --extra-index-url https://pypi.nvidia.com --extra-index-url https://pypi.ngc.nvidia.com` for installation. See [Pypi](https://pypi.org/project/pruna-engine/) for detailed on the package. |
|
2. Downloading the model files at `model_path`. This can be done using huggingface with this repository name or with manual downloading. |
|
3. Loading the model |
|
4. Running the model. |
|
|
|
You can achieve this by running the following code: |
|
|
|
```python |
|
from transformers.utils.hub import cached_file |
|
from pruna_engine.PrunaModel import PrunaModel # Step (1): install and import `pruna-engine` package. |
|
|
|
... |
|
model_path = cached_file("PrunaAI/REPO", "model") # Step (2): download the model files at `model_path`. |
|
smashed_model = PrunaModel.load_model(model_path) # Step (3): load the model. |
|
y = smashed_model(prompt="a photo of an astronaut riding a horse on mars", image_height=1024, image_width=1024)[0] # Step (4): run the model. |
|
``` |
|
|
|
## Configurations |
|
|
|
The configuration info are in `config.json`. |
|
|
|
## License |
|
|
|
We follow the same license as the original model. Please check the license of the original model before using this model. |
|
|
|
## Want to compress other models? |
|
|
|
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact). |
|
- Request access to easily compress your own AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). |