Spaces:
Running
Running
rephrase export steps
Browse files
app.py
CHANGED
@@ -78,21 +78,24 @@ TITLE = """
|
|
78 |
"
|
79 |
>
|
80 |
<h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
|
81 |
-
Export your
|
82 |
</h1>
|
83 |
</div>
|
84 |
"""
|
85 |
|
86 |
DESCRIPTION = """
|
87 |
-
This Space allows you to automatically export to the OpenVINO format
|
88 |
|
89 |
-
|
|
|
|
|
|
|
|
|
90 |
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
- That’s it! You’ll get feedback if it works or not, and if it worked, you’ll get the URL of the opened PR 🔥
|
96 |
"""
|
97 |
|
98 |
with gr.Blocks() as demo:
|
|
|
78 |
"
|
79 |
>
|
80 |
<h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
|
81 |
+
Export your model to OpenVINO with Optimum Intel
|
82 |
</h1>
|
83 |
</div>
|
84 |
"""
|
85 |
|
86 |
DESCRIPTION = """
|
87 |
+
This Space allows you to automatically export your model to the OpenVINO format.
|
88 |
|
89 |
+
To export your model you need:
|
90 |
+
- A read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens).
|
91 |
+
Read access is enough given that we will open a PR against the source repo.
|
92 |
+
- A model id of the model you'd like to export (for example: [distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english))
|
93 |
+
- A [task](https://huggingface.co/docs/optimum/main/en/exporters/task_manager#pytorch) that will be used to load the model before exporting it. If set to "auto", the task will be automatically inferred.
|
94 |
|
95 |
+
That's it ! 🔥
|
96 |
+
|
97 |
+
After the model conversion, we will open a PR against the source repo.
|
98 |
+
You will then be able to load the resulting model and run inference using [Optimum Intel](https://huggingface.co/docs/optimum/intel/inference).
|
|
|
99 |
"""
|
100 |
|
101 |
with gr.Blocks() as demo:
|