Spaces:
Running
Running
fix export
Browse files
app.py
CHANGED
@@ -19,8 +19,8 @@ if HF_TOKEN:
|
|
19 |
repo = Repository(local_dir=DATA_DIR, clone_from=DATASET_REPO_URL, token=HF_TOKEN)
|
20 |
|
21 |
|
22 |
-
def export(
|
23 |
-
if
|
24 |
return """
|
25 |
### Invalid input π
|
26 |
Please fill a token and model name.
|
@@ -87,14 +87,13 @@ TITLE = """
|
|
87 |
DESCRIPTION = """
|
88 |
This Space uses [Optimum Intel](https://huggingface.co/docs/optimum/intel/inference) to automatically export your model to the OpenVINO format.
|
89 |
|
|
|
|
|
90 |
To export your model you need:
|
91 |
-
- A read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens).
|
92 |
- A Model ID from the Hub
|
93 |
-
|
94 |
|
95 |
That's it ! π₯
|
96 |
-
|
97 |
-
After the model conversion, we will open a PR against the source repo.
|
98 |
"""
|
99 |
|
100 |
interface = gr.Interface(
|
|
|
19 |
repo = Repository(local_dir=DATA_DIR, clone_from=DATASET_REPO_URL, token=HF_TOKEN)
|
20 |
|
21 |
|
22 |
+
def export(model_id: str, token: str, task: str = "auto") -> str:
|
23 |
+
if model_id == "" or token == "":
|
24 |
return """
|
25 |
### Invalid input π
|
26 |
Please fill a token and model name.
|
|
|
87 |
DESCRIPTION = """
|
88 |
This Space uses [Optimum Intel](https://huggingface.co/docs/optimum/intel/inference) to automatically export your model to the OpenVINO format.
|
89 |
|
90 |
+
After the model conversion, we will open a PR against the source repo to add the resulting model.
|
91 |
+
|
92 |
To export your model you need:
|
|
|
93 |
- A Model ID from the Hub
|
94 |
+
- A read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens).
|
95 |
|
96 |
That's it ! π₯
|
|
|
|
|
97 |
"""
|
98 |
|
99 |
interface = gr.Interface(
|