ericsorides commited on
Commit
512a3f5
1 Parent(s): 329b30a

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - text-generation-inference
4
+ - whisper
5
+ - audio
6
+ base_model:
7
+ - openai/whisper-large-v2
8
+ ---
9
+
10
+
11
+ # Whisper Large v2 with Key-Value-Cache enabled in ONNX fp16 format
12
+ - Model creator: [Open AI](https://huggingface.co/openai)
13
+ - Original model: [Whisper Large v2](https://huggingface.co/openai/whisper-large-v2)
14
+
15
+ <!-- description start -->
16
+ ## Description
17
+
18
+ This repo contains the ONNX files for the ONNX conversion of Whisper Large v2 done by Esperanto Technologies.
19
+ The model is in the fp16 format and has the KVC enabled.
20
+
21
+ <!-- description end -->
22
+
23
+ ## How to download ONNX model and weight files
24
+
25
+ The easiest way to obtain the model is to clone this whole repo.
26
+ Alternatively you can download the files is using the `huggingface-hub` Python library.
27
+
28
+ ```shell
29
+ pip3 install huggingface-hub>=0.17.1
30
+ ```
31
+
32
+ Then you can download any individual model file to the current directory, at high speed, with a command like this:
33
+
34
+ ```shell
35
+ huggingface-cli download Esperanto/whisper-large-v2-kvc-fp16-onnx --local-dir whisper-large-v2-kvc-fp16-onnx --local-dir-use-symlinks False
36
+ ```
37
+
38
+ For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
39
+
40
+ ## How to run from Python code using ONNXRuntime
41
+
42
+ This model can easily be ran in a CPU using [ONNXRuntime](https://onnxruntime.ai/).
43
+
44
+ Scripts about how to run these models will be provided soon.