Update README.md
Browse files
README.md
CHANGED
@@ -28,4 +28,4 @@ model = AutoGPTQForCausalLM.from_quantized(model_name)
|
|
28 |
|[smpanaro/gpt2-medium-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-medium-AutoGPTQ-4bit-128g)|19.1719|18.4739|0.698|
|
29 |
|[smpanaro/gpt2-large-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-large-AutoGPTQ-4bit-128g)|16.6875|16.4541|0.2334|
|
30 |
|[smpanaro/gpt2-xl-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-xl-AutoGPTQ-4bit-128g)|14.9297|14.7951|0.1346|
|
31 |
-
<sub>Wikitext perplexity measured as in the [huggingface docs](https://huggingface.co/docs/transformers/en/perplexity)</sub>
|
|
|
28 |
|[smpanaro/gpt2-medium-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-medium-AutoGPTQ-4bit-128g)|19.1719|18.4739|0.698|
|
29 |
|[smpanaro/gpt2-large-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-large-AutoGPTQ-4bit-128g)|16.6875|16.4541|0.2334|
|
30 |
|[smpanaro/gpt2-xl-AutoGPTQ-4bit-128g](https://huggingface.co/smpanaro/gpt2-xl-AutoGPTQ-4bit-128g)|14.9297|14.7951|0.1346|
|
31 |
+
<sub>Wikitext perplexity measured as in the [huggingface docs](https://huggingface.co/docs/transformers/en/perplexity), lower is better</sub>
|