TheBloke commited on
Commit
f6fae7a
1 Parent(s): 6a1575b

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -109,7 +109,7 @@ Documentation on installing and using vLLM [can be found here](https://vllm.read
109
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
110
 
111
  ```shell
112
- python3 python -m vllm.entrypoints.api_server --model TheBloke/Llama-2-7b-Chat-GPTQ --quantization awq
113
  ```
114
 
115
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example:
 
109
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
110
 
111
  ```shell
112
+ python3 python -m vllm.entrypoints.api_server --model TheBloke/Llama-2-7b-Chat-AWQ --quantization awq
113
  ```
114
 
115
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example: