Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
# Python backend | |
## Setup | |
``` | |
pip install -r requirements.txt | |
chmod +x launch.sh | |
``` | |
## Execution | |
`./launch.sh` | |
## Usage | |
The API listens to the port `6006` and the route `autocomplete`. It listens to `POST` requests. | |
Query it like this: `{POST}http://<url>:6006/autocomplete` | |
The necessary argument is `context` which is a string of characters (ideally a sentence) which will be converted in tokens and fed to GPT-2. | |
The optional arguments are detailed below: | |
`length` is an unsigned int which sets the maximum length (in tokens) of the generated sentence __default: 100__ | |
`n_samples` is an int `0 < n_samples <= 3` which sets the maximum amount of samples generated. __default: 3__ | |
`max_time` is an unsigned float which sets an heuristic for the maximum time spent generating sentences. It is a heuristic because it is not exact, it can slightly overflow. __default: infinite__ | |
`model_size` takes `"small"` or `"medium"` as input and corresponds to the GPT model size __default: small__ | |
`temperature` float - temperature of the model __default: 1__ | |
`max_tokens` int - maximum amount of tokens that will be fed into the model. __default: 256__ | |
`top_p` float - 0 < top_p < 1, nucleus sampling; only tokens with a cumulative probability of top_p will be selected for multinomial sampling __default: 0.9__ | |
`top_k` int - Only top k tokens will be selected for multinomial sampling. __default: 256__ | |
## Return format | |
The server returns a set of sentences according to the context. Their format is: | |
``` | |
{sentences: {value: string, time: number}[], time: number} | |
``` | |
Example: | |
With POST parameters as: | |
```json | |
{ | |
"context": "That man is just another", | |
"samples": 3 | |
} | |
``` | |
The response is as follows: | |
```json | |
{ | |
"sentences": [ | |
{"value": " handicapped working man.", "time": 0.15415167808532715}, | |
{"value": " guy, doing everything his manly nature requires.", "time": 0.2581148147583008}, | |
{"value": " guy, Mohr said.", "time": 0.17547011375427246} | |
], | |
"time": 0.264873743057251 | |
} | |
``` |