MoritzLaurer HF staff commited on
Commit
1ebde70
1 Parent(s): 2b80f8c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -1
README.md CHANGED
@@ -2,4 +2,61 @@
2
  license: mit
3
  tags:
4
  - prompt
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: mit
3
  tags:
4
  - prompt
5
+ ---
6
+
7
+
8
+ This repo illustrates how you can use the hf_hub_prompts library to load prompts from YAML files in open-weight model repositories.
9
+ Several open-weight models have been tuned on specific tasks with specific prompts.
10
+ For example, the InternVL2 vision language models are one of the very few VLMs that have been trained for zeroshot bounding box prediction for any object.
11
+ To elicit this capability, users need to use this special prompt: `Please provide the bounding box coordinate of the region this sentence describes: <ref>{region_to_detect}</ref>'`
12
+
13
+ These these kinds of task-specific special prompts are currently unsystematically reported in model cards, github repos, .txt files etc.
14
+ The hf_hub_prompts library standardises the sharing of prompts in YAML files.
15
+
16
+
17
+ ```py
18
+ #!pip install hf_hub_prompts
19
+ from hf_hub_prompts import download_prompt
20
+
21
+ # download image prompt template
22
+ prompt_template = download_prompt(repo_id="MoritzLaurer/open_models_special_prompts", filename="internvl2-bbox-prompt.yaml")
23
+
24
+ # populate prompt
25
+ image_url = "https://unsplash.com/photos/ZVw3HmHRhv0/download?ixid=M3wxMjA3fDB8MXxhbGx8NHx8fHx8fDJ8fDE3MjQ1NjAzNjl8&force=true&w=1920"
26
+ region_to_detect = "the bird"
27
+ messages = prompt_template.format_messages(image_url=image_url, region_to_detect=region_to_detect, client="openai")
28
+
29
+ print(messages)
30
+ # out: [{'role': 'user'
31
+ # 'content': [{'type': 'image_url',
32
+ # 'image_url': {'url': 'https://unsplash.com/photos/ZVw3HmHRhv0/download?ixid=M3wxMjA3fDB8MXxhbGx8NHx8fHx8fDJ8fDE3MjQ1NjAzNjl8&force=true&w=1920'}},
33
+ # {'type': 'text',
34
+ # 'text': 'Please provide the bounding box coordinate of the region this sentence describes: <ref>the bird</ref>'}]
35
+ # }]
36
+ ```
37
+
38
+ These populated prompts in the OpenAI messages format are then directly compatible with vLLM or TGI containers.
39
+ When you host one of these containers on a HF Endpoint, for example, you can call on the model with the OpenAI client or with the HF Interence Client.
40
+
41
+ ```py
42
+ from openai import OpenAI
43
+ import os
44
+
45
+ ENDPOINT_URL = "https://tkuaxiztuv9pl4po.us-east-1.aws.endpoints.huggingface.cloud" + "/v1/"
46
+
47
+ # initialize the OpenAI client but point it to an endpoint running vLLM or TGI
48
+ client = OpenAI(
49
+ base_url=ENDPOINT_URL,
50
+ api_key=os.getenv("HF_TOKEN")
51
+ )
52
+
53
+ response = client.chat.completions.create(
54
+ model="/repository", # with vLLM deployed on HF endpoint, this needs to be /repository since there are the model artifacts stored
55
+ messages=messages,
56
+ )
57
+
58
+ response.choices[0].message.content
59
+ # out: 'the bird[[54, 402, 515, 933]]'
60
+ ```
61
+
62
+