Vivien Chappelier commited on
Commit
ab3d5c6
1 Parent(s): 24fcf8d

update README

Browse files
README.md CHANGED
@@ -13,28 +13,31 @@ You can use this classifier to detect watermarks generated with our [SDXL-turbo
13
  ## Usage
14
 
15
  ```py
16
- from transformers import AutoModel, BlipImageProcessor
 
17
  from PIL import Image
18
  import sys
19
- import torch
20
 
21
  image_processor = BlipImageProcessor.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
22
- commit_hash = "584a7bc01dc0f02e53bf8b8b295717ed09ed7294"
23
- model = AutoModel.from_pretrained("imatag/stable-signature-bzh-detector-resnet18", trust_remote_code=True, revision=commit_hash)
24
 
25
  img = Image.open(sys.argv[1]).convert("RGB")
26
  inputs = image_processor(img, return_tensors="pt")
27
- with torch.no_grad():
28
- p = torch.sigmoid(model(**inputs).logits).item()
29
 
30
- print(f"approximate p-value: {p}")
31
  ```
32
 
33
  ## Purpose
34
 
35
  This model is an approximate version of [IMATAG](https://www.imatag.com/)'s BZH decoder for the watermark embedded in our [SDXL-turbo watermarking demo](https://huggingface.co/spaces/imatag/stable-signature-bzh).
36
  It works on this watermark only and cannot be used to decode other watermarks.
37
- It will produce an approximate p-value measuring the risk of mistakenly detecting a watermark on a benign (non-watermarked) image. For an exact p-value and improved robustness, please use the [API](https://huggingface.co/spaces/imatag/stable-signature-bzh/resolve/main/detect_api.py) instead.
 
 
 
 
38
 
39
  For more details on this watermarking technique, check out our [announcement](https://www.imatag.com/blog/unlocking-the-future-of-content-authentication-imatags-breakthrough-in-ai-generated-image-watermarking) and our lab's [blog post](https://imatag-lab.medium.com/stable-signature-meets-bzh-53ad0ba13691).
40
 
 
13
  ## Usage
14
 
15
  ```py
16
+ from transformers import AutoModelForImageClassification, BlipImageProcessor
17
+
18
  from PIL import Image
19
  import sys
 
20
 
21
  image_processor = BlipImageProcessor.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
22
+ model = AutoModelForImageClassification.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
23
+ model.eval()
24
 
25
  img = Image.open(sys.argv[1]).convert("RGB")
26
  inputs = image_processor(img, return_tensors="pt")
27
+ p = model(**inputs).logits[0,0] < 0
 
28
 
29
+ print(f"watermarked: {p}")
30
  ```
31
 
32
  ## Purpose
33
 
34
  This model is an approximate version of [IMATAG](https://www.imatag.com/)'s BZH decoder for the watermark embedded in our [SDXL-turbo watermarking demo](https://huggingface.co/spaces/imatag/stable-signature-bzh).
35
  It works on this watermark only and cannot be used to decode other watermarks.
36
+
37
+ It will catch most altered versions of a watermarked image while making roughly one mistake in one thousand on non-watermarked images.
38
+ Alternatively, it can produce an approximate p-value measuring the risk of mistakenly detecting a watermark on a benign (non-watermarked) image, by recalibrating the output as in [this script](https://huggingface.co/imatag/stable-signature-bzh-detector-resnet18/resolve/main/detect_demo_pvalue.py).
39
+
40
+ To get an exact p-value and for improved robustness, please use the [API](https://huggingface.co/spaces/imatag/stable-signature-bzh/resolve/main/detect_api.py) instead.
41
 
42
  For more details on this watermarking technique, check out our [announcement](https://www.imatag.com/blog/unlocking-the-future-of-content-authentication-imatags-breakthrough-in-ai-generated-image-watermarking) and our lab's [blog post](https://imatag-lab.medium.com/stable-signature-meets-bzh-53ad0ba13691).
43
 
detect_demo_pvalue.py ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from transformers import AutoModelForImageClassification, BlipImageProcessor
2
+ from huggingface_hub import hf_hub_download
3
+ from safetensors import safe_open
4
+
5
+ from PIL import Image
6
+ import sys
7
+ import torch
8
+
9
+ image_processor = BlipImageProcessor.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
10
+ model = AutoModelForImageClassification.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
11
+ calibration = hf_hub_download("imatag/stable-signature-bzh-detector-resnet18", filename="calibration.safetensors")
12
+ with safe_open(calibration, framework="pt") as f:
13
+ calibration_logits = f.get_tensor("logits")
14
+
15
+ img = Image.open(sys.argv[1]).convert("RGB")
16
+ inputs = image_processor(img, return_tensors="pt")
17
+ with torch.no_grad():
18
+ p = model(**inputs).logits[...,0]
19
+ p = (1 + torch.searchsorted(calibration_logits, p)) / calibration_logits.shape[0]
20
+ p = p.item()
21
+
22
+ print(f"approximate p-value: {p}")
detect_demo.py → detect_demo_simple.py RENAMED
@@ -1,14 +1,14 @@
1
- from transformers import AutoModel, BlipImageProcessor
 
2
  from PIL import Image
3
  import sys
4
- import torch
5
 
6
  image_processor = BlipImageProcessor.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
7
- commit_hash = "584a7bc01dc0f02e53bf8b8b295717ed09ed7294"
8
- model = AutoModel.from_pretrained("imatag/stable-signature-bzh-detector-resnet18", trust_remote_code=True, revision=commit_hash)
9
 
10
  img = Image.open(sys.argv[1]).convert("RGB")
11
  inputs = image_processor(img, return_tensors="pt")
12
- with torch.no_grad():
13
- p = torch.sigmoid(model(**inputs).logits).item()
14
- print(f"approximate p-value: {p}")
 
1
+ from transformers import AutoModelForImageClassification, BlipImageProcessor
2
+
3
  from PIL import Image
4
  import sys
 
5
 
6
  image_processor = BlipImageProcessor.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
7
+ model = AutoModelForImageClassification.from_pretrained("imatag/stable-signature-bzh-detector-resnet18")
8
+ model.eval()
9
 
10
  img = Image.open(sys.argv[1]).convert("RGB")
11
  inputs = image_processor(img, return_tensors="pt")
12
+ p = model(**inputs).logits[0,0] < 0
13
+
14
+ print(f"watermarked: {p}")