Update README.md
Browse files
README.md
CHANGED
@@ -22,19 +22,15 @@ tags:
|
|
22 |
<em>Details of our multimodal language model, ColonGPT.</em>
|
23 |
</p>
|
24 |
|
25 |
-
|
26 |
π [Paper](https://arxiv.org) | π [Home](https://github.com/ai4colonoscopy/IntelliScope)
|
27 |
|
|
|
28 |
|
29 |
-
|
30 |
-
|
31 |
-
Our ColonGPT is a standard multimodal language model, which contains four basic components: a language tokenizer, an visual encoder (π€ [SigLIP-SO](https://huggingface.co/google/siglip-so400m-patch14-384)), a multimodal connector, and a language model (π€ [Phi1.5](https://huggingface.co/microsoft/phi-1_5)).
|
32 |
-
|
33 |
-
For further details about ColonGPT, we highly recommend visiting our [home page](https://github.com/BAAI-DCAI/Bunny). There, you'll find comprehensive usage instructions for our model and the latest advancements in intelligent colonoscopy technology.
|
34 |
-
|
35 |
|
36 |
|
37 |
# Quick start
|
|
|
38 |
Here is a code snippet to show you how to quickly try-on our ColonGPT model with transformers. For convenience, we manually combined some configuration and code files and merged the weights. Please note that this is a quick code, we recommend you installing [ColonGPT's source code](https://github.com/ai4colonoscopy/IntelliScope/blob/main/docs/guideline-for-ColonGPT.md) to explore more.
|
39 |
|
40 |
- Before running the snippet, you only need to install the following minimium dependencies.
|
|
|
22 |
<em>Details of our multimodal language model, ColonGPT.</em>
|
23 |
</p>
|
24 |
|
|
|
25 |
π [Paper](https://arxiv.org) | π [Home](https://github.com/ai4colonoscopy/IntelliScope)
|
26 |
|
27 |
+
> This is the merged weights of [ColonGPT-v1-phi1.5-siglip-lora](https://drive.google.com/drive/folders/1Emi7o7DpN0zlCPIYqsCfNMr9LTPt3SCT?usp=sharing), including vision encoder (siglip) + language model (phi-1.5), and other fine-tuned weights on our ColonINST.
|
28 |
|
29 |
+
Our ColonGPT is a standard multimodal language model, which contains four basic components: a language tokenizer, an visual encoder (π€ [SigLIP-SO](https://huggingface.co/google/siglip-so400m-patch14-384)), a multimodal connector, and a language model (π€ [Phi1.5](https://huggingface.co/microsoft/phi-1_5)). In this huggingface page, we provide a quick start for convenient of new users. For further details about ColonGPT, we highly recommend visiting our [homepage](https://github.com/BAAI-DCAI/Bunny). There, you'll find comprehensive usage instructions for our model and the latest advancements in intelligent colonoscopy technology.
|
|
|
|
|
|
|
|
|
|
|
30 |
|
31 |
|
32 |
# Quick start
|
33 |
+
|
34 |
Here is a code snippet to show you how to quickly try-on our ColonGPT model with transformers. For convenience, we manually combined some configuration and code files and merged the weights. Please note that this is a quick code, we recommend you installing [ColonGPT's source code](https://github.com/ai4colonoscopy/IntelliScope/blob/main/docs/guideline-for-ColonGPT.md) to explore more.
|
35 |
|
36 |
- Before running the snippet, you only need to install the following minimium dependencies.
|