tttoaster commited on
Commit
6d0dd2a
1 Parent(s): ac93ae7

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +2 -2
app.py CHANGED
@@ -649,7 +649,7 @@ def load_demo(request: gr.Request):
649
 
650
  title = ("""
651
  # SEED-X-I
652
- [[Paper]](https://arxiv.org/abs/2404.14396) [[Code]](https://github.com/AILab-CVC/SEED-X)
653
 
654
  Demo of a general instruction-tuned model SEED-X-I (17B) from the foundation model SEED-X.
655
 
@@ -657,7 +657,7 @@ SEED-X-I can follow multimodal instruction (including images with **dynamic reso
657
 
658
  SEED-X-I **does not support image manipulation**. If you want to experience **SEED-X-Edit** for high-precision image editing, please refer to [[Inference Code]](https://github.com/AILab-CVC/SEED-X).
659
 
660
- If you want to experience the normal model inference speed, you can run [[Inference Code]](https://github.com/AILab-CVC/SEED-X) locally.
661
 
662
 
663
  ## Tips:
 
649
 
650
  title = ("""
651
  # SEED-X-I
652
+ [[Paper]](https://arxiv.org/abs/2404.14396) [[Code]](https://github.com/AILab-CVC/SEED-X) [[Faster Demo]](https://arc.tencent.com/en/ai-demos/multimodal)
653
 
654
  Demo of a general instruction-tuned model SEED-X-I (17B) from the foundation model SEED-X.
655
 
 
657
 
658
  SEED-X-I **does not support image manipulation**. If you want to experience **SEED-X-Edit** for high-precision image editing, please refer to [[Inference Code]](https://github.com/AILab-CVC/SEED-X).
659
 
660
+ If you want to experience the normal model inference speed, you can use [[Faster Demo]](https://arc.tencent.com/en/ai-demos/multimodal) or run [[Inference Code]](https://github.com/AILab-CVC/SEED-X) locally.
661
 
662
 
663
  ## Tips: