BasicNp commited on
Commit
c1dfd6f
1 Parent(s): e8aa256

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -414
README.md CHANGED
@@ -1,417 +1,13 @@
1
- # DragAnything
2
-
3
- ### <div align="center"> DragAnything: Motion Control for Anything using Entity Representation <div>
4
-
5
- <div align="center">
6
- <a href="https://weijiawu.github.io/draganything_page/"><img src="https://img.shields.io/static/v1?label=Project%20Page&message=Github&color=blue&logo=github-pages"></a> &ensp;
7
- <a href="https://arxiv.org/abs/2403.07420/"><img src="https://img.shields.io/static/v1?label=Paper&message=Arxiv&color=red&logo=arxiv"></a> &ensp;
8
- </div>
9
-
10
- <p align="center">
11
- <img src="./assets/1709656085862.jpg" width="800px"/>
12
- <br>
13
- </p>
14
-
15
-
16
- ## :notes: **Updates**
17
-
18
- <!--- [ ] Mar. 13, 2024. Release the train code in **three month**.-->
19
- - [x] Mar. 24, 2024. Support interactive demo with gradio.
20
- - [x] Mar. 13, 2024. Release the inference code.
21
- - [x] Mar. 12, 2024. Rep initialization.
22
-
23
-
24
  ---
25
-
26
- ## 🐱 Abstract
27
- We introduce DragAnything, which utilizes an entity representation to achieve motion control for any object in controllable video generation. Comparison to existing motion control methods, DragAnything offers several advantages. Firstly, trajectory-based is more user-friendly for interaction, when acquiring other guidance signals (\eg{} masks, depth maps) is labor-intensive. Users only need to draw a line~(trajectory) during interaction. Secondly, our entity representation serves as an open-domain embedding capable of representing any object, enabling the control of motion for diverse entities, including background. Lastly, our entity representation allows simultaneous and distinct motion control for multiple objects. Extensive experiments demonstrate that our DragAnything achieves state-of-the-art performance for FVD, FID, and User Study, particularly in terms of object motion control, where our method surpasses the previous state of the art (DragNUWA) by 26% in human voting.
28
-
 
 
 
 
 
29
  ---
30
- ## User-Trajectory Interaction with SAM
31
- <table class="center">
32
- <tr>
33
- <td style="text-align:center;"><b>Input Image</b></td>
34
- <td style="text-align:center;"><b>Drag point with SAM</b></td>
35
- <td style="text-align:center;"><b>2D Gaussian Trajectory</b></td>
36
- <td style="text-align:center;"><b>Generated Video</b></td>
37
- </tr>
38
- <tr>
39
- <td><img src="./assets/1709660422197.jpg" width="177" height="100"></td>
40
- <td><img src="./assets/1709660459944.jpg" width="177" height="100"></td>
41
- <td><img src="./assets/image28 (3).gif" width="177" height="100"></td>
42
- <td><img src="./assets/image28 (2).gif" width="177" height="100"></td>
43
- </tr>
44
- <tr>
45
- <td><img src="./assets/1709660422197.jpg" width="177" height="100"></td>
46
- <td><img src="./assets/1709660471568.jpg" width="177" height="100"></td>
47
- <td><img src="./assets/image2711.gif" width="177" height="100"></td>
48
- <td><img src="./assets/image27 (1)1.gif" width="177" height="100"></td>
49
- </tr>
50
- <tr>
51
- <td><img src="./assets/1709660422197.jpg" width="177" height="100"></td>
52
- <td><img src="./assets/1709660965701.jpg" width="177" height="100"></td>
53
- <td><img src="./assets/image29111.gif" width="177" height="100"></td>
54
- <td><img src="./assets/image29 (1)1.gif" width="177" height="100"></td>
55
- </tr>
56
- <tr>
57
- <td><img src="./assets/1709660422197.jpg" width="177" height="100"></td>
58
- <td><img src="./assets/1709661150250.jpg" width="177" height="100"></td>
59
- <td><img src="./assets/image30 (1)1.gif" width="177" height="100"></td>
60
- <td><img src="./assets/image3011.gif" width="177" height="100"></td>
61
- </tr>
62
-
63
- </table>
64
-
65
-
66
- ## Comparison with DragNUWA
67
- <table class="center">
68
- <tr>
69
- <td style="text-align:center;"><b>Model</b></td>
70
- <td style="text-align:center;"><b>Input Image and Drag</b></td>
71
- <td style="text-align:center;"><b>Generated Video</b></td>
72
- <td style="text-align:center;"><b>Visualization for Pixel Motion</b></td>
73
- </tr>
74
- <tr>
75
- <td style="text-align:center;"><b>DragNUWA</b></td>
76
- <td><img src="./assets/1709661872632.jpg" width="177" height="100"></td>
77
- <td><img src="./assets/image63111.gif" width="177" height="100"></td>
78
- <td><img src="./assets/image6411.gif" width="177" height="100"></td>
79
- </tr>
80
- <tr>
81
- <td style="text-align:center;"><b>Ours</b></td>
82
- <td><img src="./assets/1709662077471.jpg" width="177" height="100"></td>
83
- <td><img src="./assets/image65111.gif" width="177" height="100"></td>
84
- <td><img src="./assets/image6611.gif" width="177" height="100"></td>
85
- </tr>
86
- <tr>
87
- <td style="text-align:center;"><b>DragNUWA</b></td>
88
- <td><img src="./assets/1709662293661.jpg" width="177" height="100"></td>
89
- <td><img src="./assets/image77.gif" width="177" height="100"></td>
90
- <td><img src="./assets/image76.gif" width="177" height="100"></td>
91
- </tr>
92
- <tr>
93
- <td style="text-align:center;"><b>Ours</b></td>
94
- <td><img src="./assets/1709662429867.jpg" width="177" height="100"></td>
95
- <td><img src="./assets/image75.gif" width="177" height="100"></td>
96
- <td><img src="./assets/image74.gif" width="177" height="100"></td>
97
- </tr>
98
- <tr>
99
- <td style="text-align:center;"><b>DragNUWA</b></td>
100
- <td><img src="./assets/1709662596207.jpg" width="177" height="100"></td>
101
- <td><img src="./assets/image84.gif" width="177" height="100"></td>
102
- <td><img src="./assets/image85.gif" width="177" height="100"></td>
103
- </tr>
104
- <tr>
105
- <td style="text-align:center;"><b>Ours</b></td>
106
- <td><img src="./assets/1709662724643.jpg" width="177" height="100"></td>
107
- <td><img src="./assets/image87.gif" width="177" height="100"></td>
108
- <td><img src="./assets/image88.gif" width="177" height="100"></td>
109
- </tr>
110
-
111
-
112
-
113
- </table>
114
-
115
-
116
-
117
- ## More Demo
118
-
119
-
120
- <table class="center">
121
- <tr>
122
- <td style="text-align:center;"><b>Drag point with SAM</b></td>
123
- <td style="text-align:center;"><b>2D Gaussian</b></td>
124
- <td style="text-align:center;"><b>Generated Video</b></td>
125
- <td style="text-align:center;"><b>Visualization for Pixel Motion</b></td>
126
- </tr>
127
- <tr>
128
- <td><img src="./assets/1709656550343.jpg" width="177" height="100"></td>
129
- <td><img src="./assets/image188.gif" width="177" height="100"></td>
130
- <td><img src="./assets/image190.gif" width="177" height="100"></td>
131
- <td><img src="./assets/image189.gif" width="177" height="100"></td>
132
- </tr>
133
- <tr>
134
- <td><img src="./assets/1709657635807.jpg" width="177" height="100"></td>
135
- <td><img src="./assets/image187 (1).gif" width="177" height="100"></td>
136
- <td><img src="./assets/image186.gif" width="177" height="100"></td>
137
- <td><img src="./assets/image185.gif" width="177" height="100"></td>
138
- </tr>
139
- <tr>
140
- <td><img src="./assets/1709658516913.jpg" width="177" height="100"></td>
141
- <td><img src="./assets/image158.gif" width="177" height="100"></td>
142
- <td><img src="./assets/image159.gif" width="177" height="100"></td>
143
- <td><img src="./assets/image160.gif" width="177" height="100"></td>
144
- </tr>
145
- <tr>
146
- <td><img src="./assets/1709658781935.jpg" width="177" height="100"></td>
147
- <td><img src="./assets/image163.gif" width="177" height="100"></td>
148
- <td><img src="./assets/image161.gif" width="177" height="100"></td>
149
- <td><img src="./assets/image162.gif" width="177" height="100"></td>
150
- </tr>
151
- <tr>
152
- <td><img src="./assets/1709659276722.jpg" width="177" height="100"></td>
153
- <td><img src="./assets/image165.gif" width="177" height="100"></td>
154
- <td><img src="./assets/image167.gif" width="177" height="100"></td>
155
- <td><img src="./assets/image166.gif" width="177" height="100"></td>
156
- </tr>
157
- <tr>
158
- <td><img src="./assets/1709659787625.jpg" width="177" height="100"></td>
159
- <td><img src="./assets/image172.gif" width="177" height="100"></td>
160
- <td><img src="./assets/Our_Motorbike_cloud_floor.gif" width="177" height="100"></td>
161
- <td><img src="./assets/image171.gif" width="177" height="100"></td>
162
- </tr>
163
-
164
-
165
- </table>
166
-
167
-
168
- ## Various Motion Control
169
- <table class="center">
170
- <tr>
171
- <td style="text-align:center;"><b>Drag point with SAM</b></td>
172
- <td style="text-align:center;"><b>2D Gaussian</b></td>
173
- <td style="text-align:center;"><b>Generated Video</b></td>
174
- <td style="text-align:center;"><b>Visualization for Pixel Motion</b></td>
175
- </tr>
176
-
177
- <tr>
178
- <td><img src="./assets/1709663429471.jpg" width="177" height="100"></td>
179
- <td><img src="./assets/image265.gif" width="177" height="100"></td>
180
- <td><img src="./assets/image265 (1).gif" width="177" height="100"></td>
181
- <td><img src="./assets/image268.gif" width="177" height="100"></td>
182
- </tr>
183
- <tr>
184
- <td><img src="./assets/1709663831581.jpg" width="177" height="100"></td>
185
- <td><img src="./assets/image274.gif" width="177" height="100"></td>
186
- <td><img src="./assets/image274 (1).gif" width="177" height="100"></td>
187
- <td><img src="./assets/image276.gif" width="177" height="100"></td>
188
- </tr>
189
- <tr>
190
- <td style="text-align:center;" colspan="4"><b>(a) Motion Control for Foreground</b></td>
191
- </tr>
192
- <tr>
193
- <td><img src="./assets/1709664593048.jpg" width="177" height="100"></td>
194
- <td><img src="./assets/image270.gif" width="177" height="100"></td>
195
- <td><img src="./assets/image270 (1).gif" width="177" height="100"></td>
196
- <td><img src="./assets/image269.gif" width="177" height="100"></td>
197
- </tr>
198
- <tr>
199
- <td><img src="./assets/1709664834397.jpg" width="177" height="100"></td>
200
- <td><img src="./assets/image271.gif" width="177" height="100"></td>
201
- <td><img src="./assets/image271 (1).gif" width="177" height="100"></td>
202
- <td><img src="./assets/image272.gif" width="177" height="100"></td>
203
- </tr>
204
- <tr>
205
- <td style="text-align:center;" colspan="4"><b>(b) Motion Control for Background</b></td>
206
- </tr>
207
- <tr>
208
- <td><img src="./assets/1709665073460.jpg" width="177" height="100"></td>
209
- <td><img src="./assets/image279.gif" width="177" height="100"></td>
210
- <td><img src="./assets/image278.gif" width="177" height="100"></td>
211
- <td><img src="./assets/image277.gif" width="177" height="100"></td>
212
- </tr>
213
- <tr>
214
- <td><img src="./assets/1709665252573.jpg" width="177" height="100"></td>
215
- <td><img src="./assets/image282.gif" width="177" height="100"></td>
216
- <td><img src="./assets/image280.gif" width="177" height="100"></td>
217
- <td><img src="./assets/image281.gif" width="177" height="100"></td>
218
- </tr>
219
- <tr>
220
- <td style="text-align:center;" colspan="4"><b>(c) Simultaneous Motion Control for Foreground and Background
221
- </b></td>
222
- </tr>
223
- <tr>
224
- <td><img src="./assets/1709665505339.jpg" width="177" height="100"></td>
225
- <td><img src="./assets/image283.gif" width="177" height="100"></td>
226
- <td><img src="./assets/image283 (1).gif" width="177" height="100"></td>
227
- <td><img src="./assets/image285.gif" width="177" height="100"></td>
228
- </tr>
229
- <tr>
230
- <td><img src="./assets/1709666205795.jpg" width="177" height="100"></td>
231
- <td><img src="./assets/image286.gif" width="177" height="100"></td>
232
- <td><img src="./assets/image288.gif" width="177" height="100"></td>
233
- <td><img src="./assets/image287.gif" width="177" height="100"></td>
234
- </tr>
235
- <tr>
236
- <td><img src="./assets/1709666401284.jpg" width="177" height="100"></td>
237
- <td><img src="./assets/image289.gif" width="177" height="100"></td>
238
- <td><img src="./assets/image290.gif" width="177" height="100"></td>
239
- <td><img src="./assets/image291.gif" width="177" height="100"></td>
240
- </tr>
241
- <tr>
242
- <td><img src="./assets/1709666772216.jpg" width="177" height="100"></td>
243
- <td><img src="./assets/image294.gif" width="177" height="100"></td>
244
- <td><img src="./assets/image293.gif" width="177" height="100"></td>
245
- <td><img src="./assets/image292.gif" width="177" height="100"></td>
246
- </tr>
247
- <tr>
248
- <td style="text-align:center;" colspan="4"><b>(d) Motion Control for Camera Motion
249
- </b></td>
250
- </tr>
251
-
252
- </table>
253
-
254
- ## 🔧 Dependencies and Dataset Prepare
255
-
256
- ### Dependencies
257
- - Python >= 3.10 (Recommend to use [Anaconda](https://www.anaconda.com/download/#linux) or [Miniconda](https://docs.conda.io/en/latest/miniconda.html))
258
- - [PyTorch >= 1.13.0+cu11.7](https://pytorch.org/)
259
-
260
- ```Shell
261
- git clone https://github.com/Showlab/DragAnything.git
262
- cd DragAnything
263
-
264
- conda create -n DragAnything python=3.8
265
- conda activate DragAnything
266
- pip install -r environment.txt
267
- ```
268
-
269
- ### Dataset Prepare
270
-
271
- Download [VIPSeg](https://github.com/VIPSeg-Dataset/VIPSeg-Dataset) and [Youtube-VOS](https://youtube-vos.org/) to the ```./data``` directory.
272
-
273
- ### Motion Trajectory Annotataion Prepare
274
- You can use our preprocessed annotation files or choose to process your own motion trajectory annotation files using [Co-Track](https://github.com/facebookresearch/co-tracker?tab=readme-ov-file#installation-instructions).
275
-
276
-
277
- If you choose to generate motion trajectory annotations yourself, you need to follow the processing steps outlined in [Co-Track](https://github.com/facebookresearch/co-tracker?tab=readme-ov-file#installation-instructions).
278
-
279
- ```Shell
280
- cd ./utils/co-tracker
281
- pip install -e .
282
- pip install matplotlib flow_vis tqdm tensorboard
283
-
284
- mkdir -p checkpoints
285
- cd checkpoints
286
- wget https://huggingface.co/facebook/cotracker/resolve/main/cotracker2.pth
287
- cd ..
288
-
289
- ```
290
- Then, modify the corresponding ```video_path```, ```ann_path```, and ```save_path``` in the ```Generate_Trajectory_for_VIPSeg.sh``` file, and run the command. The corresponding trajectory annotations will be saved as .json files in the save_path directory.
291
-
292
- ```Shell
293
- Generate_Trajectory_for_VIPSeg.sh
294
-
295
- ```
296
-
297
- ### Trajectory visualization
298
- You can run the following command for visualization.
299
-
300
- ```Shell
301
- cd .utils/
302
- python vis_trajectory.py
303
- ```
304
-
305
- ### Pretrained Model Preparation
306
-
307
- We adopt the [ChilloutMix](https://civitai.com/models/6424/chilloutmix) as pretrained model for extraction of entity representation, please download the diffusers version:
308
-
309
- ```bash 
310
- mkdir -p utils/pretrained_models
311
- cd utils/pretrained_models
312
-
313
- # Diffusers-version ChilloutMix to utils/pretrained_models
314
- git-lfs clone https://huggingface.co/windwhinny/chilloutmix.git
315
- ```
316
-
317
- And you can download our pretrained model for the controlnet:
318
- ```bash 
319
- mkdir -p model_out/DragAnything
320
- cd model_out/DragAnything
321
-
322
- # Diffusers-version DragAnything to model_out/DragAnything
323
- git-lfs clone https://huggingface.co/weijiawu/DragAnything
324
- ```
325
-
326
-
327
-
328
- ## :paintbrush: Train(Awaiting release) <!-- omit in toc -->
329
-
330
- ### 1) Semantic Embedding Extraction
331
-
332
- ```Shell
333
- cd .utils/
334
- python extract_semantic_point.py
335
- ```
336
-
337
- ### 2) Train DragAnything
338
-
339
- For VIPSeg
340
- ```Shell
341
- sh ./script/train_VIPSeg.sh
342
- ```
343
-
344
- For YouTube VOS
345
- ```Shell
346
- sh ./script/train_youtube_vos.sh
347
- ```
348
-
349
- ## :paintbrush: Evaluation <!-- omit in toc -->
350
-
351
- ### Evaluation for [FID](https://github.com/mseitzer/pytorch-fid)
352
-
353
- ```Shell
354
- cd utils
355
- sh Evaluation_FID.sh
356
- ```
357
-
358
- ### Evaluation for [Fréchet Video Distance (FVD)](https://github.com/hyenal/relate/blob/main/extras/README.md)
359
-
360
- ```Shell
361
- cd utils/Eval_FVD
362
- sh compute_fvd.sh
363
- ```
364
-
365
- ### Evaluation for Eval_ObjMC
366
-
367
- ```Shell
368
- cd utils/Eval_ObjMC
369
- python ./ObjMC.py
370
- ```
371
-
372
-
373
-
374
- ## :paintbrush: Inference for single video <!-- omit in toc -->
375
-
376
-
377
- ```Shell
378
- python demo.py
379
- ```
380
-
381
- or run the interactive inference with gradio (install the ```gradio==3.50.2```).
382
- ```Shell
383
- cd ./script
384
- ```
385
- download the weight of ```sam_vit_h_4b8939.pth``` from [SAM](https://github.com/facebookresearch/segment-anything?tab=readme-ov-file#model-checkpoints)
386
-
387
- ```Shell
388
- python gradio_run.py
389
- ```
390
-
391
-
392
- ### :paintbrush: Visulization of pixel motion for the generated video <!-- omit in toc -->
393
-
394
- ```Shell
395
- cd utils/co-tracker
396
- python demo.py
397
- ```
398
-
399
-
400
-
401
- ## 📖BibTeX
402
- @misc{wu2024draganything,
403
- title={DragAnything: Motion Control for Anything using Entity Representation},
404
- author={Weijia Wu, Zhuang Li, Yuchao Gu, Rui Zhao, Yefei He, David Junhao Zhang, Mike Zheng Shou, Yan Li, Tingting Gao, Di Zhang},
405
- year={2024},
406
- eprint={2403.07420},
407
- archivePrefix={arXiv},
408
- primaryClass={cs.CV}
409
- }
410
-
411
-
412
- ## 🤗Acknowledgements
413
- - Thanks to [Diffusers](https://github.com/huggingface/diffusers) for the wonderful work and codebase.
414
- - Thanks to [svd-temporal-controlnet](https://github.com/CiaraStrawberry/svd-temporal-controlnet) for the wonderful work and codebase.
415
- - Thanks to chaojie for building [ComfyUI-DragAnything](https://github.com/chaojie/ComfyUI-DragAnything).
416
-
417
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: Dragreal
3
+ emoji: 🐠
4
+ colorFrom: indigo
5
+ colorTo: blue
6
+ sdk: gradio
7
+ sdk_version: 4.23.0
8
+ app_file: app.py
9
+ pinned: false
10
+ license: mit
11
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
 
13
+ Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference