Commit
ba0ed2f
1 Parent(s): 5da8c14

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -6
README.md CHANGED
@@ -1,9 +1,37 @@
1
  ---
2
- tags:
3
- - pytorch_model_hub_mixin
4
- - model_hub_mixin
5
  ---
 
6
 
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: https://huggingface.co/robotics-diffusion-transformer/rdt-1b
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: mit
 
 
3
  ---
4
+ # RDT-1B
5
 
6
+ RDT-1B is a 1B-parameter imitation learning Diffusion Transformer pre-trained on 1M+ multi-robot episodes. Given a language instruction and 3-view RGB image observations, RDT can predict the next
7
+ 64 robot actions. RDT is inherently compatible with almost all kinds of modern mobile manipulators, from single-arm to dual-arm, joint to EEF, pos. to vel., and even with a mobile chassis.
8
+
9
+ All the code and model weights are licensed under MIT license.
10
+
11
+ Please refer to our [project page](), [github repository]() and [paper]() for more information.
12
+
13
+ ## Model Details
14
+
15
+ - **Developed by** Thu-ml team
16
+ - **License:** MIT
17
+ - **Pretrain dataset:** [More Information Needed]
18
+ - **Finetune dataset:** [More Information Needed]
19
+
20
+ - **Repository:** [More Information Needed]
21
+ - **Paper :** [More Information Needed]
22
+ - **Project Page:** https://rdt-robotics.github.io/rdt-robotics/
23
+
24
+ ## Uses
25
+
26
+ RDT-1B supports finetuning and pre-training on custom dataset, as well as deploying and inferencing on real-robots.
27
+
28
+ Please refer to [our repository](https://github.com/GeneralEmbodiedSystem/RoboticsDiffusionTransformer/blob/main/docs/pretrain.md) for all the above guides.
29
+
30
+
31
+ ## Citation
32
+
33
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
34
+
35
+ **BibTeX:**
36
+
37
+ [More Information Needed]