Update README.md
Browse files
README.md
CHANGED
@@ -23,60 +23,5 @@ Commonsene Object Affordance Task [COAT]
|
|
23 |
<a href="https://openreview.net/pdf?id=xYkdmEGhIM">OpenReview</a> | <a href="https://drive.google.com/drive/u/4/folders/1reH0JHhPM_tFzDMcAaJF0PycFMixfIbo">Datasets</a>
|
24 |
</p>
|
25 |
|
26 |
-
|
27 |
-
<img src="https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/utility-intro(1).png" alt="Paper Summary Flowchart">
|
28 |
-
<em>A 3 level framework adumbrating human commonsense style reasoning for estimating object affordance for various tasks</em>
|
29 |
-
</p>
|
30 |
-
|
31 |
-
### Experimental Setup:
|
32 |
-
- Task List: [tasks](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/tasks.json)
|
33 |
-
- Object List: [objects](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/concepts.json)
|
34 |
-
- Utility List[^1]: [utilities](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/concepts.json)
|
35 |
-
- Variables Used:
|
36 |
-
```temperature```, ```mass```, ```material```, ```already-in-use```, ```condition```
|
37 |
-
|
38 |
-
### Utility Level Pruning:
|
39 |
-
This gives us ```Utility```to``Object`` mappings also called ```utility objects```
|
40 |
-
- GT Object-Utility Mappings : [utility-mappings](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/objects.json)
|
41 |
-
|
42 |
-
### Task-u(Utility Level):
|
43 |
-
Here we evaluate models on their ability to prune out appropriate objects on the basis of Utility.
|
44 |
-
- GT (Utility)-(Object) Mappings: [utility-objects](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/objects.json)
|
45 |
-
- Task-u Dataset: [4 Variations](https://drive.google.com/drive/folders/1JJSIicKGp0a7ThsenKl0XWKsTtPL_b5z?usp=sharing)
|
46 |
-
|
47 |
-
### Task-0(Context Level):
|
48 |
-
Here we evaluate models on their ability to prune out appropriate objects on the basis of Context. This gives us ```(Task,Utility)```to``Object`` mappings also called ```context objects```
|
49 |
-
- GT (Task-Utility)-(Object) Mappings: [context-objects](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/oracle.json)
|
50 |
-
- Task-0 Dataset: [4 Variations](https://drive.google.com/drive/folders/1reH0JHhPM_tFzDMcAaJF0PycFMixfIbo?usp=sharing)
|
51 |
-
|
52 |
-
### Task-1(Physical State Level):
|
53 |
-
Here we evaluate models on their ability to prune out the ```ideal``` configuration when presented with a number of ```context object``` configurations. (Something that is pretty obvious to humans)
|
54 |
-
- All Possible Common Configurations: [possible configurations](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/task-1/possible_configurations_v1.json)
|
55 |
-
- Ideal Configurations: [ideal configurations](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/task-1/pouch_config_oracle.json)
|
56 |
-
- Commonsense Common Occurence Variables: [common variables values](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/task-1/common_var_responses.json)
|
57 |
-
- Task-1 Dataset: [12 Variations](https://drive.google.com/drive/folders/1reH0JHhPM_tFzDMcAaJF0PycFMixfIbo?usp=sharing)
|
58 |
-
|
59 |
-
### Task-2(Physical State Level):
|
60 |
-
Here we evaluate models on their ability to prune out the most appropriate```sub-optimal``` configuration when presented with a number of sub-optimal configurations of ```context objects```. (Something that is pretty obvious to humans)
|
61 |
-
- Suboptimal Configurations: [suboptimal configurations](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/task-2/pouch_suboptimal.json)
|
62 |
-
- Human Preference Material Order: [material preference](https://huggingface.co/datasets/Ayush8120/commonsense-embodied-affordance/blob/main/task-2/material_preference.json)
|
63 |
-
- Task-2 Dataset: [14 Variations](https://drive.google.com/drive/folders/1reH0JHhPM_tFzDMcAaJF0PycFMixfIbo?usp=sharing)
|
64 |
-
---------------------------------------------------------------------------------------------------------------
|
65 |
-
|
66 |
-
### Finetuning Datasets
|
67 |
-
|
68 |
-
Please refer to [Appendix F.1](https://openreview.net/pdf?id=xYkdmEGhIM) for dataset details
|
69 |
-
|
70 |
-
- Finetuning Dataset for Object Level Selection : [Google Drive Link](https://drive.google.com/drive/folders/1GtrGQxTTtYEczYK1ytB71Y2HGxM1TEu5?usp=drive_link)
|
71 |
-
- Finetuning Dataset for Physical State Level Selection : [Google Drive Link](https://drive.google.com/drive/folders/1FiZc8u_G8wUrN4NroZmIgmcTe0jor72T?usp=drive_link)
|
72 |
-
|
73 |
-
### Full Pipeline Evaluation Datasets
|
74 |
-
|
75 |
-
Please refer to [Appendix F.2](https://openreview.net/pdf?id=xYkdmEGhIM) for dataset deatails
|
76 |
-
|
77 |
-
- Ideal Object Choice Datasets : [Google Drive Link](https://drive.google.com/drive/folders/1SMM2TU1BKH32oKtfmW0gS3QfyUA68IZ0?usp=drive_link)
|
78 |
-
- Moderate Object Choice Datasets : [Google Drive Link](https://drive.google.com/drive/folders/1SlZQBp4Iao3VHnmOFZMKfzn_LWOctnVE?usp=drive_link)
|
79 |
-
|
80 |
-
|
81 |
-
[^1]: For the purpose of datasets, we've used `concept and utility` interchangeably.
|
82 |
----------------------------------------------------------------------------------------------------------------
|
|
|
23 |
<a href="https://openreview.net/pdf?id=xYkdmEGhIM">OpenReview</a> | <a href="https://drive.google.com/drive/u/4/folders/1reH0JHhPM_tFzDMcAaJF0PycFMixfIbo">Datasets</a>
|
24 |
</p>
|
25 |
|
26 |
+
Please refer [here](https://github.com/Ayush8120/COAT) for the documentation
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
----------------------------------------------------------------------------------------------------------------
|