Datasets:

Modalities:
Tabular
Text
Formats:
json
Size:
< 1K
ArXiv:
Libraries:
Datasets
pandas
License:
JamesZhutheThird commited on
Commit
28941d1
โ€ข
1 Parent(s): 4b299f1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +88 -83
README.md CHANGED
@@ -1,83 +1,88 @@
1
- ---
2
- license: cc-by-nc-sa-4.0
3
- language:
4
- - en
5
- - zh
6
- tags:
7
- - GUI
8
- - Agent
9
- size_categories:
10
- - n<1K
11
- ---
12
-
13
- <div align="center">
14
- <img src="./title.png"></img>
15
-
16
- <img src="./overview.png" width="500em" ></img>
17
-
18
- **๐ŸŽฎ MobA manipulates mobile phones just like how you would.**
19
-
20
- ๐ŸŒ [Website](https://github.com/OpenDFM/MobA) | ๐Ÿ“ƒ [Paper](https://arxiv.org/abs/2410.13757/) | ๐Ÿค— [MobBench](https://huggingface.co/datasets/OpenDFM/MobA-MobBench) | ๐Ÿ—ƒ๏ธ [Code](https://github.com/OpenDFM/MobA)
21
-
22
- [็ฎ€ไฝ“ไธญๆ–‡](./README_zh.md) | English
23
-
24
- </div>
25
-
26
-
27
- ## ๐Ÿ”ฅ News
28
-
29
- - **[2024.10.18]** We open-source MobA on [GitHub](https://github.com/OpenDFM/MobA), and our paper is available on [arXiv](https://arxiv.org/abs/2410.13757).
30
-
31
- ## ๐Ÿ“– Introduction
32
-
33
- Current mobile assistants are limited by dependence on system APIs or struggle with complex user instructions and diverse interfaces due to restricted comprehension and decision-making abilities. To address these challenges, we propose MobA, a novel Mobile phone Agent powered by multimodal large language models that enhances comprehension and planning capabilities through a sophisticated two-level agent architecture. The high-level Global Agent (GA) is responsible for understanding user commands, tracking history memories, and planning tasks. The low-level Local Agent (LA) predicts detailed actions in the form of function calls, guided by sub-tasks and memory from the GA. Integrating a Reflection Module allows for efficient task completion and enables the system to handle previously unseen complex tasks. MobA demonstrates significant improvements in task execution efficiency and completion rate in real-life evaluations, underscoring the potential of MLLM-empowered mobile assistants.
34
-
35
- ## ๐Ÿ”ง Deployment
36
-
37
- > MobA is still under development, and we are keeping updating the code. Please stay tuned!
38
-
39
- ### System Requirements
40
-
41
- Make sure you have installed [Android Debug Bridge (ADB)](https://developer.android.google.cn/tools/adb), and you have connected your Android device to your computer. You should be able to see your devides with command `adb devices`.
42
-
43
- ### Environment Setup
44
-
45
- ```shell
46
- conda create -n moba python=3.12
47
- conda activate moba
48
- pip install numpy opencv-python openai generativeai pillow colorama
49
- ```
50
-
51
- You may also use `requirements.txt` to install the required packages (However it is not recommended since there are many unused packages).
52
-
53
- ### Run MobA
54
-
55
- You need to specify the configuration file in `config.yaml` before running MobA. You can find the configuration file in the `moba` folder.
56
-
57
- ```bash
58
- vim ./moba/config.yaml
59
- cd ./moba/agent
60
- python executor.py
61
- ```
62
-
63
- You should be able to run MobA smoothly on Windows now. You can find MobBench, the fifty tasks we tested in the paper, on [huggingface](https://huggingface.co/datasets/OpenDFM/MobA-MobBench).
64
-
65
- ## ๐Ÿ“‘ Citation
66
-
67
- If you find our work useful, please cite us!
68
-
69
- ```bib
70
- @misc{zhu2024moba,
71
- title={MobA: A Two-Level Agent System for Efficient Mobile Task Automation},
72
- author={Zichen Zhu and Hao Tang and Yansi Li and Kunyao Lan and Yixuan Jiang and Hao Zhou and Yixiao Wang and Situo Zhang and Liangtai Sun and Lu Chen and Kai Yu},
73
- year={2024},
74
- eprint={2410.13757},
75
- archivePrefix={arXiv},
76
- primaryClass={cs.MA},
77
- url={https://arxiv.org/abs/2410.13757},
78
- }
79
- ```
80
-
81
- ## ๐Ÿ“ง Contact Us
82
-
83
- If you have any questions, please feel free to contact me via email `[email protected]`.
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ language:
4
+ - en
5
+ - zh
6
+ tags:
7
+ - GUI
8
+ - Agent
9
+ size_categories:
10
+ - n<1K
11
+ configs:
12
+ - config_name: default
13
+ data_files:
14
+ - split: MobBench
15
+ path: "MobBench_tasks_release_v1.0.json"
16
+ ---
17
+
18
+ <div align="center">
19
+ <img src="./title.png"></img>
20
+
21
+ <img src="./overview.png" width="500em" ></img>
22
+
23
+ **๐ŸŽฎ MobA manipulates mobile phones just like how you would.**
24
+
25
+ ๐ŸŒ [Website](https://github.com/OpenDFM/MobA) | ๐Ÿ“ƒ [Paper](https://arxiv.org/abs/2410.13757/) | ๐Ÿค— [MobBench](https://huggingface.co/datasets/OpenDFM/MobA-MobBench) | ๐Ÿ—ƒ๏ธ [Code](https://github.com/OpenDFM/MobA)
26
+
27
+ [็ฎ€ไฝ“ไธญๆ–‡](./README_zh.md) | English
28
+
29
+ </div>
30
+
31
+
32
+ ## ๐Ÿ”ฅ News
33
+
34
+ - **[2024.10.18]** We open-source MobA on [GitHub](https://github.com/OpenDFM/MobA), and our paper is available on [arXiv](https://arxiv.org/abs/2410.13757).
35
+
36
+ ## ๐Ÿ“– Introduction
37
+
38
+ Current mobile assistants are limited by dependence on system APIs or struggle with complex user instructions and diverse interfaces due to restricted comprehension and decision-making abilities. To address these challenges, we propose MobA, a novel Mobile phone Agent powered by multimodal large language models that enhances comprehension and planning capabilities through a sophisticated two-level agent architecture. The high-level Global Agent (GA) is responsible for understanding user commands, tracking history memories, and planning tasks. The low-level Local Agent (LA) predicts detailed actions in the form of function calls, guided by sub-tasks and memory from the GA. Integrating a Reflection Module allows for efficient task completion and enables the system to handle previously unseen complex tasks. MobA demonstrates significant improvements in task execution efficiency and completion rate in real-life evaluations, underscoring the potential of MLLM-empowered mobile assistants.
39
+
40
+ ## ๐Ÿ”ง Deployment
41
+
42
+ > MobA is still under development, and we are keeping updating the code. Please stay tuned!
43
+
44
+ ### System Requirements
45
+
46
+ Make sure you have installed [Android Debug Bridge (ADB)](https://developer.android.google.cn/tools/adb), and you have connected your Android device to your computer. You should be able to see your devides with command `adb devices`.
47
+
48
+ ### Environment Setup
49
+
50
+ ```shell
51
+ conda create -n moba python=3.12
52
+ conda activate moba
53
+ pip install numpy opencv-python openai generativeai pillow colorama
54
+ ```
55
+
56
+ You may also use `requirements.txt` to install the required packages (However it is not recommended since there are many unused packages).
57
+
58
+ ### Run MobA
59
+
60
+ You need to specify the configuration file in `config.yaml` before running MobA. You can find the configuration file in the `moba` folder.
61
+
62
+ ```bash
63
+ vim ./moba/config.yaml
64
+ cd ./moba/agent
65
+ python executor.py
66
+ ```
67
+
68
+ You should be able to run MobA smoothly on Windows now. You can find MobBench, the fifty tasks we tested in the paper, on [huggingface](https://huggingface.co/datasets/OpenDFM/MobA-MobBench).
69
+
70
+ ## ๐Ÿ“‘ Citation
71
+
72
+ If you find our work useful, please cite us!
73
+
74
+ ```bib
75
+ @misc{zhu2024moba,
76
+ title={MobA: A Two-Level Agent System for Efficient Mobile Task Automation},
77
+ author={Zichen Zhu and Hao Tang and Yansi Li and Kunyao Lan and Yixuan Jiang and Hao Zhou and Yixiao Wang and Situo Zhang and Liangtai Sun and Lu Chen and Kai Yu},
78
+ year={2024},
79
+ eprint={2410.13757},
80
+ archivePrefix={arXiv},
81
+ primaryClass={cs.MA},
82
+ url={https://arxiv.org/abs/2410.13757},
83
+ }
84
+ ```
85
+
86
+ ## ๐Ÿ“ง Contact Us
87
+
88
+ If you have any questions, please feel free to contact me via email `[email protected]`.