File size: 1,194 Bytes
58f88a0
 
 
 
 
 
b0a204c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
58f88a0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
---
language:
- en
metrics:
- accuracy
---
# GraphGPT
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm. 

## Model Details
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm based on the [Vicuna-7B-v1.5 model](https://huggingface.co/lmsys/vicuna-7b-v1.5).
* Developed by: [Data Intelligence Lab](https://sites.google.com/view/chaoh/group-join-us)@HKU
* Model type: An auto-regressive language model based on the transformer architecture.
* Finetuned from model: [Vicuna-7B-v1.5 model](https://huggingface.co/lmsys/vicuna-7b-v1.5).
## Model Sources
* Repository: [https://github.com/HKUDS/GraphGPT](https://github.com/HKUDS/GraphGPT)
* Paper: []()
* Project: [https://graphgpt.github.io/](https://graphgpt.github.io/)
## Uses
This version of GraphGPT is tuned utilizing the mixing instruction data, which is able to handle both node classification and link prediction for different graph datasets.
## How to Get Started with the Model
* Command line interface: Plaese refer to [https://github.com/HKUDS/GraphGPT](https://github.com/HKUDS/GraphGPT) to evaluate our GraphGPT.
* Gradio demo is under development.