File size: 2,322 Bytes
144fdb8 213704e 144fdb8 96e9d21 b783d55 96e9d21 b5f46cb b4c0fd1 96e9d21 144fdb8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 |
---
datasets:
- McGill-NLP/WebLINX
- McGill-NLP/WebLINX-full
language:
- en
metrics:
- f1
- iou
- chrf
library_name: transformers
pipeline_tag: text-generation
tags:
- weblinx
- text-generation-inference
- web-agents
- agents
license: llama2
---
<div align="center">
<h1 style="margin-bottom: 0.5em;">WebLINX: Real-World Website Navigation with Multi-Turn Dialogue</h1>
<em>Xing Han Lù*, Zdeněk Kasner*, Siva Reddy</em>
</div>
<div style="margin-bottom: 2em"></div>
<div style="display: flex; justify-content: space-around; align-items: center; font-size: 120%;">
<div><a href="https://arxiv.org/abs/2402.05930">📄Paper</a></div>
<div><a href="https://mcgill-nlp.github.io/weblinx">🌐Website</a></div>
<div><a href="https://huggingface.co/spaces/McGill-NLP/weblinx-explorer">💻Explorer</a></div>
<div><a href="https://huggingface.co/datasets/McGill-NLP/WebLINX">🤗Dataset</a></div>
<div><a href="https://github.com/McGill-NLP/weblinx">💾Code</a></div>
</div>
## Quickstart
```python
from datasets import load_dataset
from huggingface_hub import snapshot_download
from transformers import pipeline
# Load validation split
valid = load_dataset("McGill-NLP/weblinx", split="validation")
# Download and load the templates
snapshot_download(
"McGill-NLP/WebLINX", repo_type="dataset", allow_patterns="templates/*.txt", local_dir="./"
)
with open('templates/llama.txt') as f:
template = f.read()
turn = valid[0]
turn_text = template.format(**turn)
# Load action model and input the text to get prediction
action_model = pipeline(
model="McGill-NLP/Llama-2-13b-chat-weblinx", device=0, torch_dtype='auto'
)
out = action_model(turn_text, return_full_text=False, max_new_tokens=64, truncation=True)
pred = out[0]['generated_text']
print("Ref:", turn["action"])
print("Pred:", pred)
```
## Original Model
This model is finetuned on WebLINX using checkpoints previously published on Huggingface Hub.\
[Click here to access the original model.](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf)
## License
This model is derived from LLaMA-2, which can only be used with the [LLaMA 2 Community License Agreement](https://github.com/facebookresearch/llama/blob/main/LICENSE). By using or distributing any portion or element of this model, you agree to be bound by this Agreement. |