elliesleightholm
commited on
Commit
•
6c5db7f
1
Parent(s):
a57d90b
Update README.md
Browse files
README.md
CHANGED
@@ -44,20 +44,28 @@ size_categories:
|
|
44 |
- 1M<n<10M
|
45 |
---
|
46 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
47 |
|
48 |
# Marqo-GS-10M
|
49 |
-
This dataset is our multimodal, fine-grained, ranking dataset, **Marqo-GS-10M
|
50 |
-
|
51 |
-
Blog post: https://www.marqo.ai/blog/generalized-contrastive-learning-for-multi-modal-retrieval-and-ranking
|
52 |
-
|
53 |
-
Paper: https://arxiv.org/pdf/2404.08535.pdf
|
54 |
|
55 |
-
|
|
|
56 |
|
57 |
-
|
58 |
-
|
59 |
|
60 |
-
**Release WIP**: GCL Training Framework.
|
61 |
## Table of Content
|
62 |
1. Motivation
|
63 |
2. Dataset and Benchmarks
|
@@ -124,6 +132,16 @@ a set of validation ground truth and a set of test ground truth.
|
|
124 |
### Dataset Downloads
|
125 |
The Marqo-GS-10M dataset is available for direct download. This dataset is pivotal for training and benchmarking in Generalized Contrastive Learning (GCL) frameworks and other multi-modal fine-grained ranking tasks.
|
126 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
127 |
- **Full Dataset**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/marqo-gs-dataset.tar) - Link contains the entire Marqo-GS-10M dataset except for the images.
|
128 |
- **Full Images**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/images_archive.tar) - Link contains the images of the entire Marqo-GS-10M dataset.
|
129 |
- **Sample Images**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/images_wfash.tar) - Link contains the images for woman fashion category, it corresponds to the woman fashion sub-dataset.
|
|
|
44 |
- 1M<n<10M
|
45 |
---
|
46 |
|
47 |
+
<div style="display: flex; align-items: center; gap: 10px;">
|
48 |
+
<a href="https://www.marqo.ai/blog/generalized-contrastive-learning-for-multi-modal-retrieval-and-ranking">
|
49 |
+
<img src="https://img.shields.io/badge/Marqo-Blog-blue?logo=font-awesome&logoColor=white&style=flat&logo=pencil-alt" alt="Blog">
|
50 |
+
</a>
|
51 |
+
<a href="https://arxiv.org/pdf/2404.08535.pdf">
|
52 |
+
<img src="https://img.shields.io/badge/arXiv-Paper-red?logo=arxiv" alt="arXiv Paper">
|
53 |
+
</a>
|
54 |
+
<a href="https://github.com/marqo-ai/GCL">
|
55 |
+
<img src="https://img.shields.io/badge/GitHub-Repo-lightgrey?logo=github" alt="GitHub Repo">
|
56 |
+
</a>
|
57 |
+
</div>
|
58 |
|
59 |
# Marqo-GS-10M
|
60 |
+
This dataset is our multimodal, fine-grained, ranking Google Shopping dataset, **Marqo-GS-10M**, followed by our novel training framework: Generalized Contrastive Learning (GCL). GCL aims to improve and measure the **ranking** performance of information retrieval models,
|
61 |
+
especially for retrieving relevant **products** given a search query.
|
|
|
|
|
|
|
62 |
|
63 |
+
```python
|
64 |
+
from datasets import load_dataset
|
65 |
|
66 |
+
ds = load_dataset("Marqo/marqo-GS-10M")
|
67 |
+
```
|
68 |
|
|
|
69 |
## Table of Content
|
70 |
1. Motivation
|
71 |
2. Dataset and Benchmarks
|
|
|
132 |
### Dataset Downloads
|
133 |
The Marqo-GS-10M dataset is available for direct download. This dataset is pivotal for training and benchmarking in Generalized Contrastive Learning (GCL) frameworks and other multi-modal fine-grained ranking tasks.
|
134 |
|
135 |
+
You can use the dataset with Hugging Face's `datasets`:
|
136 |
+
|
137 |
+
```python
|
138 |
+
from datasets import load_dataset
|
139 |
+
|
140 |
+
ds = load_dataset("Marqo/marqo-GS-10M")
|
141 |
+
```
|
142 |
+
|
143 |
+
Alternatively:
|
144 |
+
|
145 |
- **Full Dataset**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/marqo-gs-dataset.tar) - Link contains the entire Marqo-GS-10M dataset except for the images.
|
146 |
- **Full Images**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/images_archive.tar) - Link contains the images of the entire Marqo-GS-10M dataset.
|
147 |
- **Sample Images**: [Download](https://marqo-gcl-public.s3.amazonaws.com/v1/images_wfash.tar) - Link contains the images for woman fashion category, it corresponds to the woman fashion sub-dataset.
|