Papers
arxiv:2410.16153

Pangea: A Fully Open Multilingual Multimodal LLM for 39 Languages

Published on Oct 21
¡ Submitted by yuexiang96 on Oct 22

Abstract

Despite recent advances in multimodal large language models (MLLMs), their development has predominantly focused on English- and western-centric datasets and tasks, leaving most of the world's languages and diverse cultural contexts underrepresented. This paper introduces Pangea, a multilingual multimodal LLM trained on PangeaIns, a diverse 6M instruction dataset spanning 39 languages. PangeaIns features: 1) high-quality English instructions, 2) carefully machine-translated instructions, and 3) culturally relevant multimodal tasks to ensure cross-cultural coverage. To rigorously assess models' capabilities, we introduce PangeaBench, a holistic evaluation suite encompassing 14 datasets covering 47 languages. Results show that Pangea significantly outperforms existing open-source models in multilingual settings and diverse cultural contexts. Ablation studies further reveal the importance of English data proportions, language popularity, and the number of multimodal training samples on overall performance. We fully open-source our data, code, and trained checkpoints, to facilitate the development of inclusive and robust multilingual MLLMs, promoting equity and accessibility across a broader linguistic and cultural spectrum.

Community

Paper author Paper submitter
•
edited 10 days ago

Pangea: A Fully Open Multilingual Multimodal LLM for 39 Languages
Homepage, Demo

  • Pangea-7B: a SoTA open multilingual multimodal LLM capable of 39 languages.
  • PangeaIns: a 6M multilingual multimodal instruction tuning dataset spanning 39 languages.
  • PangeaBench: a holistic evaluation benchmark spanning 14 datasets in 47 languages.
This comment has been hidden

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 2

Datasets citing this paper 1

Spaces citing this paper 3

Collections including this paper 9