chameleon-7b / README.md
jacobkahn's picture
Fix pipeline tag (#2)
0f474b7 verified
metadata
license: other
license_name: chameleon-research-license
license_link: https://ai.meta.com/resources/models-and-libraries/chameleon-license/
extra_gated_prompt: '### META CHAMELEON RESEARCH LICENSE AGREEMENT'
extra_gated_fields:
  First Name: text
  Last Name: text
  Date of birth: date_picker
  Country: country
  Affiliation: text
  I accept the terms and conditions: checkbox
  geo: ip_location
extra_gated_description: Meta Chameleon Research License and Acceptable Use Policy
extra_gated_button_content: I Accept Meta Chameleon Research License and AUP
pipeline_tag: image-text-to-text

Meta Chameleon 7B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.