librarian-bot commited on
Commit
3d8faf4
1 Parent(s): 73b6b3a

Librarian Bot: Add moe tag to model

Browse files

This pull request aims to enrich the metadata of your model by adding an `moe` (Mixture of Experts) `tag` in the `YAML` block of your model's `README.md`.

How did we find this information? We infered that this model is a `moe` model based on the following criteria:

- The model's name contains the string `moe`.
- The model indicates it uses a `moe` architecture
- The model's base model is a `moe` model


**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes easier to find mixture of experts models on the Hub
- **Helping understand the ecosystem** - It becomes easier to understand the ecosystem of mixture of experts models on the Hub and how they are used


This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to [@davanstrien](https://huggingface.co/davanstrien).

Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -1,12 +1,13 @@
1
-
2
  ---
 
 
 
3
  pipeline_tag: text-generation
4
  inference: true
5
  widget:
6
- - text: 'Hello!'
7
  example_title: Hello world
8
  group: Python
9
- library_name: transformers
10
  ---
11
 
12
  This model is randomly initialized, using the config from [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) but with smaller size.
 
 
1
  ---
2
+ library_name: transformers
3
+ tags:
4
+ - moe
5
  pipeline_tag: text-generation
6
  inference: true
7
  widget:
8
+ - text: Hello!
9
  example_title: Hello world
10
  group: Python
 
11
  ---
12
 
13
  This model is randomly initialized, using the config from [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) but with smaller size.