|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
pipeline_tag: text-classification |
|
--- |
|
|
|
# Monarch Mixer-BERT |
|
|
|
The 80M checkpoint for M2-BERT-base from the paper [Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture](https://arxiv.org/abs/2310.12109). |
|
This model has been pretrained with sequence length 32K. |
|
Note (11/3 evening): this is a partial checkpoint, this one had not finished training before upload. |
|
|
|
This model was trained by Dan Fu, Jon Saad-Falcon, and Simran Arora. |
|
|
|
Check out our [GitHub](https://github.com/HazyResearch/m2/tree/main) for instructions on how to download and fine-tune it! |
|
|