File size: 602 Bytes
6d98b24
 
afc7050
 
 
6d98b24
afc7050
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: apache-2.0
language:
- en
pipeline_tag: text-classification
---

# Monarch Mixer-BERT

The 80M checkpoint for M2-BERT-base from the paper [Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture](https://arxiv.org/abs/2310.12109).
This model has been pretrained with sequence length 32K.
Note (11/3 evening): this is a partial checkpoint, this one had not finished training before upload.

This model was trained by Dan Fu, Jon Saad-Falcon, and Simran Arora.

Check out our [GitHub](https://github.com/HazyResearch/m2/tree/main) for instructions on how to download and fine-tune it!