| license: apache-2.0 | |
| datasets: | |
| - cerebras/SlimPajama-627B | |
| language: | |
| - en | |
| Model of the paper [MoM: Linear Sequence Modeling with Mixture-of-Memories](https://arxiv.org/abs/2502.13685). | |
| The model was trained on a sample of SlimPajama with 15B tokens. We use Gated-Deltanet as the memory update mechanism. |