File size: 549 Bytes
a93bfec |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
# Model Card
- Source: [https://arxiv.org/abs/2509.02046](https://arxiv.org/abs/2509.02046)
- Optimizer: `soape`
- Model size: `130m`
- Data size: `42B`
## Best configuration
| Hyperparameter | Value |
|---|---|
| beta1 | `0.95` |
| beta2 | `0.99` |
| block_size | `512` |
| epsilon | `1e-10` |
| learning_rate | `0.008` |
| max_grad_norm | `1` |
| min_lr_ratio | `0` |
| partition_grads_into_blocks | `True` |
| precondition_frequency | `10` |
| shampoo_beta | `0.98` |
| train_batch_size | `256` |
| warmup | `1000` |
| weight_decay | `0.1` |
|