| license: mit | |
| Arabic ModernBERT model partially trained (13% of one epoch) | |
| on a filtered [subset](https://huggingface.co/datasets/akhooli/afw2_f98_tok) of | |
| FineWeb2 (text length: 250-25000 characters, 98% or more Arabic words) pretokenized. | |
| The actual filtered dataset (text column only) is [here](https://huggingface.co/datasets/akhooli/afw2_f98). | |
| The dataset is a little over 30M records. | |
| The model folder contains a checkpoint (64 batch size on single GPU, 60,000 iterations) |