tokenized fineweb2 pol-Latn split, using APT4 tokenizer. Around 109B tokens. It's split into parts, just cat it together before training.
tokenized fineweb2 pol-Latn split, using APT4 tokenizer. Around 109B tokens. It's split into parts, just cat it together before training.