| 
							 | 
						--- | 
					
					
						
						| 
							 | 
						language: | 
					
					
						
						| 
							 | 
						- en | 
					
					
						
						| 
							 | 
						--- | 
					
					
						
						| 
							 | 
						 | 
					
					
						
						| 
							 | 
						# SmolLM3 Training Configs | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						**[IMPORTANT NOTE]**: for the latest configs go to this repo: https://github.com/huggingface/smollm/tree/main/text/pretraining/smollm3 | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						Here you can find the training configs for [SmoLLM3-3B-Base](https://huggingface.co/HuggingFaceTB/SmolLM3-3B-Base) using [nanotron](https://github.com/huggingface/nanotron/) with exact training details and data mixtures.  | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						The model was trained on 11.2T tokens in 3 stages on 4k context:  | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						- stage 1 [config](https://huggingface.co/datasets/HuggingFaceTB/smollm3-configs/blob/main/stage1_8T.yaml) | 
					
					
						
						| 
							 | 
						- stage 2 [config](https://huggingface.co/datasets/HuggingFaceTB/smollm3-configs/blob/main/stage2_8T_9T.yaml) | 
					
					
						
						| 
							 | 
						- stage 3 [config](https://huggingface.co/datasets/HuggingFaceTB/smollm3-configs/blob/main/stage3_9T_11T.yaml) | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						%3C!-- HTML_TAG_END --> | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						And then we trained on an additional 2 stages to extend the contetx length to 64k: | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						- stage 4 [config](https://huggingface.co/datasets/HuggingFaceTB/smollm3-configs/blob/main/long_context_4k_to_32k.yaml) | 
					
					
						
						| 
							 | 
						- stage 5 [config](https://huggingface.co/datasets/HuggingFaceTB/smollm3-configs/blob/main/long_context_32k_to_64.yaml) | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						%3C!-- HTML_TAG_END --> | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						
 |