GroveMoE Collection GroveMoE is an open-source family of large language models developed by the AGI Center, Ant Research Institute. • 4 items • Updated 19 days ago • 7
Pass@k Training for Adaptively Balancing Exploration and Exploitation of Large Reasoning Models Paper • 2508.10751 • Published Aug 14 • 28
NVIDIA Nemotron V2 Collection Open, Production-ready Enterprise Models. Nvidia Open Model license. • 9 items • Updated 4 days ago • 74
Tfree-HAT-7b-pretrained Collection Tokenizer free models based on Hierarchical Autoregressive Transformer (https://arxiv.org/abs/2501.10322) trained from scratch. • 2 items • Updated Aug 1 • 10