The AI community is celebrating the rapid expansion of the model library on Ollama, which now offers high-quality models accessible with a single click, including the use of the VACode extension by CodeGPT AI for code completion. The OpenMMLab team's 'internlm2' model, with its 7b version, is recognized as the best model under 13b parameters, while its 20b version matches the performance of the Yi-34B model. The open-source NeuralBeagle14, a 7B parameter model developed through collaboration between Teknium, Intel, and Argilla.io, has been released and is making waves for its top performance. NeuralBeagle14 ranks at the top of every benchmark and is the best-performing 7B parameter model on the Open LLM Leaderboard, where it also holds the 10th overall position. This achievement underscores the efficacy of smaller, open-source models in challenging the notion that larger models are necessary for top performance in AI.
I was eager to test new NeuralBeagle14 from @maximelabonne so I created an @ollama ready version here. Give it a try it's fast and powerful! Have fun! https://t.co/LTIcaKzUMy
NeuralBeagle14-7B is a the best-performing 7B parameter model on the Open LLM Leaderboard. It also ranks as the 10th best-performing model overall on the Open LLM Leaderboard. In just 7B parameters! Tiny open source continues to defy experts that we need big models. https://t.co/Hmyf5PCBIC
Excellent thread on the open-source collaboration among @Teknium1, @intel, and @argilla_io that led to NeuralBeagle14. This model ranks at the top of every benchmark I've seen with just 7B parameters! https://t.co/2jbnV9q4ZA
🔥 Open source, open datasets & open collaboration go a long way 🍿The story behind NeuralBeagle14, a top performing 7B model released by @maximelabonne https://t.co/XVDHJSMeuX
There is a new top pretrained model on the leaderboard 👀 Congrats to @intern_lm and @OpenMMLab team for their new internlm2. With the 7b version being the best model under 13b and the 20b being on par with Yi-34B ! https://t.co/mQIuzmjD4b
This is great news. 👌🏼 Model library on @ollama is increasing very fast and quality models available at one click. Using this model with VACode extension @codegptAI for code completion should make our life easier 😊 https://t.co/O4Tqyk8J9Y