Mistral AI has released Codestral, a new open weights model for coding in multiple languages with impressive speed and performance. Codestral is noted for handling instructions and fill-in-the-middle tasks efficiently, outperforming the DeepSeek Coder 33B model. Users praise its speed and inclusion of Swift, with comparisons to Mistral's previous models. Codestral is available for use in various platforms and is lauded for its fast autocomplete capabilities.
⚡️ Calling all coders! ⚡️ Exciting update: #Codestral by @MistralAI is now part of @CodeGPTAI. Experience: 🟠 32K Context Window 🟠 Superior Performance 🟠 Fill-in-the-Middle Level up your coding in #VSCode with Codestral and @CodeGPTAI https://t.co/XYMem1Wkdb
You can use the latest code model Codestral from @MistralAI in BoltAI. Go to Settings > Models > Mistral AI, click Refresh (screenshot in comment) First impression, it's quite fast! https://t.co/O7cYoBfgEz
Codestral @MistralAILabs first impression: 1. 80 languages is crazy. Finally someone included Swift. Which a lot of OS models skip 2. Really fucking fast. wtf. It’s a 22b model and it’s significantly faster than mistral 7b. Are they using groq to serve it?? Comparison: https://t.co/dAFaporiRd
Meanwhile Mistral AI drops a model for coding “in 80 programming languages” With 32K context window. https://t.co/BgtHLdU98B https://t.co/O3ezkCDzrz https://t.co/Dn00jl1nzY
Codestral is the fastest good autocomplete model we've seen! Very impressive. https://t.co/d8EYpLG6RO
Codestral is quite exciting. It's the first time I've seen a code LLM handle both instructions and fill-in-the-middle. It looks like it outperforms the excellent DeepSeek Coder 33B (current SOTA open-source code LLM), which is 50% bigger. Thanks @MistralAI! 📝 Blog:… https://t.co/ov4EO0G3MB
try out our latest open weights model Codestral, it’s super fast and available now on le Chat and @jetbrains @LangChainAI @llama_index @continuedev @tabnine @sourcegraph ❤️ https://t.co/UQfCldTnMj