Fireworks AI has unveiled Firefunction-v2, an advanced open weights function calling model that boasts a performance on par with GPT-4o but operates at 2.5 times the speed and at only 10% of the cost. This new model is designed to enhance the capabilities of AI agents by allowing them to use external tools more effectively. The model also features secure code interpreting capabilities. Alongside this, a new short course on function calling and data extraction with LLMs has been launched in collaboration with NexusflowX, DeepLearningAI, and Andrew Ng. The course aims to teach the application of function-calling to expand LLM and agent application capabilities, with instruction from Jiantao Jiao and Venkat K. Srinivasan. The model is available on Hugging Face, and a model playground is provided for users to explore its functionalities.
Have you checked out Function-Calling and Data Extraction with LLMs, our new short course made in collaboration with @NexusflowX yet? Join in and: 👉 Learn to extend LLMs with function-calling, enabling them to form calls to external functions. 👉 Extract structured data from… https://t.co/RQgk2g0MKQ
Check out our portco @NexusflowX's collab with @AndrewYNg's @DeepLearningAI on the course "Function-calling and Data Extraction with LLMs." Learn from Nexusflow's founding engineers and explore the NexusRavenV2-13B model. https://t.co/1PVC7fh2qE #AI #DeepLearning #Nexusflow https://t.co/ACqprbFvls
💥 Catch up on the MAX 24.4 ⚡️ release! Watch the latest Modular community livestream, where we covered MAX on macOS, gave a live demo of fast quantized Llama 3 🦙, dug into Mojo 🔥 updates, shared the latest from our Mojo 🔥 community meetings, and more! https://t.co/5PfgeouqNp
Good morning. At some point this summer, perhaps quite soon, @AIatMeta will be releasing a LLaMA-3 model with 400B parameters. It will likely be the strongest open-source LLM ever released by a wide margin. This is a thread about how to run it locally. 🧵
🌟 Function Calling is now live on Workers AI! 🌟 Upgrade your AI applications by leveraging LLMs to make external API calls - bringing you one step closer to creating autonomous AI agents. https://t.co/mAogsmNOXz
📢 New function calling and GenAI agent course launched in collaboration with @AndrewYNg and @DeepLearningAI. Come and try out the tutorial built by @VenkatKSrini and @NexusflowX team! Hope you enjoy it. https://t.co/CiA8LiVEHw
Fireworks AI Releases Firefunction-v2: An Open Weights Function Calling Model with Function Calling Capability on Par with GPT4o at 2.5x the Speed and 10% of the Cost https://t.co/XMtAIGu0py
Super excited to help initiate the short course on function calling, in collaboration with the amazing @AndrewYNg! https://t.co/RxTBAE0ei1
🚀 Exciting News! Check out our brand new short course on function calling, the foundation for AI agents, created by our CEO @JiantaoJ and founding engineer @VenkatKSrini, in collaboration with @DeepLearningAI and the legendary @AndrewYNg 🌟 #AI #GenAI #NexusRaven… https://t.co/N0r5WdxKFt
Fireworks AI Releases Firefunction-v2: An Open Weights Function Calling Model with Function Calling Capability on Par with GPT4o at 2.5x the Speed and 10% of the Cost Full read: https://t.co/XWL7bnO4WV Hugging Face model page: https://t.co/ZgZ6uCAmi6 Model playground:… https://t.co/IYBKwmeCgq
Super excited to launch this course with @AndrewYNg ! Function calling lies at the heart of AI agents but it is tricky to produce highly accurate and reliable results. Come and learn from @VenkatKSrini ! https://t.co/xJXks13jI0
Introducing Function-Calling and Data Extraction with LLMs, a short course made in collaboration with @NexusflowX, taught by @JiantaoJ and @VenkatKSrini. Learn to apply function-calling to expand your LLM and agent application capabilities. Join now: https://t.co/31v3MjgcIK
Function calling is a powerful way to extend the capabilities of LLMs and AI agents by letting them use external tools. Our new short course Function calling and Data Extraction with LLMs, created with @NexusflowX and taught by @JiantaoJ and @VenkatKSrini, demonstrates how to… https://t.co/VkXYqDPZcu
Fully-local function calling w/ llamacpp 🤖 Function calling allows an LLM to connect with external tools, which is central to building agents. But, tool calling with local LLMs has been a challenge. Improved open weights LLMs, such as Llama 3, and fine-tuning efforts are now… https://t.co/0zM8ULYwj0
The new Firefunction V2 by @FireworksAI_HQ with tool calling is mind bogglingly fast https://t.co/RYLVlO2skR
Firefunction-v2 + Code Interpreter I built a simple AI agent with a new LLM that is very fast (actually 2x faster than GPT-4o 💨). In the example, it has code-interpreting capabilities. Featuring: 🎆 @FireworksAI_HQ's new function calling model ✴️ @e2b_dev for the secure code… https://t.co/IMlYyO4a03
FireFunction-V2 is an open-weights model from @FireworksAI_HQ - It's built on Llama 3-70B, offering strong instruction-following capabilities - FireFunction-V2 provides tool-calling accuracy on par with GPT4-o - It offers faster inference speeds, enhancing real-world use cases https://t.co/sws6QihFHF
🔥 Firefunction-v2 🔥: Llama 3 fine-tuned for tool calling / agents Firefunction-v2 is a new open weights model from @FireworksAI_HQ fine-tuned for tool calling. Built on Llama 3-70b, Firefunction-v2 marries strong instruction following capabilities with tool calling that is on… https://t.co/1mt1P2T6tP