Hugging Face has announced a significant initiative, committing $10 million in free shared GPUs to support developers in creating new AI technologies. This program, called ZeroGPU, aims to democratize access to computational resources, helping small developers, academics, and others in the open-source community. The initiative is part of Hugging Face's broader efforts to support the AI ecosystem, which also includes providing hundreds of GPU grants via their Spaces platform. CEO Clement Delangue discussed this initiative with The Verge, highlighting that the program offers $28.5K/month worth of compute access.
Our portfolio company @huggingface is committing $10 million in free shared GPUs to help developers create new AI technologies. CEO @ClementDelangue talks with @theverge about how his team is investing in the startup community. https://t.co/SYhyxVr30I
➡️ Compute sharing initiative! Hugging Face pledges $10 million worth of compute resources to aid in competing against major AI companies. https://t.co/gO4e7To0ey
Hugging Face Is Sharing $10 Million Worth of Compute To Help Beat the Big AI Companies https://t.co/SNlglsIC1e
The Guardian @guardian: Hugging Face is sharing $10 million worth of compute to help beat the big AI companies. #industry40 #ArtificialIntelligence #AI https://t.co/5GPbBbbD1K
Awesome models and datasets released by our friends at @DbrxMosaicAI 🤗 Not only is this a large openly licensed image-text pair dataset (~70M images), it is also the largest dataset hosted on the Hugging Face Hub 📏📚 https://t.co/jjp69HZJjd
I commend the @huggingface leadership team, their investors, and sponsors for initiating and supporting this effort. It's inspiring to see such commitment from the private industry, and I hope it encourages others to support open-source initiatives and making AI more sustainable. https://t.co/IGR3Hhmebd
i benefitted from the ZeroGPU program i did the math , it's $28.5K/month worth of compute you get access to! if you're not on huggingface, you're missing out ! https://t.co/JJ3ELn7grz
ZeroGPU is a fantastic example of great AI infrastructure developed by @HuggingFace. At @SakanaAILabs we’ve already been using it to showcase models like EvoVLM and EvoSDXL and it works like a charm. 🎊 https://t.co/KExEfWBaMb
in 5 years huggingface models and code will be part of all software, in devices, robots, data pipelines, in browsers, games
Hugging Face, one of the biggest names in machine learning, is committing $10 million in free, shared GPUs to help developers create new AI technologies. I sat down with Hugging Face CEO @ClementDelangue to chat about this new program called ZeroGPU. https://t.co/AuoITuYxpC
HuggingChat can now speak Chinese with Yi-34B-Chat e.g.🈷️ = "Moon" Yi open-source models on @huggingface are Chinese-English bilingual for you to build with the two most spoken languages in the world!! Try by changing current model setting to Yi on https://t.co/RCZhuEkfJd https://t.co/l9rCgOVvHi
GPU-Poor no more: super excited to officially release ZeroGPU in beta today. Congrats @victormustar & team for the release! In the past few months, the open-source AI community has been thriving. Not only Meta but also Apple, NVIDIA, Bytedance, Snowflake, Databricks, Microsoft,… https://t.co/6UzWvYhmpw
We're announcing over $10 million worth of compute to be distributed to the ecosystem of researchers, practitioners, and builders doing amazing open ML!🤗🚀 https://t.co/OF0GiW3Z7p Apart from this, Hugging Face has supported the OS ecosystem by providing hundreds... of GPU…
We're announcing over $10 million worth of compute to be distributed to the ecosystem of researchers, practitioners, and builders doing amazing open ML!🤗🚀 https://t.co/OF0GiW3Z7p Apart from this, Hugging Face has supported the OS ecosystem by providing hundreds of GPU…
We're announcing over $10 million worth of compute to be distributed to the ecosystem of researchers, practitioners, and builders doing amazing open ML!🤗🚀 https://t.co/OF0GiW3Z7p Apart from this, Hugging Face has supported the OS ecosystem by providing hundreds of GPU grants…
Yi-1.5-34B from @01AI_Yi become the FIRST Chinese model on Hugging Chat🎉🥳 https://t.co/3er124ixww https://t.co/FwlmxTk9wA
We're announcing over $10 million worth of compute to be distributed to the ecosystem of researchers, practitioners, and builders doing amazing open ML!🤗🚀 Apart from this, Hugging Face has supported the OS ecosystem by providing hundreds of GPU grants via Spaces, supporting…
Hugging Face, which is "profitable, or close to profitable", commits $10M in free shared GPUs to help small developers, academics, and others create AI apps (@kyliebytes / The Verge) https://t.co/pByPoRKAqD 📫 Subscribe: https://t.co/OyWeKSRpIM https://t.co/mZsfKVlBIW
Hugging Chat can now speak Chinese :-) Welcome to the first Chinese speaking model Yi-1.5-34B by @01AI_Yi on https://t.co/9Q0UsOndCB! Let me know which assistant you built with this model :-D My Markdown translator powered by Yi https://t.co/MI9XOTXlH6 Feedback welcome! https://t.co/dPFLmBIrF4 https://t.co/EKtnYgsgr7
Big news from @huggingface: We're committing $10M via our ZeroGPU initiative to put GPUs in the hands of AI builders.🥳🥳🥳 Great article from @kyliebytes @verge interviewing our CEO @ClementDelangue: https://t.co/gXY2YoQC3d
Access to computational resources is key for democratizing AI. We're taking action by committing $10 million in free GPUs to help developers create new AI technologies. Exclusive by @kyliebytes https://t.co/Tkkx0BgOCJ
Hugging Face is sharing $10 million worth of compute to help beat the big AI companies https://t.co/wfTcc5SMC4
What a year for open ML! Trending models on Hugging Face include models from Meta, Google (TimesFM, PaliGemma), Tencent, NVIDIA, DeepSeek, RefuelAI, TII, Salesforce, 01-ai, Apple, Fugaku, Hugging Face, Microsoft, Stability, NousResearch, Gradient, Mistral, ByteDance 🤯 https://t.co/xUXWbdurSY
Me and @chargoddard collabed to make something pretty unique here, Hermes 2 Θ (Theta) - a Hermes 2 Pro + Llama-3 Instruct merge that takes Hermes to the next level (and gets to meme on gpt4"o" at the same time). Check it out on HF here: https://t.co/IVvn5d9dyh We added some… https://t.co/nHZPasahKt https://t.co/v7wR53LDai
Today we are releasing an experimental new model in collaboration with @chargoddard and @arcee_ai, Hermes 2 Θ, our first model merge, combining Hermes 2 Pro, and Llama-3 Instruct, and then further RLHF'ed from there. Available on HuggingFace: https://t.co/3ZWLVfcFeg This model… https://t.co/uDgAWZgQkB
Hugging Face is an awesome AI community 🎶 It's a worldwide platform for open-source technnnnnology 💛 https://t.co/zXHuuGJBpp
Seriously, the @huggingface team's ability to deliver on big collabs so quickly is amazing. Kudos to @mervenoyann @pcuenq @m_olbap for their major sprint in supporting the Google team to deliver PaliGemma to the community.🙌 https://t.co/EShEOZghMm
🤗 Hugging Face x LangChain 🦜 Excited to share about huggingface-langchain: a new, jointly maintained partner package for the open-source community! This Python library offers a frictionless integration between the latest open models from @huggingface and your familiar… https://t.co/lbCMtsOfmk
Our HuggingFace integration is one of our most popular ones, so excited to work with them on a joint partner package. Should help with stability and robustness of the integration It's a goal of ours to make it easy to build complex chains and agents powered by OSS models 🤗 https://t.co/lji862SJdi
We are excited to announce huggingface-langchain🚀 A new open-source package to seamlessly integrate the latest open Models from @huggingface into @LangChainAI, supporting local models hosted models! 🤗🦜 TL;DR: 🛠️ Easy Installation: Install with a simple `pip install… https://t.co/jhsQLKsAEC
🤝Hugging Face x LangChain partner package We're excited to announce the launch of langchain-huggingface, a partner package in LangChain jointly maintained with @huggingface. LangChain users can now reliably connect to and access Hugging Face features. These include chat, text…
Yi-1.5🏅️from Chinese community is now on the @huggingface Hub 🔥🚀 👉https://t.co/xawOcUFiC1 ✨ 6B, 9B and 34B Base and Chat ✨ Base pre-trained on 500B tokens ✨Chat fine-tuned on 3M Samples ✨ 4K token context ✨ Apache 2.0