Google's Gemini Nano, an in-browser AI model, is being run locally on Google Chrome browsers with minimal resource usage. Users are excited about the potential for fast and private AI processing on Chrome without internet connectivity.
Have been experimenting with Chrome AI (Gemini Nano). Built a notebook to try out experiments. https://t.co/NgFTbt6Xvi
Got access to on-device LLM in Chrome (gemini-nano). I think it will change the world forever. Imagine: free, unmetered LLM for any task on any website. https://t.co/UcSuTTbN8V
Our first look at Google Gemini running natively in Chrome and its lightning-quick https://t.co/8MF8lFbQW5
In-Browser-AI-Models are here 🤯 This is Gemini Nano running on my M1 Macbook Pro using the latest Chrome Canary version. Most of the text generations are completed in under 500ms - entirely offline too! Adding this to a Vue app is only ~15 lines of code. Pretty insane! https://t.co/iHUWrlNXKp
We’ve got Yagil Burowski from @LMStudioAI here for running local LLMs I’ve been using LM Studio since last year, it’s really great. It runs on CPU at 15-20 token/s right now on Snapdragon X Elite and will hopefully run on NPU soon! https://t.co/RifiuC0KLK
guide to run gemini nano on chrome just dropped🔥🔥: https://t.co/MKC0oeNoD6
Gemini Nano running locally without internet on Chrome. Local compute ⚡️ https://t.co/33ky3mjZos
Google Gemini Nano in your Chrome browser right there personal and private with just 2 lines of code. https://t.co/3QIEc2QVoj
a 3.25B params quantized gemini running locally in coming Google Chrome with less than 100ms latency while using less than 2GB of ram that's less ram usage than many of my current Chrome page already use (my slack is using 4.8GB as I type this) no doubt LLMs will be integrated… https://t.co/Muy8o4qPVV