Japanese startup Sakana AI Labs, founded by former Google researchers, is pioneering innovation in AI by evolving new foundational models through an evolutionary process. Their approach leverages open-source AI models to efficiently produce powerful AI models. Gemini 1.5 with a 1M context size API is generating excitement for its potential to revolutionize AI work by enabling multi-modal input and extensive context analysis.
Tokyo-based @SakanaAILabs, founded Transformers paper author @YesThisIsLion and fellow Google Brain researcher @hardmaru, is using evolution-inspired techniques to “breed” multiple generations of LLMs: https://t.co/CuwfnYU02r
They could use Gemini 1.5 https://t.co/jJYknB5hiz
Gemini 1.5 Pro is a very good model, by the way. Appears to beat Gemini 1.0 Ultra which was itself GPT-4 class on the stats. Also impressive that Google can more widely release massive context windows. Google seems to be moving quickly.
If you want a hint about the future of AI, it is worth trying Gemini 1.5 with the 1M token context window, now available to everyone, apparently. Some of my experiments: giving it a video and having it figure out a recipe, execute instructions, watching my screen, summarize work https://t.co/ojVdxmZMic
today in AI: 1/ Gemini 1.5 Pro is now open to all in Google’s AI studio. It’s soon coming to API as well. This is @Google’s model with 1M context length. 2/ Sakana AI releases its report on merging foundational models similar to natural evolution. Sakana AI was founded by two… https://t.co/kHriDWwgxX
Gemini Pro 1.5 has the potential to be THE game-changer for these 3 reasons: 1) it is multi-modal (especially the video input), so you can add recordings of your work to the prompt and reason about it 2) it has 1M context, which itself would be amazing... 3) ...but it can also… https://t.co/sko0yw9P6b
Open-source AI models released by Tokyo lab Sakana founded by former Google researchers https://t.co/GFfvOdnOK7 https://t.co/f4ueu7LbXQ
Gemini 1.5 1M Context Size API going out. https://t.co/vJYxSUBBYS
While many labs in Japan (and globally) are trying to catch up, and train foundation models by using the same techniques as everyone else, @SakanaAILabs is showing the path to innovation.
Innovation in AI keeps going. Excited for this evolution from @SakanaAILabs. Their new approach of *evolving* new foundational AI models, which leverages on the vast ocean of open-source AI models out there, enables them to efficiently, and with little effort, produce powerful… https://t.co/uSo996FZGt
Innovation in AI keeps going. Excited for this evolution from @SakanaAILabs https://t.co/uSo996FZGt
Japanese startup generates AI models from 'evolutionary' process https://t.co/gYUX5SW57f
I'm getting lot more useful real work done with Gemini 1.5 and 1m tokens by jamming in massive amounts of context than I've ever gotten done with RAG and GPT-4.