Local is the future. We’ve barely gotten used to having pocket supercomputers. Soon we’ll have pocket super intelligence. Hold on. It’s going to be wild.

Google DeepMind
@GoogleDeepMind
Meet Gemma 4: our new family of open models you can run on your own hardware.
Built for advanced reasoning and agentic workflows, we’re releasing them under an Apache 2.0 license. Here’s what’s new 🧵
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content
