🌟 You can now run the 7B parameter version of Gemma, entirely locally in the browser, using MediaPipe LLM Inference API. Simply download the model from @kaggle and try it on → goo.gle/4ajmrYh Learn more → goo.gle/3UKhwcX
5
48
197
42K
71
Download Video
@googledevs @kaggle #WebAI for the win! If anyone has any questions feel free to connect with me.
@googledevs @kaggle *runs on high end laptops" - what min numbers we're talking here? Is 16gb RAM enough?