it begins apps and sites are going to start bundling or allow downloading of local models free, on-device, local, private
it begins apps and sites are going to start bundling or allow downloading of local models free, on-device, local, private
@jsngr Maybe a silly Q but what's the core value prop of running locally? Speed? Privacy?
@jsngr this is what i was strongly hoping we would get with Apple Intelligence last WWDC how disappointed i was…
@jsngr Kinda reminds me of how apps used to let us download dictionaries. The OS then abstracted this, in some cases.
@jsngr I hope apps are eventually allowed to share local models. Otherwise every app will be 1 GB 😬
@jsngr any way we can reuse models already on device? seems rather inefficient to have each app download the same model
@jsngr local models could change how we engage online.
@jsngr Interesting. Do you think this will eventually be supported at the OS level rather than by individual apps?
@jsngr Apple needs to bundle a system LLM or offer it as a system level service that can wrap other providers. This is unsustainable for every app to download their own flavor.
@jsngr A trend I have been a strong advocate for. We’re averaging out the user base of every product and building for lower powered machines.
@jsngr Will the app size increase even if the user opts not to download the model?
@jsngr Seems like this is moving towards operating systems and browsers including built-in LLMs, and in Apple’s case allowing the apps to provide adapters as a way to specialize without fine-tuning.
@jsngr so the next step is obviously leveraging cloud servers to host local LLMs… What do you think about this approach for overcoming hardware limitations on our devices? 🫣
@jsngr This is not sustainable for users because of limited memory. I think it’ll be better if Apple provides developer access to local models at the OS level through a framework rather than users having to download on-device models on every app.