Access. If it can truly run fully locally, then companies can have their LLM model in everyone’s pockets, basically.
As far desirable, it depends on the person. And it’s more about what people don’t want, then want. Google has been promising various AI features that people will be able to run on the device, but so far most of them send the data to Google servers and then return the result.
Access. If it can truly run fully locally, then companies can have their LLM model in everyone’s pockets, basically.
As far desirable, it depends on the person. And it’s more about what people don’t want, then want. Google has been promising various AI features that people will be able to run on the device, but so far most of them send the data to Google servers and then return the result.