Quote: “all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!”
So you can download it and set the device to airplane mode, never go online again - they won’t be able to monitor anything, even if there’s code for that included.
Quote: “all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!”
So you can download it and set the device to airplane mode, never go online again - they won’t be able to monitor anything, even if there’s code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
So trust them. If you don’t and want to use this, buy a separate device for it, or VM.
Can’t? This is not for you.
I won’t gonna use my smartphone as a local llm machine.
That is exactly what Ollama does too.