Pro to Technology@lemmy.worldEnglish • 5 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square41fedilinkarrow-up1240arrow-down10
arrow-up1240arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro to Technology@lemmy.worldEnglish • 5 days agomessage-square41fedilink
minus-squareGreg ClarkelinkfedilinkEnglish3•5 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-square@Euphoma@lemmy.mllinkfedilinkEnglish7•4 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk