Sudo Sodium @lemdro.id to Linux@lemmy.mlEnglish · 2 months agoWhat to try in a linux distro ?message-squaremessage-square42fedilinkarrow-up158arrow-down15file-text
arrow-up153arrow-down1message-squareWhat to try in a linux distro ?Sudo Sodium @lemdro.id to Linux@lemmy.mlEnglish · 2 months agomessage-square42fedilinkfile-text
minus-squareSudo Sodium @lemdro.idOPlinkfedilinkEnglisharrow-up1·2 months agoI’m indeed open to the idea if it’s locally hosted but ollama isn’t available in my country… I’ll search if there’s a LLM that isn’t an ollama fork
minus-squarePossibly linuxlinkfedilinkEnglisharrow-up1·2 months agoOllama is run locally. It can be available in any country
minus-squareSudo Sodium @lemdro.idOPlinkfedilinkEnglisharrow-up1·2 months agoI know, but won’t I need to download the models in the app in order to run it locally ?
minus-squarePossibly linuxlinkfedilinkEnglisharrow-up1·2 months agoYes but that’s pretty minor. You can just run ollama pull <model name>
I’m indeed open to the idea if it’s locally hosted but ollama isn’t available in my country… I’ll search if there’s a LLM that isn’t an ollama fork
Ollama is run locally. It can be available in any country
I know, but won’t I need to download the models in the app in order to run it locally ?
Yes but that’s pretty minor. You can just run ollama pull <model name>