Track_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 2 days agoHexadecimalslrpnk.netimagemessage-square109fedilinkarrow-up1971arrow-down125
arrow-up1946arrow-down1imageHexadecimalslrpnk.netTrack_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 2 days agomessage-square109fedilink
minus-squareMasterNerd@lemm.eelinkfedilinkarrow-up36arrow-down2·1 day agoJust run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
minus-squarevvilld@lemmy.worldlinkfedilinkarrow-up1·32 minutes agoIs this something someone without a coding background can do easily?
minus-squaressillyssadass@lemmy.worldlinkfedilinkarrow-up5·18 hours agoDon’t you need a beefy GPU to run local LLMs?
minus-squareMasterNerd@lemm.eelinkfedilinkarrow-up2arrow-down1·17 hours agoDepends on how many parameters you want to use. I can run it with 8billion on my laptop.
minus-squarePsythik@lemm.eelinkfedilinkarrow-up5·21 hours agoOr just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
minus-squareTja@programming.devlinkfedilinkarrow-up5arrow-down1·21 hours agoIs the local version censored at all?
Just run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
Is this something someone without a coding background can do easily?
Don’t you need a beefy GPU to run local LLMs?
Depends on how many parameters you want to use. I can run it with 8billion on my laptop.
Or just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
After censorship, bias still remains.
Is the local version censored at all?
How? The tweaking part, of course