potentiallynotfelix@lemmy.fish to Firefox@lemmy.mlEnglish · 13 hours agoFirefox introduces AI as experimental featurelemmy.fishimagemessage-square40fedilinkarrow-up1100arrow-down17file-text
arrow-up193arrow-down1imageFirefox introduces AI as experimental featurelemmy.fishpotentiallynotfelix@lemmy.fish to Firefox@lemmy.mlEnglish · 13 hours agomessage-square40fedilinkfile-text
minus-squareLojcs@lemm.eelinkfedilinkarrow-up1·2 hours agoLast time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
minus-squareTheDorkfromYork@lemm.eelinkfedilinkEnglisharrow-up1·1 hour agoThey’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.
Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
They’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.