beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 8 months agoThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.message-squaremessage-square85fedilinkarrow-up1344arrow-down111
arrow-up1333arrow-down1message-squareThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 8 months agomessage-square85fedilink
minus-squarePossibly linuxlinkfedilinkEnglisharrow-up1arrow-down1·8 months agoThe Mistral language model is 3.8gb and has a crazy amount of knowledge
minus-squarePossibly linuxlinkfedilinkEnglisharrow-up1·8 months agoI wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information. Regardless, it is still wild that 3.8gb can go so far
minus-squaretrolololol@lemmy.worldlinkfedilinkarrow-up1·8 months agoThen it will pass better on Turing test. It’s a feature.
The Mistral language model is 3.8gb and has a crazy amount of knowledge
deleted by creator
I wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information.
Regardless, it is still wild that 3.8gb can go so far
Then it will pass better on Turing test.
It’s a feature.