beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 3 months agoThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.message-squaremessage-square90fedilinkarrow-up1343arrow-down111
arrow-up1332arrow-down1message-squareThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 3 months agomessage-square90fedilink
minus-squarePossibly linuxcakelinkfedilinkEnglisharrow-up1arrow-down1·3 months agoThe Mistral language model is 3.8gb and has a crazy amount of knowledge
minus-squaretrolololol@lemmy.worldlinkfedilinkarrow-up1·3 months agoThen it will pass better on Turing test. It’s a feature.
minus-squarePossibly linuxcakelinkfedilinkEnglisharrow-up1·3 months agoI wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information. Regardless, it is still wild that 3.8gb can go so far
The Mistral language model is 3.8gb and has a crazy amount of knowledge
It lies a lot
Then it will pass better on Turing test.
It’s a feature.
I wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information.
Regardless, it is still wild that 3.8gb can go so far