One time I asked an AI for all the video games where health was illustrated by hearts. It told me legend of Zelda which is true but then it said kingdom hearts which is definitely a green bar. If I can stumble across blatant lies because my dumbass was talking to someone in a dating app, how the fuck could someone invest millions upon millions of dollars gambling on whether a LLM is going to give you accurate information without asking it something first?
One time I asked an AI for all the video games where health was illustrated by hearts. It told me legend of Zelda which is true but then it said kingdom hearts which is definitely a green bar. If I can stumble across blatant lies because my dumbass was talking to someone in a dating app, how the fuck could someone invest millions upon millions of dollars gambling on whether a LLM is going to give you accurate information without asking it something first?