• LanternEverywhere@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

    For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!