• abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    8 months ago

    If I ask an LLM something like “is there a git project that does <something I’d describe in natural language but not keywords>” or is there a Windows program that does X, it may make up the answers

    Obviously it depends on the LLM, but ChatGPT Plus doesn’t hallucinate with your example. What it does is provide a list of git projects / windows programs, each with a short summary and a link to the official website.

    And the summary doesn’t come from the website — the summary is a short description of how it matches your requirements list.

    I’ve also noticed Bing has started showing LLM summaries for search results. For example I’ve typed a question into Duck Duck Go (which uses Bing internally) and seen links to reddit where the answer is “a user answered your question stating X, and another user disagreed saying Y”.

    I’m encountering hallucinations far less often now than I used to - at least with OpenAI based products.