• PerogiBoi@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      9 months ago

      For real. I’ve been using Google since 2000 and it now costs me substantially more cognitive load to find things than if I just ask a LLM.

      • TORFdot0@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        9 months ago

        I can’t trust the output of an LLM but at least you can ask it to cite its sources so you can get the page that helped it come to that conclusion

        • rebelsimile@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          9 months ago

          If I ask an LLM something like “is there a git project that does <something I’d describe in natural language but not keywords>” or is there a Windows program that does X, it may make up the answers, but at least I can verify that via a search engine. If I try to Google that (that’s theoretical, I don’t use Google), I’m going to end up on a page full of ads that is filled with trap links that lead to malware, top 10 lists with the same repeated content and all the other shit the internet has become. I kind of don’t mind the hallucinations relative to the ads. What a time to be alive though.

          • abhibeckert@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 months ago

            If I ask an LLM something like “is there a git project that does <something I’d describe in natural language but not keywords>” or is there a Windows program that does X, it may make up the answers

            Obviously it depends on the LLM, but ChatGPT Plus doesn’t hallucinate with your example. What it does is provide a list of git projects / windows programs, each with a short summary and a link to the official website.

            And the summary doesn’t come from the website — the summary is a short description of how it matches your requirements list.

            I’ve also noticed Bing has started showing LLM summaries for search results. For example I’ve typed a question into Duck Duck Go (which uses Bing internally) and seen links to reddit where the answer is “a user answered your question stating X, and another user disagreed saying Y”.

            I’m encountering hallucinations far less often now than I used to - at least with OpenAI based products.

          • TORFdot0@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            That’s a really good use case that I will need to start using. In a use case such as that where paid ads make the search engines unreliable, the LLM is at least going to be on the same footing if not better.

          • Aniki 🌱🌿@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            You immediately start from a better place using an LLM vs searching raw. If I just need a link, I use DDG. When I need to research, I ask the latest gpt4-turbo model.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    9 months ago

    In no particular order

    • Learning how to bake
    • Search engine
    • Re-writing documentation for work
    • Reading up on random topics
    • Fucking around
    • Looking up old timey technical words that some of my customer’s specs are written in
    • Homework help for my kids
    • Getting it to attack my ideas before I announce them at work so I can anticipate what I am up against
  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    9 months ago

    TL;DR: fucking around with it.

    Yes, it sure is a transformative paradigmatic shift in, uh, throwing money down a well.