The 200 year-old company may soon go public on the back of AI-powered education products.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    4
    arrow-down
    6
    ·
    1 day ago

    How dare you say something insufficiently negative about the stuff everyone hates.

    • Lemminary@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      The downvotes are for the naïveté of the statement. Many people here use LLMs every day and have stated so in other threads. We just don’t think this is necessarily a proper use case given that you’re dealing with factual information. You can see as much in other comments on this thread pointing out the hallucinations.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        21 hours ago

        Whereas I use LLMs every day, have actually written code that uses them, and I understand that they’re perfectly fine dealing with factual information when used in the proper framework. You’d be using retrieval-augmented generation (RAG) in an application like this.

        The “but hallucinations!” Objection goes in the same bin as “they can’t do fingers.” It’s an old concern that’s had a lot of work done to resolve it but that the general public haven’t bothered to keep up with.

        • Lemminary@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          18 hours ago

          “they can’t do fingers.” It’s an old concern

          Have you seen those gorilla hands, though? Yes, there are five fingers there but everyone got fucking man hands. lmao

          It seems RAG helps mitigate but doesn’t eliminate hallucinations yet. Not to mention it’s quite expensive and has trouble extracting information based on abstract concepts. It sounds promising but it’s not the silver bullet I’m being sold on.