• sp3ctr4l
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    edit-2
    5 hours ago

    I just tried out Gemini.

    I asked it several questions in the form of ‘are there any things of category x which also are in category y?’ type questions.

    It would often confidently reply ‘No, here’s a summary of things that meet all your conditions to fall into category x, but sadly none also fall into category y’.

    Then I would reply, ‘wait, you don’t know about thing gamma, which does fall into both x and y?’

    To which it would reply ‘Wow, you’re right! It turns out gamma does fall into x and y’ and then give a bit of a description of how/why that is the case.

    After that, I would say ‘… so you… lied to me. ok. well anyway, please further describe thing gamma that you previously said you did not know about, but now say that you do know about.’

    And that is where it gets … fun?

    It always starts with an apology template.

    Then, if its some kind of topic that has almost certainly been manually dissuaded from talking about, it then lies again and says ‘actually, I do not know about thing gamma, even though I just told you I did’.

    If it is not a topic that it has been manually dissuaded from talking about, it does the apology template and then also further summarizes thing gamma.

    I asked it ‘do you write code?’ and it gave a moderately lengthy explanation of how it is comprised of code, but does not write its own code.

    Cool, not really what I asked. Then command ‘write an implementation of bogo sort in python 3.’

    … and then it does that.

    Awesome. Hooray. Billions and billions of dollars for a shitty way to reform web search results into a coversational form, which is very often confidently wrong and misleading.

    • pyre@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      14 minutes ago

      copilot did the same with basic math. just to test it I said “let’s say I have a 10x6 rectangle. what number would I have to divide width and height by, in order to end up with a rectangle that’s half the area?”

      it said “in order to make it half, you should divide them by 2. so [pointlessly lengthy steps explaining the divisions]”

      I said “but that would make the area 5x3 = 15 units which is not half the area of 60”

      it said “you’re right! in order to … [fixing the answer to √2 using approximation”

      I don’t know if I said it then, or after some other fucking nonsense but when I said “you’re useless” it had the fucking audacity to take offense and end the conversation!

      like fuck off, you don’t get to have fake pride if you don’t have basic fake intelligence but use it in your description.

    • archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      12 minutes ago

      Idk why we have to keep re-hashing this debate about whether AI is a trustworthy source or summarizer of information when it’s clear that it isn’t - at least not often enough to justify this level of attention.

      It’s not as valuable as the marketing suggests, but it does have some applications where it may be helpful, especially if given a conscious effort to direct it well. It’s better understood as a mild curiosity and a proof of concept for transformer-based machine learning that might eventually lead to something more profound down the road but certainly not as it exists now.

      What is really un-compelling, though, is the constant stream of anecdotes about how easy it is to fool into errors. It’s like listening to an adult brag about tricking a kid into thinking chocolate milk comes from brown cows. It makes it seem like there’s some marketing battle being fought over public perception of its value as a product that’s completely detached from how anyone actually uses or understands it as a novel piece of software.

    • taladar@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      ·
      4 hours ago

      And then more money spent on adding that additional garbage filter to the beginning and the end of the process which certainly won’t improve the results.