Meta has released llama 3.1. It seems to be a significant improvement to an already quite good model. It is now multilingual, has a 128k context window, has some sort of tool chaining support and, overall, performs better on benchmarks than its predecessor.

With this new version, they also released their 405B parameter version, along with the updated 70B and 8B versions.

I’ve been using the 3.0 version and was already satisfied, so I’m excited to try this.

    • chayleaf@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago

      the code is FOSS, the weights aren’t, this is pretty common with e.g. FOSS games, the only difference here is weights are much costlier to remake from scratch than game assets

      • Possibly linux
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        5 months ago

        The license has limitations and isn’t something standard like Apache

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          True, but it hardly matters for the source since the architecture is pulled into open source projects like transformers (Apache) and llama.cpp (MIT). The weights remain under the dubious Llama Community License, so I would only call the data “available” instead of “open”.