I have an GTX 1660 Super (6 GB)

Right now I have ollama with:

  • deepseek-r1:8b
  • qwen2.5-coder:7b

Do you recommend any other local models to play with my GPU?

  • Possibly linux
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 days ago

    Mistral

    I personally run models on my laptop. I have 48 GB of ram and a i5-12500U. It runs a little slow but usable

      • Possibly linux
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        The biggest bottleneck is going to be memory. I would just stick with GPU only since your GPU memory has the most bandwidth.