I installed Ollama but I don’t have any ideas of what to do with it.

Do you have any fun/original use cases for it? I’m a programmer so it doesn’t have to exist already.

  • The Hobbyist
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    13 hours ago

    Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

    You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

    Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

    Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.