So you can run a local AI assistant with ollama.

https://ollama.com/

Ollama isn’t a complete software and you will need a front end such as oterm or openwebUI.

Keep in mind that ollama has models under various licenses. I would use models under a free license such as Mistral, Mixtral llama2 and Llava. (Lava next uses llama3 which isn’t under a free license)

Keep away from models such as llama3 and Gemma as they are under a non free license.

You also can fine tune a model but you will need a significant amount of compute.