So you can run a local AI assistant with ollama.
Ollama isn’t a complete software and you will need a front end such as oterm or openwebUI.
Keep in mind that ollama has models under various licenses. I would use models under a free license such as Mistral, Mixtral llama2 and Llava. (Lava next uses llama3 which isn’t under a free license)
Keep away from models such as llama3 and Gemma as they are under a non free license.
You also can fine tune a model but you will need a significant amount of compute.
You must log in or register to comment.