I think it’s pretty low. All of them the requirements are low, but the better your hardware the faster your results are.
I’m using LM Studio to run it. Previous models I ran with Oobabooga, but it needs an update to run the DeepSeek models that it hasn’t gotten yet. It’s all pretty easy to get started. You can try it out and worst case you just delete it after if your system isn’t powerful enough to be useful.
I want to run it locally what’s the system requirements?
I think it’s pretty low. All of them the requirements are low, but the better your hardware the faster your results are.
I’m using LM Studio to run it. Previous models I ran with Oobabooga, but it needs an update to run the DeepSeek models that it hasn’t gotten yet. It’s all pretty easy to get started. You can try it out and worst case you just delete it after if your system isn’t powerful enough to be useful.
Yea true! I’ll try it out tonight.
Maybe this article will help you: GPU System Requirements for Running DeepSeek-R1 https://apxml.com/posts/gpu-requirements-deepseek-r1