3 Ways to Run DeepSeek r1 Locally on your laptop - NO Coding

DeepSeek has stunned the AI industry with a model that rivals closed-source models. At the heart of its leading r1 model lies a novel Reinforcement Learning (RL) approach to training.
Though the model is currently free to use on the DeepSeek platform, some are concerned about privacy issues when sharing their data.
Fret not! Here are three platforms available for running the DeepSeek r1 model locally, even on a laptop.
Read this article for free here. Also, why not subscribe there to get articles like this right to your inbox?
LM Studio

- Hit LM Studio and download the right LM studio for your operating system. I have got a Mac so I downloaded the Mac release.
- The download is a .dmg file that can be installed like any other app on Mac.
- If we open the app and click “Discover” on the left menu, the “Mission Control” page pops up. Under the “Model Search” tab, if we search for “deepseek” we can find all the deepseek models. We can install the model based on our needs and the computing power available.

- Once the download is complete, the button changes to a purple-colored “Use in New Chat” button.
- If we click on that, it opens up a new chat window with the DeepSeek model fully running locally on your machine.
- Check out the response to the question, “What is the speciality of the DeepSeek r1 model?” below:

PS: Download the right model that is compatible with your hardware and that satisfies your requirements.
GPT4all

GPT4all is a platform developed by Nomic AI. To get started:
- Hit this link and download the right version that is compatible with your operating system. For me, it was the Mac version. The download leads to a dmg file which can be installed like any other app on a a Mac.
- Open the downloaded app and go to “Models” → “Add Model”. Under the app models page, select “Reasoning”.

- It lists the models that are available to download. If the model is not compatible with your hardware, it shows a warning in red below the download button.
- After downloading a compatible DeepSeek model for your computer, you can always remove the model by clicking on the “Remove” button that comes in place of the “Download” button.
- Click on “Chat” at the left menu bar to start chatting. It asks us to load a model to use. We can choose the DeepSeek model from the dropdown at the top and start chatting.
- Check out the response to the question, “What is the speciality of the DeepSeek r1 model?” below:

Ollama
Unlike the other two apps, Ollama is suitable for someone familiar with the fundamentals of programming. Some commands need to be run on the command prompt. Below are the steps to follow:
- Go to Ollama site and download Ollama for your operating system.

- On the Mac, once the dmg file is downloaded, we can install it like any other app.
- To use Ollama, first open the terminal app (on Mac). In the terminal verify the installation by running the command
ollama —version
. If the installation went through fine, then we will get to see the installed version as shown below

- After verifying, check out the models page on the Ollama site and navigate to deepseek-r1 to find the name of the deepseek model we wish to use.
- For example, one of the quantized models that can be run on a Macbook is DeepSeek-R1-Distill-Qwen-1.5B. So to get the model, all that we need to do is, run the command ollama run DeepSeek-R1-Distill-Qwen-1.5B. The command will download the model, and open a command prompt for us to chat with as shown in the below screenshot:

- We can then chat with the model in the command prompt. For example, I asked the question, “What is the speciality of deepseek r1 model?”. I got the response as shown in the below screenshot:

Conclusion
We are now at a stage where not only the models are getting sophisticated, but also the platforms to run those. We have just scratched the surface of what is possible with these platforms.
Let me know if you can think of any other platforms that enable us to run DeepSeek (or any open-source LLM for that matter) on compute-constrained hardware like a laptop.