DeepSeek is the new artificial intelligence Chinese food that everyone is talking about and that has caused so much fuss. You have probably already tried it from its website or by installing the app on your mobile device. But, would you like to try it? Using DeepSeek locallyIn this post we show you the easiest way to do it on a Windows 11 computer.
Using DeepSeek locally has its advantages. The most important of all is that You don't need an internet connection to interact with the chatbot. This prevents the information you enter from being shared with external servers, giving you greater privacy. Interested? Let's take a look at the procedure to install and launch DeepSeek locally on Windows 11.
What you need to use DeepSeek locally with Windows 11

Let's first review what the requirements are to use DeepSeek locally with Windows 11. There are several ways to do it, but the one we are going to describe below is the simplest of all. To do this, the first thing you need in a Windows 10 or Windows 11 computer, although the procedure also works on computers running Linux and macOS.
Second, you will need have about 5 GB available on your storage unitThat's the approximate size of the stripped-down versions 7b (4.7 GB) and 8b (4.9 GB) of DeepSeek that can be installed on almost any computer. There are also larger versions that require more power to run locally, such as the 671b, which weighs 404 GB. Obviously, the larger the size, the better the performance and responsiveness of the AI.
And thirdly, it will be necessary Install a program on your computer that allows you to run DeepSeek and chat with the model. One of the most popular is Ollama, designed to run multiple AI models locally. Another is LM Studio, which basically performs the same function, but with the advantage of providing a more user-friendly graphical interface to interact with AI.
Step by step to use DeepSeek locally with Windows 11

That being said, let's get to the Step by step to use DeepSeek locally with Windows 11First we will see the procedure to do it through Ollama, and then using the LM Studio program. Remember that you can also follow these steps to use DeepSeek locally with Windows 10, macOS and GNU Linux.
The first thing to do is Visit Ollama's website y Download the version for the Windows operating system. On the ollama.com home page, click the button Download to go to the download page and download the Ollama .exe file. Then, go to the download page on your computer and run the file to install the application.
The next step is to launch the Ollama applicationYou can find the icon in the Windows Start menu and click on it. You will notice that the program does not open like any other, but will be running in the background. Check this in the left menu of the toolbar (you will see the Ollama icon next to the other applications that are running).
Download and install DeepSeek from Command Prompt
With Ollama running in the background, let's open the Command Prompt or CMD. You can search for it in the Start menu or open it by pressing the Windows+R shortcut. Once this is done, let's write the following code to download DeepSeek R1 version 7b: ollama pull deepseek-r1:7bIf you want to install another version, replace 7b with the version number you prefer.
Once the download is complete, type the following code into the command prompt to install and launch DeepSeek: ollama run deepseek-r1:7bIf you have downloaded another version, remember to replace 7b with the correct code. The installation process will take more or less time depending on your internet connection and the version of DeepSeek you want to run. Once it is finished, you will be able to use DeepSeek locally on your Windows 11 computer.
At this point, you can now ask the AI a question. typing a prompt at the command prompt. Before you see the answer, you will see the label and the reasoning that the AI uses to generate it. This is similar to what happens in the DeepSeek mobile app. Whenever you are going to use the AI on your computer locally, remember to run the Ollama app first.
Using DeepSeek locally with LM Studio

Using DeepSeek locally with Ollama is advantageous because the application consumes few resources, but it is not very intuitive for the average user. If this is your case, you have the option of Run DeepSeek AI with graphical interface using the LM Studio program. To use it, you must first download the version compatible with Windows 11 from its official website, lmstudio.ai.
Once installed, open the LM Studio application and Type DeepSeek in the search bar above. You will see the DeepSeek models available for download displayed. Choose the one you want and click on the button Download to download it. Remember that the heavier the model, the longer it will take to download and the greater the resource demand to run it.
Once you have downloaded the DeepSeek model, click on the folder icon which is in the vertical menu on the left LM Studio. There you will find all the AI models you have downloaded. Select DeepSeek and click the button Load Model to start execution and interact with artificial intelligence.
Using LM Studio to use DeepSeek locally with Windows 11 it's much simplerThe interface looks pretty much the same as the one you see in the DeepSeek mobile app or when you open the AI from your browser. There's a text field to type in your prompt, and you can even add documents and other files as part of your query. Whether you prefer Ollama or stick with the LM Studio graphical interface, you're using DeepSeek locally and offline. Take advantage of all the benefits this mode offers!
Since I was very young I have been very curious about everything related to scientific and technological advances, especially those that make our lives easier and more entertaining. I love staying up to date with the latest news and trends, and sharing my experiences, opinions and advice about the equipment and gadgets I use. This led me to become a web writer a little over five years ago, primarily focused on Android devices and Windows operating systems. I have learned to explain in simple words what is complicated so that my readers can understand it easily.