Last mod: 2025.02.19

DeepSeek local - "Hello world" example

Running DeepSeek on local servers ensures full control over data and enhances security by eliminating the risk of leaks to external providers. It also allows for model customization to meet specific organizational needs, incorporating proprietary data and unique requirements. Additionally, independence from cloud services provides greater operational stability, freedom from provider policy changes, and potentially lower long-term costs.

Hardware and software

The example is run on:

  • CPU i7-4770K
  • 32GB RAM
  • Ubuntu 24.04 LTS Server
  • Python 3.12

It is a rather old configuration and slow, but nevertheless allows even the deepseek-r1:32b model to run.

Install and run

Install Ollama:

curl -fsSL https://ollama.com/install.sh | sh

Run deepseek-r1:7b model:

ollama run deepseek-r1:7b

Commands list:

/?

ollama run deepseek-r1:7b | /?

Let us ask a simple question "What is two multiplied by five?":

What is two multiplied by five?

Links

https://github.com/ollama/ollama