Hi, is it worth trying to install and run DeepSeek AI on a regular Windows 11 laptop with a 12th Intel Gen Core i5-1235U and 16GB RAM?
I have just checked whether DeepSeek R1 AI can run locally on a regular everyday laptop without a dedicated graphics card. For the test I have used Ollama with the basic DeepSeek-R1-Distill-Qwen-1.5B and heavier R1-Distill-Llama-8B and R1-Distill-Qwen-14B models on my old laptop with the Intel Core i5-8250U CPU and its Intel UHD 620 integrated graphics. And, it works, but note the R1 1.5B, 8B, and 14B are way smaller models than the full blown R1 671B.
In cases when no dedicated video card is available, Ollama doesn’t use the graphics processor. Instead, it uses the CPU’s processing cores and system memory, as you can see in the screenshots showing a high CPU load of up to 100%. There is no integrated graphics load.
True, speed won’t be as great as on computers with dedicated GPUs.
For instance, it took 37 seconds DeepSeek to complete a prompt “write a song about dogs” using deepseek-r1:1.5b, 4 minutes and 2 seconds using the larger deepseek-r1:8b, and 7 minutes and 28 seconds with deepseek-r1:14b.
Also, system memory size is a limiting factor, since I wasn’t able to use R1 models larger than 14B due to insufficient RAM.
Anyway, being able to run any AI locally on a regular Intel Core i5 & Intel integrated graphics based laptop is certainly good news for a lot of people.
Yes, it works with Ollama on Windows locally, but you can also use it online, which is reportedly running a DeepSeek model better than 1.5B, not sure which. I'm getting server overload errors when using it online though.