Running ollama on RX 5600M

I followed the instructions below. 

 

(129) Force Ollama to Use Your AMD GPU (even if it’s not officially supported) – YouTube

 

Issues Running Ollama on Windows 10 with AMD Radeon RX 6700 XT and ROCm 6.1 : r/ollama

 

Running ollama with an AMD Radeon 6600 XT · Major Hayden

 

likelovewant/ollama-for-amd: Get up and running with Llama 3, Mistral, Gemma, and other large language models.by adding more amd gpu support.

 

It was quite promising, but it failed in the end. Maybe my graphics cards was too old. 🙁 

 

and the, I found a way to utilize my old GPU, which is using LM Studio instead of ollama. 

This just worked. 

However, the performance improvement is not that impressive. It was 5.x token/s with GPU while 3.x token/s with CPU only. 

GPU did it’s role quite fast but when generating output still heavily used CPU, which ended up slow total response time. 

댓글 달기

이메일 주소는 공개되지 않습니다. 필수 필드는 *로 표시됩니다