AMD GPU for local inference

I’m considering buying a graphics card and besides using it for games, I want to use it for some local AI experiments, specifically using the transformers and diffusers libraries with HuggingFace models. I currently have an AMD RX 5600XT which does not seem to be supported by rocm. I was considering mainly AMD cards because of their compatibility with games and Linux.

If I were considering another AMD card like a 7900XT 24GB or a 9070 16GB, what would that make available to me in terms of AI experiments? Which models would I be able to use, even if it’s a bit slow, and which models would still be off limits?

thanks for any advice!

1 Like

a 7900XT 24GB or a 9070 16GB

Both GPUs seem to have decent ROCm support. For generative AI tasks, while clock speed matters, insufficient VRAM makes running things difficult in the first place. Since VRAM is generally more critical, the 24GB model would likely be easier to work with.

When using ROCm-compatible GPUs on Linux, installing PyTorch with ROCm support (basically, most Hugging Face libraries work if PyTorch works) seems to allow usage with minimal code changes.

For non-Hugging Face tools, using vLLM and Ollama for LLMs, or ComfyUI and A1111 WebUI for image generation, should also work without issues.
Fine-tuning LLMs should also be possible.

However, I currently don’t own any AMD GPUs other than the APU, so please take this as general advice… Also, while it’s not much of an issue on Linux, Windows often has various incompatibilities. This is true even with nVidia. Since you’re using Linux, you’ll probably be fine.

For beginners I think it’s almost always a better option to go with nVidia. Yes, technically you can run local AI on AMD gpu’s but your options are limited and almost every workflow will require more troubleshooting.

I’d consider looking at used RTX 3060TI 12GB or RTX 4060TI 16GB. These are very affordable on eBay at the moment and are fantastic starter gpu’s. I’d be careful buying anything less powerful or with less vram - and yes the AMD gpu’s do have more VRAM but again software tooling is a nightmare.

I’ve seen llamabuilds.ai referenced here before - they have some great example builds that utilize used hardware and show what can be done with gpus like the 3060ti / 4060ti.

Let us know what gpu you choose to go with!