I've been using it with a 6800 for a few months now, all it needs is a few env vars.
this post was submitted on 16 Mar 2024
76 points (100.0% liked)
LocalLLaMA
2235 readers
9 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
That's cool. I've just recently gotten hold of an interesting Ampere system and it's got an AMD card in it. I must give it a spin.
I was sadly stymied by the fact the rocm driver install is very much x86 only.
It's improving very fast. Give it a little time.