I've been using Ollama + ROCM with my fairly underpowered RX 580 (8gb), and have had a lot of success with different models. I'm surprised at how well everything works and can see myself building a home server dedicated to AI workloads in the relatively near future.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I've been using Ollama + ROCM with my fairly underpowered RX 580 (8gb), and have had a lot of success with different models. I'm surprised at how well everything works and can see myself building a home server dedicated to AI workloads in the relatively near future.