-
-
Notifications
You must be signed in to change notification settings - Fork 707
Description
Describe the feature you'd like
I was wondering if there is any way to support the GPU capability of the app with AMDGPU compatibility. This would open up the accessibility of the program to those who don't have or want to use NVIDIA chips for accelerating the inference speed. Perhaps this could be done for the docker container where you have an option to uncomment where you bind the /dev/dri128 directory to the docker container to have it automatically use it when present.
Describe alternatives you've considered
Ostensibly one could just settle with CPU or NVIDIA usage, but again, for the sake enabling greater compatibility, and considering the fact that I have an AMDGPU, it would make sense to enable the compatibility especially for people who have those kinds of homelabs.
Additional context
I run a bunch of docker services that already allow for AMDGPU support. Ollama for example supports it.