You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I need support for GPUs in general, both AMD and Intel Graphics.
While an Intel Graphics may not provide as much acceleration, it would still be interesting to be able to utilize the GPUs I have access to. It's really difficult to find good Python frameworks that can meet this requirement, but I hope Gorgonia can help me. In general, my model can be trained on CPU, but obviously a GPU would greatly accelerate the training and iteration process. Is it possible to add this support?
The text was updated successfully, but these errors were encountered:
Hello, I need support for GPUs in general, both AMD and Intel Graphics.
The text was updated successfully, but these errors were encountered: