-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Copilot+ PCs #92
Comments
I wonder if this is related to onnxruntime-genai still awaiting QNN support. |
This is listed in the docs as supports AI Copilot PC but it doesnt, my NPU activity is 0%. So how to use this? |
I don't see any reference yet to CoPilot+ PC in the AI Toolkit docs, at least not here. Because it relies on onnxruntime-genai, I believe QNN support must land there first before AI Toolkit can take full advantage of it. You might be able to take some advantage of the NPU now, indirectly, by using DirectML with a model like |
Hi @sirredbeard - I saw it in the release notes on installation of the VSCode extension with the mention of support. But I agree seems many frameworks are dependent on the QNN runtimes/sdks being release. |
It seems like |
Me neither - what is the course of action to enable models to show up on Snapdragon machines ? |
It would be great to see if AI Toolkit can leverage the NPU in Copilot PCs.
Currently this uses the CPU, its nice a quick on the Snapdragon processors but not using the AI processor when running models.
The text was updated successfully, but these errors were encountered: