You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following the demo of Microsoft(https://github.com/microsoft/Phi3-Chat-WinUI3-Sample), I can run phi3 model on GPU via Onnxruntime + DirectML. By using C#, I want to run program on NPU (OnnxRuntime + DirectML + NPU), but it failed.
I already install Microsoft.AI.DirectML 1.13.1 and the Microsoft.ML.Onnxruntime.DirectML 1.17.1 on VS 2022 Nuget
And I write C# test code in demo, as follows:
And when '4> Index >= 0' output with [DmlExecutionProvider, CPUExecutionProvider]
When I set 'Index=4', the error is occured as follows:
when 'Index >= 5', the error is occured as follows:
However, in the Microsoft demo, the loading of the phi3 model is also not configured using similar code as below
SessionOptions options = new SessionOptions();
options.AppendExecutionProvider_DML(Index)
Therefore, how can the reasoning of phi3 be executed on the NPU?
Is there currently a demo that can reason about Phi3 via OnnxRuntime + DirectML on NPU? (C#, C++, python are all good)
My PC configuration as follows:
Processor: Intel(R) Core(TM) Ultra 7 165H 1.40GHz
GPU: Intel(R) Arc(TM) Graphics
NPU: Intel(R) AI Boost
Installed RAM: 32.0GB
System Type:64-bit operating system, x64-based processor
Windows Edition: Windows 11 pro
version: 24H2
The text was updated successfully, but these errors were encountered:
Hello, big guys, I checked the Microsoft blog and did the following test(https://blogs.windows.com/windowsdeveloper/2024/02/01/introducing-neural-processor-unit-npu-support-in-directml-developer-preview/)
Following the demo of Microsoft(https://github.com/microsoft/Phi3-Chat-WinUI3-Sample), I can run phi3 model on GPU via Onnxruntime + DirectML. By using C#, I want to run program on NPU (OnnxRuntime + DirectML + NPU), but it failed.
I already install Microsoft.AI.DirectML 1.13.1 and the Microsoft.ML.Onnxruntime.DirectML 1.17.1 on VS 2022 Nuget
And I write C# test code in demo, as follows:
And when '4> Index >= 0' output with [DmlExecutionProvider, CPUExecutionProvider]
When I set 'Index=4', the error is occured as follows:
when 'Index >= 5', the error is occured as follows:
However, in the Microsoft demo, the loading of the phi3 model is also not configured using similar code as below
SessionOptions options = new SessionOptions();
options.AppendExecutionProvider_DML(Index)
Therefore, how can the reasoning of phi3 be executed on the NPU?
Is there currently a demo that can reason about Phi3 via OnnxRuntime + DirectML on NPU? (C#, C++, python are all good)
My PC configuration as follows:
Processor: Intel(R) Core(TM) Ultra 7 165H 1.40GHz
GPU: Intel(R) Arc(TM) Graphics
NPU: Intel(R) AI Boost
Installed RAM: 32.0GB
System Type:64-bit operating system, x64-based processor
Windows Edition: Windows 11 pro
version: 24H2
The text was updated successfully, but these errors were encountered: