Skip to content

Description of onnxRuntimeGenAI.QNN Long-Term Inference Output Anomaly Issue #1831

@suki-lqh

Description

@suki-lqh

Bug Description and Reproduction

Describe the Bug

After long-term inference using onnxRuntimeGenAI.QNN, nonsensical output or repeated punctuation may occur.
EnableCaching has been disabled during initialization:

var options = new OnnxRuntimeGenAIChatClientOptions  
{  
    StopSequences = Array.Empty<string>(),  
    PromptFormatter = TestPromptFormatter,  
    EnableCaching = false  
};  

To Reproduce

Steps to reproduce the issue:

  1. Initialize the model
  2. Repeat the inference process (Start inference → Wait for inference to end → Start inference)

Expected behavior

Normal inference output.

Desktop (please complete the following information)

  • OS: Windows 11 Home 25H2 26200.6588
  • OnnxRuntimeGenAI.QNN: 0.10.0
  • NPU Driver: 30.0.140.1000/30.0.145.1000

Additional Context

Add any other relevant information about the issue here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions