Skip to content

Bug: OnPostChatPromptAsync doesn't create an async yield array for internal back-end exceptions #301

@Coruscate5

Description

@Coruscate5

Please provide us with the following information:

This issue is for a: (mark with an x)

- [X] bug report -> please search issues before submitting
- [ ] feature request
- [ ] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)

Minimal steps to reproduce

Exceed the context window, or manually throw an exception in OnPostChatPromptAsync, or cause any other OpenAI API call failure - the response generation never completes, because this action does not create an async enumerable that can be ingested by front-end

Any log messages given by the failure

Expected/desired behavior

Displays API call error to user. Fix: Migrate the side-effect to a try-catch block (openai.GetChatStreamingCompletions), then use a bool flag to either return an Error based ChunkResponse ("Error: ex.Tostring") or the actual completed Response.

OS and Version?

Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)

Versions

Mention any other details that might be useful


Thanks! We'll be in touch soon.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions