Skip to content

Open Question: Add async/streaming interface? #26

@yuhonglin

Description

@yuhonglin

Currently, the interface is synced. That is, for every input "x", the client will get some output "y=f(x)".

But for some application, the input/output may be asynced. Take speech recognition as an example, an typical usage may be,

  1. The client starts the recognition.
  2. The client keeps feeding audio data to the model, without receiving any output.
  3. When the model thinks the input audio data is enough to make a reasonable prediction, it will actively tells the client the result.
  4. The client stops the recognition.

So the client will need to provide an "OnPredictionResult" callback to the model.

In some cases, it is the model that actively asks for input (e.g. when the model thinks it is ready). Then there will be no Step 2 and the client needs to provide an "OnGetInput" callback to the model.

This is not a blocker for now, but just an interesting issue to think about.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions