Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Flowise Streaming Mode - Multiple Concurrent Requests Response Mixing #3474

Open
vijaykammili opened this issue Nov 6, 2024 · 2 comments
Labels
question Further information is requested

Comments

@vijaykammili
Copy link

@HenryHengZJ - When sending multiple HTTP requests concurrently with the same session to Flowise using streaming mode, the responses are not properly separated by connection. Instead, all responses are combined and streamed back through a single HTTP connection, making it impossible to parse individual responses. Can this be fixed?

1st Request:

{
"question": "tell me about Elon musk in 1000 words..",
"overrideConfig": {
"sessionId": "123"
},
"streaming": true
}

2nd Request:

{
"question": "tell me about Steve Jobs in 1000 words..",
"overrideConfig": {
"sessionId": "123"
},
"streaming": true
}

Response (This needs to be two separate streams):

Elon Musk is a renowned entrepreneur...1000 words here..

Steve Jobs, born on February 24, 1955, in San Francisco....

Version: 2.1.3

Also, is there way to stop stream for an existing request? we tried aborting the http connection but it doesn't kill the streaming process

@HenryHengZJ
Copy link
Contributor

How do you simulate concurrent streaming request?

And is it the same when you directly use OpenAI or other LLM streaming API outside of Flowise?

@HenryHengZJ HenryHengZJ added the question Further information is requested label Nov 9, 2024
@vijaykammili
Copy link
Author

vijaykammili commented Nov 9, 2024

@HenryHengZJ We simulated this using the Flowise SDK, the http requests module, and Postman by sending requests one after another immediately to mimic concurrent streaming. The stream gets mixed when the payload I posted in my comment above is sent. We tested this with AzureOpenAI and OpenAI.

However, we found yesterday that streams are correctly segmented when the chatid is passed in the payload. I think a unique chatid is created when requests are sent via UI and ChatEmbed but not by default when you don't specify it in the payload.

One more observation (minor), when chatid is sent in the payload, the messages UI section displays individual chats with the same session ID rather than grouping them as multiple chats with a single session.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants