Skip to content

Add broadcast for stream open in process_stream_chunk#308

Open
TheRealNeil wants to merge 1 commit intoactiveagents:mainfrom
qlarity:fix-ollama-provider-stream-completion
Open

Add broadcast for stream open in process_stream_chunk#308
TheRealNeil wants to merge 1 commit intoactiveagents:mainfrom
qlarity:fix-ollama-provider-stream-completion

Conversation

@TheRealNeil
Copy link
Contributor

Fixes #289

Summary

Add broadcast_stream_open call in OpenAI::ChatProvider#process_stream_chunk to establish the streaming lifecycle

Root cause

The OpenAI ChatProvider#process_stream_chunk never called broadcast_stream_open, so the streaming flag was never set to true. When streaming finished, broadcast_stream_close checked return unless streaming and returned early — meaning on_stream_close callbacks never fired. This affected both OpenAI and Ollama (which inherits from the OpenAI provider).

The Anthropic and mock providers already called broadcast_stream_open correctly; the OpenAI provider was the odd one out.

Fix

Add broadcast_stream_open at the top of process_stream_chunk, mirroring the pattern in the Anthropic and mock providers. This ensures the full stream lifecycle (open → update → close) fires correctly.

Test plan

  • Verify Ollama streaming now triggers on_stream_close callbacks
  • Verify OpenAI streaming open/close lifecycle works end-to-end
  • Confirm Anthropic and mock provider streaming is unaffected
  • Confirm broadcast_stream_open is idempotent when called on every chunk

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama Provider stream completion callback or done message not being sent

1 participant