You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug weave wraps OpenAI's Stream in a way that breaks class inheritance which is why our check for isinstance(response, Stream) fails which leads to an error for anyone who wants to use streaming in OpenAIChatGenerator or OpenAIGenerator together with weave. We ran into this in deepset Studio.
Although it's in general bad practice by the tracing lib providers to wrap well-know API types such as OpenAI's Stream, we've learned that more than one of these libs make use of this pattern anyways. Therefore I'd suggest to part from the type checks in question in Haystack alltogether.
Exception has occurred: AssertionError (note: full exception trace is shown but execution is paused at: <module>)
exception: no description
File "/home/haystackd/.local/lib/python3.12/site-packages/haystack/components/generators/chat/openai.py", line 251, in run
assert is_streaming or streaming_callback is None
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haystackd/.local/lib/python3.12/site-packages/haystack/core/pipeline/pipeline.py", line 80, in _run_component
component_output = instance.run(**component_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haystackd/.local/lib/python3.12/site-packages/haystack/core/pipeline/pipeline.py", line 248, in run
component_outputs = self._run_component(component, inputs, component_visits, parent_span=span)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/haystackd/debug/debug_pipeline.py", line 17, in <module> (Current frame)
result = pipeline.run(data={"query": query, "text": query, "question": query, "streaming_callback": lambda x: print(x.content, end="")})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/runpy.py", line 88, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.12/runpy.py", line 198, in _run_module_as_main
return _run_code(code, main_globals, None,
AssertionError:
(Stacktrace produced with Haystack 2.10.3, but current main has the same issue)
Expected behavior
Streaming of OpenAIGenerators works with tracing-libs such as weave.
Describe the bug
weave wraps OpenAI's Stream in a way that breaks class inheritance which is why our check for
isinstance(response, Stream)
fails which leads to an error for anyone who wants to use streaming inOpenAIChatGenerator
orOpenAIGenerator
together with weave. We ran into this in deepset Studio.Although it's in general bad practice by the tracing lib providers to wrap well-know API types such as OpenAI's
Stream
, we've learned that more than one of these libs make use of this pattern anyways. Therefore I'd suggest to part from the type checks in question in Haystack alltogether.Responsible Code:
haystack/haystack/components/generators/chat/openai.py
Lines 258 to 259 in 195d403
haystack/haystack/components/generators/openai.py
Line 220 in 195d403
Error message
(Stacktrace produced with Haystack 2.10.3, but current main has the same issue)
Expected behavior
Streaming of OpenAIGenerators works with tracing-libs such as
weave
.Additional context
We've seen issues with type checks on tracing-lib-wrapped responses before (See deepset-ai/haystack-core-integrations#1454)
To Reproduce
weave.init(project_name)
using default settingsOpenAIChatGenerator
withstreaming_callback
set.FAQ Check
System:
The text was updated successfully, but these errors were encountered: