You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For example, in the OpenAIChatGenerator we handle the stream response in this function which is shown below:
def_handle_stream_response(self, chat_completion: Stream, callback: SyncStreamingCallbackT) ->List[ChatMessage]:
chunks: List[StreamingChunk] = []
chunk=Nonechunk_delta: StreamingChunkforchunkinchat_completion: # pylint: disable=not-an-iterableassertlen(chunk.choices) <=1, "Streaming responses should have at most one choice."# NOTE: We convert both Tool Call and Text chunks into a chunk_delta herechunk_delta=self._convert_chat_completion_chunk_to_streaming_chunk(chunk)
chunks.append(chunk_delta)
# NOTE: Here we always pass that chunk_delta to the callbackcallback(chunk_delta)
return [self._convert_streaming_chunks_to_chat_message(chunk, chunks)]
as you can see we pass all chunk deltas whether they are text or tool calls to the callback allowing streaming of the tool call.
We should update the AmazonBedrockChatGenerator to allow for streaming of tool calls as well by passing the tool call chunks to the streaming callback.
The text was updated successfully, but these errors were encountered:
The ``
For example, in the OpenAIChatGenerator we handle the stream response in this function which is shown below:
as you can see we pass all chunk deltas whether they are text or tool calls to the callback allowing streaming of the tool call.
We should update the
AmazonBedrockChatGenerator
to allow for streaming of tool calls as well by passing the tool call chunks to the streaming callback.The text was updated successfully, but these errors were encountered: