Skip to content

[BUG]: Phoenix Playground Run Error Checking #8177

Open
@brandonbiggs

Description

@brandonbiggs

Where do you use Phoenix

Local (self-hosted)

What version of Phoenix are you using?

10.13.2

What operating system are you seeing the problem on?

Linux

What version of Python are you running Phoenix with?

NA

What version of Python or Node are you using instrumentation with?

NA

What instrumentation are you using?

Not instrumentation, but playground interface

What happened?

I am testing the playground interface with a very simple prompt using vLLM hosted models and I am getting an error. If I have streaming on, I can see the message for a brief second and then it disappearts with the error:
peer closed connection without sending complete message body (incomplete chunked read)

Gif of the example:

Image

Full traceback:

Traceback (most recent call last):
  File "/phoenix/env/phoenix/server/api/subscriptions.py", line 150, in chat_completion
    async for chunk in llm_client.chat_completion_create(
  File "/phoenix/env/phoenix/server/api/helpers/playground_clients.py", line 352, in chat_completion_create
    async for chunk in await throttled_create(
  File "/phoenix/env/openai/_streaming.py", line 147, in __aiter__
    async for item in self._iterator:
  File "/phoenix/env/openai/_streaming.py", line 202, in __stream__
    async for _sse in iterator:
  File "/phoenix/env/openai/_streaming.py", line 151, in _iter_events
    async for sse in self._decoder.aiter_bytes(self.response.aiter_bytes()):
  File "/phoenix/env/openai/_streaming.py", line 302, in aiter_bytes
    async for chunk in self._aiter_chunks(iterator):
  File "/phoenix/env/openai/_streaming.py", line 313, in _aiter_chunks
    async for chunk in iterator:
  File "/phoenix/env/httpx/_models.py", line 997, in aiter_bytes
    async for raw_bytes in self.aiter_raw():
  File "/phoenix/env/httpx/_models.py", line 1055, in aiter_raw
    async for raw_stream_bytes in self.stream:
  File "/phoenix/env/httpx/_client.py", line 176, in __aiter__
    async for chunk in self._stream:
  File "/phoenix/env/httpx/_transports/default.py", line 270, in __aiter__
    with map_httpcore_exceptions():
  File "/usr/lib/python3.11/contextlib.py", line 155, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/phoenix/env/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

What did you expect to happen?

It seems like streaming is happening correctly, but there is something at the very end that isn't being caught that is throwing this error.

How can we reproduce the bug?

I'm not totally sure. Maybe it's a vLLM thing? I have seen this error reported in multiple locations. It's on the AI provider's end, but I think Phoenix can do a better job of catching the error.

Additional information

Here is the line that can be caught:

async for chunk in await throttled_create(

If I wrap the async for chunk in await throttled_create on line 352 in a try except:

import httpx
....
try:
    ....
except httpx.RemoteProtocolError as e:
    pass

I get the following:

Image

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingc/playgroundprompt playground and llm provider supportneeds informationNeeds more info from the issuer

Type

No type

Projects

Status

📘 Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions