-
Notifications
You must be signed in to change notification settings - Fork 481
Add support for LlamaIndex streaming #1769
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
from llama_index.llms.openai import OpenAI | ||
from llama_index.core.llms import ChatMessage | ||
|
||
llm = OpenAI(model="gpt-3.5-turbo-0125") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's better not to use the exact version in the test and let the provider just give you the latest. Later they will reduce computation resources for old versions and tests will start failing due to servers overload.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done but I doubt that OpenAI will introduce new gpt-3.5 models
@@ -4,9 +4,13 @@ | |||
from llama_index.core import Settings | |||
from llama_index.core.base.llms.types import ChatResponse | |||
from llama_index.core.callbacks import schema as llama_index_schema | |||
from llama_index.core.callbacks import ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please import modules (I know it was already violated in this file)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was it only for llama_index
imports or also opik?
self._opik_client.span(**span_data.__dict__) | ||
|
||
del self._map_event_id_to_span_data[event_id] | ||
# # Orphaned events where the output was received after end_trace |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need this comment or it can be deleted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've deleted it
Details
Issues
Fix #1743
Testing
Documentation