-
Notifications
You must be signed in to change notification settings - Fork 474
[Bug]: Streams not logged to Opik when using LLamaIndex integration #1743
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@Lothiraldan Can you take a look |
Hi @Pratham271, I was able to reproduce the bug and I'm working on fixing it. Thank you for opening an issue! |
@Lothiraldan any update on this |
@Pratham271 Yes, I've been able to understand the root cause of the issue. Solving it turns out to be a bit more complicated than anticipated but we have a clear plan now. I'm hoping to have it finished by tomorrow. |
@Lothiraldan Thank you |
@Pratham271 I just merged a PR to add support for it, can you give it a try? You will need to install the Python SDK from a Git checkout or wait for the next release, let me know if you need help with instructions |
@Lothiraldan I will try it and let you know. |
What component(s) are affected?
Opik version
Describe the problem
I am using the following code to log the traces of my llama app
but it logs the traces for .chat method of llama but not for stream_chat function it is showing me nothing in the output for that
Reproduction steps and code snippets
this is how I am setting up configuration for llama and opik:
Error logs or stack trace
No response
Healthcheck results
No response
The text was updated successfully, but these errors were encountered: