Skip to content

Adding streaming responses #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 19, 2025
Merged

Adding streaming responses #3

merged 2 commits into from
May 19, 2025

Conversation

jonigl
Copy link
Owner

@jonigl jonigl commented May 19, 2025

v0.5.0 – Add Support for Ollama Streaming Responses

πŸš€ Highlights

This PR introduces response streaming capabilities in the MCP Client for Ollama. When stream=True is set, the client now returns streaming responses from the server, enabling more responsive and efficient interactions.

πŸ”§ Changes

  • Implemented response streaming support via the ollama chat stream=True flag
  • Using rich live for Markdown stream response

⚠️ Note

  • Known issue: Streaming does not function correctly in Ollama when tools are enabled. See ollama/ollama-python#463 for details.

@jonigl jonigl merged commit 0e36bc6 into main May 19, 2025
3 checks passed
@jonigl jonigl deleted the feature/add-streaming-responses branch May 19, 2025 21:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant