Skip to content

chore: llama.cpp - gently handle the removal of ChatMessage.from_function #1298

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 16, 2025

Conversation

anakin87
Copy link
Member

Related Issues

Nightly tests with Haystack main are failing due to the removal of ChatMessage.from_function in deepset-ai/haystack#8725

failing test: https://github.com/deepset-ai/haystack-core-integrations/actions/runs/12799444834/job/35685509040

Proposed Changes:

  • Avoid using ChatMessage.from_function in tests - if not available
  • Partially unrelated: pin haystack-ai>=2.9.0 to remove a previously introduced compatibility check

How did you test it?

CI; local tests with Haystack main branch

Notes for the reviewer

You may notice how this PR is doing some workarounds to avoid tests failing.
I'll soon create an issue to keep track of proper implementation of Tool support in llama.cpp.

Checklist

@anakin87 anakin87 requested a review from a team as a code owner January 16, 2025 12:04
@anakin87 anakin87 requested review from vblagoje and removed request for a team January 16, 2025 12:04
@anakin87 anakin87 merged commit ee543c1 into main Jan 16, 2025
11 checks passed
@anakin87 anakin87 deleted the fix-llamacpp-removed-chatmessagefromfunction branch January 16, 2025 14:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants