-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can you add firecrawl for fetch markdown ? #13
Comments
Hi, just seeing this, I have been experimenting friecrawl both locally and using api and yes I think their performance is more stable. |
Some suggestions for reference: consider directly configuring an MCP server to complete search and fetch markdown. There are already many such tools, so you can switch tools at any time. This project should focus on optimizing the deep research process. |
Hi, I agree, and adding mcp is on my roadmap. But since this repo also covers some local models, and most local models cannot call tools reliably as a multiturn agent, in the end I will still hardcode some functions for them and add mcp support for online models. |
can't wait looking forward to it |
I believe local model will be more reliable to make tool calls, you should consider a clear ,simple plan. |
and consider about expose as a mcp server too, since there is tool convert mcp server to open ai like endpoint |
Very cool project! I’ve tested many deep research tools, and this one is truly impressive.
However, the embedded browser seems quite unstable, and I couldn’t get Chrome remote debugging to work properly.
Additionally, r.jina.ai has rate limits, which makes it less suitable for local research.
I think Firecrawl could be a great solution for local research since it’s self-hosted and doesn’t rely on external rate limits.
The text was updated successfully, but these errors were encountered: