Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(llm): Support OpenHands LM #7598

Merged
merged 17 commits into from
Mar 31, 2025
Merged

(llm): Support OpenHands LM #7598

merged 17 commits into from
Mar 31, 2025

Conversation

xingyaoww
Copy link
Collaborator

@xingyaoww xingyaoww commented Mar 31, 2025

  • This change is worth documenting at https://docs.all-hands.dev/
  • Include this change in the Release Notes. If checked, you must provide an end-user friendly description for your change below

End-user friendly description of the problem this fixes or functionality that this introduces.

Add documentation and LLM support for OpenHands LM


Give a summary of what the PR does, explaining any non-trivial design decisions.

We are gonna release this model today :)

Blog: https://www.all-hands.dev/blog/introducing-openhands-lm-32b----a-strong-open-coding-agent-model


Link of any specific issues this addresses.


To run this PR locally, use the following command:

docker run -it --rm   -p 3000:3000   -v /var/run/docker.sock:/var/run/docker.sock   --add-host host.docker.internal:host-gateway   -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:29c0bc1-nikolaik   --name openhands-app-29c0bc1   docker.all-hands.dev/all-hands-ai/openhands:29c0bc1

@xingyaoww xingyaoww marked this pull request as draft March 31, 2025 14:32
- the API key is optional, you can use any string, such as `ollama`.
- the model to `openai/openhands-lm-32b-v0.1` (`openai/`, and then `served-model-name` you set above)
- the base url to `http://host.docker.internal:8000`
- the API key is optional, you can use any string, such as `mykey` you set above.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the API Key need to be set to the exact string you set above? Or can be set to anything regardless of what you set it to above?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no -- it have to be the exact string you set above. will modify this


In the API Key field, enter `ollama` or any value, since you don't need a particular key.
In the API Key field, enter `my` or any value you setted.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can fix this but to understand, this section is for running it in development mode and the one above that has the same information is for when you run OpenHands via the docker command?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes!

@enyst
Copy link
Collaborator

enyst commented Mar 31, 2025

This is very interesting and great to see!

Just a small detail, I'd suggest we could create a different page than local_llms, and let that one continue to document Ollama and LMStudio. The current page is not even linked from the left panel, it's just there to be linked by us for people who want to try it with their Ollama or LMStudio models. Or we can make another page, I suppose. I'm just thinking that these models advance at an unbelievable pace, and people are asking about the newest [local model here], so this way we have some docs to help them along. Never know what tomorrow may bring, too! 😅

Not sure if the PR did yet, but we may want to link from the left side panel, the page documenting this new model. 🚀

@mamoodi mamoodi self-requested a review March 31, 2025 16:46
@mamoodi
Copy link
Collaborator

mamoodi commented Mar 31, 2025

This is very interesting and great to see!

Just a small detail, I'd suggest we could create a different page than local_llms, and let that one continue to document Ollama and LMStudio. The current page is not even linked from the left panel, it's just there to be linked by us for people who want to try it with their Ollama or LMStudio models. Or we can make another page, I suppose. I'm just thinking that these models advance at an unbelievable pace, and people are asking about the newest [local model here], so this way we have some docs to help them along. Never know what tomorrow may bring, too! 😅

Not sure if the PR did yet, but we may want to link from the left side panel, the page documenting this new model. 🚀

I reset my vote so this comment gets looked at. I think the consensus was that the ollama one didn't really work as is and one needs to be written for it from scratch. @enyst

@xingyaoww xingyaoww marked this pull request as ready for review March 31, 2025 16:59
@xingyaoww xingyaoww enabled auto-merge (squash) March 31, 2025 16:59
@enyst
Copy link
Collaborator

enyst commented Mar 31, 2025

Oh, it's not a big deal. I think we can also add another page and maybe I'll convince you Mamoodi. 😅

The thing is that yes, currently it seems that 32B models might still be too small to work reasonably well as they are. So post-training one on openhands trajectories and whatever these research folks have done there, is very good and perhaps the thing that works well today!

That doesn't mean things aren't changing. On the contrary, it seems like they are, at a crazy pace, and people will always try to push boundaries. I know I'm happy for that!

The way the page was, not linked, not in anyone's face, does mean IMHO that we had to do the extra step to find it for people and show it to them. I think that can continue: have some doc and let people try a new coding LLM etc, and see. Otherwise it seems like we will have to copy paste our responses everywhere. 😅 🤷‍♂️

Copy link
Contributor

@neubig neubig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the ollama and lmstudio instructions weren't tested and working properly, so I'd be in favor of removing them for now and then adding them back when we know they're working.

@enyst
Copy link
Collaborator

enyst commented Mar 31, 2025

I think that the ollama and lmstudio instructions weren't tested and working properly, so I'd be in favor of removing them for now and then adding them back when we know they're working.

@neubig I have tested them on my machine the last time I updated them. Is there something in particular that wasn't working?

@xingyaoww xingyaoww merged commit 648c8ff into main Mar 31, 2025
16 checks passed
@xingyaoww xingyaoww deleted the xw/openhands-lm branch March 31, 2025 17:29
@mamoodi
Copy link
Collaborator

mamoodi commented Mar 31, 2025

I think that the ollama and lmstudio instructions weren't tested and working properly, so I'd be in favor of removing them for now and then adding them back when we know they're working.

@neubig I have tested them on my machine the last time I updated them. Is there something in particular that wasn't working?

@enyst there were a lot of Discord users that could not get it working using the docs. To the point where I stopped sending it over.

@enyst
Copy link
Collaborator

enyst commented Mar 31, 2025

OK, I'll revisit it all.

doew pushed a commit to doew/OpenHands that referenced this pull request Apr 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants