Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: llm completion exception breaks CodeActAgent #3575

Closed
2 tasks done
tobitege opened this issue Aug 25, 2024 · 3 comments
Closed
2 tasks done

[Bug]: llm completion exception breaks CodeActAgent #3575

tobitege opened this issue Aug 25, 2024 · 3 comments
Labels
bug Something isn't working severity:medium Affecting multiple users

Comments

@tobitege
Copy link
Collaborator

tobitege commented Aug 25, 2024

Is there an existing issue for the same bug?

Describe the bug

Any exception raised by completion within the CodeActAgent breaks the agent and is not recoverable, i.e. the page needs reloading to start a new session.

response = self.llm.completion(

response = self.llm.completion(

Current OpenHands version

0.9.0

Installation and Configuration

any

Model and Agent

No response

Operating System

any

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

Example exception log excerpt:

CodeActAgent LEVEL 0 LOCAL STEP 12 GLOBAL STEP 12


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "/home/tobias/.cache/pypoetry/virtualenvs/openhands-oz9p9M-K-py3.11/lib/python3.11/site-packages/litellm/llms/openai.py", line 1033, in completion
    raise e
  File "/home/tobias/.cache/pypoetry/virtualenvs/openhands-oz9p9M-K-py3.11/lib/python3.11/site-packages/litellm/llms/openai.py", line 966, in completion
    return convert_to_model_response_object(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tobias/.cache/pypoetry/virtualenvs/openhands-oz9p9M-K-py3.11/lib/python3.11/site-packages/litellm/utils.py", line 5972, in convert_to_model_response_object
    raise raised_exception
Exception

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tobias/.cache/pypoetry/virtualenvs/openhands-oz9p9M-K-py3.11/lib/python3.11/site-packages/litellm/main.py", line 1923, in completion
    response = openai_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tobias/.cache/pypoetry/virtualenvs/openhands-oz9p9M-K-py3.11/lib/python3.11/site-packages/litellm/llms/openai.py", line 1039, in completion
    raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError
During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/d/github/OpenDevin/openhands/controller/agent_controller.py", line 153, in _start_step_loop
    await self._step()
  File "/mnt/d/github/OpenDevin/openhands/controller/agent_controller.py", line 462, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/d/github/OpenDevin/agenthub/codeact_agent/codeact_agent.py", line 175, in step
    response = self.llm.completion(
               ^^^^^^^^^^^^^^^^^^^^
@tobitege tobitege added bug Something isn't working agent framework severity:medium Affecting multiple users labels Aug 25, 2024
@shubhamofbce
Copy link
Contributor

@tobitege I will look into this today.

@tobitege
Copy link
Collaborator Author

@tobitege I will look into this today.

Wonderful, thank you!

@tobitege
Copy link
Collaborator Author

tobitege commented Sep 4, 2024

Resolved by #3678 thanks to @shubhamofbce 🎉

@tobitege tobitege closed this as completed Sep 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users
Projects
None yet
Development

No branches or pull requests

2 participants