Skip to content

Malformed Request during Initialization Tiny Agent #1614

Open
@debasisdwivedy

Description

@debasisdwivedy

Hi,

While initializing an agent with provider OPEN AI the request is adding a signal: {} to the body of the request.

Below is the agent.json file:

{
    "model": "openai/gpt-4o",
    "provider": "openai",
    "apiKey": "<YOUR TOKEN>",
    "servers": [
        {
            "type": "stdio",
            "command": "npx",
            "args": ["mcp-remote",
                "http://127.0.0.1:7860/gradio_api/mcp/sse"
            ],
            "env":{},
            "cwd": "."
        }
    ]
}

I have just given an example of OpenAI, but you can put any provider of your choice.

REPRODUCTION

Make a agent.json file as below:

{
    "model": "openai/gpt-4o",
    "provider": "openai",
    "apiKey": "<YOUR TOKEN>",
    "servers": [
        {
            "type": "stdio",
            "command": "npx",
            "args": ["mcp-remote",
                "http://127.0.0.1:7860/gradio_api/mcp/sse"
            ],
            "env":{},
            "cwd": "."
        }
    ]
}

RUN the command from terminal:

npx tiny-agents run ./agent.json

LOGS

./node_modules/@huggingface/tiny-agents/dist/cli.js:115
    throw err;
    ^

InferenceClientProviderApiError: Failed to perform inference: Unrecognized request argument supplied: signal
    at innerStreamingRequest (./node_modules/@huggingface/inference/dist/commonjs/utils/request.js:110:23)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async chatCompletionStream (./node_modules/@huggingface/inference/dist/commonjs/tasks/nlp/chatCompletionStream.js:13:5)
    at async Agent.processSingleTurnWithTools (./node_modules/@huggingface/mcp-client/dist/src/index.js:198:22)
    at async Agent.run (./node_modules/@huggingface/mcp-client/dist/src/index.js:345:9)
    at async mainCliLoop (./node_modules/@huggingface/tiny-agents/dist/cli.js:125:22) {
  httpRequest: {
    url: 'https://api.openai.com/v1/chat/completions',
    method: 'POST',
    headers: {
      Authorization: 'Bearer [redacted]',
      'Content-Type': 'application/json',
      'User-Agent': '@huggingface/inference/4.4.0 Node.js/23'
    },
    body: {
      messages: [
        {
          role: 'system',
          content: 'You are an agent - please keep going until the user’s query is completely resolved, before ending your turn and yielding back to the user. Only terminate your turn when you are sure that the problem is solved, or if you need more info from the user to solve the problem.\n' +
            '\n' +
            'If you are not sure about anything pertaining to the user’s request, use your tools to read files and gather the relevant information: do NOT guess or make up an answer.\n' +
            '\n' +
            'You MUST plan extensively before each function call, and reflect extensively on the outcomes of the previous function calls. DO NOT do this entire process by making function calls only, as this can impair your ability to solve the problem and think insightfully.'
        },
        {
          role: 'user',
          content: "Analyze the sentiment of the following text 'This is awesome'"
        }
      ],
      tools: [
        {
          type: 'function',
          function: {
            name: 'task_complete',
            description: 'Call this tool when the task given by the user is complete'
          }
        },
        {
          type: 'function',
          function: {
            name: 'ask_question',
            description: 'Ask a question to the user to get more info required to solve or clarify their problem.'
          }
        },
        {
          type: 'function',
          function: {
            name: 'sentiment_analysis',
            description: 'Analyze the sentiment of the given text. Returns: A JSON string containing polarity, subjectivity, and assessment',
            parameters: [Object]
          }
        }
      ],
      tool_choice: 'auto',
      signal: {},
      stream: true,
      model: 'gpt-4o'
    }
  },
  httpResponse: {
    requestId: 'req_0d0e4c96cbca6ee9ace96b39ef96b594',
    status: 400,
    body: {
      error: {
        message: 'Unrecognized request argument supplied: signal',
        type: 'invalid_request_error',
        param: null,
        code: null
      }
    }
  }
}

Node.js v23.11.0
[46427] 
Shutting down...

The issue lies in the body of the request:

body: {
messages: [],
tools: [],
signal: {},
stream: true,
model: 'gpt-4o'

Signal is not a valid parameter. I am not sure why are we adding that. If it is model specific then it has to be checked and added accordingly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions