Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: When using .NET version of Autogen, if an agent makes parallels function calls using OpenAI, the next response throws an exception about invalid message ordering with tools calls #2722

Closed
arelath opened this issue May 19, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@arelath
Copy link

arelath commented May 19, 2024

Describe the bug

When using Autogen.NET, ChatGPT will occasionally make multiple function calls in the same message. This causes an exception when history with these messages is sent back to the server since it always expects a single message with an array of results, not a message per result. Function calls work fine when there is only one per message.

Steps to reproduce

In the example file dotnet\sample\AutoGen.BasicSamples\Example03_Agent_FunctionCall.cs change the tax calculator to this code:

`        var history = new List<IMessage>();
        var requestTaxRate = new TextMessage(Role.User, "calculate tax of both: 100, 0.1 and 100, 0.2");
        //var requestTaxRate = new TextMessage(Role.User, "calculate tax of 100, 0.1");
        var calculateTax2 = await agent.SendAsync(requestTaxRate, history);

        history.Add(requestTaxRate);
        history.Add(calculateTax2);

        calculateTax2.Should().BeOfType<AggregateMessage<ToolCallMessage, ToolCallResultMessage>>();
        var goodbye = await agent.SendAsync(new TextMessage(Role.User, "Say Goodbye"), history);`

When the second message is sent, an exception is throw Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.
Status: 400 (Bad Request)

debugging the code, there is a single tools request message with 2 functions, but this generates 2 response messages, one for each function. As far as I can tell, they want all tool responses in the same message as an array, not in the content field. (from this example code: https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb?short_path=ff3f9f2)

Model Used

ChatGPT versions. A GPT3.5 and GPT4o version.

Expected Behavior

It should format the message in the way OpenAI's server expects?

Screenshots and logs

No response

Additional Information

I noticed the function id sent in the response is the function name, not the id for the function call request. This seems to be different than the curl examples in the documentation, but I don't know if it matters because other than this case, function calls work just fine.

@arelath arelath added the bug Something isn't working label May 19, 2024
@LittleLittleCloud LittleLittleCloud self-assigned this May 19, 2024
@LittleLittleCloud LittleLittleCloud added this to the AutoGen.Net 0.0.14 milestone May 19, 2024
@LittleLittleCloud
Copy link
Collaborator

LittleLittleCloud commented May 19, 2024

@arelath Thanks for creating this issue, this is a bug in the OpenAIChatRequestMessageConnector, where it put functionName as tool call id when converting message from ToolCall and ToolCallResult to ChatRequestToolMessage and ChatRequestAssistantMessage. When there is only one function call, the conversion works fine because the tool call id is identical. However in parallel function call, there will be two identical tool call ids right after the original function call and that causes a bad request in openai end.

The fix is to add ToolCallId to ToolCall class, and make it default to functionName_{i} if the llm returns a null Id, or using the Id from llm.

github-merge-queue bot pushed a commit that referenced this issue May 21, 2024
* fix bug and add tests

* update
@LittleLittleCloud
Copy link
Collaborator

@arelath The fix is in nightly build now and will go out in 0.0.14

@arelath
Copy link
Author

arelath commented May 28, 2024

@LittleLittleCloud - Thank you for the incredibly fast turnaround. Verified the fix myself and it works perfectly!

jayralencar pushed a commit to jayralencar/autogen that referenced this issue May 28, 2024
* fix bug and add tests

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants