Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UserProxy in RoundRobinGroupChat not acting as expected. #5990

Open
Wei-Cheng881221 opened this issue Mar 18, 2025 · 2 comments
Open

UserProxy in RoundRobinGroupChat not acting as expected. #5990

Wei-Cheng881221 opened this issue Mar 18, 2025 · 2 comments

Comments

@Wei-Cheng881221
Copy link

Wei-Cheng881221 commented Mar 18, 2025

What happened?

I was trying to setup a simple RoundRobinGroupChat with UserProxy being apart of the Team, but ran into some issue that the UserProxy's input prompt came too early.
Use case: The RoundRobin seq is like this: [user_agent, assistant_agent, code_executor_agent], the user_agner(UserProxy) will block the output msg from code_executor_agent.

Therefore, I follow the userguide from : Providing Feedback During a Run. And found the same result from the user-guide. To make the question more easier to understand, I use the user-guide code instead my code.
For example, the reference expected output are:

---------- user ----------
Write a 4-line poem about the ocean.
---------- assistant ----------
In endless blue where whispers play,  
The ocean's waves dance night and day.  
A world of depths, both calm and wild,  
Nature's heart, forever beguiled.  
TERMINATE
---------- user_proxy ----------
APPROVE

but my results looks like this after running:

---------- user ----------
Write a 4-line poem about the ocean.
---------- assistant ----------
Enter your response: The ocean's waves crash on the shore,
A soothing sound that calms once more.
Its depths are dark, its beauty bright,
A treasure trove of wonder in sight.

TERMINATE

And after typing APPORVE, it then shows the following then end:

APPROVE
---------- user_proxy ----------
APPROVE

I am using autogen 0.4.9.1 and python 3.12.2 under Redhat 8 machine.
Also, I am using llama 3.3_70B through ollama.

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0)

AutoGen library version.

Python dev (main branch)

Other library version.

No response

Model used

llama_3.3_70B

Model provider

Ollama

Other model provider

No response

Python version

3.12

.NET version

None

Operating system

Other

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 21, 2025

@jackgerrits thoughts?

@caserzer
Copy link

same issue on ubuntu 24 python version 3.12 /3.10 , but works on macos python 3.10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants