-
Notifications
You must be signed in to change notification settings - Fork 6.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MultimodalWebSurfer with Gemini fails "Using cl100k_base encoding." when used in MagenticOneGroupChat & MagenticOne #5966
Labels
Comments
Thanks for reporting this issue. I tried to replicate this and got a different error:
This warning message is to indicate that tiktoken doesn't have a built-in mapping for |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What happened?
Describe the bug
Running the example autogen.4 using Gemini-2.0-Flash model produces the error "Model gemini-2.0-flash not found. Using cl100k_base encoding." It doesns't ALWAYS fail, reading the console.stream the agent to agent handover of data fails at some point?
setup
vscode 1.98.1
nodejs 20.18.2
python 3.11.9
autogen-agentchat (0.4.9.2)
autogen-ext[magentic-one,openai] (0.4.9.2)
autogen-core==0.4.9.2
To Reproduce
Using the example with gemini .
The Surfer agent visits the webpage, MagenticOneOrchestrator askes to scroll down, back and forth a few times until error
callstack of model_client
Expected behavior
Should return the UV Index as example shows
Terminal output
Which packages was the bug in?
Python AgentChat (autogen-agentchat>=0.4.0), Python Extensions (autogen-ext)
AutoGen library version.
Python dev (main branch)
Other library version.
No response
Model used
gemini-2.0-flash
Model provider
Google Gemini
Other model provider
No response
Python version
3.11
.NET version
.NET 8
Operating system
Windows
The text was updated successfully, but these errors were encountered: