Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce OpenAIAgent backed by the Response API in Extensions. #6032

Open
1 task done
ekzhu opened this issue Mar 20, 2025 · 1 comment
Open
1 task done

Introduce OpenAIAgent backed by the Response API in Extensions. #6032

ekzhu opened this issue Mar 20, 2025 · 1 comment
Labels
help wanted Extra attention is needed proj-extensions
Milestone

Comments

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 20, 2025

Confirmation

  • I confirm that I am a maintainer and so can use this template. If I am not, I understand this issue will be closed and I will be asked to use a different template.

Issue body

Assistant API will be deprecated in 2026, replaced by Response API: https://platform.openai.com/docs/guides/responses-vs-chat-completions.

We should start support the Response API by introducing OpenAIAgent that is backed by the Response API, which can be stateful.

The new agent should conform to the behavior of AssistantAgent in AgentChat, while backed directly by the openai library.

Thoughts, feedback welcome!

@ekzhu ekzhu added help wanted Extra attention is needed proj-extensions labels Mar 20, 2025
@ekzhu ekzhu added this to the 0.4.x-python milestone Mar 20, 2025
@federicovilla55
Copy link
Contributor

Should OpenAIAgent contain an agent implementation that uses the Response API to generate responses or should it provide a common abstraction layer that unifies both the Assistant API and the Responses API under a single interface? (and define in a new class the implementation of the Response API)
Since the Assistant API will be deprecated in 2026, only the Responses API will remain, but introducing a common interface could simplify the transition (as with a common interface developers could select the appropriate adapter based on their configuration or runtime and therefore it's easier to switch).

Regarding the agent implementation that uses Responses API to generate model responses:

The context in Response API can be maintained using a previous_response_id that is updated on every response (as this API has stateful interactions). This ensures that instructions from previous responses are not unintentionally carried over to subsequent ones.
Since the state model should be saved and reloaded between session (e.g. for resuming a session after a restart), the state model should be serializable, for example by using Pydantic’s BaseModel. I think the same should hold for the file IDs from the files uploaded via the OpenAI’s API. A class equivalent to OpenAIResponseAgentState(BaseModel) could be implemented for such scope.

Additionally, most of the logic from the current Assistant API implementation can be adapted for the Responses API, with necessary modifications to accommodate its different response format; therefore much of this functionality could be leveraged from the existing OpenAIAssistantAgent class within the autogen_ext.agents.openai module.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed proj-extensions
Projects
None yet
Development

No branches or pull requests

2 participants