Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OperationalError: no such column: "size" - should this be a string literal in single-quotes? #6009

Open
thangcn1943 opened this issue Mar 19, 2025 · 3 comments
Labels
0.2 Issues which are related to the pre 0.4 codebase needs-triage

Comments

@thangcn1943
Copy link

What happened?

I encountered the following error when running the code:
OperationalError Traceback (most recent call last)
Cell In[3], line 9
6 user_proxy = UserProxyAgent("user_proxy", code_execution_config=False)
8 # Start the chat
----> 9 user_proxy.initiate_chat(
10 assistant,
11 message="Tell me a joke about NVDA and TESLA stock prices.",
12 )

File ~/miniconda3/envs/thangcn/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py:1503, in ConversableAgent.initiate_chat(self, recipient, clear_history, silent, cache, max_turns, summary_method, summary_args, message, **kwargs)
1501 else:
1502 msg2send = self.generate_init_message(message, **kwargs)
-> 1503 self.send(msg2send, recipient, silent=silent)
1504 summary = self._summarize_chat(
1505 summary_method,
1506 summary_args,
1507 recipient,
1508 cache=cache,
1509 )
1510 for agent in [self, recipient]:

File ~/miniconda3/envs/thangcn/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py:1187, in ConversableAgent.send(self, message, recipient, request_reply, silent)
1185 valid = self._append_oai_message(message, "assistant", recipient, is_sending=True)
1186 if valid:
...
873 value,
874 ),
875 )
OperationalError: no such column: "size" - should this be a string literal in single-quotes?

My code:

import os
from autogen import AssistantAgent, UserProxyAgent

llm_config = { "config_list": [{ "model": "gpt-4o", "api_key": os.environ.get("OPENAI_API_KEY") }] }
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent("user_proxy", code_execution_config=False)

# Start the chat
user_proxy.initiate_chat(
    assistant,
    message="Tell me a joke about NVDA and TESLA stock prices.",
)

Which packages was the bug in?

V0.2 (autogen-agetnchat==0.2.*)

AutoGen library version.

Python dev (main branch)

Other library version.

No response

Model used

No response

Model provider

None

Other model provider

No response

Python version

3.10

.NET version

None

Operating system

Ubuntu

@Trongnhat191
Copy link

i dont know bro :)

@ekzhu ekzhu added the 0.2 Issues which are related to the pre 0.4 codebase label Mar 21, 2025
@yogyagit
Copy link

yogyagit commented Mar 22, 2025

encountered the same issue due to a corrupted or outdated .cache/ schema (e.g., no such column: "size"). As a temporary fix, disabled AutoGen’s caching using a dummy cache:

from autogen import oai
from autogen.cache import Cache

# Dummy no-op cache class
class DummyCache:
    def get(self, key, default=None): return None
    def set(self, key, value): pass
    def close(self): pass
    def __enter__(self): return self
    def __exit__(self, *args): pass

# Patch AutoGen's cache system
Cache.cache = DummyCache()  # disables global instance
Cache.disk = lambda *args, **kwargs: DummyCache()  # disables disk instantiation
oai.client.LEGACY_CACHE_DIR = ""  # disables fallback .cache/ directory

# Now safe to use LLMConfig with cache_seed=None
from autogen import LLMConfig, config_list_from_dotenv
import os

env_path = os.path.join(os.path.dirname(__file__), '..', '..', '.env')
config_list = config_list_from_dotenv(dotenv_file_path=env_path)

llm_config = LLMConfig(
    config_list=config_list,
    cache_seed=None  # 
)

This lets you continue using LLMConfig(cache_seed=None) without triggering fallback to a broken disk cache. The proper fix would be deleting the .cache/ directory, but this workaround helps avoid cache-related crashes in the meantime.

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 23, 2025

This is a 0.2 issue, consider upgrading to the latest version v0.4. Let us know about feature gaps.

See migration guide: https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/migration-guide.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0.2 Issues which are related to the pre 0.4 codebase needs-triage
Projects
None yet
Development

No branches or pull requests

4 participants