A Python library for interacting with LLMs using mcp.run tools
mcpx-py
supports all models supported by PydanticAI
uv
npm
ollama
(optional)
You will need to get an mcp.run session ID by running:
npx --yes -p @dylibso/mcpx gen-session --write
This will generate a new session and write the session ID to a configuration file that can be used
by mcpx-py
.
If you need to store the session ID in an environment variable you can run gen-session
without the --write
flag:
npx --yes -p @dylibso/mcpx gen-session
which should output something like:
Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Then set the MPC_RUN_SESSION_ID
environment variable:
$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Using uv
:
uv add mcpx-py
Or pip
:
pip install mcpx-py
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
# Or OpenAI
# llm = Chat("gpt-4o")
# Or Ollama
# llm = Chat("ollama:qwen2.5")
# Or Gemini
# llm = Chat("gemini-2.0-flash")
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
It's also possible to get structured output by setting result_type
from mcpx_py import Chat, BaseModel, Field
from typing import List
class Summary(BaseModel):
"""
A summary of some longer text
"""
source: str = Field("The source of the original_text")
original_text: str = Field("The original text to be summarized")
items: List[str] = Field("A list of summary points")
llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
More examples can be found in the examples/ directory
uv tool install mcpx-py
From git:
uv tool install git+https://github.com/dylibso/mcpx-py
Or from the root of the repo:
uv tool install .
mcpx-client can also be executed without being installed using uvx
:
uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client
mcpx-client --help
mcpx-client chat
mcpx-client list
mcpx-client tool eval-js '{"code": "2+2"}'
- Sign up for an Anthropic API account at https://console.anthropic.com
- Get your API key from the console
- Set the environment variable:
ANTHROPIC_API_KEY=your_key_here
- Create an OpenAI account at https://platform.openai.com
- Generate an API key in your account settings
- Set the environment variable:
OPENAI_API_KEY=your_key_here
- Create an Gemini account at https://aistudio.google.com
- Generate an API key in your account settings
- Set the environment variable:
GEMINI_API_KEY=your_key_here
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull llama3.2
- No API key needed - runs locally
- Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
- Make the file executable:
chmod +x your-model.llamafile
- Run in JSON API mode:
./your-model.llamafile --json-api --host 127.0.0.1 --port 8080
- Use with the OpenAI provider pointing to
http://localhost:8080