Skip to content

A Slack bot that lets you choose your preferred LLM using LiteLLM.

License

Notifications You must be signed in to change notification settings

RayeEvtuchM1/collmbo

 
 

Repository files navigation

Collmbo

A Slack bot that lets you choose your preferred LLM using LiteLLM. Pronounced the same as "Colombo".

Quick Start

Collmbo supports multiple LLMs, but let's begin with OpenAI for a quick setup.

1. Create a Slack App

Create a Slack app and obtain the required tokens:

  • App-level token (xapp-1-...)
  • Bot token (xoxb-...)

2. Create a .env File

Save your credentials in a .env file:

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gpt-4o
OPENAI_API_KEY=sk-...

3. Run Collmbo Container

Start the bot using Docker:

docker run -it --env-file .env ghcr.io/iwamot/collmbo:latest-slim

4. Say Hello!

Mention the bot in Slack and start chatting:

@Collmbo hello!

Collmbo should respond in channels, threads, and direct messages (DMs).

Want to Use a Different LLM?

First, pick your favorite LLM from LiteLLM supported providers.

To use it, update the relevant environment variables in your .env file and restart the container.

Here are some examples:

Azure OpenAI (gpt-4-0613)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=azure/<your_deployment_name>
LITELLM_MODEL_TYPE=azure/gpt-4-0613
AZURE_API_KEY=...
AZURE_API_BASE=...
AZURE_API_VERSION=...

Gemini - Google AI Studio (Gemini 1.5 Flash)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gemini/gemini-1.5-flash
GEMINI_API_KEY=...

Amazon Bedrock (Claude 3.5 Sonnet v2)

SLACK_APP_TOKEN=...
SLACK_BOT_TOKEN=...
LITELLM_MODEL=bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

# You can specify a Bedrock region if it's different from your default AWS region
AWS_REGION_NAME=us-west-2

# You can use your access key for authentication, but IAM roles are recommended
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...

When using Amazon Bedrock, use the full flavor image instead of slim one, as it includes boto3, which is required for Bedrock:

docker run -it --env-file .env ghcr.io/iwamot/collmbo:latest-full

Deployment

Collmbo does not serve endpoints and can run in any environment with internet access.

Features

Configuration

Collmbo runs with default settings, but you can customize its behavior by setting optional environment variables.

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request.

Before opening a PR, please run:

./validate.sh

This helps maintain code quality.

Related Projects

License

The code in this repository is licensed under the MIT License.

The Collmbo icon (assets/icon.png) is licensed under CC BY-NC-SA 4.0. For example, you may use it as a Slack profile icon.

About

A Slack bot that lets you choose your preferred LLM using LiteLLM.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.8%
  • Other 1.2%