Most intuitive All-in-one AI chat interface.
- π§ Multiple LLM Providers: Supports various language models, including Ollama.
- π Plugins Library: Enhance functionality with an expandable plugin system, including function calling capabilities.
- π Web Search Plugin: Allows AI to fetch and utilize real-time web data.
- π€ Custom Assistants: Create and tailor AI assistants for specific tasks or domains.
- π£οΈ Text-to-Speech: Converts AI-generated text responses to speech using Whisper.
- ποΈ Speech-to-Text: (Coming soon) Enables voice input for more natural interaction.
- πΎ Local Storage: Securely store data locally using in-browser IndexedDB for faster access and privacy.
- π€π₯ Data Portability: Easily import or export chat data for backup and migration.
- π Knowledge Spaces: (Coming soon) Build custom knowledge bases for specialized topics.
- π Prompt Library: Use pre-defined prompts to guide AI conversations efficiently.
- π€ Personalization: Memory plugin ensures more contextual and personalized responses.
- π± Progressive Web App (PWA): Installable on various devices for a native-like app experience.
- π Next.js
- π€ TypeScript
- ποΈ Pglite
- 𧩠LangChain
- π¦ Zustand
- π React Query
- ποΈ Supabase
- π¨ Tailwind CSS
- β¨ Framer Motion
- ποΈ Shadcn
- π Tiptap
- ποΈ Speech-to-Text: Coming soon.
- π Knowledge Spaces: Coming soon.
To get the project running locally:
- Ensure you have
yarn
orbun
installed.
-
Clone the repository:
git clone https://github.com/your-repo/llmchat.git cd llmchat
-
Install dependencies:
yarn install # or bun install
-
Start the development server:
yarn dev # or bun dev
-
Open your browser and navigate to
http://localhost:3000
.
Instructions for deploying the project will be added soon.