Skip to content

trendy-design/llmchat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

fd58582 Β· Sep 29, 2024
Aug 25, 2024
Sep 29, 2024
Sep 29, 2024
Sep 29, 2024
Sep 25, 2024
Sep 22, 2024
Aug 24, 2024
Jul 24, 2024
Sep 9, 2024
Jul 24, 2024
Jul 24, 2024
Aug 26, 2024
Sep 25, 2024
Aug 6, 2024
Aug 24, 2024
Sep 9, 2024
Aug 24, 2024
Sep 10, 2024
Sep 21, 2024
May 11, 2024
Sep 9, 2024
Aug 11, 2024
Aug 11, 2024
Sep 23, 2024
Aug 25, 2024

Repository files navigation

Screenshot 2024-09-25 at 8 52 53 AM

LLMChat logo

Most intuitive All-in-one AI chat interface.

Key Features

  • 🧠 Multiple LLM Providers: Supports various language models, including Ollama.
  • πŸ”Œ Plugins Library: Enhance functionality with an expandable plugin system, including function calling capabilities.
  • 🌐 Web Search Plugin: Allows AI to fetch and utilize real-time web data.
  • πŸ€– Custom Assistants: Create and tailor AI assistants for specific tasks or domains.
  • πŸ—£οΈ Text-to-Speech: Converts AI-generated text responses to speech using Whisper.
  • πŸŽ™οΈ Speech-to-Text: (Coming soon) Enables voice input for more natural interaction.
  • πŸ’Ύ Local Storage: Securely store data locally using in-browser IndexedDB for faster access and privacy.
  • πŸ“€πŸ“₯ Data Portability: Easily import or export chat data for backup and migration.
  • πŸ“š Knowledge Spaces: (Coming soon) Build custom knowledge bases for specialized topics.
  • πŸ“ Prompt Library: Use pre-defined prompts to guide AI conversations efficiently.
  • πŸ‘€ Personalization: Memory plugin ensures more contextual and personalized responses.
  • πŸ“± Progressive Web App (PWA): Installable on various devices for a native-like app experience.

Tech Stack

  • 🌍 Next.js
  • πŸ”€ TypeScript
  • πŸ—‚οΈ Pglite
  • 🧩 LangChain
  • πŸ“¦ Zustand
  • πŸ”„ React Query
  • πŸ—„οΈ Supabase
  • 🎨 Tailwind CSS
  • ✨ Framer Motion
  • πŸ–ŒοΈ Shadcn
  • πŸ“ Tiptap

Roadmap

  • πŸŽ™οΈ Speech-to-Text: Coming soon.
  • πŸ“š Knowledge Spaces: Coming soon.

Quick Start

To get the project running locally:

Prerequisites

  • Ensure you have yarn or bun installed.

Installation

  1. Clone the repository:

    git clone https://github.com/your-repo/llmchat.git
    cd llmchat
  2. Install dependencies:

    yarn install
    # or
    bun install
  3. Start the development server:

    yarn dev
    # or
    bun dev
  4. Open your browser and navigate to http://localhost:3000.

og_6x

Deployment

Instructions for deploying the project will be added soon.