Skip to content

zaidmukaddam/youtube-transcripts-machine

Folders and files

NameName
Last commit message
Last commit date
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025
Mar 8, 2025

Repository files navigation

YTM (YouTube Transcripts Machine)

Deploy with Vercel

Overview

YTM (YouTube Transcripts Machine) is a web application that automatically extracts timestamps and transcripts from any YouTube video. It uses browser automation with Stagehand and BrowserBase to navigate to YouTube videos, extract transcript data, and present it in a clean, user-friendly interface.

Features

  • Easy URL Input: Simply paste any YouTube video URL to extract its transcript
  • Timestamped Transcripts: View the complete transcript with accurate timestamps
  • Interactive Timestamps: Click on any timestamp to jump to that exact point in the video
  • Export Options: Copy the entire transcript to clipboard or download as a text file
  • Responsive Design: Works seamlessly on desktop and mobile devices

How It Works

  1. Input: User enters a YouTube video URL
  2. Processing:
    • The app uses Stagehand to automate a browser session
    • It navigates to the YouTube video
    • Opens the transcript panel
    • Extracts all transcript entries with their timestamps
  3. Output: Displays the formatted transcript with clickable timestamps

Technology Stack

  • Frontend: Next.js with React
  • Browser Automation: Stagehand SDK
  • Cloud Execution: BrowserBase
  • AI Processing: OpenAI's GPT models for transcript extraction

Getting Started

Prerequisites

  • Node.js and npm
  • OpenAI API key
  • BrowserBase API key and project ID (for cloud execution)

Installation

# Clone the repository
git clone https://github.com/zaidmukaddam/youtube-transcripts-machine.git
cd youtube-transcripts-machine

# Install dependencies
npm install

# Set up environment variables
cp .example.env .env.local
# Add your API keys to .env.local

Configuration

This project can be configured to use different LLM providers:

Using OpenAI (Default)

# In .env
OPENAI_API_KEY=your_openai_api_key

Using Anthropic Claude

  1. Add your API key to .env
ANTHROPIC_API_KEY=your_anthropic_api_key
  1. Update stagehand.config.ts:
modelName: "claude-3-5-sonnet-latest"
modelClientOptions: { apiKey: process.env.ANTHROPIC_API_KEY }

Running Locally

npm run dev

Deploying to Production

The easiest way to deploy is using Vercel:

  1. Click the "Deploy with Vercel" button above
  2. Configure your environment variables
  3. Deploy!

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is open source and available under the MIT License.

Acknowledgements

  • Built with Stagehand
  • Powered by BrowserBase
  • Uses OpenAI's GPT 4o Mini model for transcript processing