This is a snapshot release of the final gradio PoC build.
This version was the first iteration of the application, and is no longer supported.
If you are looking for a replacement for a local-first/open-source solution similar, please see either tldw_chatbook: https://github.com/rmusser01/tldw_chatbook or the Web Front-end for tldw_server:
Install instructions:
tldw_server – Installation Guide (Win/Mac/Linux)
Requirements
- Python 3.9+
- ffmpeg (media processing)
- 8GB+ RAM (3–4GB for server, rest for models)
- 10GB+ disk space
1. Install System Dependencies
Linux
# Debian/Ubuntu
sudo apt install ffmpeg portaudio19-dev gcc build-essential python3-dev
# Fedora
sudo dnf install ffmpeg portaudio-devel gcc gcc-c++ python3-devel
macOS
brew install ffmpeg portaudio
Windows
- Install Python and ffmpeg from official sources.
- For CUDA transcription without full CUDA install:
Download [Faster-Whisper-XXL](https://github.com/Purfview/whisper-standalone-win/releases/download/Faster-Whisper-XXL/Faster-Whisper-XXL_r192.3.4_windows.7z)
Extractcudnn_ops_infer64_8.dll
andcudnn_cnn_infer64_8.dll
into thetldw_server
directory.
Download/install ffmpeg from https://www.gyan.dev/ffmpeg/builds/ - Move the ffmpeg/ffmprobe binaries into the./bin
folder if you don't install them system-wide. This will let tldw use ffmpeg for transcription/file conversion.
2. Download Software & Create Virtual Environment
# Download this package via your choice, and then extract the folder to a location of your choosing, and then navigate to that folder via the terminal.
cd tldw_server
python3 -m venv .venv
# Activate:
# Linux/macOS:
source .venv/bin/activate
# Windows:
.venv\Scripts\activate
3. Install Python Dependencies
pip install -r requirements.txt
4. Configure API Keys
cp config.txt.example config.txt
# Edit config.txt with your keys & settings
Alternatively, use environment variables (OPENAI_API_KEY
, ANTHROPIC_API_KEY
, etc.).
5. Run the Server
python summarize.py -gui
Optional – Docker Install
CPU
docker build -f Helper_Scripts/Dockerfiles/tldw_cpu_Dockerfile -t tldw-cpu .
docker run -p 8000:8000 tldw-cpu
GPU
docker build -f Helper_Scripts/Dockerfiles/tldw_nvidia_Dockerfile -t tldw-gpu .
docker run --gpus all -p 8000:8000 tldw-gpu