github Lexus2016/claude-flow v1.1.0
v1.1.0 — Bridge any AI model to Claude Code CLI

latest releases: v2.1.1, v2.1.0, v2.0.1...
one month ago

What's New

Rewritten Documentation

The README has been completely rewritten to properly explain the real problem claude-flow solves:

The Three Walls

1. Response Format Incompatibility
Every AI provider (DeepSeek, OpenAI, Gemini, GLM, Qwen, Llama) returns responses in different formats. Claude Code expects Anthropic's specific streaming events, content blocks, tool_use structures, and stop reasons. claude-flow routes requests through Anthropic-compatible proxy endpoints that translate responses between formats.

2. Authentication Trap
Claude Code has an undocumented quirk: ANTHROPIC_API_KEY must be set to an empty string ("") for proxy providers — not absent, not unset. claude-flow handles this automatically.

3. Configuration Complexity
6 environment variables, zero official documentation, different combinations per provider. One wrong setting = crash or silent failure. claude-flow gets it right with one command.

Architecture Diagram

Added clear architecture diagrams in all three README languages showing the flow:

Claude Code CLI → Provider's Anthropic-compatible endpoint → Any AI Model
                  (translates response format)

Supported Providers

  • OpenRouter — 200+ models (GPT, DeepSeek, Gemini, GLM, Llama, Qwen, Mistral...)
  • DeepSeek — V3, R1 (reasoning) via native Anthropic-compatible API
  • OpenAI — GPT-4.1, o3
  • Gemini — 2.5 Pro, 2.5 Flash
  • Custom — Any Anthropic-compatible endpoint

Documentation

  • 🇬🇧 English
  • 🇺🇦 Українська
  • 🇷🇺 Русский

Install

npm install -g claude-flow
claude-flow setup openrouter
eval $(claude-flow env)
claude -p "Hello from any AI model!"

Zero dependencies. Node.js >= 18.

Don't miss a new claude-flow release

NewReleases is sending notifications on new releases.