github dtzp555-max/ocp v2.4.0
v2.4.0 — Stable local API bridge for Claude, with real streaming

latest releases: v3.16.4, v3.16.3, v3.16.2...
one month ago

v2.4.0: stop treating Claude like a chat tab

openclaw-claude-proxy is not trying to imitate Claude's native UI.
It is the stable local API bridge for the rest of your tool stack.

If Claude Code channel is the native in-app experience, OCP is the bridge that makes Claude usable from everything else: OpenClaw, Cursor, Continue, Open WebUI, LangChain, custom routers, and any OpenAI-compatible client.

What's actually better in v2

No more warm-pool fragility

  • v2 uses on-demand spawning
  • no pre-warmed worker pool
  • no stale workers
  • no degraded warm state to babysit

Real-world local workflow performance

  • real streaming instead of buffered upstream responses
  • adaptive timeout tiers by model class
  • faster and cleaner failover behavior
  • in practice, local tool / agent workflows are now competitive with Claude channel latency, while remaining easier to route and diagnose

More stable where channel-style workflows get brittle

  • better fit for headless automation
  • better fit for OpenAI-compatible tools that know nothing about Claude channel
  • explicit /health, /sessions, diagnostics, auth state, and local observability

Security hardening

  • stricter request validation
  • stronger sanitization on the proxy boundary
  • safer behavior under malformed input / timeout edges

Included in this release

  • Per-model circuit breaker — one model failing no longer poisons the whole proxy
  • Adaptive timeouts — Opus and other heavier models get saner timeout budgets
  • Real streaming — upstream output is actually streamed
  • Security hardening — tightened request handling and validation

Bottom line

Claude channel is great inside Claude Code. OCP is better as the stable local API bridge for everything else.

Full Changelog: v2.3.0...v2.4.0

Don't miss a new ocp release

NewReleases is sending notifications on new releases.