github rohitg00/agentmemory v0.9.17
v0.9.17 — OpenAI-compat provider + telemetry id + Compare polish

4 hours ago

OpenAI-compatible LLM provider lands the universal-adapter shape — one config (OPENAI_API_KEY + OPENAI_BASE_URL + OPENAI_MODEL) covers OpenAI, Azure OpenAI (auto-detected from hostname), DeepSeek, SiliconFlow, vLLM, LM Studio, Ollama (via /v1), and any future endpoint mirroring POST /v1/chat/completions. Worker telemetry now pins a stable project_name so engine metrics + traces attribute cleanly. agent-memory.dev Compare section no longer wraps awkwardly.

Added

  • OpenAI-compatible LLM provider (#307, @fatinghenji). Closes #185, #232 (Ollama works via OPENAI_BASE_URL=http://localhost:11434/v1), #312, supersedes #240.

  • Azure OpenAI auto-detection. .openai.azure.com hostname → swaps Authorization: Bearer for api-key, drops /v1 path prefix, appends api-version=<version> query (default 2024-08-01-preview, override via OPENAI_API_VERSION).

  • OPENAI_TIMEOUT_MS env var. AbortController-bounded fetch, default 60s, clear timeout error with the env-var hint. Other raw-fetch providers tracked in #373.

  • OPENAI_REASONING_EFFORT passthrough. Forwarded as reasoning_effort on the request body for OpenAI reasoning models (o1, o3, gpt-*-reasoning) and providers that mirror that schema. Standard chat models reject the field with 400 — README documents the caveat. Falls back to message.reasoning when message.content is empty (Ollama Cloud thinking-model shape).

Changed

  • telemetry.project_name pinned to "agentmemory" (#426). iii-sdk auto-detection produces inconsistent identifiers per host (agentmemory, node, npm, occasionally the user's home dir basename via npx). Pinning gives every install the same stable identifier in engine metrics + traces. Also pins language and framework.

  • OPENAI_API_KEY_FOR_LLM=false opt-out. detectLlmProviderKind now mirrors detectProvider's existing gate — users who set OPENAI_API_KEY only for embeddings won't see the LLM auto-activate. README leads with an explicit shared-use callout.

  • Compare section (#427). Title AGENTMEMORY VS. THE FIELD.VS. THE FIELD. (eyebrow already says VS.). text-wrap: balance globally on .section-title. NATIVE PLUGINS cell 6 (Claude/Codex/OpenClaw/Hermes/pi/OpenHuman)6 (names already shown in Agents grid). Row grid rebalanced + word-break: break-word + 24px padding so cells like YES (APACHE-2.0) have breathing room.

Install

npm install -g @agentmemory/agentmemory@0.9.17
agentmemory

Try with any OpenAI-compatible endpoint:

# Standard OpenAI
OPENAI_API_KEY=sk-...
OPENAI_MODEL=gpt-4o-mini

# DeepSeek
OPENAI_API_KEY=sk-...
OPENAI_BASE_URL=https://api.deepseek.com
OPENAI_MODEL=deepseek-chat

# Local Ollama
OPENAI_API_KEY=ollama
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.1:8b

# Azure (auto-detected by hostname)
OPENAI_API_KEY=...
OPENAI_BASE_URL=https://my-resource.openai.azure.com/openai/deployments/gpt-4o
OPENAI_API_VERSION=2024-08-01-preview

Full changelog: https://github.com/rohitg00/agentmemory/blob/main/CHANGELOG.md#0917--2026-05-16

Don't miss a new agentmemory release

NewReleases is sending notifications on new releases.