1.10.0 (2025-12-04)
Features
- add global filters negation (#888) (a332ded)
- add message deletion to prompt playground chat (#855) (2a820ee)
- event sourcing 2, electric boogaloo (#860) (19eeecf)
- improve trace message loading with retry logic and caching (#852) (4a76815)
Bug Fixes
- add cap and no decimal (#851) (e0efafd)
- add robust FormErrorDisplay component for form validation errors (#870) (c3d50a5)
- bump litellm and other dependencies to the latest version to fix bedrock embedding issues (b522e89)
- hotfix: fix api_key being set breaking bedrock for newer versions of litellm, has_key being set instead of the actual value, and allowing for empty bedrock values to use env vars instead for onprem deployments (ade0a84)
- hotfix: parse input and output for mastra (cdb8766)
- hotfix: remove eval usage limits (7a48498)
- intelligent minmax for model settings so that users dont have to be surprised with a failure (#859) (0b6b498)
- langwatch/package.json & langwatch/package-lock.json to reduce vulnerabilities (a9eeb12)
- managed llm providers (bddfa70)
- otel trace/span id parsing was not decoding base64 span/trace ids (#861) (19abbac)
- prompts: make maxTokens and temperature optional in form schema (#913) (73a2705), closes #912
- prompts: make prompts.get throw error instead of returning null/undefined (#867) (9705201)
- python-sdk: add httpx.ReadTimeout to transient error skip list (#910) (dbdae14), closes #909
- resolve LLM config modal value reversion by adding proper format… (#874) (85daeb2)
- secutity: upgrade next from 15.5.4 to 15.5.7 (#914) (a9eeb12)
- type errors (#872) (f1f3333)