github crmne/ruby_llm 1.11.0

18 hours ago

RubyLLM 1.11: xAI Provider & Grok Models πŸš€πŸ€–βš‘

This release welcomes xAI as a first-class provider, brings Grok models into the registry, and polishes docs around configuration and thinking. Plug in your xAI API key and start chatting with Grok in seconds.

πŸš€ xAI Provider (Hello, Grok!)

Use xAI’s OpenAI-compatible API via a dedicated provider and jump straight into chat:

RubyLLM.configure do |config|
  config.xai_api_key = ENV["XAI_API_KEY"]
end

chat = RubyLLM.chat(model: "grok-4-fast-non-reasoning")
response = chat.ask("What's the fastest way to parse a CSV in Ruby?")
response.content
  • xAI is now a first-class provider (:xai) with OpenAI-compatible endpoints under the hood.
  • Grok models are included in the registry so you can pick by name without extra wiring.
  • Streaming, tool calls, and structured output work the same way you already use with OpenAI-compatible providers.

Stream responses just like you’re used to:

chat = RubyLLM.chat(model: "grok-3-mini")

chat.ask("Summarize this PR in 3 bullets") do |chunk|
  print chunk.content
end

🧩 Model Registry Refresh

Model metadata and the public models list were refreshed to include Grok models and related updates.

πŸ“š Docs Polishes

  • Configuration docs now include xAI setup examples.
  • The thinking guide got a tighter flow and clearer examples.

πŸ› οΈ Provider Fixes

  • Resolved an OpenAI, Bedrock, and Anthropic error introduced by the new URI interface.

Installation

gem "ruby_llm", "1.11.0"

Upgrading from 1.10.x

bundle update ruby_llm

Merged PRs

Full Changelog: 1.10.0...1.11.0

Don't miss a new ruby_llm release

NewReleases is sending notifications on new releases.