RubyLLM 1.11: xAI Provider & Grok Models ππ€β‘
This release welcomes xAI as a first-class provider, brings Grok models into the registry, and polishes docs around configuration and thinking. Plug in your xAI API key and start chatting with Grok in seconds.
π xAI Provider (Hello, Grok!)
Use xAIβs OpenAI-compatible API via a dedicated provider and jump straight into chat:
RubyLLM.configure do |config|
config.xai_api_key = ENV["XAI_API_KEY"]
end
chat = RubyLLM.chat(model: "grok-4-fast-non-reasoning")
response = chat.ask("What's the fastest way to parse a CSV in Ruby?")
response.content- xAI is now a first-class provider (
:xai) with OpenAI-compatible endpoints under the hood. - Grok models are included in the registry so you can pick by name without extra wiring.
- Streaming, tool calls, and structured output work the same way you already use with OpenAI-compatible providers.
Stream responses just like youβre used to:
chat = RubyLLM.chat(model: "grok-3-mini")
chat.ask("Summarize this PR in 3 bullets") do |chunk|
print chunk.content
endπ§© Model Registry Refresh
Model metadata and the public models list were refreshed to include Grok models and related updates.
π Docs Polishes
- Configuration docs now include xAI setup examples.
- The thinking guide got a tighter flow and clearer examples.
π οΈ Provider Fixes
- Resolved an OpenAI, Bedrock, and Anthropic error introduced by the new URI interface.
Installation
gem "ruby_llm", "1.11.0"Upgrading from 1.10.x
bundle update ruby_llmMerged PRs
- Add xAI Provider by @infinityrobot and @crmne in #373
Full Changelog: 1.10.0...1.11.0