RubyLLM 1.3.0.rc1: Isolation, Configuration & Model Expansion 🚀
We're excited to introduce RubyLLM 1.3.0.rc1 with isolated configuration contexts, expanded provider support, and major improvements to attachments and Rails integration.
🔄 Isolation with Contexts
Introducing scoped configuration with Contexts - perfect for multi-tenant applications, environment targeting, and testing:
# Create an isolated configuration context
context = RubyLLM.context do |config|
config.openai_api_key = ENV['TENANT_A_OPENAI_KEY']
config.anthropic_api_key = ENV['TENANT_A_ANTHROPIC_KEY']
config.request_timeout = 180 # Longer timeout just for this context
end
# Use the isolated context for specific operations
embedding = context.embed("Text for tenant A")
response = context.chat.ask("Question from tenant A")
# Global configuration remains untouched
RubyLLM.embed("This uses the default configuration")
📎 Smart Attachments with Auto Type Detection
The new attachment system works everywhere - not just Rails! RubyLLM automatically detects file types and handles the appropriate API formatting:
# Send a single file - auto detects if it's an image, PDF, or audio
chat.ask("What's in this file?", with: "diagram.png")
# Send multiple files - all types are automatically detected
chat.ask("Analyze these files", with: [
"report.pdf",
"chart.jpg",
"recording.mp3"
])
# Still works with manual categorization if you prefer
chat.ask("What's in this?", with: {
image: "diagram.png",
audio: "recording.mp3"
})
🚂 Enhanced Rails Integration
The Rails integration now supports attachments with ActiveStorage, making it easy to work with uploaded files:
# After setting up ActiveStorage
class Message < ApplicationRecord
acts_as_message
has_many_attached :attachments # Add this for attachment support
end
# Works seamlessly with Rails uploads
chat_record.ask("Analyze this", with: params[:uploaded_file])
# Works with existing ActiveStorage attachments
chat_record.ask("What's in my document?", with: user.profile_document)
# Send multiple files from a form
chat_record.ask("Review these documents", with: params[:files])
The persistence flow has also been enhanced with better error handling and support for validation-focused approaches.
💻 Ollama Integration
Local LLMs are now first-class citizens with Ollama support:
RubyLLM.configure do |config|
config.ollama_api_base = 'http://localhost:11434/api'
end
# Use local models
chat = RubyLLM.chat(model: 'mistral')
🔀 OpenRouter Support
Access hundreds of models through a single API:
RubyLLM.configure do |config|
config.openrouter_api_key = ENV['OPENROUTER_API_KEY']
end
# Access models from various providers through OpenRouter
chat = RubyLLM.chat(model: 'anthropic/claude-3.5-sonnet', provider: 'openrouter')
🏢 OpenAI Organization Support
Support for OpenAI organization and project IDs:
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
config.openai_organization_id = ENV['OPENAI_ORG_ID']
config.openai_project_id = ENV['OPENAI_PROJECT_ID']
end
🌐 Better Model Information via Parsera API
We now use the Parsera API for getting accurate, up-to-date model information when you run RubyLLM.models.refresh!
. This provides scraped data directly from provider websites, ensuring the most current model details, pricing, and capabilities.
Check out the model browser built by Parsera for more information.
🔢 Custom Embedding Dimensions
Control the dimensionality of your embeddings:
# Generate compact embeddings for specific use cases
embedding = RubyLLM.embed(
"Ruby is a programmer's best friend",
model: "text-embedding-3-small",
dimensions: 512
)
🪵 Configurable Logging
Control where and how RubyLLM logs:
RubyLLM.configure do |config|
config.log_file = '/path/to/ruby_llm.log'
config.log_level = :debug
end
🔧 Core Improvements
- Completely refactored Content implementation for simplified provider integration
- Fixed duplicate messages when calling
chat.to_llm
multiple times - Enhanced media handling across providers (including PDF support)
- Fixed embedding when using default model
- Configuration.inspect now properly filters sensitive data
- Rails-style foreign keys for custom class names in ActiveRecord integration
- Fixed empty message cleanup on API failures
- Enhanced test coverage for streaming vs non-streaming consistency
This is a release candidate. Please test thoroughly, especially the new context and provider features!
gem 'ruby_llm', '1.3.0.rc1'
PRs Merged
- Support Embedding Dimensions by @papgmez in #73
- Updated acts_as_* helpers to use canonical 'rails-style' foreign keys by @timaro in #151
- Add support for logging to file via configuration by @rhys117 in #148
- Use foreign_key instead of to_s for acts_as methods by @bborn in #157
- Fix inflector by @bborn in #159
- Handle OpenAI organization and project IDs by @xymbol in #162
- Don't call blob on a blob by @bborn in #164
New Contributors
- @papgmez made their first contribution in #73
- @timaro made their first contribution in #151
- @rhys117 made their first contribution in #148
- @bborn made their first contribution in #157
- @xymbol made their first contribution in #162
Full Changelog: 1.2.0...1.3.0rc1