RubyLLM 1.1.0.rc1: Cloud Expansion & Better Conversations đ
We're excited to introduce RubyLLM 1.1.0.rc1, expanding cloud provider support and enhancing conversation management. This release candidate brings AWS Bedrock support, improved system prompts, and much more.
đŠī¸ AWS Bedrock Support
Access Claude models through your existing AWS infrastructure:
RubyLLM.configure do |config|
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID')
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY')
config.bedrock_region = ENV.fetch('AWS_REGION')
config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN') # optional
end
# Using Claude through Bedrock
chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'bedrock')
Great for teams who want to keep everything within their existing AWS environment.
đ§ Simpler System Instructions
We've introduced with_instructions
for a cleaner way to set system prompts:
chat = RubyLLM.chat
.with_instructions("You are a helpful coding assistant that specializes in Ruby")
.ask("How would I implement a binary search tree?")
You can also explicitly replace previous instructions instead of accumulating them:
chat.with_instructions("New instructions", replace: true)
This works across all providers, with smart handling of differences between OpenAI, Anthropic, and others.
đ Smarter Model Resolution
Model identification is now much smarter:
- Model Aliases: Use simple names like
gpt-4o
instead ofgpt-4o-2024-11-20
- Provider-Specific Matching: Target specific providers with the same base model
- Exact Match Priority: Exact matches are prioritized over aliases
# These all work now
chat = RubyLLM.chat(model: 'claude-3-5-sonnet')
chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'anthropic')
chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'bedrock')
đ ī¸ Tool Improvements
Tools get even better with support for parameterless tools:
class RandomNumber < RubyLLM::Tool
description "Generates a random number between 1 and 100"
def execute
rand(1..100)
end
end
đ Rails Enhancements
ActiveRecord integration improvements make Rails usage even better:
# Now all these methods return self for chainability
chat = Chat.create!(model_id: 'gpt-4o-mini')
.with_instructions("You are a helpful assistant")
.with_tool(Calculator)
.ask("What's 123 * 456?")
đ§ Technical Improvements
- Fixed multimodal inputs for Bedrock Anthropic
- Improved system prompt handling across all providers
- Enhanced streaming functionality with better error parsing
- Fixed model refreshing for read-only filesystems in production
- Improvements to DeepSeek and Anthropic providers
đ Documentation Updates
We've updated our documentation to cover:
- Working with multiple providers
- Using the model registry effectively
- Deploying in production environments
This is a release candidate. Please try it out and report any issues before the final release!
gem 'ruby_llm', '1.1.0.rc1'
Contributors
Thanks to everyone who contributed to this release, especially @tpaulshippy for the AWS Bedrock implementation, @kieranklaassen for ActiveRecord improvements, and @seuros and @redox for their contributions.
What's Changed
- Support tools without params by @redox in #62
- Switched from git ls-files to Ruby's Dir.glob for defining spec.file by @seuros in #84
- 16: Add support for Claude through Bedrock by @tpaulshippy in #50
- Fix refresh of Bedrock models and remove some expired credentials from cassettes by @tpaulshippy in #89
- fix acts_as delegation to return self instead of RubyLLM by @kieranklaassen in #82
New Contributors
- @redox made their first contribution in #62
- @seuros made their first contribution in #84
- @tpaulshippy made their first contribution in #50
- @kieranklaassen made their first contribution in #82
Full Changelog: 1.0.1...1.1.0rc1