RubyLLM 1.12.1: Agent API Delegation + Rails add_message Persistence + Dependency Compatibility 🎉🤖🛠️
This is a focused patch release.
RubyLLM 1.12.1 tightens Agent behavior, fixes Rails chat persistence in add_message, and relaxes dependency constraints for better compatibility.
🤖 Agent API: Full Chat Delegation via Forwardable
Agents now delegate the full RubyLLM::Chat instance API to the wrapped chat object using Ruby’s Forwardable.
This also fixes the undefined method 'delegate' for class RubyLLM::Agent issue for PORO.
Delegated methods now include core accessors and fluent config methods like:
model,messages,tools,params,headers,schemaask,say,complete,add_message,reset_messages!with_model,with_tools,with_params,with_headers,with_schema, etc.
agent = WorkAssistant.new
agent.with_model("gpt-5-nano")
agent.add_message(role: :user, content: "Summarize this thread")
response = agent.completeRails: Chat#add_message Now Persists Properly
Rails-backed chats now persist messages correctly when using add_message (not just ask/legacy flows).
chat = Chat.find(params[:chat_id])
chat.add_message(role: :user, content: params[:content]) # now persistedAlso included in this fix:
- tool-call linkage persistence for added messages
- attachment/content persistence handling improvements
create_user_messageremains as a compatibility wrapper (legacy/deprecated path)
📎 Attachment Robustness for Rails Multipart Inputs
RubyLLM::Content now ignores blank/nil attachment placeholder entries (common in Rails multipart arrays), preventing noisy failures when attachments include empty values.
📦 Dependency Compatibility Update
Dependency constraints were updated to reduce unnecessary pinning friction:
ruby_llm-schema:~> 0.2.1→~> 0marcel:~> 1.0→~> 1
Installation
gem "ruby_llm", "1.12.1"Upgrading from 1.12.0
bundle update ruby_llmFull Changelog: 1.12.0...1.12.1