github crmne/ruby_llm 1.8.0

10 hours ago

RubyLLM 1.8.0: Video Support & Content Moderation 🎥🛡️

Major feature release bringing video file support for multimodal models and content moderation capabilities to ensure safer AI interactions.

🎥 Video File Support

Full video file support for models with video capabilities:

# Local video files
chat = RubyLLM.chat(model: "gemini-2.5-flash")
response = chat.ask("What happens in this video?", with: "video.mp4")

# Remote video URLs (with or without extensions)
response = chat.ask("Describe this video", with: "https://example.com/video")

# Multiple attachments including video
response = chat.ask("Compare these", with: ["image.jpg", "video.mp4"])

Features:

  • Automatic MIME type detection for video formats
  • Support for remote videos without file extensions
  • Seamless integration with existing attachment system
  • Full support for Gemini and VertexAI video-capable models

🛡️ Content Moderation

New content moderation API to identify potentially harmful content before sending to LLMs:

# Basic moderation
result = RubyLLM.moderate("User input text")
puts result.flagged?  # => true/false
puts result.flagged_categories  # => ["harassment", "hate"]

# Integration pattern - screen before chat
def safe_chat(user_input)
  moderation = RubyLLM.moderate(user_input)
  return "Content not allowed" if moderation.flagged?

  RubyLLM.chat.ask(user_input)
end

# Check specific categories
result = RubyLLM.moderate("Some text")
puts result.category_scores["harassment"]  # => 0.0234
puts result.category_scores["violence"]    # => 0.0012

Features:

  • Detects sexual, hate, harassment, violence, self-harm, and other harmful content
  • Convenience methods: flagged?, flagged_categories, category_scores
  • Currently supports OpenAI's moderation API
  • Extensible architecture for future providers
  • Configurable default model (defaults to omni-moderation-latest)

🐛 Bug Fixes

Rails Inflection Issue

  • Fixed critical bug where Rails apps using Llm module/namespace would break due to inflection conflicts
  • RubyLLM now properly isolates its inflections

Migration Foreign Key Errors

  • Fixed install generator creating migrations with foreign key references to non-existent tables
  • Migrations now create tables first, then add references in correct order
  • Prevents "relation does not exist" errors in PostgreSQL and other databases

Model Registry Improvements

  • Fixed Models.resolve instance method delegation
  • Fixed helper methods to return all models supporting specific modalities
  • image_models, audio_models, and embedding_models now correctly include all capable models

📚 Documentation

  • Added comprehensive moderation guide with Rails integration examples
  • Updated video support documentation with examples
  • Clarified version requirements in documentation

Installation

gem 'ruby_llm', '1.8.0'

Upgrading from 1.7.x

bundle update ruby_llm

All changes are backward compatible. New features are opt-in.

Merged PRs

New Contributors

Full Changelog: 1.7.1...1.8.0

Don't miss a new ruby_llm release

NewReleases is sending notifications on new releases.