Skip to content

Incorrect capture of LLM Input with RubyLLM #71

@TL-David-Bauske

Description

@TL-David-Bauske

I just tried integrating Braintrust into our RubyLLM-powered backend. In general, it works fine, but in the UI I only see "#RubyLLM::Content:0x000079b4abac44f8" as the input instead of the actual prompt.

Image

I use attachments and I think that's the problem. From the SDK code:

              # Handle content
              if msg.respond_to?(:content) && msg.content
                # Convert Ruby hash notation to JSON string for tool results
                content = msg.content
                if msg.role.to_s == "tool" && content.is_a?(String) && content.start_with?("{:")
                  # Ruby hash string like "{:location=>...}" - try to parse and re-serialize as JSON
                  begin
                    # Simple conversion: replace Ruby hash syntax with JSON
                    content = content.gsub(/(?<=\{|, ):(\w+)=>/, '"\1":').gsub("=>", ":")
                  rescue
                    # Keep original if conversion fails
                  end
                end
                formatted["content"] = content
              end

It assumes that msg.content is a string or can be converted to a meaningful string using to_s. That's not true in case of attachments. From RubyLLM's code (v1.9.1), in the Message class:

    def content
      if @content.is_a?(Content) && @content.text && @content.attachments.empty?
        @content.text
      else
        @content
      end
    end

So, if there is an attachment, then the full Content object will be returned, not a string. And it doesn't return the text in it's to_s method.

I think the fix is to do something like this:

content = msg.content
content = content.text if content.respond_to?(:text) # or content.is_a?(RubyLLM::Message)

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions