Skip to content

Conversation

@martosaur
Copy link
Contributor

@martosaur martosaur commented Nov 8, 2025

What's inside:

  1. PostHog.LLMAnalytics module with interface that allows ergonomic span capture.
  2. PostHog.Integrations.LLMAnalytics.Req Req plugin that captures $ai_generation events when HTTP requests happen. It tries to collect as many special properties as possible. Currently mostly knows about OpenAI Responses and Completions endpoints.

The biggest decision here is how to manage spans. Spans and traces are usual guests in the tracing world, so I draw some inspiration from OpenTelemetry interface. The resulting toolbox is:

  • set_session, set_trace and get_trace for setting global trace id for current process
  • set_root_span and get_root_span for setting parent span id for current process. All top-level spans captured in this process will have this span as parent.
  • capture_span for capturing a span of any of the LLM Analytics type.
  • start_span and capture_current_span. All started spans are kept in a stack in a process dictionary and consumed by calling capture_current_span.

I tried it in a real application and the interface seems ok.

TO DO

  • PR description
  • Docs
  • Test in real project
  • Update to have new $ai_session_id lol
image

@martosaur martosaur marked this pull request as ready for review November 15, 2025 18:33
@rafaeelaudibert rafaeelaudibert requested a review from a team November 17, 2025 13:08
@rafaeelaudibert
Copy link
Member

@martosaur FYI the LLM team will be reviewing this shortly :). This implementation is a little bit different to the one we have in other libraries, but I let them know this feels very Elixir-y and how good your skills are :)

Copy link
Member

@rafaeelaudibert rafaeelaudibert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good! We'll start using sampo for our release management. Do you mind installing it (cargo install sampo) and running sampo add to add a changeset for this change? It should be a minor release.

You might need to rebase on master to get the config we've just merged.

Thank you!

Comment on lines +166 to +167
@typedoc "One of LLM Analytics events: `$ai_generation`, `$ai_trace`, `$ai_span`, `$ai_embedding`"
@type llm_event() :: PostHog.event()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we possibly type this as one of those strings?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, no, not with typespecs anyway. Maybe if we expanded event names to be either strings or atoms 🤔 I might look into it at some point. I wonder if this limitation would even be relevant with new type system

@rafaeelaudibert
Copy link
Member

BTW, i've talked to the LLM Analytics team here and this doesn't follow their standards but they're happy to have this merged so that people can start using it. We'll note on posthog.com docs as experimental and make it clear that it doesn't follow most of the LLM Analytics patterns in favor of following a very Elixir-like approach.

@martosaur martosaur force-pushed the am-llm-analytics-spans branch from 1ce6271 to 4069a8e Compare January 20, 2026 05:31
@martosaur
Copy link
Contributor Author

Not sure what sampo thing is but it added something!

BTW, i've talked to the LLM Analytics team here and this doesn't follow their standards but they're happy to have this merged so that people can start using it. We'll note on posthog.com docs as experimental and make it clear that it doesn't follow most of the LLM Analytics patterns in favor of following a very Elixir-like approach.

This works for me! I'll need to revisit the docs to see what the current LLMAnalytics approach is, but we've been using this for months now and so far so good. We can also add Experimental to the module name just so people would know to not rely on this too much. Something like PostHog.Experimental.LLMAnalytics

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants