-
Notifications
You must be signed in to change notification settings - Fork 25
LLM Analytics proposal #67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
|
@martosaur FYI the LLM team will be reviewing this shortly :). This implementation is a little bit different to the one we have in other libraries, but I let them know this feels very Elixir-y and how good your skills are :) |
rafaeelaudibert
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good! We'll start using sampo for our release management. Do you mind installing it (cargo install sampo) and running sampo add to add a changeset for this change? It should be a minor release.
You might need to rebase on master to get the config we've just merged.
Thank you!
| @typedoc "One of LLM Analytics events: `$ai_generation`, `$ai_trace`, `$ai_span`, `$ai_embedding`" | ||
| @type llm_event() :: PostHog.event() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we possibly type this as one of those strings?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, no, not with typespecs anyway. Maybe if we expanded event names to be either strings or atoms 🤔 I might look into it at some point. I wonder if this limitation would even be relevant with new type system
|
BTW, i've talked to the LLM Analytics team here and this doesn't follow their standards but they're happy to have this merged so that people can start using it. We'll note on posthog.com docs as experimental and make it clear that it doesn't follow most of the LLM Analytics patterns in favor of following a very Elixir-like approach. |
1ce6271 to
4069a8e
Compare
|
Not sure what sampo thing is but it added something!
This works for me! I'll need to revisit the docs to see what the current LLMAnalytics approach is, but we've been using this for months now and so far so good. We can also add |
What's inside:
PostHog.LLMAnalyticsmodule with interface that allows ergonomic span capture.PostHog.Integrations.LLMAnalytics.ReqReq plugin that captures$ai_generationevents when HTTP requests happen. It tries to collect as many special properties as possible. Currently mostly knows about OpenAI Responses and Completions endpoints.The biggest decision here is how to manage spans. Spans and traces are usual guests in the tracing world, so I draw some inspiration from OpenTelemetry interface. The resulting toolbox is:
set_session,set_traceandget_tracefor setting global trace id for current processset_root_spanandget_root_spanfor setting parent span id for current process. All top-level spans captured in this process will have this span as parent.capture_spanfor capturing a span of any of the LLM Analytics type.start_spanandcapture_current_span. All started spans are kept in a stack in a process dictionary and consumed by callingcapture_current_span.I tried it in a real application and the interface seems ok.
TO DO
$ai_session_idlol