fix: export thoughts_token_count to OpenTelemetry trace spans#4835
Open
brucearctor wants to merge 1 commit intogoogle:mainfrom
Open
fix: export thoughts_token_count to OpenTelemetry trace spans#4835brucearctor wants to merge 1 commit intogoogle:mainfrom
brucearctor wants to merge 1 commit intogoogle:mainfrom
Conversation
Add thoughts_token_count as gen_ai.usage.experimental.reasoning_tokens span attribute in trace_generate_content_result() and trace_inference_result(), matching the existing pattern in trace_call_llm(). Fixes google#4829
Contributor
|
Warning You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Fixes #4829
ADK's OpenTelemetry tracing does not export
thoughts_token_countto span attributes. When using Gemini models withThinkingConfig, theusage_metadatainLlmResponsecorrectly containsthoughts_token_count, but this field is never written to spans bytrace_generate_content_result()ortrace_inference_result().Interestingly,
trace_call_llm()already exports this field (asgen_ai.usage.experimental.reasoning_tokens). This PR adds the same export to the two remaining functions that were missing it.Changes
src/google/adk/telemetry/tracing.pythoughts_token_count→gen_ai.usage.experimental.reasoning_tokensspan attribute export intrace_generate_content_result()(~line 746)trace_inference_result()(~line 789)try/except AttributeErrorguard pattern astrace_call_llm()for backward compatibility with older SDK versionstests/unittests/telemetry/test_spans.pytest_trace_inference_result_with_thinking_tokens— verifies the attribute is exported whenthoughts_token_countis non-Nonetest_trace_inference_result_without_thinking_tokens— verifies no attribute is set whenthoughts_token_countis NoneTesting Plan
Unit Tests
All 23 telemetry tests pass:
New tests specifically verify:
thoughts_token_count=50→ span attributegen_ai.usage.experimental.reasoning_tokens=50is setthoughts_token_count=None→ nogen_ai.usage.experimental.reasoning_tokensattribute on spanVerification
Before fix —
Event.usage_metadata.thoughts_token_countis non-zero but Cloud Trace spans only showgen_ai.usage.input_tokensandgen_ai.usage.output_tokens.After fix —
gen_ai.usage.experimental.reasoning_tokensappears alongside the existing token attributes in all three tracing functions.