Skip to content

Commit 2307d25

Browse files
harsh543claude
andcommitted
refactor: address review feedback on LoggerAdapter extra modes
- Remove json mode (no known use case, simplifies code) - Merge key/prefix params into single key param (prefix derived via replace) - Revert unrelated README changes - Parameterize unit tests, remove json-mode tests - Remove json integration tests from test_activity and test_workflow Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent d3c5ad0 commit 2307d25

File tree

7 files changed

+103
-477
lines changed

7 files changed

+103
-477
lines changed

README.md

Lines changed: 3 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1018,21 +1018,10 @@ By default the sandbox completely reloads non-standard-library and non-Temporal
10181018
the sandbox quicker and use less memory when importing known-side-effect-free modules, they can be marked
10191019
as passthrough modules.
10201020

1021-
**Passthrough modules are about import-time behavior and determinism, not just about whether a module is third-party.**
1022-
Any module that is side-effect-free on import and makes only deterministic calls can be passed through, including
1023-
first-party application modules, third-party libraries, and non-workflow-specific code. The key criteria are:
1024-
1. The module does not have import-time side effects (e.g., file I/O, network calls, random values)
1025-
2. Calls made from the module are deterministic within workflow code
1026-
1027-
**For performance and behavior reasons, users are encouraged to pass through all non-workflow imports whose calls will be
1021+
**For performance and behavior reasons, users are encouraged to pass through all third party modules whose calls will be
10281022
deterministic.** In particular, this advice extends to modules containing the activities to be referenced in workflows,
10291023
and modules containing dataclasses and Pydantic models, which can be particularly expensive to import.
10301024

1031-
**Note on datetime and similar stdlib modules:** If you need to use non-deterministic functions like `datetime.date.today()`,
1032-
do **not** mark `datetime` as passthrough. Instead, use `with_child_unrestricted()` to allow specific invalid members
1033-
(see [Invalid Module Members](#invalid-module-members) below). Passthrough affects import reloading, while
1034-
`invalid_module_members` controls which calls are allowed at runtime.
1035-
10361025
One way to pass through a module is at import time in the workflow file using the `imports_passed_through` context
10371026
manager like so:
10381027

@@ -1082,7 +1071,7 @@ Note, some calls from the module may still be checked for invalid calls at runti
10821071

10831072
`SandboxRestrictions.invalid_module_members` contains a root matcher that applies to all module members. This already
10841073
has a default set which includes things like `datetime.date.today()` which should never be called from a workflow. To
1085-
remove this restriction and allow a specific call like `datetime.date.today()`:
1074+
remove this restriction:
10861075

10871076
```python
10881077
my_restrictions = dataclasses.replace(
@@ -1094,12 +1083,6 @@ my_restrictions = dataclasses.replace(
10941083
my_worker = Worker(..., workflow_runner=SandboxedWorkflowRunner(restrictions=my_restrictions))
10951084
```
10961085

1097-
**This is the correct approach for allowing non-deterministic stdlib calls.** Do not use passthrough modules for this
1098-
purpose—passthrough controls import reloading, while `invalid_module_members` controls runtime call restrictions.
1099-
1100-
For a complete example showing both passthrough modules and unrestricted invalid members, see the
1101-
[pydantic_converter sample worker configuration](https://github.com/temporalio/samples-python/blob/4303a9b15f4ddc4cd770bc0ba33afef90a25d3ae/pydantic_converter/worker.py#L45-L65).
1102-
11031086
Restrictions can also be added by `|`'ing together matchers, for example to restrict the `datetime.date` class from
11041087
being used altogether:
11051088

@@ -1146,7 +1129,7 @@ To mitigate this, users should:
11461129

11471130
* Define workflows in files that have as few non-standard-library imports as possible
11481131
* Alter the max workflow cache and/or max concurrent workflows settings if memory grows too large
1149-
* Set non-workflow imports as passthrough modules if they are known to be side-effect free on import and deterministic
1132+
* Set third-party libraries as passthrough modules if they are known to be side-effect free
11501133

11511134
###### Extending Restricted Classes
11521135

temporalio/_log_utils.py

Lines changed: 11 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,10 @@
55

66
from __future__ import annotations
77

8-
import json
98
from collections.abc import Mapping, MutableMapping
109
from typing import Any, Literal
1110

12-
TemporalLogExtraMode = Literal["dict", "flatten", "json"]
11+
TemporalLogExtraMode = Literal["dict", "flatten"]
1312
"""Mode controlling how Temporal context is added to log record extra.
1413
1514
Values:
@@ -20,48 +19,41 @@
2019
namespaced prefix. Values that are not primitives (str/int/float/bool)
2120
are converted to strings. This mode is recommended for OpenTelemetry
2221
and other logging pipelines that require flat, scalar attributes.
23-
json: Add context as a JSON string under a single key. Useful when
24-
downstream systems expect string values but you want structured data.
2522
"""
2623

2724

2825
def _apply_temporal_context_to_extra(
2926
extra: MutableMapping[str, Any],
3027
*,
3128
key: str,
32-
prefix: str,
3329
ctx: Mapping[str, Any],
3430
mode: TemporalLogExtraMode,
3531
) -> None:
3632
"""Apply temporal context to log record extra based on the configured mode.
3733
3834
Args:
3935
extra: The mutable extra dict to update.
40-
key: The key to use for dict/json modes (e.g., "temporal_workflow").
41-
prefix: The prefix to use for flatten mode keys (e.g., "temporal.workflow").
36+
key: The base key (e.g., "temporal_workflow"). In dict mode this is
37+
used directly. In flatten mode the prefix is derived by replacing
38+
underscores with dots (e.g., "temporal.workflow").
4239
ctx: The context mapping containing temporal fields.
4340
mode: The mode controlling how context is added.
4441
"""
45-
if mode == "dict":
46-
extra[key] = dict(ctx)
47-
elif mode == "json":
48-
extra[key] = json.dumps(ctx, separators=(",", ":"), default=str)
49-
elif mode == "flatten":
42+
if mode == "flatten":
43+
prefix = key.replace("_", ".")
5044
for k, v in ctx.items():
5145
# Ensure value is a primitive type safe for OTel attributes
5246
if not isinstance(v, (str, int, float, bool, type(None))):
5347
v = str(v)
5448
extra[f"{prefix}.{k}"] = v
5549
else:
56-
# Fallback to dict for any unknown mode (shouldn't happen with typing)
5750
extra[key] = dict(ctx)
5851

5952

6053
def _update_temporal_context_in_extra(
6154
extra: MutableMapping[str, Any],
6255
*,
6356
key: str,
64-
prefix: str,
6557
update_ctx: Mapping[str, Any],
6658
mode: TemporalLogExtraMode,
6759
) -> None:
@@ -71,38 +63,18 @@ def _update_temporal_context_in_extra(
7163
7264
Args:
7365
extra: The mutable extra dict to update.
74-
key: The key used for dict/json modes (e.g., "temporal_workflow").
75-
prefix: The prefix used for flatten mode keys (e.g., "temporal.workflow").
66+
key: The base key (e.g., "temporal_workflow"). In dict mode this is
67+
used directly. In flatten mode the prefix is derived by replacing
68+
underscores with dots (e.g., "temporal.workflow").
7669
update_ctx: Additional context fields to add/update.
7770
mode: The mode controlling how context is added.
7871
"""
79-
if mode == "dict":
80-
extra.setdefault(key, {}).update(update_ctx)
81-
elif mode == "json":
82-
# For JSON mode, we need to parse, update, and re-serialize
83-
existing = extra.get(key)
84-
if existing is not None:
85-
try:
86-
existing_dict = json.loads(existing)
87-
existing_dict.update(update_ctx)
88-
extra[key] = json.dumps(
89-
existing_dict, separators=(",", ":"), default=str
90-
)
91-
except (json.JSONDecodeError, TypeError):
92-
# If parsing fails, just create a new JSON object with update_ctx
93-
extra[key] = json.dumps(
94-
dict(update_ctx), separators=(",", ":"), default=str
95-
)
96-
else:
97-
extra[key] = json.dumps(
98-
dict(update_ctx), separators=(",", ":"), default=str
99-
)
100-
elif mode == "flatten":
72+
if mode == "flatten":
73+
prefix = key.replace("_", ".")
10174
for k, v in update_ctx.items():
10275
# Ensure value is a primitive type safe for OTel attributes
10376
if not isinstance(v, (str, int, float, bool, type(None))):
10477
v = str(v)
10578
extra[f"{prefix}.{k}"] = v
10679
else:
107-
# Fallback to dict for any unknown mode
10880
extra.setdefault(key, {}).update(update_ctx)

temporalio/activity.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -504,8 +504,7 @@ class LoggerAdapter(logging.LoggerAdapter):
504504
temporal_extra_mode: Controls how activity context is added to log
505505
``extra``. Default is ``"dict"`` (current behavior). Set to
506506
``"flatten"`` for OpenTelemetry compatibility (scalar attributes
507-
with ``temporal.activity.`` prefix), or ``"json"`` for a single JSON
508-
string value.
507+
with ``temporal.activity.`` prefix).
509508
"""
510509

511510
def __init__(self, logger: logging.Logger, extra: Mapping[str, Any] | None) -> None:
@@ -535,7 +534,6 @@ def process(
535534
_apply_temporal_context_to_extra(
536535
extra,
537536
key="temporal_activity",
538-
prefix="temporal.activity",
539537
ctx=context.logger_details,
540538
mode=self.temporal_extra_mode,
541539
)

temporalio/workflow.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1577,8 +1577,7 @@ class LoggerAdapter(logging.LoggerAdapter):
15771577
temporal_extra_mode: Controls how workflow context is added to log
15781578
``extra``. Default is ``"dict"`` (current behavior). Set to
15791579
``"flatten"`` for OpenTelemetry compatibility (scalar attributes
1580-
with ``temporal.workflow.`` prefix), or ``"json"`` for a single JSON
1581-
string value.
1580+
with ``temporal.workflow.`` prefix).
15821581
15831582
Values added to ``extra`` are merged with the ``extra`` dictionary from a
15841583
logging call, with values from the logging call taking precedence. I.e. the
@@ -1616,7 +1615,6 @@ def process(
16161615
_apply_temporal_context_to_extra(
16171616
extra,
16181617
key="temporal_workflow",
1619-
prefix="temporal.workflow",
16201618
ctx=workflow_details,
16211619
mode=self.temporal_extra_mode,
16221620
)
@@ -1631,7 +1629,6 @@ def process(
16311629
_update_temporal_context_in_extra(
16321630
extra,
16331631
key="temporal_workflow",
1634-
prefix="temporal.workflow",
16351632
update_ctx=update_details,
16361633
mode=self.temporal_extra_mode,
16371634
)

0 commit comments

Comments
 (0)