-
Notifications
You must be signed in to change notification settings - Fork 131
Open
Description
I have a class with a function that is slow the first time it's called (common for ML code like TF and PyTorch). In order to get more consistent timing, this function does a warmup run during init. I'd like to make sure I don't profile this warmup run when running line_profiler. This is related to #30
I tried to do this by modifying the explicit_profiler's internal _profile object but it didn't work. Here is a minimal example:
import os
from time import sleep
os.environ["LINE_PROFILE"] = "1"
import line_profiler
class test:
def __init__(self):
self.warm = 0
# Disable profiling for warmup
line_profiler.explicit_profiler.profile._profile.disable()
self.f()
line_profiler.explicit_profiler.profile._profile.enable()
@line_profiler.profile
def f(self):
if not self.warm:
sleep(0.5)
self.warm = 1
return 5
o = test()
o.f()
I think this is a generally useful feature so I hope there can be a documented way to achieve this!
EDIT:
looks like this works:
line_profiler.explicit_profiler.profile._profile.enable_count = -1
self.f()
line_profiler.explicit_profiler.profile._profile.enable()
but I think there should be a more obvious way to do it.
Metadata
Metadata
Assignees
Labels
No labels