-
-
Notifications
You must be signed in to change notification settings - Fork 33.8k
Closed as not planned
Closed as not planned
Copy link
Labels
3.12only security fixesonly security fixesinterpreter-core(Objects, Python, Grammar, and Parser dirs)(Objects, Python, Grammar, and Parser dirs)topic-profilingtype-crashA hard crash of the interpreter, possibly with a core dumpA hard crash of the interpreter, possibly with a core dump
Description
Crash report
What happened?
So if you dont stop py4j session, it leaks the thread, but if you set PYTHONPERFSUPPORT=1 the interpreter exits with segfault. I tested it vs python built from 3.12.8 branch.
# minimal_pyspark_leak.py
from pyspark.sql import SparkSession
import time
# Create a SparkSession but do NOT call spark.stop()
spark = SparkSession.builder \
.appName("LeakTest") \
.master("local[*]") \
.getOrCreate()
print("Spark started, not stopping...")
# Sleep so Spark's Py4J callback thread stays alive across shutdown
time.sleep(3.12)
In order to obtain a segfault I had to launch several instances in parallel like so
PYTHONPERFSUPPORT=1 python minimal_pyspark_leak.py &
PYTHONPERFSUPPORT=1 python minimal_pyspark_leak.py &
...
PYTHONPERFSUPPORT=1 python minimal_pyspark_leak.py &
CPython versions tested on:
3.12
Operating systems tested on:
Linux
Output from running 'python -VV' on the command line:
Python 3.12.8 (main, Nov 4 2025, 09:59:07) [GCC 15.1.0]
Metadata
Metadata
Assignees
Labels
3.12only security fixesonly security fixesinterpreter-core(Objects, Python, Grammar, and Parser dirs)(Objects, Python, Grammar, and Parser dirs)topic-profilingtype-crashA hard crash of the interpreter, possibly with a core dumpA hard crash of the interpreter, possibly with a core dump