Skip to content

"Spawn" locks and semaphores are crashing during pickling in correct "spawn" process launch.  #125738

@ben-black-tec

Description

@ben-black-tec

Bug report

Bug description:

"Spawn" locks and semaphores are crashing during the process argument pickling.

This isssue is new as of python3.11, earlier versions do not have this issue. The code change that triggered the issue seems to be this pull request #108378 that was backported to python3.11 and python3.12, but unfortunately does not seem to have correct semantics.

import multiprocessing as mp


def sub_task(lock):
    with lock['lock']:
        pass


def main():
    ctx = mp.get_context("spawn")
    lock = ctx.Lock()
    lock_dict = {"lock":lock}
    proc = ctx.Process(target=sub_task, args=(lock_dict,))
    proc.start()
    proc.join()

Gives the error:

Traceback (most recent call last):
  File "/home/ben/work/training/pipeline-lib/test/test_execution.py", line 59, in raises_from
    yield
  File "/home/ben/work/training/pipeline-lib/test/test_execution.py", line 354, in test_single_worker_unexpected_exit
    execute(tasks, parallelism)
  File "/home/ben/work/training/pipeline-lib/pipeline_lib/execution.py", line 49, in execute
    execute_mp(tasks, "spawn", inactivity_timeout=inactivity_timeout)
  File "/home/ben/work/training/pipeline-lib/pipeline_lib/mp_execution.py", line 658, in execute_mp
    process.start()
  File "/usr/lib/python3.11/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/usr/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/usr/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/usr/lib/python3.11/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
  File "/usr/lib/python3.11/multiprocessing/synchronize.py", line 107, in __getstate__
    raise RuntimeError('A SemLock created in a fork context is being '
RuntimeError: A SemLock created in a fork context is being shared with a process in a spawn context. This is not supported. Please use the same context to create multiprocessing objects and Process.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/home/ben/work/training/pipeline-lib/test/test_execution.py", line 641, in <module>
    test_single_worker_unexpected_exit("process-spawn")
  File "/home/ben/work/training/pipeline-lib/test/test_execution.py", line 353, in test_single_worker_unexpected_exit
    with raises_from(pipeline_lib.pipeline_task.TaskError):
  File "/usr/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/home/ben/work/training/pipeline-lib/test/test_execution.py", line 66, in raises_from
    raise AssertionError(f"expected error of type {err_type} got error {err}")
AssertionError: expected error of type <class 'pipeline_lib.pipeline_task.TaskError'> got error A SemLock created in a fork context is being shared with a process in a spawn context. This is not supported. Please use the same context to create multiprocessing objects and Process.
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python3.11/multiprocessing/spawn.py", line 122, in spawn_main
    exitcode = _main(fd, parent_sentinel)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/spawn.py", line 132, in _main
    self = reduction.pickle.load(from_parent)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/synchronize.py", line 115, in __setstate__
    self._semlock = _multiprocessing.SemLock._rebuild(*state)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory

CPython versions tested on:

3.11

Operating systems tested on:

Linux

Metadata

Metadata

Assignees

No one assigned

    Labels

    type-bugAn unexpected behavior, bug, or error

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions